Vis enkel innførsel

dc.contributor.authorBlumentrath, Stefan
dc.contributor.authorPuliti, Stefano
dc.contributor.authorMolværsmyr, Sindre
dc.contributor.authorHamre, Øyvind
dc.date.accessioned2022-05-03T11:59:53Z
dc.date.available2022-05-03T11:59:53Z
dc.date.issued2022
dc.identifier.isbn978-82-426- 4925-6
dc.identifier.issn1504-3312
dc.identifier.urihttps://hdl.handle.net/11250/2993915
dc.description.abstractBlumentrath, S., Puliti, S., Molværsmyr, S. & Hamre, Ø. 2022. Wheel rut mapping with high resolution ortho-imagery – a comparison of data and methods. NINA Report 2137. Norwegian Institute for Nature Research. The number of registered All Terrain vehicles (ATV) in Norway has been constantly increasing over the last decade. Driving these vehicles off-road in nature requires special permission because of the potentially severe damage it can cause in nature. The number of registered vehicles however suggests that illegal driving occurs. In order to be able to better monitor this issue, efficient mapping techniques are required to cover the large areas where this can be relevant. Remote sensing has been explored as an efficient technique for that purpose earlier, however recent technological advances like the availability of Unmanned Aerial Vehicles (UAV) or deep learning for image analysis may further increase the potential of remote sensing. The aim of this project has therefore been to develop a coherent workflow to detect wheel ruts in drone and / or plane-based aerial imagery that can serve as a starting point for practical monitoring tools. Method-development was conducted for two case study sides, one in northern and one in southern Norway, where drone imagery from autumn 2020 was available. The developed workflow covers all relevant steps of image analysis from preparation of input data to postprocessing of modelling results and was made publicly available as a set of Python scripts. Results show that deep-learning performed better than more traditional image analysis techniques and the initial deep-learning models developed in this project produce fair to good results for both plane- and drone based imagery in both study sites. Models utilizing drone data perform slightly better than models based on aerial images with regards to correctly capturing wheel ruts. Models based on drone imagery capture more details but currently also show a larger degree of noise and scattered false positive classifications. Models from aerial images perform best in open areas while they struggle more in forested areas. The developed post-processing routine im-proves the quality of the final products and can produce condensed and more usable representations of the results. However, the classification of the results during post-processing with regards to the severity of damages to esp. soil / terrain should undergo systematic evaluation and re-adjustment if needed. Together with the modelling results, re-processing of the raw drone images illustrate that the main benefit from using drone imagery is the timely data acquisition, both in terms of time of the year but also with regards to e.g. local urgency to monitor an area in more detail. Drone data can therewith be seen as an on-demand technology that is complementary to aerial images that are taken on a regularly basis for the Norwegian orthophoto program every 5th to 10th year. Other potential benefits from using drone imagery like the possibility to capture photogrammetric terrain models or multispectral imagery in and of themselves, currently do not seem to justify the extra effort necessary to acquire drone imagery because their contribution modelling accuracy may even be negative due to data quality issues. In particular, monitoring of soil impact of wheel ruts by means of repeated collection of photogrammetric terrain models seems to be hard if not impossible to conduct at the extent and scale used in this project with study area of ~6km2. Due to the limited amount of situations that are covered by the current models, further training under different seasons, light conditions, vegetation types and so on would be necessary to make the models more transferable and thus applicable in practical management. Since models for both drone and aerial imagery show comparable performance, images from the Norwegian orthophoto program should thus be a natural starting point in order to increase the amount of training data and therewith the number of conditions the model is trained with. In that context, a more systematic evaluation of the effect of image resolution should be conducted and other available data sources like ultra-high resolution satellite images (with up to 30 cm resolution) may be considered. It should also be investigated whether it is feasible and adequate from an end-user point of view to consolidate the deep learning models that currently are different for drone and aerial imagery, into one coherent model in order to reduce the maintenance effort and at the same time increase the amount of both training data and imagery the model could be trained with. To that end, also recent methods to limit the required amount of test- and training data, like Few-Shot Learning (see e.g. Wang et al. 2020) should be explored in order to increase the practical applicability in a monitoring context. Finally, even if the developed workflow is usable already, technical improvements can further improve the practical applicabilityen_US
dc.language.isoengen_US
dc.publisherNorsk institutt for naturforskning (NINA)en_US
dc.relation.ispartofseriesNINA Report;2137
dc.subjectUAVen_US
dc.subjectdroneen_US
dc.subjectortho photoen_US
dc.subjectimage analysisen_US
dc.subjectdeep learningen_US
dc.subjectwheel ruten_US
dc.subjectmonitoringen_US
dc.subjectGISen_US
dc.subjectmethoden_US
dc.subjectortofotoen_US
dc.subjectbildeanalyseen_US
dc.subjectdyp læringen_US
dc.subjectkjøresporen_US
dc.subjectovervåkningen_US
dc.subjectmetodeutviklingen_US
dc.titleWheel rut mapping with high resolution ortho-imagery – a comparison of data and methodsen_US
dc.typeResearch reporten_US
dc.rights.holder© Norwegian Institute for Nature Research. The publication may be freely cited where the source is acknowledgeden_US
dc.source.pagenumber54en_US


Tilhørende fil(er)

Thumbnail
Thumbnail

Denne innførselen finnes i følgende samling(er)

  • NINA Rapport/NINA Report [2297]
    NINAs vanligste rapporteringsform til oppdragsgiver etter gjennomført forsknings-, overvåkings- eller utredningsarbeid.

Vis enkel innførsel