Invasive species detection from RGB aerial imagery: investigating links between plant characteristics and transfer learning success

dc.contributor.advisorPerroy, Ryan L.
dc.contributor.authorTa, Erica
dc.contributor.departmentTropical Conservation Biology & Environmental Science
dc.date.accessioned2022-02-01T18:44:05Z
dc.date.available2022-02-01T18:44:05Z
dc.date.issued2021-12
dc.description.degreeM.S.
dc.identifier.urihttp://hdl.handle.net/10790/6868
dc.subjectConservation biology
dc.subjectArtificial intelligence
dc.subjectRemote sensing
dc.subjectdeep learning
dc.subjectHawaiʻi
dc.subjectinvasive species
dc.subjectobject detection
dc.subjecttransfer learning
dc.titleInvasive species detection from RGB aerial imagery: investigating links between plant characteristics and transfer learning success
dc.typeThesis
dcterms.abstractAdvancements in remote sensing techniques and deep learning applications like object detection have improved invasive species monitoring systems. Deep learning typically uses large training sets, up to millions of images, to consistently recognize targets, but generating these training sets may not be practical for incipient invasive species targets of interest. When large datasets are unavailable for training, one approach is to use transfer learning to overcome data limitations. The process applies knowledge learned from the source network (pre-trained on a task on which large datasets are available) to a target problem that has limited data samples. Here I examine how object detection performance for the following invasive species of interest in Hawaiʻi differs with the inclusion of cross-species transfer learning: Miconia (Miconia calvascens), Guinea grass (Megathrysus maximus), and four symptomatic visible classes of Rapid ‘Ōhi‘a death (ROD): red, brown, fine white, and skeleton. I also measured visual plant features of contrast, shape, size, and texture to understand how different plant morphologies provide easier or more challenging scenarios for plant object detection using aerial visible imagery. I found that 9 out of 30 transfer learning instances had significantly higher mean average precision (mAP) scores than instances without transfer learning (p < 0.00167, α = 0.00167 (0.05/30). Transfer learning was found to be most effective between red, brown, fine white, and skeleton ROD classes and least effective among miconia and guinea grass. The feature measurement of contrast was significantly correlated with source model mAP (R = 0.82, p = 0.045) whereas texture was strongly correlated (R = 0.77, p = 0.073), size (R = 0.54, p = 0.27) was moderately correlated, and circularity (R = -0.096, p = 0.86) was weakly correlated. My results indicate there are advantages in plant detections that utilize similar source and target candidates for transfer learning, in addition to incorporating source candidates whose image data exhibit higher contrast and higher texture measurements. Overall, this study may inform future workflows to detect plants from aerial imagery by demonstrating how available data can be best leveraged or repurposed through transfer learning to detect a plant target using limited available datasets.
dcterms.extent62 pages
dcterms.languageen
dcterms.publisherUniversity of Hawaii at Hilo
dcterms.rightsAll UHH dissertations and theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission from the copyright owner.
dcterms.typeText
local.identifier.alturihttp://dissertations.umi.com/hilo.hawaii:10215

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Ta_hilo.hawaii_1418O_10215.pdf
Size:
1.84 MB
Format:
Adobe Portable Document Format