Invasive species detection from RGB aerial imagery: investigating links between plant characteristics and transfer learning success

dc.contributor.advisor Perroy, Ryan L.
dc.contributor.author Ta, Erica
dc.contributor.department Tropical Conservation Biology & Environmental Science
dc.date.accessioned 2022-02-01T18:44:05Z
dc.date.available 2022-02-01T18:44:05Z
dc.date.issued 2021-12
dc.description.degree M.S.
dc.identifier.uri http://hdl.handle.net/10790/6868
dc.subject Conservation biology
dc.subject Artificial intelligence
dc.subject Remote sensing
dc.subject deep learning
dc.subject Hawaiʻi
dc.subject invasive species
dc.subject object detection
dc.subject transfer learning
dc.title Invasive species detection from RGB aerial imagery: investigating links between plant characteristics and transfer learning success
dc.type Thesis
dcterms.abstract Advancements in remote sensing techniques and deep learning applications like object detection have improved invasive species monitoring systems. Deep learning typically uses large training sets, up to millions of images, to consistently recognize targets, but generating these training sets may not be practical for incipient invasive species targets of interest. When large datasets are unavailable for training, one approach is to use transfer learning to overcome data limitations. The process applies knowledge learned from the source network (pre-trained on a task on which large datasets are available) to a target problem that has limited data samples. Here I examine how object detection performance for the following invasive species of interest in Hawaiʻi differs with the inclusion of cross-species transfer learning: Miconia (Miconia calvascens), Guinea grass (Megathrysus maximus), and four symptomatic visible classes of Rapid ‘Ōhi‘a death (ROD): red, brown, fine white, and skeleton. I also measured visual plant features of contrast, shape, size, and texture to understand how different plant morphologies provide easier or more challenging scenarios for plant object detection using aerial visible imagery. I found that 9 out of 30 transfer learning instances had significantly higher mean average precision (mAP) scores than instances without transfer learning (p < 0.00167, α = 0.00167 (0.05/30). Transfer learning was found to be most effective between red, brown, fine white, and skeleton ROD classes and least effective among miconia and guinea grass. The feature measurement of contrast was significantly correlated with source model mAP (R = 0.82, p = 0.045) whereas texture was strongly correlated (R = 0.77, p = 0.073), size (R = 0.54, p = 0.27) was moderately correlated, and circularity (R = -0.096, p = 0.86) was weakly correlated. My results indicate there are advantages in plant detections that utilize similar source and target candidates for transfer learning, in addition to incorporating source candidates whose image data exhibit higher contrast and higher texture measurements. Overall, this study may inform future workflows to detect plants from aerial imagery by demonstrating how available data can be best leveraged or repurposed through transfer learning to detect a plant target using limited available datasets.
dcterms.extent 62 pages
dcterms.language en
dcterms.publisher University of Hawaii at Hilo
dcterms.rights All UHH dissertations and theses are protected by copyright. They may be viewed from this source for any purpose, but reproduction or distribution in any format is prohibited without written permission from the copyright owner.
dcterms.type Text
local.identifier.alturi http://dissertations.umi.com/hilo.hawaii:10215
Files
Original bundle
Now showing 1 - 1 of 1
Thumbnail Image
Name:
Ta_hilo.hawaii_1418O_10215.pdf
Size:
1.84 MB
Format:
Adobe Portable Document Format
Description: