Automated detection of Hainan gibbon calls for passive acoustic monitoring

التفاصيل البيبلوغرافية
العنوان: Automated detection of Hainan gibbon calls for passive acoustic monitoring
المؤلفون: Ian N. Durbach, James P. Hansford, Wenyong Li, Zhiwei Liu, Zhaoli Zhou, Christina S. Stender, Qing Chen, Amanda Hoepfner, Heidi Ma, Samuel T. Turvey, Emmanuel Dufourq, Jessica V. Bryant
المساهمون: University of St Andrews. School of Mathematics and Statistics
المصدر: Remote Sensing in Ecology and Conservation, Vol 7, Iss 3, Pp 475-487 (2021)
بيانات النشر: Wiley, 2021.
سنة النشر: 2021
مصطلحات موضوعية: 0106 biological sciences, QA75, Technology, Passive acoustic monitoring, 010504 meteorology & atmospheric sciences, Computer science, QA75 Electronic computers. Computer science, QH301 Biology, species identification, Library science, 010603 evolutionary biology, 01 natural sciences, Convolutional neural network, passive acoustic monitoring, QH301, Political science, convolutional neural networks, Species identification, Hainan gibbons, Computers in Earth Sciences, Ecology, Evolution, Behavior and Systematics, QH540-549.5, 0105 earth and related environmental sciences, Nature and Landscape Conservation, Government, Artificial neural network, biology, Ecology, business.industry, deep learning, Pattern recognition, Deep learning, DAS, biology.organism_classification, Nomascus hainanus, Research centre, Spectrogram, Mammal, Convolutional neural networks, Artificial intelligence, International development, business, Classifier (UML), Bioacoustics
الوصف: Fieldwork was funded by an Arcus Foundation grant to STT and a Wildlife Acoustics grant to JVB. ID is supported in part by funding from the National Research Foundation of South Africa (Grant ID 90782, 105782). ED is supported by a postdoctoral fellowship from the African Institute for Mathematical Sciences South Africa, Stellenbosch University and the Next Einstein Initiative. This work was carried out with the aid of a grant from the International Development Research Centre, Ottawa, Canada (www.idrc.ca), and with financial support from the Government of Canada, provided through Global Affairs Canada (GAC; www.international.gc.ca). Extracting species calls from passive acoustic recordings is a common preliminary step to ecological analysis. For many species, particularly those occupying noisy, acoustically variable habitats, the call extraction process continues to be largely manual, a time-consuming and increasingly unsustainable process. Deep neural networks have been shown to offer excellent performance across a range of acoustic classification applications, but are relatively underused in ecology. We describe the steps involved in developing an automated classifier for a passive acoustic monitoring project, using the identification of calls of the Hainan gibbon Nomascus hainanus, one of the world's rarest mammal species, as a case study. This includes preprocessing-selecting a temporal resolution, windowing and annotation; data augmentation; processing-choosing and fitting appropriate neural network models; and post-processing-linking model predictions to replace, or more likely facilitate, manual labelling. Our best model converted acoustic recordings into spectrogram images on the mel frequency scale, using these to train a convolutional neural network. Model predictions were highly accurate, with per-second false positive and false negative rates of 1.5% and 22.3%. Nearly all false negatives were at the fringes of calls, adjacent to segments where the call was correctly identified, so that very few calls were missed altogether. A post-processing step identifying intervals of repeated calling reduced an 8-h recording to, on average, 22 min for manual processing, and did not miss any calling bouts over 72 h of test recordings. Gibbon calling bouts were detected regularly in multi-month recordings from all selected survey points within Bawangling National Nature Reserve, Hainan. We demonstrate that passive acoustic monitoring incorporating an automated classifier represents an effective tool for remote detection of one of the world's rarest and most threatened species. Our study highlights the viability of using neural networks to automate or greatly assist the manual labelling of data collected by passive acoustic monitoring projects. We emphasize that model development and implementation be informed and guided by ecological objectives, and increase accessibility of these tools with a series of notebooks that allow users to build and deploy their own acoustic classifiers. Publisher PDF
وصف الملف: application/pdf
اللغة: English
تدمد: 2056-3485
URL الوصول: https://explore.openaire.eu/search/publication?articleId=doi_dedup___::2d631287279787948db51efa58154bfd
https://doaj.org/article/3182ba7f901e4d8fac6f622c40ce117f
حقوق: OPEN
رقم الأكسشن: edsair.doi.dedup.....2d631287279787948db51efa58154bfd
قاعدة البيانات: OpenAIRE