Hostname: page-component-cd9895bd7-gbm5v Total loading time: 0 Render date: 2024-12-23T05:51:12.905Z Has data issue: false hasContentIssue false

The Necessity, Promise and Challenge of Automated Biodiversity Surveys

Published online by Cambridge University Press:  18 July 2019

Justin Kitzes*
Affiliation:
Department of Biological Sciences, University of Pittsburgh, Fifth and Ruskin Avenues, Pittsburgh, PA 15260, USA
Lauren Schricker
Affiliation:
Department of Biological Sciences, University of Pittsburgh, Fifth and Ruskin Avenues, Pittsburgh, PA 15260, USA
*
Author for correspondence: Justin Kitzes, Email: [email protected]

Summary

We are in the midst of a transformation in the way that biodiversity is observed on the planet. The approach of direct human observation, combining efforts of both professional and citizen scientists, has recently generated unprecedented amounts of data on species distributions and populations. Within just a few years, however, we believe that these data will be swamped by indirect biodiversity observations that are generated by autonomous sensors and machine learning classification models. In this commentary, we discuss three important elements of this shift towards indirect, technology driven observations. First, we note that the biodiversity data sets available today cover a very small fraction of all places and times that could potentially be observed, which suggests the necessity of developing new approaches that can gather such data at even larger scales, with lower costs. Second, we highlight existing tools and efforts that are already available today to demonstrate the promise of automated methods to radically increase biodiversity data collection. Finally, we discuss one specific outstanding challenge in automated biodiversity survey methods, which is how to extract useful knowledge from observations that are uncertain in nature. Throughout, we focus on one particular type of biodiversity data - point occurrence records - that are frequently produced by citizen science projects, museum records and systematic biodiversity surveys. As indirect observation methods increase the spatiotemporal scope of these point occurrence records, ecologists and conservation biologists will be better able to predict shifting species distributions, track changes to populations over time and understand the drivers of biodiversity occurrence.

Type
Comment
Copyright
© Foundation for Environmental Conservation 2019 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

Buxton, RT, Lendrum, PE, Crooks, KR, Wittemyer, G (2018) Pairing camera traps and acoustic recorders to monitor the ecological impact of human disturbance. Global Ecology and Conservation 16: e00493.CrossRefGoogle Scholar
Corrada Bravo, CJ, Álvarez Berríos, R, Aide, TM (2017) Species-specific audio detection: a comparison of three template-based detection algorithms using random forests. PeerJ Computer Science 3: e113.CrossRefGoogle Scholar
GBIF (2019) Global Biodiversity Information Facility. Free and Open Access to Biodiversity Data [www document]. URL https://www.gbif.org/ Google Scholar
Hill, AP, Prince, P, Pinña Covarrubias, E, Doncaster, CP, Snaddon, JL, Rogers, A (2018) AudioMoth: evaluation of a smart open acoustic device for monitoring biodiversity and the environment. Methods in Ecology and Evolution 9: 11991211.CrossRefGoogle Scholar
iNaturalist (2019) iNaturalist Computer Vision Explorations [www document]. URL https://www.inaturalist.org/pages/computer_vision_demo Google Scholar
LifeCLEF (2019) BirdCLEF 2018 | ImageCLEF/LifeCLEF – Multimedia Retrieval in CLEF [www document]. URL https://www.imageclef.org/node/230 Google Scholar
Marconi, S, Graves, SJ, Gong, D, Nia, MS, Le Bras, M, Dorr, BJ, et al. (2019) A data science challenge for converting airborne remote sensing data into ecological information. PeerJ 6: e5843.CrossRefGoogle ScholarPubMed
Microsoft (2019) AI for Earth – APIs and Applications: Species Classifications [www document]. URL https://www.microsoft.com/en-us/ai/ai-for-earth-apis?activetab=pivot1:primaryr4 Google Scholar
Norouzzadeh, MS, Nguyen, A, Kosmala, M, Swanson, A, Palmer, MS, Packer, C, Clune, J (2018) Automatically identifying, counting, and describing wild animals in camera-trap images with deep learning. Proceedings of the National Academy of Sciences of the United States of America 115(25): E5716.CrossRefGoogle ScholarPubMed
Priyadarshani, N, Marsland, S, Castro, I (2018) Automated birdsong recognition in complex acoustic environments: a review. Journal of Avian Biology 49: e01447.CrossRefGoogle Scholar
Steenweg, R, Hebblewhite, M, Kays, R, Ahumada, J, Fisher, JT, Burton, C, et al. (2017) Scaling up camera traps: monitoring the planet’s biodiversity with networks of remote sensors. Frontiers in Ecology and the Environment 15(1): 2634.CrossRefGoogle Scholar
Stowell, D, Wood, MD, Pamuła, H, Stylianou, Y, Glotin, H (2019) Automatic acoustic detection of birds through deep learning: the first Bird Audio Detection challenge. Methods in Ecology and Evolution 10(3): 368380.CrossRefGoogle Scholar
Sugai, LSM, Silva, TSF, Ribeiro, JJW, Llusia, D (2018) Terrestrial passive acoustic monitoring: review and perspectives. Bioscience 69(1): 1525.CrossRefGoogle Scholar
Towsey, M, Parsons, S, Sueur, J (2014) Ecology and acoustics at a large scale. Ecological Informatics 21: 13.CrossRefGoogle Scholar
USFWS (2019) USFWS: Indiana Bat Summer Survey Guidance – Automated Acoustic Bat ID Software Programs [www document]. URL https://www.fws.gov/midwest/endangered/mammals/inba/surveys/inbaacousticsoftware.html Google Scholar
USGS (2017) North American Breeding Bird Survey Summary of Effort in 2017 [www document]. URL https://www.pwrc.usgs.gov/BBS/Results/Summaries/ Google Scholar
Zuur, AF, Ieno, EN, Walker, NJ, Saveliev, AA, Smith, GM (2009) Mixed Effects Models and Extensions in Ecology in R . New York, NY, USA: Springer Science+Business Media.CrossRefGoogle Scholar
Supplementary material: File

Kitzes and Schricker supplementary material

Kitzes and Schricker supplementary material

Download Kitzes and Schricker  supplementary material(File)
File 7.5 KB
Supplementary material: File

Kitzes and Schricker supplementary material

Table S1

Download Kitzes and Schricker  supplementary material(File)
File 9.8 KB