Seismic Savanna: Classifying wildlife and behaviors using vibration data

Team

  • Alexandre Szenicer (University of Oxford)
  • Michael Reinwald (University of Oxford)
  • Ben Moseley (University of Oxford)
  • Tarje Nissen-Meyer (University of Oxford)
  • Zacharia Mutinda Muteti (Mpala Research Centre, Kenya)
  • Sandy Oduor (Mpala Research Centre, Kenya)
  • Alex McDermott-Roberts (University of Oxford)
  • Atılım Güneş Baydin (University of Oxford)
  • Beth Mortimer (University of Oxford)

Abstract

We develop a machine learning approach to detect and discriminate elephants from other species, and to recognise important behaviours such as running and rumbling, based only on seismic data generated by the animals. We demonstrate our approach using data acquired in the Kenyan savanna, consisting of 8000 hours seismic recordings and 250k camera trap pictures. Our classifiers, different convolutional neural networks trained on seismograms and spectrograms, achieved 80–90% balanced accuracy in detecting elephants up to 100 meters away, and over 90% balanced accuracy in recognising running and rumbling behaviours from the seismic data. We release the dataset used in this study: SeisSavanna represents a unique collection of seismic signals with the associated wildlife species and behaviour. Our results suggest that seismic data offer substantial benefits for monitoring wildlife, and we propose to further develop our methods using dense arrays that could result in a seismic shift for wildlife monitoring.

Publications

  1. Szenicer, Alexandre, Michael Reinwald, Ben Moseley, Tarje Nissen-Meyer, Zacharia Mutinda Muteti, Sandy Oduor, Alex McDermott-Roberts, Atılım Güneş Baydin, and Beth Mortimer. 2021. “Seismic Savanna: Machine Learning for Classifying Wildlife and Behaviours Using Ground-Based Vibration Field Recordings.” Remote Sensing in Ecology and Conservation 8 (2). John Wiley & Sons and Zoological Society of London: 236–250. doi:10.1002/rse2.242.

    We develop a machine learning approach to detect and discriminate elephants from other species, and to recognise important behaviours such as running and rumbling, based only on seismic data generated by the animals. We demonstrate our approach using data acquired in the Kenyan savanna, consisting of 8000 hours seismic recordings and 250k camera trap pictures. Our classifiers, different convolutional neural networks trained on seismograms and spectrograms, achieved 80–90% balanced accuracy in detecting elephants up to 100 meters away, and over 90% balanced accuracy in recognising running and rumbling behaviours from the seismic data. We release the dataset used in this study: SeisSavanna represents a unique collection of seismic signals with the associated wildlife species and behaviour. Our results suggest that seismic data offer substantial benefits for monitoring wildlife, and we propose to further develop our methods using dense arrays that could result in a seismic shift for wildlife monitoring.

    @article{szenicer-2021-seismic,
      title = {Seismic savanna: Machine learning for classifying wildlife and behaviours using ground-based vibration field recordings},
      author = {Szenicer, Alexandre and Reinwald, Michael and Moseley, Ben and {Nissen-Meyer}, Tarje and Muteti, Zacharia Mutinda and Oduor, Sandy and {McDermott-Roberts}, Alex and Baydin, Atılım Güneş and Mortimer, Beth},
      journal = {Remote Sensing in Ecology and Conservation},
      publisher = {John Wiley & Sons and Zoological Society of London},
      year = {2021},
      volume = {8},
      number = {2},
      pages = {236--250},
      url = {https://doi.org/10.1002/rse2.242},
      doi = {10.1002/rse2.242}
    }
    

Acknowledgments

We would like to express our gratitude to all the staff at the Mpala Research Centre for assisting with fieldwork operations and for creating a warm and welcoming environment, in particular Dino Martins and Cosmas Nzomo. We thank Frank Pope and staff at Save the Elephants for help getting research permits and Kenyan Wildlife Service affiliations, and Paula Koelemeijer for help with sensor deployment. A. Szenicer thanks the anonymous donor of his PhD grant. We thank National Geographic (NGS50019R-18), Royal Society (URF R1 191033), John Fell Oxford University Press Research Fund, Royal Commission for the Exhibition of 1851 for funding. This research has been supported by the Centre for Doctoral Training in Autonomous Intelligent Machines and Systems at the University of Oxford, Oxford, UK, and the UK Engineering and Physical Sciences Research Council. A. G. Baydin is supported by EPSRC/MURI grant EP/N019474/1 and by Lawrence Berkeley National Lab.