Skip to content

Introduction

Insect Detect Logo

Build your own insect-detecting camera trap!

This website provides instructions on hardware assembly, software setup, programming, model training and deployment of a smart DIY camera trap that can be used for automated monitoring.

Background

Long-term monitoring data at a high spatiotemporal resolution is essential to investigate potential drivers and their impact on the widespread decline of insect abundance and diversity (Wagner, 2020), as well as to design effective conservation strategies (Harvey et al., 2020). Automated monitoring methods can extend the ecologists' toolbox and yield multidimensional data as output, with a comparatively low time and labor input (Besson et al., 2022). Standardized methods that are easily accessible and reproducible could furthermore decentralize monitoring efforts and strengthen the integration of independent biodiversity observations (e.g. Citizen Science) (Kühl et al., 2020).

A range of different sensors can be used for automated insect monitoring (van Klink et al., 2022). These include acoustic (e.g. Kawakita & Ichikawa, 2019) and opto-electronical sensors (e.g. Potamitis et al., 2015; Rydhmer et al., 2022), as well as cameras (overview in Høye et al., 2021). Several low-cost DIY camera trap systems for insects use video or time-lapse recordings, which are analyzed in a subsequent processing step (e.g. Droissart et al., 2021; Geissmann et al., 2022). Other systems utilize motion detection software as trigger for the image capture (e.g. Bjerge et al., 2021a; overview in Pegoraro et al., 2020). As for traditional camera traps used for monitoring of mammals, the large amount of image data that is produced in this way can be most efficiently processed and analyzed by making use of machine learning (ML) and especially deep learning (DL) methods, to extract information such as species identity, abundance and behaviour (Tuia et al., 2022).

Small DL models with relatively low computational costs can be run on suitable devices "on the edge", to enable real-time detection of objects the model was trained on. The appearance and detection of an insect can thereby be used as a kind of trigger to start a recording. This can drastically reduce the amount of data that has to be stored, by integrating the information extraction into the recording process. A "smart" camera trap for automated monitoring of pollinators was developed by Bjerge et al. (2021b) using the NVIDIA Jetson Nano in combination with a HD webcam. A custom trained YOLOv3 model is run in parallel with the time-lapse image recording and can thereby detect and classify insects in each image in real-time on the device (~0.5 fps). Filtering of false detections and the tracking of insects was performed in a subsequent step on a remote computer. An updated dataset is presented in a preprint by Bjerge et al. (2022) and can be seen as an important benchmark for insect detection and classification models with complex background. The smaller versions of the YOLOv5 models that were trained on this dataset (available in the same Zenodo repository) could also be used on edge devices to detect insects in real-time with similar backgrounds.

The necessity of automated biodiversity monitoring

"We believe that the fields of ecology and conservation biology are in the midst of a rapid and discipline-defining shift towards technology-mediated, indirect biodiversity observation. [...] Finally, for those who remain sceptical of the value of indirect observations, it is also useful to remember that we can never predict the advances in methods that may occur in the future. Unlike humans in the field, automated sensors produce a permanent visual or acoustic record of a given location and time that is far richer than a simple note that 'species X was here at time Y'. Similar to museum specimens, these records will undoubtedly be reanalysed by future generations of ecologists and conservation biologists using better tools than we have available now in order to extract information and answer questions that we cannot imagine today. And these future researchers will undoubtedly thank us, as we thank previous generations of naturalists, for having the foresight to collect as many observations as possible of the rapidly changing species and habitats on our planet." (Kitzes & Schricker, 2019)


Overview

Camera trap insect detection

The solar-powered DIY camera trap can be used for continuous automated monitoring of flower-visiting insects

The proposed DIY camera trap for automated insect monitoring is composed of low-cost off-the-shelf hardware components, combined with completely open source software and can be easily assembled and set up with the provided instructions. All Python scripts for testing the system, data collection and continuous automated monitoring can be adapted to different use cases by changing only a few lines of code. The labeled datasets and trained models for insect detection and classification that are provided should be seen as a starting point to train your own models, e.g. adapted to different backgrounds or insect taxa (classes for detection/classification). Especially when deploying the camera trap system in new environments, edge cases (low confidence score or false detection/classification) should be identified and models retrained with this new data. This iterative Active Learning loop of retraining and redeploying can ensure a high detection and classification accuracy over time. With the combination of Roboflow for annotation and dataset management and Google Colab as cloud training platform, this can be achieved in a straightforward way, even without prior knowledge or specific hardware and free of charge.

Insect Detect Active Learning loop

The proposed processing pipeline can increase detection and classification accuracy over time if new real-world data is integrated via an Active Learning loop

The use of an artificial flower platform provides a homogenous, constant background, which standardizes the visual attraction for insects and leads to higher detection and tracking accuracy with less data requirement for model training. Because of the flat platform design, the posture of insects landing on the platform will be more uniform, which can lead to better classification results with less data input. The biggest disadvantage at the moment is the bias in attraction for different insect groups. We are currently studying various shapes, colors and materials to enhance the visual attraction for specific pollinator groups and are testing the possible use of dispensers with artificial floral scent bouquets, to add an olfactorial component to the attraction.

Implemented functions

  • non-invasive, continuous automated monitoring of flower-visiting insects
  • standardized artificial flower platform as visual attractant
  • on-device detection and tracking with custom trained YOLOv5 model (4-5 fps)
  • save images of detected insects cropped from high-resolution frames (4K) to .jpg
  • easy to build and deploy with low-cost off-the-shelf hardware components
  • low power consumption (< 4 W) and fully solar-powered
  • weatherproof enclosure
  • automated classification and analysis in subsequent step on local PC
  • completely open source software with detailed documentation

Not implemented (yet)

  • high attraction for a wide range of insect taxa or specific groups
  • selection of different detection model architectures (coming soon)
  • on-device classification and analysis
  • real-time data transfer (e.g. via LTE stick/module)
  • validation with traditional monitoring methods

In the Hardware section of this website you will find a list with all required components and detailed instructions on how to build and assemble the camera trap system. Only some standard tools are necessary, which are listed in the Hardware overview.

In the Software section, all steps to get the camera trap up and running are explained. We will start with installing the necessary software on your local PC, to communicate with the Raspberry Pi Zero 2 W. After the Raspberry Pi is configured, you can go deeper into the Python scripts if you want, with details on adapting the scripts to your use case.

The Model Training section will show you tools to annotate your own images and use these to train your custom YOLOv5 object detection model that can be deployed on the OAK-1 camera. To classify the cropped insect images, you can also train your custom classification model in the next step that can be run on your local PC (no GPU necessary). All of the model training can be done in Google Colab, where you will have access to a free cloud GPU for fast training. This means all you need is a Google account, no special hardware is required.

The Deployment section will give you details on each step of the processing pipeline, from on-device detection and tracking, to classification of the cropped insect images on your local PC and subsequent automated analysis of the combined results with the provided Python script.

OAK-1 Raspberry Pi PiJuice Zero

The OAK-1 camera, Raspberry Pi Zero 2 W and PiJuice Zero pHAT provide all necessary Hardware functions in a tiny form factor

GitHub repositories

  • insect-detect GitHub repo

    Python scripts for testing and deploying the camera trap system for automated insect monitoring. Includes a basic YOLOv5s insect detection model.

  • insect-detect-ml GitHub repo

    Jupyter notebooks to run in Google Colab for YOLOv5 detection and classification model training. Modified YOLOv5 classification script with basic YOLOv5s insect classification model. Python script for automated analysis of the generated metadata .csv files.

  • insect-detect-docs GitHub repo

    Source files and assets of this documentation website, based on Material for MkDocs.


Datasets

  • Detection Dataset

    Dataset to train an insect detection model, with annotated images collected in 2022 with the DIY camera trap and the proposed flower platform as background.

  • Classification Dataset

    Dataset to train an insect classification model, which contains the cropped bounding boxes with insects, exported from the Detection Dataset.


Citation

Until the corresponding paper will be published, please cite this project as:

Sittinger, M. (2022). Insect Detect - Software for automated insect monitoring
with a DIY camera trap system (v1.3). Zenodo. https://doi.org/10.5281/zenodo.7554302

DOI


Acknowledgements

Many thanks to:


License

CC BY-SA 4.0 license

This documentation website and its content is licensed under the Creative Commons Attribution-ShareAlike 4.0 International License (CC BY-SA 4.0).

All Python scripts mentioned on this website are licensed under the GNU General Public License v3.0 (GNU GPLv3).