About BirdNET
BirdNET is a research collaboration using machine learning to monitor global biodiversity. We develop open-source tools that transform bioacoustics from a specialized field into a scalable solution for conservation.
Science-led conservation.
BirdNET aims to lower the barrier to using sound for biodiversity monitoring. By combining deep learning with open tools and citizen science, we help track bird populations and support conservation decisions at local to global scales.
BirdNET is a joint effort between:
- K. Lisa Yang Center for Conservation Bioacoustics, Cornell Lab of Ornithology
- Chair of Media Informatics, Chemnitz University of Technology
Supported by researchers, engineers, educators, and community contributors.
Acoustic Monitoring
Many bird species are more easily detected by sound than sight. Passive monitoring captures activity without human disturbance across remote habitats and seasons.
Machine Learning
Manual review of audio is not feasible. AI automates species identification at scale, extracting features from noisy soundscapes with consistent precision.
Open Ecosystem
BirdNET provides core models for edge devices and reusable embeddings, supporting a global community of researchers and conservationists.
From soundscapes to insights
The BirdNET model is trained on thousands of hours of curated bird vocalizations. It is optimized to be robust against background noise while remaining efficient enough for real-time mobile use.
Design Priorities
- Noise robustness
- Real-time & Batch optimization
- High species coverage (6k+)
- Continuous data-driven updates
Audio is captured at 48 kHz and divided into 3-second segments, optimizing the balance between model input size and the natural duration of avian vocalizations.
Signals are processed into two log-scaled Mel-spectrograms, visualizing frequency patterns from 0 to 3 kHz and 150 Hz to 15 kHz for detailed analysis.
A Convolutional Neural Network (CNN) scans these visuals, utilizing millions of trained weights to detect species-specific patterns.
Initial predictions are cross-referenced with local metadata (location and date) to produce high-confidence species probabilities.
Supported by a global network
Institutional Support
Work at the K. Lisa Yang Center for Conservation Bioacoustics is made possible by the generosity of K. Lisa Yang, supporting innovative conservation technologies that scale global biodiversity monitoring.
Project Funding
- German Federal Ministry of Research, Technology and Space (FKZ 01|S22072)
- German Federal Ministry for the Environment (FKZ 67KI31040E)
- German Federal Ministry of Economic Affairs and Energy (FKZ 16KN095550)
- Deutsche Bundesstiftung Umwelt (39263/01)
- European Social Fund
Research Collaboration
BirdNET is a joint effort of partners from academia and industry. This multidisciplinary collaboration enables us to bridge the gap between AI research and practical field ecology.
Representative partner logos. See publications and tools for additional collaborators.