WildMetrics Earth

Three Sensors. Five Models. Zero Guesswork.

Most biodiversity scores cannot survive an audit. WildMetrics Earth does not estimate β€” it detects, classifies, and logs every species event to a traceable timeseries, grounded in physical sensors and peer-reviewed models.

What Makes WildMetrics Earth Different

Physical Evidence. Not Modeled Assumptions.

Biodiversity claims built on proxies and desktop models fail when scrutiny arrives. WildMetrics Earth processes real sensor events β€” field recordings, satellite passes, and camera trap images β€” through validated ML models. Every metric traces back to a date, a location, and a physical source.

Three sensor pipelines

Acoustic Β· Satellite Β· Camera trap β€” each with dedicated ML processing

Five ML models

BirdNET, BatDetect2, MegaDetector, SpeciesNET, Acoustic Indices β€” purpose-built for field biodiversity

Structured timeseries output

Species detections, habitat quality, and land cover change β€” delivered to your dashboard and reporting pipeline

Processing Pipelines

Three Sensor Pillars. One Unified Evidence Layer.

Each pipeline ingests raw sensor data, runs validated ML models, and writes structured metrics to a shared timeseries database. All three feed the same reporting and dashboard layer.

Pipeline 1

Acoustic Monitoring

AudioMoth field recorders capture WAV audio. WildMetrics Earth processes every recording through three models β€” identifying bird species, bat calls, and broader soundscape health in a single pass.

BirdNET v2

Passerine bird species identification from raw audio. Detects and classifies hundreds of species β€” producing per-recording species lists, confidence scores, and detection timeseries.

Species ID Aves AudioMoth WAV

BatDetect2

Bat echolocation call detection and species classification. Identifies bat activity in the ultrasonic spectrum β€” producing call event timeseries, species-level classification, and nightly activity profiles.

Call Detection Chiroptera Ultrasonic

Acoustic Indices

Soundscape biodiversity proxy metrics computed from spectrogram analysis: ACI (Acoustic Complexity), BI (Bioacoustic Index), NDSI, ADI, and EVE (Acoustic Evenness).

ACI Β· BI Β· NDSI ADI Β· EVE Soundscape

Outputs

Per-species detection timeseries (birds, bats) Diurnal activity histograms (24h) IUCN threat category per detected species Acoustic diversity indices over time Soundscape health trend indicators
Pipeline 2

Satellite Intelligence

Sentinel-2 and Landsat imagery is processed through atmospheric correction, spectral index computation, and change detection β€” producing a longitudinal land cover intelligence layer for every project site.

Spectral Vegetation Indices

NDVI, EVI, and NDWI computed per acquisition β€” tracking vegetation health, canopy density, and water stress over time.

NDVI Β· EVI NDWI Sentinel-2

Land Cover Change Detection

Automated change detection identifies land cover transitions β€” capturing deforestation, degradation, fire aftermath, and restoration progress. Integrates ESA WorldCover, GFW, and GEDI data.

Change Detection GFW Β· GEDI ESA WorldCover

Temporal Band Statistics

Per-pixel temporal statistics across all Sentinel-2 acquisitions β€” median, percentile bands, standard deviation β€” building a statistical fingerprint of each landscape.

Time Series 20+ Years Landsat + S2

Outputs

Vegetation health timeseries (NDVI/EVI/NDWI) Land cover change events with magnitude scores Permanence risk signals (deforestation, degradation) Per-pixel statistical profiles across all acquisitions GeoTIFF raster layers published as WMS/WFS
Pipeline 3

Camera Trap Detection

Field camera traps collect images and video of wildlife. WildMetrics Earth runs two sequential ML models β€” first detecting and localizing animals, then classifying them to species level with IUCN threat attribution.

MegaDetector v5

Animal, person, and vehicle detection with bounding box localization. Filters every image for wildlife presence β€” dramatically reducing processing cost on large deployments.

Object Detection Bounding Box GPU Accelerated

SpeciesNET

Fine-grained wildlife species classification. Produces species names, common names, and IUCN Red List threat categories β€” enabling direct TNFD-ready reporting from camera trap records.

Species Classification IUCN Categories Taxonomy

Video Frame Extraction

Automated frame extraction from MP4 video clips β€” enabling the same species detection pipeline on motion-triggered video as on still images.

MP4 Β· JPG Frame Extraction Mixed Deployments

Outputs

Per-species detection counts and timeseries IUCN Red List exposure per site Diurnal activity patterns per taxonomic class Species richness by Mammalia, Aves, and other classes Spot-check sample images linked to detections
Analytics & Reporting

From Raw Sensor Data to Conservation Reports

All three pipelines write to a shared timeseries database. WildMetrics Earth aggregates across sensor types and delivers structured reports β€” automatically, on a schedule.

Species Summary

Unique species count per taxonomic class β€” Aves (BirdNET), Mammalia (MegaDetector + SpeciesNET), and Chiroptera (BatDetect2) β€” aggregated from acoustic recorders, camera traps, and satellite-derived habitat layers. Filter by date range, site, or sensor deployment to compare biodiversity across projects.

QuestDB timeseries Cross-sensor TNFD-ready

Detection Timeline

Daily detection counts grouped by sensor type and pipeline β€” acoustic events from AudioMoth deployments, camera trap triggers from field units, and satellite acquisition passes from Sentinel-2. Track monitoring coverage gaps, seasonal migration patterns, and dataset completeness across the full project lifecycle.

Daily resolution 3 pipelines Gap detection

IUCN Breakdown

Every detected species is matched against the IUCN Red List β€” from Least Concern through Critically Endangered. The breakdown maps directly to TNFD disclosure requirements, giving project developers and corporate buyers an auditable view of threatened species presence at each monitoring site.

IUCN Red List API TNFD aligned Per-site

Project Overview

The cross-pillar summary combines acoustic biodiversity indices, camera trap species richness, satellite-derived vegetation health (NDVI/EVI), and land cover change events into a single project-level dashboard. This is the view project developers share with verification bodies, buyers use for due diligence, and government planners use for jurisdictional reporting.

6 data pillars Shareable Verra / Gold Standard

Hourly Activity Profiles

24-hour diurnal detection histograms per taxonomic class β€” built from timestamped BirdNET, BatDetect2, and MegaDetector events. Dawn chorus peaks for birds, dusk echolocation bursts for bats, nocturnal mammal activity from camera traps. A key quality signal that proves ecological realism and validates habitat assessment claims.

24h histogram Per-class Ecological validation

Spot-Check Sampling

Randomly sampled detection records per project β€” linking each species event to its source audio clip, camera trap image, or satellite tile. Auditors and rating agencies can independently verify any claim by tracing it back to the raw sensor data. No black boxes, no unverifiable scores.

Traceable Audit-ready Source-linked
1 / 6
Our Principle

Sensor Data You Can Trace. Outputs You Can Defend.

Every metric in WildMetrics Earth traces back to a physical sensor event β€” a WAV file, a satellite acquisition, an image frame. The models that process those events are published, peer-reviewed, and used in the field conservation community. There are no proprietary black-box scores.

When a buyer, auditor, or rating agency asks "where does this biodiversity score come from?" β€” you have an answer. BirdNET v2 detected Turdus merula on AudioMoth unit 4 at 05:42 on 14 March. SpeciesNET classified a jaguar at the northeast transect on 3 separate camera nights.

Explainable outputs are not a feature. They are the baseline requirement for any data that enters a carbon or nature credit verification cycle.

Data Sources

Built on Open Science and Field-Proven Sensors

Field Sensors

  • AudioMoth acoustic recorders (WAV, 16–384 kHz)
  • Camera traps β€” JPG images and MP4 video clips

Satellite Data

  • ESA Sentinel-2 MSI (10m resolution, L1C β†’ L2A)
  • Landsat archive (30m, multidecadal)
  • ESA WorldCover land classification
  • Global Forest Watch (GFW) tree cover loss
  • GEDI spaceborne LiDAR (forest structure)

ML Models

  • BirdNET v2 (Cornell Lab / TU Chemnitz)
  • BatDetect2 (University College London)
  • MegaDetector v5 (Microsoft AI for Earth)
  • SpeciesNET (iNaturalist / Google)
  • Acoustic Indices via librosa
  • Sen2Cor atmospheric correction (ESA)
Roadmap

What's Coming to WildMetrics Earth

The current platform is the evidence layer. The roadmap extends into predictive intelligence and real-time alerting.

Real-Time Monitoring Alerts

Automated early-warning signals triggered by acoustic anomalies, NDVI drops, or camera trap disturbance events β€” delivered before the next verification cycle.

Predictive Degradation Modeling

Machine learning on multi-decade satellite time series β€” predicting permanence risk trajectories before they appear in standard vegetation indices.

Cross-Sensor Fusion

Integrated inference across acoustic, satellite, and camera trap data β€” species habitat models calibrated simultaneously against multiple independent sensor types.

Ready to See WildMetrics Earth in Action?

Book a demo and we'll walk you through the dashboard for your project type β€” carbon project, corporate portfolio, or government planning program.