The AI That Counts Species: When Artificial Intelligence Serves the Living World
300 million biodiversity observations collected by citizens. Algorithms tracking poachers in real time. 200 million protein structures free for the entire planet. The same technology powering our news feeds can also protect life — if we choose what it serves.
The AI That Counts Species: When Artificial Intelligence Serves the Living World
TL;DR: We often hear that artificial intelligence is inherently extractive — a technology in service of surveillance, targeted advertising, and the concentration of power. This reading is incomplete. The same infrastructure is today being used to identify endangered species, protect nature reserves against poaching, and accelerate the discovery of medicines for the forgotten diseases of the poor. The question is not “AI yes or no” but “in whose service, under what rules?”
Here is a statistic that should appear in every report on the future of AI: in 2025, ordinary citizens submitted nearly 300 million observations of plants, animals, fungi, and other organisms to a shared platform. That data has fed into more than 4,000 scientific papers on global biodiversity.
These observers are not researchers in white coats. They are hikers with a phone, curious gardeners, children on school trips. Their tool is called iNaturalist. It combines artificial intelligence and collective intelligence to do something that governments and laboratories alone could never fund: map life at a planetary scale, in real time.
That is the use of AI this article documents. Not the promise — the facts.
The 4.3 Million Eyes of the Living World
iNaturalist was born from a master’s thesis at Berkeley in 2008. Today, the platform has 4.3 million registered users, with 400,000 active each month. Its visual identification algorithm analyses your photos and suggests the matching species, refined through community validation.
The result is an unprecedented stream of ecological data. Scientists use these observations to track the expansion or retreat of invasive species, detect unknown populations of rare plants, measure the effects of climate change on flowering cycles, and document species in regions where no researcher is present.
Here, AI does not replace human observation. It amplifies it. The visual recognition algorithm processes what 4 million human eyes collect. Community validation corrects errors. The result is more reliable than if either intelligence were operating alone.
One honest note: the platform is not without geographic bias. Northern Europe and North America are over-represented. Sub-Saharan Africa and rural regions of South-East Asia — where biodiversity is often richest — remain under-documented. The map of life that iNaturalist draws also reflects the map of unequal access to smartphones.
The Sentinel of the Parks
Deforestation and poaching take place in hard-to-reach locations, often at night, always beyond the reach of understaffed field teams. That is where the second AI application worth discussing comes in.
EarthRanger, developed by the Allen Institute for Artificial Intelligence, is a data management platform for conservation teams. It aggregates in real time data from camera traps, acoustic sensors, GPS tags on animals, and field team reports. The AI identifies patterns — a vehicle stopped too long, an abnormal nocturnal movement in a sensitive zone — and alerts rangers.
According to secondary reports, the system is deployed in 76 countries and 650 protected areas. A review published in 2025 in Frontiers in Ecology and Evolution — whose lead author is affiliated with the EarthRanger programme — documents how this generation of tools “revolutionises the detection and disruption of illegal wildlife trade”, combining open-source software, AI, acoustic sensors, and satellite imagery.
Important nuance: the deployment figures (76 countries, 650 parks) come from secondary communications and not from the Frontiers article itself. They should be treated as indicative orders of magnitude rather than independently verified data.
What can be stated with certainty: the approach has changed the nature of conservation work. Teams that were previously reactive — discovering carcasses after the fact — can now be preventive. Poaching remains a massive problem; these tools do not solve it alone, but they shift the odds in favour of those defending wildlife.
Humanity’s Free Protein
In 2022, Google DeepMind published a database of 200 million protein structures predicted by AlphaFold, freely accessible to anyone in the world. Determining the 3D structure of a protein previously took months or years; AlphaFold does it in seconds.
These structures matter to biology of living organisms as much as to pharmacology. Understanding how a protein folds is understanding how an organism works — or fails to work. In the field of conservation, proteins are biomarkers: they allow diseases to be detected in wild animals, mechanisms of climate adaptation to be understood, and treatments for endangered species to be developed.
More than 500,000 researchers from 190 countries have accessed the AlphaFold database. Among them, teams working with the Drugs for Neglected Diseases initiative (DNDi) on Leishmaniasis and Chagas disease — two diseases that kill tens of thousands of people per year in the poorest regions, and that hold no interest for pharmaceutical laboratories whose model depends on patients who can pay.
This is where the question of access becomes political. A technology whose results are locked behind patents remains in service of those who can pay. A technology whose results are freely shared becomes a common good — and can serve the forgotten diseases that the market ignores.
AlphaFold was published in open access. That choice was not inevitable; it was deliberate.
Extractive AI Versus Regenerative AI: How to Tell the Difference
When faced with a project or product presented as “AI for good”, a few questions help distinguish greenwashing from reality:
1. Who controls the data? iNaturalist is managed by a non-profit organisation; the data is open access. EarthRanger is open source. AlphaFold is freely published. Compare this with an ecological AI tool whose data remains the property of a publicly listed company: the business model will eventually steer decisions.
2. Who benefits from the result? Regenerative AI produces common goods — an accessible database, a patent-free medicine, a shared map. Extractive AI captures value for its shareholders. The same technology can serve either depending on the governance structure.
3. Is the system auditable? An algorithm that influences conservation decisions — where to patrol, which zone to protect — must be open to scrutiny by local stakeholders. The opacity of proprietary models poses a democratic problem as much as a technical one.
4. Does AI concentrate or distribute agency? iNaturalist gives a secondary school student in Brittany or a researcher in Nairobi the means to contribute to science. That is the distribution of agency. A system that concentrates decision-making capacity in the hands of a handful of operators moves in the opposite direction.
These criteria are not a perfect checklist. They are a starting point for asking the right questions — before accepting that the next technology “in service of the planet” actually is.
What You Can Do
The most direct contribution is also the most accessible: install iNaturalist and photograph the living world around you. A single observation in your garden, your street, your local forest enriches a global database used by researchers you will never meet.
Beyond individual gestures, institutional choices carry more weight. Supporting open-access publication policies, demanding that AI projects funded by public money share their data, and favouring companies that publish their algorithms are structural levers.
The dividing line between extractive AI and regenerative AI is not technological. It is political. It is drawn in legal statutes, licences, governance structures — and in the collective choices we make about what technology should serve.
300 million biodiversity observations. 200 million protein structures. Parks protected across four continents. These figures exist because institutions chose to put their tools in common. Others made the opposite choice.
The question is not whether AI will “save the planet”. That is a poorly framed question. The right question: which AI do we trust with our attention, our time, and our votes?
Sources
- iNaturalist — Wikipedia — verified 2026-05-02
- The rising tide of conservation technology — Frontiers in Ecology and Evolution — published 27 May 2025, verified 2026-05-02
- AlphaFold Protein Structure Database — EMBL-EBI — verified 2026-05-02
- AlphaFold reveals the structure of the protein universe — Google DeepMind — published July 2022, verified 2026-05-02
See also: