Why this exists
The One Health research landscape spans thousands of journals across veterinary medicine, wildlife biology, epidemiology, and public health. The evidence is there. Finding it, comparing it, and trusting it is the hard part.
The problem
A veterinarian treating a zoonotic infection and an epidemiologist tracking the same pathogen in wildlife populations are often reading different journals, using different search tools, and arriving at different bodies of evidence — even though the underlying science overlaps.
The One Health principle recognizes that human, animal, and environmental health are deeply interconnected. But the research infrastructure hasn't caught up. Evidence is siloed by discipline, scattered across publishers, and locked behind search interfaces that return papers, not answers.
When a clinician needs to know whether a treatment is supported by evidence, they don't need a list of PDFs. They need to know: how many studies, what types, in which species, with what results, and how strong is the evidence? That's the question we set out to answer.
What we built
onehealth.science is an AI-powered evidence search platform. We continuously index metadata and abstracts from research relevant to One Health — veterinary medicine, zoonotic disease, wildlife health, epidemiology, antimicrobial resistance, and the interfaces between them.
For every paper, our extraction pipeline produces a structured Evidence Card: study type, PICO fields, consensus-validated claims, and a GRADE-aligned quality level. These claims are then normalized, embedded using PubMedBERT, and indexed for semantic search — enabling you to find related evidence across species and terminology.
The result is a searchable evidence index where you can ask a question and get back not just relevant papers, but structured evidence: what study designs were used, what each one found, and how strong the evidence is — all with GRADE quality levels and full provenance to the source.
What we are (and aren't)
We are an independent research tool that makes published evidence more accessible, structured, and comparable. We help people find and evaluate the evidence that already exists.
We are not a journal, a clinical decision engine, or a replacement for expert judgment. Our automated grading is transparent structured triage, not peer review. We never make clinical recommendations. We report what the evidence shows and where it's uncertain.
Independence
onehealth.science is not affiliated with the World Health Organization, the CDC, the OIE/WOAH, any university, or any publisher. "One Health" is a scientific framework used by researchers and institutions worldwide. We use it because it accurately describes the scope of our platform: evidence across the human-animal-environment interface.
We are funded independently. Our methodology is public. Our grading rubric is versioned and documented. We have no financial relationship with any journal, publisher, or pharmaceutical company that would create a conflict of interest in how we present evidence.
The One Health framework
One Health is the understanding that the health of people, animals, and the environment are closely linked and interdependent. It's not a new idea — the connection between animal and human disease has been recognized for centuries — but it has become an increasingly formal framework for research, policy, and practice.
Roughly 75% of new or emerging infectious diseases in people come from animals. Antimicrobial resistance travels across species and ecosystems. Climate change reshapes the distribution of vectors and pathogens. These aren't separate problems; they're facets of the same system.
The research addressing these interconnections is published across dozens of disciplines and hundreds of journals. A study on leptospirosis in wildlife might be published in an ecology journal. The veterinary treatment data might be in a clinical veterinary journal. The human epidemiology might be in a public health journal. Our platform connects these threads.
Our origin
onehealth.science was built by the team behind PupPilot, a platform focused on improving pet health outcomes through better information. Working in the veterinary space, we kept running into the same problem: the evidence existed, but finding it, evaluating it, and comparing it across studies was enormously time-consuming, even for professionals who do this regularly.
We realized that the infrastructure for evidence synthesis in veterinary and One Health research hadn't kept pace with what's now possible computationally. The tools that exist are either general-purpose academic search engines (which don't understand species, study design, or evidence quality) or manually curated databases (which are excellent but can't scale to cover the full literature).
So we built what we wished existed: a system that reads the research, extracts the claims, grades the evidence, links it across species and disciplines, and makes the whole thing searchable. Automatically, transparently, and continuously.
Contact
We welcome questions about our methodology, suggestions for coverage, and feedback from the research community. Reach us at hello@onehealth.science.
For questions about our grading rubric or extraction pipeline, see our methodology documentation. For details on which journals and databases we index, see sources & coverage.