Skip to main content
Back to projects
ResearchIn Progress

Arctic Tracker

Conservation intelligence for Arctic species trade

A data platform integrating 473,000+ CITES trade records, IUCN assessments, and illegal seizure data for 43 Arctic species. Co-authored preprint with Dr. Tom Barry now under review.

RoleLead Developer & Researcher
DurationOngoing
ClientUniversity of Akureyri
StatusIn Progress

The Challenge

Conservation data for Arctic species is scattered across CITES trade databases, IUCN assessments, CMS listings, and NAMMCO catch records. No single platform integrates these sources, making it nearly impossible to assess whether trade conventions are actually protecting Arctic wildlife. Researchers rely on fragmentary data — individual trade records, single-species assessments — without the cross-referencing needed to detect systemic patterns. Meanwhile, climate change accelerates threats faster than traditional assessment cycles can track.

The Approach

Arctic Tracker unifies five international data sources into a single analytical platform. Python ETL pipelines process raw data from CITES (473,102 trade records), IUCN (322 assessments), CMS (31 species listings), and NAMMCO (958+ catch records), loading into a Supabase PostgreSQL database across 18 normalized tables. The React/TypeScript frontend provides species-level detail pages with tabbed views across all data dimensions. Seven publication-ready interactive visualizations built with Plotly support the research paper. A pre-aggregation layer reduces query response from 3–5 seconds to under 500 milliseconds. An MCP server enables AI assistants to query the database directly for real-time cross-referencing during research.

Outcomes

473,102Trade RecordsCITES trade records processed, covering 1975–2024 across 43 Arctic species
43Species MonitoredArctic CITES-listed species — polar bears, walruses, narwhals, whales, rare birds
Under reviewPreprintCo-authored with Dr. Tom Barry, published on Research Square December 2025
881Illegal SeizuresDocumented illegal wildlife trade seizure records integrated for enforcement analysis
94,000+ linesCodebaseReact/TypeScript frontend, Python ETL, Supabase backend, MCP server, 7 visualizations
<500msQuery PerformancePre-aggregated trade summaries reduced page loads from 3–5 seconds

Arctic Tracker is a data platform and analytical environment built to answer a question that conservation policy has skirted for decades: are international trade conventions actually protecting Arctic wildlife? The platform integrates 473,102 CITES trade records, 322 IUCN assessments, 881 illegal trade seizures, and NAMMCO catch statistics for 43 Arctic species — making visible what was previously scattered across incompatible international databases.

The project is a collaboration with Dr. Tom Barry at the University of Akureyri. A preprint — Trading for Conservation — was published on Research Square in December 2025 and is currently under review.

The Research

The study evaluates CITES effectiveness across five decades. The findings are sobering: only 18.6% of the 43 species show stable or improving populations. The remaining 69.8% continue declining despite decades of trade protection. Legal trade volumes have reduced, but illegal trade persists — and the data reveals that climate change, not trade, has become the dominant threat to most Arctic species.

This matters because conservation policy still treats trade regulation as the primary lever. The data suggests a more complex reality: protection regimes designed in the 1970s are encountering environmental pressures they were never built to address. Arctic Tracker makes these patterns visible so that evidence — not assumption — drives the next generation of conservation decisions.

Architecture

The platform comprises three components: a Python ETL backend for data processing, a Supabase PostgreSQL database with 18 normalized tables, and a React/TypeScript frontend with 45+ page components.

  • Backend — Python 3.12 with 30+ ETL scripts for CITES, IUCN, CMS, and NAMMCO data ingestion. Pandas and NumPy for analysis. Automated quality assessment and database auditing.
  • Database — Supabase PostgreSQL with 18 tables covering species, trade records, assessments, catch data, illegal seizures, conservation measures, and distribution ranges. Pre-aggregated summary tables for sub-500ms query performance.
  • Frontend — React 18 with TypeScript, Vite, TanStack Query, Shadcn UI, and Tailwind CSS. Species detail pages with tabbed navigation across trade, assessment, catch, illegal trade, and timeline data. Admin panel for data management.
  • Visualizations — Seven publication-ready interactive figures built with Plotly: trade patterns, enforcement analysis, partner networks, conservation status trends, status change heatmaps, illegal seizure analysis, and wild-origin trade flows.
  • MCP Server — A TypeScript Model Context Protocol server connects AI assistants directly to the Supabase database, enabling real-time cross-referencing of species data during research workflows.

Data Governance

No automated scraping of unverified sources. Every record is traced to its international source database. Data quality scripts run automated audits across the full dataset.

Technology Stack

SupabaseReactViteTypeScriptMCP Server

People

Dr. Tom BarryResearch Lead & Co-author

Resources

Lessons Learned

  • Integrating multiple international databases reveals patterns invisible in any single source — cross-referencing trade volume with IUCN status changes showed 69.8% of species continue declining despite CITES protection.
  • Climate change, not illegal trade, emerged as the dominant threat. The data challenges the assumption that stronger trade enforcement alone can protect Arctic species.
  • Pre-aggregation is essential for research platforms. Summary tables transformed the experience without sacrificing analytical depth.
  • An MCP server connecting AI assistants directly to the database created research workflows that would have taken weeks manually.

Related Projects

View all projects