03 · Impact

1,247 citations, 38% YoY — and a public replication tracker.

Citation counts are a noisy signal of research quality. We pair them with a transparent replication tracker so a reader can tell which claims have been independently rebuilt and which are still load-bearing on a single team.

  • 1,247 citations indexed
  • +38% YoY growth
  • 31 independent replications
IMPACT · TRK-26 indexing
2020 2022 2024 2025 2026 1,247
Index OpenAlex+arXiv
Last sync 2026-04-22
Drift ok

Citations are necessary, not sufficient

A citation count tells you that other researchers have read a paper. It does not tell you that the result is correct, that it has been independently rebuilt, or that the field has even tried. The lab tracks both numbers — the citation timeline and the replication ratio — and publishes them side by side.

i

A claim crosses the 100-citation threshold → replication audit opens.

Once a claim has accumulated 100 citations, the lab automatically opens a replication audit and posts the results in the public journal. We do this whether the claim is one of ours or not.

What we count, and what we do not

We count appearances in the OpenAlex and arXiv indexes, weighted by venue type, with self-citations excluded by default. We do not count blog mentions or social-media posts in the headline number, though those are tracked in the secondary feed for context.

Source Weight Self-cite filter
Peer-reviewed venue 1.00 yes
Workshop 0.65 yes
Pre-print 0.40 yes
Thesis 0.50 yes
Book chapter 0.85 yes
Patent 0.20 no

Replication tracker

Every cited paper from the lab carries a status flag in the public tracker:

  • Replicated — at least one independent group has rebuilt the central figure to within reported error bands.
  • Pending — replication is open but not yet complete; reviewer assigned.
  • Adjusted — replication revealed a bound that needs updating; the paper carries an addendum.
  • Withdrawn — replication failed and the claim has been withdrawn. We publish the negative result with the same prominence as the original.

Top venues

The headline citation count masks a long tail. The five venues below account for 62% of the indexed citations.

NeurIPS

Reinforcement learning and agentic-systems track. Largest single venue contribution.

412 citations7 papers

Nature Methods

Methodology pieces on reproducibility pipelines and content-addressed artefact management.

198 citations2 papers

IEEE S&P

Cryptographic protocol verification, including the 18-protocol audit series.

156 citations3 papers

ICML

Adversarial robustness benchmarks and counterfactual cohort analysis.

127 citations4 papers

How to read the impact page

If you have arrived here from a citation list and want to know whether a particular paper has been replicated: the public tracker is queryable by DOI. If the tracker says Pending, the lab welcomes your replication — and will cite it.

Continue exploring the lab

Open the rest of the research surface.

Each tile of the bento grid is a primitive of the workflow — instrumentation, modelling, peer review, publication. Together they form a transparent chain of custody from raw signal to citable claim.