Nature Study: Over 140,000 Fake AI Citations in Research Papers in 2025 Alone
A study published in Nature found more than 140,000 AI-generated fake citations across four major research repositories—covering both peer-reviewed papers and preprints published in 2025 alone. The finding represents the first large-scale empirical quantification of LLM hallucination contaminating the academic record. The same day, arXiv updated its Code of Conduct to clarify that authors bear full responsibility for all paper content regardless of how it was generated.
Why It Matters
Academic literature is a primary training source for future AI models. Fake citations entering the published record at this scale create a feedback loop where AI-generated errors can compound across successive training generations.