Section 01
Introduction: Citation Hallucination Issues in AI In-depth Research Agents and Solutions
A large-scale study found that 3-13% of citation URLs generated by commercial LLMs and in-depth research agents are fake links caused by AI hallucinations. The more an agent claims to be capable of "in-depth research", the higher the proportion of fake citations. Researchers have open-sourced the urlhealth tool, which can detect and correct this issue. Through self-correction experiments, non-resolvable citations can be reduced by 6 to 79 times, with the final proportion controlled below 1%. This article will delve into the background, research methods, core findings, and solutions to this problem.