Section 01
[Introduction] InftyThink: Breaking the Length Limit of Long-Context Reasoning for Large Language Models
The InftyThink framework developed by the REAL Lab at Zhejiang University successfully breaks the length limit of long-context reasoning for traditional large language models (LLMs) through an innovative segmented reasoning mechanism, enabling efficient understanding and reasoning of ultra-long texts. This work has been accepted by ICLR 2026, addressing core issues faced by current LLMs such as scattered attention and the "Lost in the Middle" phenomenon. It constructs a hierarchical reasoning architecture that mimics human reading patterns, balancing computational efficiency and deep understanding capabilities.