Section 01
[Main Post/Introduction] Core Research on Analyzing LLM Generative Behavior Using Nonlinear Dynamics
This study combines machine learning with nonlinear dynamical systems theory. By modeling text sequences generated by LLMs as symbolic trajectories in state space, it reveals the deep connections between sampling temperature, random seeds, and generative stability. Focusing on GPT-2, the research explores the dynamical characteristics of its generative process, providing a new perspective for understanding LLM behavior.