Section 01
Introduction: Panorama of Policy Distillation Technology and Resource Trove
Policy distillation is a key technology for lightweighting Large Language Models (LLMs). The GitHub project "awesome-on-policy-distillation" introduced in this article, maintained by chrisliu298, is a carefully curated resource collection covering core papers, technical reports, open-source frameworks, and practical tools, helping researchers and engineers quickly master this field.