Section 01
Knowledge Distillation Empowers Sequential Recommendation: A New Path to Injecting LLM Semantic Understanding
Sequential recommendation systems excel at modeling temporal behaviors but have limitations in capturing semantic information. This article proposes an innovative knowledge distillation method that distills text-based user profiles generated by pre-trained LLMs into sequential recommendation models, achieving a balance between recommendation quality and system efficiency without requiring LLM online inference.