Section 01
Introduction: delta-Mem—An Efficient Solution to LLM Long Conversation Memory Dilemma
The delta-Mem framework, launched by Declare Lab at the Singapore University of Technology and Design, targets the context forgetting problem faced by large language models (LLMs) in long conversations. It adopts an incremental memory update mechanism, significantly enhancing the coherence and accuracy of multi-turn dialogues while keeping computational overhead low. This framework provides an efficient and feasible solution for memory enhancement of LLMs.