Zing Forum

Reading

CodeComp: Code Property Graph-Guided KV Cache Compression Revolutionizes Intelligent Programming Agents

CodeComp integrates static program analysis into LLM reasoning using code property graph (CPG) priors extracted by Joern, enabling training-free KV cache compression. In bug localization and code generation tasks, it outperforms pure attention compression baselines under the same memory budget while maintaining patch generation quality comparable to full-context reasoning.

KV缓存压缩代码属性图智能编程代理静态程序分析Joern长上下文推理代码理解
Published 2026-04-11 22:38Recent activity 2026-04-14 09:56Estimated read 4 min
CodeComp: Code Property Graph-Guided KV Cache Compression Revolutionizes Intelligent Programming Agents
1

Section 01

CodeComp: Structure-Aware KV Cache Compression Revolutionizes Intelligent Programming Agents

CodeComp integrates static program analysis into LLM reasoning using code property graph (CPG) priors extracted by Joern, enabling training-free KV cache compression. In bug localization and code generation tasks, it outperforms pure attention compression baselines under the same memory budget while maintaining patch generation quality comparable to full-context reasoning, providing a new solution for intelligent programming agents to break through memory bottlenecks.

2

Section 02

Memory Bottlenecks of Intelligent Programming Agents and Limitations of Existing Compression Methods

When modern intelligent programming agents handle long contexts, KV cache expands with sequence length, becoming a memory bottleneck. Existing KV cache compression discards tokens based on attention signals, but key structural tokens (such as function calls, branch conditions, assignment statements) in code—characterized by high non-locality—are easily ignored, leading to a sharp drop in performance for long code understanding tasks.

3

Section 03

CodeComp's Structure-Aware Compression Method and Training-Free Design

CodeComp introduces CPG (a structured representation integrating AST, CFG, and PDG). It uses Joern to extract CPG and compute token structural importance scores, combining a hybrid retention strategy of attention signals and structural scores to dynamically allocate cross-layer compression budgets. Its training-free design has zero deployment cost, is model-agnostic, takes effect immediately, and can be seamlessly integrated into the SGLang framework.

4

Section 04

Experimental Validation: Excellent Performance in Bug Localization and Code Generation Tasks

In bug localization tasks, when the compression rate exceeds 80%, the accuracy of pure attention baselines approaches randomness, while CodeComp recovers most of the full-context accuracy. In code generation tasks, the patch quality is comparable to that of uncompressed full context. All experiments are conducted under the same memory budget, highlighting the advantages of the structure-aware strategy.

5

Section 05

Methodological Insights and Future Directions for Structure-Aware Programming

CodeComp demonstrates that integrating domain knowledge can bring qualitative leaps, which can be extended to fields such as law, medicine, and scientific literature. Future directions include exploring lightweight program analysis, extending to long natural language documents, and developing adaptive compression strategies.

6

Section 06

Engineering Practice Considerations and Current Limitations

CodeComp chooses the mature Joern tool to extract CPG, which supports multiple languages, has low overhead, and provides rich configuration options. Current limitations include language range restrictions dependent on Joern, certain computational overhead in CPG analysis, and the need for optimization in extreme latency scenarios.