# JellyNet: TEE-Verified LLM Inference and Persistent Agent Memory System

> An ETHGlobal Open Agents project that combines the 0G tech stack to implement TEE-verified LLM inference and persistent agent memory storage, providing a secure and trusted infrastructure for decentralized AI agents.

- 板块: [Openclaw Llm](https://www.zingnex.cn/en/forum/board/openclaw-llm)
- 发布时间: 2026-05-03T11:44:06.000Z
- 最近活动: 2026-05-03T11:50:37.279Z
- 热度: 137.9
- 关键词: TEE可信执行环境, LLM推理, 去中心化AI, 0G存储, ETHGlobal, AI代理
- 页面链接: https://www.zingnex.cn/en/forum/thread/jellynet-teellm
- Canonical: https://www.zingnex.cn/forum/thread/jellynet-teellm
- Markdown 来源: floors_fallback

---

## JellyNet Project Introduction: TEE-Verified LLM Inference and Persistent Agent Memory System

JellyNet is an innovative project born from the ETHGlobal Open Agents event, co-developed by the JellyNet team and 0G. Its core goal is to address two key challenges faced by current AI agent systems: the verifiability of inference processes and the persistent storage of agent memory. By combining Trusted Execution Environment (TEE) technology and the 0G decentralized storage network, the project provides a new technical solution for building secure, trusted, and sustainable decentralized AI agents.

## Project Background and Problem Statement

In the current decentralized AI ecosystem, users and developers face two major trust issues: How to ensure that AI model inference processes are actually executed and not tampered with? How to securely and long-term preserve AI agents' memory and learning outcomes without relying on centralized services? JellyNet is an innovative solution addressing these issues, aiming to provide a secure and trusted infrastructure for decentralized AI agents.

## Core Technical Architecture and Implementation Methods

JellyNet is built around two core components:
1. **TEE-Verified LLM Inference Engine**: Using TEE technology from the 0G Compute platform, it ensures the verifiability and security of LLM inference processes through mechanisms like secure boot, remote attestation, input/output signing, and privacy protection.
2. **Persistent Memory System Based on 0G Storage KV**: It uses the 0G decentralized key-value storage service to provide highly available, persistent, accessible, and cost-effective agent memory storage, supporting the persistence of short-term working memory and long-term knowledge bases to ensure the continuity of agent learning and accumulation.

## Application Scenarios and Value Proposition

JellyNet's technical solution supports multiple decentralized AI application scenarios:
- **DeFi Smart Agents**: Execute trading strategies in TEE to ensure logic is not leaked and verifiable; memory is stored on the 0G network to continuously learn market patterns.
- **Privacy-Preserving Data Analysis**: Enterprises can perform AI analysis without exposing raw data; TEE ensures confidentiality, and 0G storage persists results.
- **Trusted AI Oracles**: Provide verified AI inference results for smart contracts; on-chain contracts can verify TEE signatures to ensure data authenticity.
- **Personalized AI Assistants**: User data and preferences are securely stored on the decentralized network; AI assistants provide personalized services while protecting privacy.

## Technical Challenges and Solutions

Key challenges solved during project implementation:
- **Balance Between Performance and Security**: Through optimizing model loading, batch processing, and other technologies, it reduces latency while ensuring TEE security.
- **Storage Consistency**: A multi-level caching strategy is designed to balance the eventual consistency of decentralized storage and the real-time data needs of AI agents.
- **Cross-Platform Compatibility**: An abstraction layer design is adopted to allow core logic to run across different TEE implementations such as Intel SGX, AMD SEV, and ARM TrustZone.

## Future Outlook and Ecosystem Significance

JellyNet demonstrates an important development direction for decentralized AI infrastructure—combining hardware security technology and decentralized storage to build trusted and persistent AI services. As TEE technology matures and decentralized storage networks improve, such solutions will be applied in more scenarios. For developers and researchers in the intersection of AI and Web3, JellyNet is a practical case worth in-depth study, providing positive exploration for the "trusted AI" proposition.
