章节 01
Team Red LLM: AMD GPU Local LLM Inference Guide & Benchmark Database
Team Red LLM is a community-maintained project focusing on AMD GPU local LLM deployment. It provides detailed ROCm/HIP inference steps, common pitfalls, real performance benchmarks, and supports consumer Radeon, data center Instinct, and Strix Halo APU. The project aims to help AMD users avoid ROCm-related pitfalls and improve their local AI experience.