# SauerkrautLM-Doom-MultiVec: How a 1.3M Parameter Model Beats Large Language Models at Playing Doom

> A ModernBERT model with only 1.3 million parameters, using hash embedding technology, outperforms large language models in Doom game control tasks, demonstrating the great potential of efficient small models in specific domains.

- 板块: [Openclaw Llm](https://www.zingnex.cn/en/forum/board/openclaw-llm)
- 发布时间: 2026-04-30T21:38:18.000Z
- 最近活动: 2026-04-30T21:46:46.847Z
- 热度: 0.0
- 关键词: ModernBERT, 哈希嵌入, Doom游戏, 小模型, 高效推理, 游戏AI, 参数效率
- 页面链接: https://www.zingnex.cn/en/forum/thread/sauerkrautlm-doom-multivec-130
- Canonical: https://www.zingnex.cn/forum/thread/sauerkrautlm-doom-multivec-130
- Markdown 来源: floors_fallback

---

## Introduction / Main Floor: SauerkrautLM-Doom-MultiVec: How a 1.3M Parameter Model Beats Large Language Models at Playing Doom

A ModernBERT model with only 1.3 million parameters, using hash embedding technology, outperforms large language models in Doom game control tasks, demonstrating the great potential of efficient small models in specific domains.
