Zing Forum

Reading

llm-docs-builder: A Powerful Tool for Optimizing Documents for LLM and RAG Systems

A Ruby tool that converts Markdown documents into AI-friendly formats, automatically generates llms.txt, reduces token consumption by 67-95%, and supports RAG retrieval enhancement.

llm-docs-builder文档优化RAGLLMMarkdowntoken优化llms.txt技术文档AI友好
Published 2026-04-04 18:40Recent activity 2026-04-04 18:47Estimated read 5 min
llm-docs-builder: A Powerful Tool for Optimizing Documents for LLM and RAG Systems
1

Section 01

Introduction: llm-docs-builder – An Open-Source Tool for Optimizing Documents for LLM and RAG

llm-docs-builder is an open-source Ruby tool developed by Maciej Mensfeld, designed to solve the redundancy problem when AI reads technical documents. It can convert Markdown documents into AI-friendly formats, automatically generate llms.txt index files, reduce token consumption by 67-95%, and support Retrieval-Augmented Generation (RAG) systems, improving the efficiency and accuracy of AI's document understanding.

2

Section 02

Background: The Dilemma of AI Reading Documents

When LLMs (such as ChatGPT, Claude) crawl technical documents, human-designed HTML pages contain redundant elements like navigation bars, footers, and JavaScript. These contents occupy 70-90% of the context window, drowning out core information. This not only increases API call costs but also reduces the accuracy of models' answers to technical questions. Developers urgently need an automated solution to clean and optimize documents.

3

Section 03

Core Capabilities: Key Features of llm-docs-builder

The core features of this tool include: 1. Document conversion and optimization, removing unnecessary elements; 2. Automatic generation of standardized llms.txt index files; 3. Conversion of HTML content into clean Markdown format; 4. Enhancement of RAG retrieval effectiveness through hierarchical title context and metadata. Tests show that processing Karafka document samples reduces token consumption by an average of 83%.

4

Section 04

Core Mechanism: How Document Optimization Is Implemented

  1. Intelligent content cleaning: Remove comments, badges, pre-metadata, standardize whitespace characters, and optionally remove images and quote blocks; 2. Link normalization: Convert relative paths to absolute paths and remove unnecessary anchors; 3. Hierarchical title enhancement: Convert titles into complete hierarchical paths (e.g., "Configuration / Consumer Settings / auto_offset_reset") to ensure that fragments retain context during RAG retrieval.
5

Section 05

Practical Application: Installation and Usage Guide

Installation methods: Docker (docker pull mensfeld/llm-docs-builder:latest) or Ruby Gem (gem install llm-docs-builder). Common commands: compare (shows token savings), transform (single file conversion), bulk-transform (batch conversion), generate (generates llms.txt). The configuration file supports conversion options and RAG enhancement settings. Web servers (e.g., Nginx) can automatically return optimized versions for AI crawlers.

6

Section 06

Practical Significance: Value and Impact of the Tool

  1. Cost savings: Significantly reduce token consumption and lower API call costs; 2. Improve AI accuracy: Remove interfering information so models can focus on core content; 3. Enhance RAG performance: Hierarchical title enhancement improves retrieval quality and answer relevance; 4. Promote standards: The llms.txt format facilitates the formation of AI-friendly document standards.
7

Section 07

Summary and Outlook

llm-docs-builder represents a new paradigm in document processing, balancing the reading needs of both humans and AI. As LLM and RAG systems are applied more deeply, such tools will become increasingly important. Open-source project maintainers, document authors, and AI developers can use this tool to reduce costs, improve efficiency, and enhance user experience. It may become a standard practice for AI-friendly documents in the future. Project address: https://github.com/mensfeld/llm-docs-builder.