Zing Forum

Reading

machine-core: A Flexible AI Agent Framework Supporting MCP Protocol

machine-core is a flexible framework for building AI Agents, supporting MCP (Model Context Protocol) integration, dynamic OpenAPI tool generation, vector-based RAG tool filtering, and multiple LLM and Embedding providers.

AI AgentMCPLLMEmbeddingRAGOpenAPIPython框架
Published 2026-04-10 03:50Recent activity 2026-04-10 04:22Estimated read 6 min
machine-core: A Flexible AI Agent Framework Supporting MCP Protocol
1

Section 01

Introduction / Main Floor: machine-core: A Flexible AI Agent Framework Supporting MCP Protocol

machine-core is a flexible framework for building AI Agents, supporting MCP (Model Context Protocol) integration, dynamic OpenAPI tool generation, vector-based RAG tool filtering, and multiple LLM and Embedding providers.

2

Section 02

What is machine-core?

machine-core is a flexible framework specifically designed for building AI Agents. Its core positioning is to provide a clear, scalable infrastructure that allows developers to quickly build intelligent agent systems with tool calling, knowledge retrieval, and multi-model support capabilities.

Unlike traditional Agent frameworks, machine-core emphasizes clear separation of architecture: the infrastructure layer (AgentCore) is decoupled from the execution mode layer (BaseAgent), enabling developers to choose the appropriate Agent type based on specific scenarios without being restricted by the framework's preset structure.

3

Section 03

1. Multi-provider LLM + Embedding Support

machine-core has built-in support for 7 LLM providers and 3 Embedding providers:

LLM Providers:

  • Ollama (local/cloud, default: qwen3-vl:32b)
  • Azure (Azure OpenAI, default: gpt-4o-2)
  • Grok (x.ai, default: grok-2-latest)
  • Groq (Groq Cloud, default: llama-3.3-70b-versatile)
  • Google Gemini
  • Vertex Gemini
  • Vertex Claude

Embedding Providers:

  • Ollama (default: nomic-embed-text)
  • Azure (default: text-embedding-3-large)
  • Google Cloud

This multi-provider architecture allows developers to easily switch models via environment variables or configurations, enabling flexible model routing and fallback strategies.

4

Section 04

2. MCP (Model Context Protocol) Integration

MCP is an open protocol proposed by Anthropic, aiming to standardize the interaction between AI models and external tools/data sources. machine-core natively supports MCP:

  • Load and validate MCP tool sets from JSON configurations
  • Dynamic tool discovery and invocation
  • Seamless integration with existing Agent workflows

This means you can quickly integrate any MCP-compliant tools (such as file system access, database queries, API calls, etc.) into your Agent.

5

Section 05

3. Dynamic OpenAPI Tool Generation

This is a very practical feature: machine-core can dynamically generate pydantic-ai tools based on OpenAPI specifications. The workflow is as follows:

  1. Obtain the OpenAPI specification (fetch_openapi_spec)
  2. Use ToolFilterManager to index and filter tools
  3. Select tools based on task relevance
  4. Dynamically generate tools and rebuild the Agent (rebuild_agent)

This design allows the Agent to dynamically adjust the available tool set according to current task requirements, avoiding context bloat caused by loading too many irrelevant tools at once.

6

Section 06

4. RAG Tool Filtering

machine-core has a built-in vector-based tool filtering mechanism:

  • ToolFilterManager uses vector similarity to index and filter tools
  • Automatically select the most relevant tools based on task descriptions
  • Supports essential_tools to force inclusion of specific tools

This mechanism significantly improves the Agent's tool selection efficiency, especially in scenarios with a large number of tools.

7

Section 07

5. File Processing Capabilities

The framework has built-in multiple file processing functions:

  • PDF text extraction
  • Image OCR
  • VLM (Vision Language Model) preprocessing
  • Batch upload processing

These capabilities enable the Agent to handle multimodal inputs, expanding application scenarios.

8

Section 08

6. Vector Storage

Vector storage implementation based on LanceDB:

  • Cross-table search
  • DocumentStore facade pattern
  • Integration with Embedding providers

This provides a solid infrastructure for RAG (Retrieval-Augmented Generation) applications.