Zing Forum

Reading

Ollixir: A First-Class Ollama Client for Elixir Developers

Ollixir is a fully-featured Elixir client library equivalent in functionality to the official ollama-python library. It supports complete features such as chat, generation, embeddings, function calling, structured output, and multimodal image processing, and integrates with HuggingFace Hub and Ollama Cloud API.

OllamaElixir本地LLM大语言模型流式处理工具调用多模态HuggingFace嵌入MCP
Published 2026-04-05 07:39Recent activity 2026-04-05 07:55Estimated read 10 min
Ollixir: A First-Class Ollama Client for Elixir Developers
1

Section 01

Ollixir: A First-Class Ollama Client for Elixir Developers

Abstract: Ollixir is a fully-featured Elixir client library equivalent in functionality to the official ollama-python library. It supports complete features such as chat, generation, embeddings, function calling, structured output, and multimodal image processing, and integrates with HuggingFace Hub and Ollama Cloud API.

This article will detail Ollixir's background, core features, advanced functions, quick start guide, and conclusion.

2

Section 02

Background: The Gap in Local LLM Clients for Elixir Developers

With the popularization of large language model (LLM) technology, more and more developers want to run LLMs on local or private infrastructure to achieve better data privacy protection and lower operational costs. As a popular solution for running LLMs locally, Ollama provides easy-to-use command-line tools and REST APIs. However, for Elixir developers, there was previously a lack of a fully-featured, elegantly designed client library.

Ollixir fills this gap. As an Elixir client equivalent in functionality to the official ollama-python library, Ollixir not only provides full API coverage but also makes full use of Elixir's language features to bring developers a programming experience that aligns with the language's conventions.

3

Section 03

Core Features: Complete Functionality Combined with Elixir's Traits

Ollixir offers a rich and complete set of features, covering almost all of Ollama's capabilities:

1. Full API Coverage

Ollixir supports all core APIs of Ollama:

  • Chat: Supports multi-turn conversations, the foundation for building interactive applications
  • Text Generation (Generate/Completion): Single text completion, suitable for simple generation tasks
  • Embeddings: Converts text into vector representations, used for semantic search and RAG applications
  • Model Management: List, pull, create, copy, and delete models

2. Stream Response Handling

Elixir's Stream abstraction perfectly aligns with Ollama's streaming API. Developers can direct streaming output to enumerables or send it directly to a specified process, enabling true asynchronous processing. This design allows non-blocking handling of streaming output from large models, which is ideal for real-time interaction scenarios such as chatbots and real-time content generation.

3. Function Calling

Ollixir supports Ollama's function calling feature, allowing models to call external functions. Developers can manually define tools or directly pass Elixir functions, letting Ollixir automatically convert them into tool definitions. This design greatly reduces the complexity of building Agent applications, enabling models to interact with external systems, perform calculations, query databases, or call APIs.

4. Structured Output

Through JSON Schema, Ollixir allows models to output structured data. This is very useful for applications that need to extract structured information from model outputs, such as data extraction, classification, entity recognition, etc. Developers only need to define the schema, and the model will return data in the specified format, facilitating subsequent processing.

5. Multimodal Support

Ollixir supports image input and can handle vision-language models (such as LLaVA). The client automatically encodes images into Base64; developers only need to provide the image path or binary data. This makes it easy to build applications that can "see" images.

6. Typed Responses and Options

Ollixir provides structured response and option types. Developers can choose to receive structs instead of raw maps, getting better IDE support and compile-time checks. The Options struct provides preset configurations (like creative, precise, etc.) to easily adjust generation parameters.

4

Section 04

Advanced Features: Expanding Model Ecosystem and Cloud Support

HuggingFace Hub Integration

Ollixir optionally integrates with HuggingFace Hub, allowing discovery and running of over 45,000 GGUF format models. This integration greatly expands the range of available models; developers are no longer limited to models officially supported by Ollama and can try various professional models contributed by the community.

Ollama Cloud API Support

Ollixir supports the Ollama Cloud API, including web search, web page retrieval, and access to cloud-based large models. The cloud model lineup is strong, covering programming and Agentic models (such as deepseek-v3.1, qwen3-coder), general reasoning models (such as glm-5, gemini-3-flash), and multimodal models (such as kimi-k2.5, qwen3-vl). This provides a convenient way for applications that need stronger model capabilities.

MCP Server Support

Ollixir includes an MCP (Model Context Protocol) server example that can work with any stdio-supported MCP client, such as Cursor, Claude Desktop, Cline, Continue, Open WebUI, etc. This allows Ollixir to serve as the underlying model provider for these tools.

5

Section 05

Quick Start: Simple and Easy Integration Steps

Using Ollixir is very simple. First, install and run Ollama, then add the dependency in mix.exs. After initializing the client, you can perform operations such as chat, text generation, and embeddings.

The client supports rich custom configurations, including custom host addresses, timeout settings, request headers, etc. The environment variables OLLAMA_HOST and OLLAMA_API_KEY provide a convenient way to configure these.

6

Section 06

Error Handling: Structured and Elixir-Conventional Design

Ollixir provides structured error types, including ConnectionError, RequestError, and ResponseError. This design follows Elixir's conventions, using the {:ok, result} and {:error, reason} tuple patterns, making it easy for developers to perform pattern matching and error handling.

7

Section 07

Conclusion: An Important AI Tool for the Elixir Ecosystem

Ollixir brings a fully-featured, elegantly designed Ollama client to the Elixir ecosystem. It not only fills a technical gap but also provides developers with a programming experience that aligns with the language's conventions by making full use of Elixir's language features (such as stream processing, pattern matching, and structured data).

Whether building locally run AI applications, prototype validation, or RAG systems in production environments, Ollixir is a choice worth considering. Its MIT license and active community support also provide guarantees for long-term use. As local large model technology continues to develop, Ollixir is expected to become an important part of Elixir developers' toolkits.