Section 01
Inferno.jl: Julia-based LLM Inference Framework for Intel Devices (Main Guide)
Inferno.jl is an open-source Julia project dedicated to large language model (LLM) inference on Intel devices. It fills a gap in the Julia ecosystem by enabling LLM capabilities for users who prefer Julia's performance and scientific computing features, while optimizing for Intel hardware (CPU, Arc GPU, Gaudi accelerators) to deliver efficient inference solutions.