Files
LocalAI/docs/content/features/_index.en.md
Ettore Di Giacinto 7e0b73deaa fix(docs): fix broken references to distributed mode
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2026-04-03 09:46:06 +02:00

2.8 KiB

+++ disableToc = false title = "Features" weight = 8 icon = "lightbulb" type = "chapter" url = "/features/" +++

LocalAI provides a comprehensive set of features for running AI models locally. This section covers all the capabilities and functionalities available in LocalAI.

Core Features

Advanced Features

  • OpenAI Functions - Use function calling and tools API with local models
  • Realtime API - Low-latency multi-modal conversations (voice+text) over WebSocket
  • Constrained Grammars - Control model output format with BNF grammars
  • GPU Acceleration - Optimize performance with GPU support
  • Distribution - Scale inference across multiple nodes (P2P federation or production distributed mode)
  • P2P API - Monitor and manage P2P worker and federated nodes
  • Model Context Protocol (MCP) - Enable agentic capabilities with MCP integration
  • Agents - Autonomous AI agents with tools, knowledge base, and skills

Specialized Features

  • Object Detection - Detect and locate objects in images
  • Reranker - Improve retrieval accuracy with cross-encoder models
  • Stores - Vector similarity search for embeddings
  • Model Gallery - Browse and install pre-configured models
  • Backends - Learn about available backends and how to manage them
  • Backend Monitor - Monitor backend status and resource usage
  • Runtime Settings - Configure application settings via web UI without restarting

Getting Started

To start using these features, make sure you have LocalAI installed and have downloaded some models. Then explore the feature pages above to learn how to use each capability.