Files
LocalAI/docs/content/reference/system-info.md
LocalAI [bot] 9090bca920 feat: Add documentation for undocumented API endpoints (#8852)
* feat: add documentation for undocumented API endpoints

Creates comprehensive documentation for 8 previously undocumented endpoints:
- Voice Activity Detection (/v1/vad)
- Video Generation (/video)
- Sound Generation (/v1/sound-generation)
- Backend Monitor (/backend/monitor, /backend/shutdown)
- Token Metrics (/tokenMetrics)
- P2P endpoints (/api/p2p/* - 5 sub-endpoints)
- System Info (/system, /version)

Each documentation file includes HTTP method, request/response schemas,
curl examples, sample JSON responses, and error codes.

* docs: remove token-metrics endpoint documentation per review feedback

The token-metrics endpoint is not wired into the HTTP router and
should not be documented per reviewer request.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

* docs: move system-info documentation to reference section

Per review feedback, system-info endpoint docs are better suited
for the reference section rather than features.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>

---------

Co-authored-by: localai-bot <localai-bot@noreply.github.com>
Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
2026-03-08 17:59:33 +01:00

1.6 KiB

+++ disableToc = false title = "System Info and Version" weight = 23 url = "/reference/system-info/" +++

LocalAI provides endpoints to inspect the running instance, including available backends, loaded models, and version information.

System Information

  • Method: GET
  • Endpoint: /system

Returns available backends and currently loaded models.

Response

Field Type Description
backends array List of available backend names (strings)
loaded_models array List of currently loaded models
loaded_models[].id string Model identifier

Usage

curl http://localhost:8080/system

Example response

{
  "backends": [
    "llama-cpp",
    "huggingface",
    "diffusers",
    "whisper"
  ],
  "loaded_models": [
    {
      "id": "my-llama-model"
    },
    {
      "id": "whisper-1"
    }
  ]
}

Version

  • Method: GET
  • Endpoint: /version

Returns the LocalAI version and build commit.

Response

Field Type Description
version string Version string in the format version (commit)

Usage

curl http://localhost:8080/version

Example response

{
  "version": "2.26.0 (a1b2c3d4)"
}

Error Responses

Status Code Description
500 Internal server error