mirror of
https://github.com/mudler/LocalAI.git
synced 2026-05-16 12:38:01 -04:00
The Ollama /api/tags handler passes a nil filter to galleryop.ListModels. When ModelsPath contains any non-skipped loose file the function then calls filter(name, nil) and panics, which Echo surfaces to clients as "Server disconnected without sending a response" - the exact failure Home Assistant's Ollama integration reports against LocalAI. Mirror the nil guard already present in ModelConfigLoader.GetModelConfigsByFilter so every caller is safe, and add a regression test that exercises the loose-file path with a nil filter. Assisted-by: claude:claude-opus-4-7 [Claude Code] Signed-off-by: Ettore Di Giacinto <mudler@localai.io> Co-authored-by: Ettore Di Giacinto <mudler@localai.io>