mirror of
https://github.com/mudler/LocalAI.git
synced 2026-04-01 13:42:20 -04:00
* fix: include model name in mmproj file path to prevent model isolation issues This fix addresses issue #8937 where different models with mmproj files having the same filename (e.g., mmproj-F32.gguf) would overwrite each other. By including the model name in the path (llama-cpp/mmproj/<model-name>/<filename>), each model's mmproj files are now stored in separate directories, preventing the collision that caused conversations to fail when switching between models. Fixes #8937 Signed-off-by: LocalAI Bot <localai-bot@example.com> * test: update test expectations for model name in mmproj path The test file had hardcoded expectations for the old mmproj path format. Updated the test expectations to include the model name subdirectory to match the new path structure introduced in the fix. Fixes CI failures on tests-apple and tests-linux * fix: add model name to model path for consistency with mmproj path This change makes the model path consistent with the mmproj path by including the model name subdirectory in both paths: - mmproj: llama-cpp/mmproj/<model-name>/<filename> - model: llama-cpp/models/<model-name>/<filename> This addresses the reviewer's feedback that the model config generation needs to correctly reference the mmproj file path. Fixes the issue where the model path didn't include the model name subdirectory while the mmproj path did. Signed-off-by: team-coding-agent-1 <team-coding-agent-1@localai.dev> --------- Signed-off-by: LocalAI Bot <localai-bot@example.com> Signed-off-by: team-coding-agent-1 <team-coding-agent-1@localai.dev> Co-authored-by: team-coding-agent-1 <team-coding-agent-1@localai.dev>