Logo
Explore Help
Register Sign In
mirror/LocalAI
1
0
Fork 0
You've already forked LocalAI
mirror of https://github.com/mudler/LocalAI.git synced 2026-04-17 05:18:53 -04:00
Code Issues Packages Projects Releases Wiki Activity
Files
efae3fd97b028800c4a66d441d61dfc90bbe2dfa
LocalAI/backend/go
History
Ettore Di Giacinto 87e6de1989 feat: wire transcription for llama.cpp, add streaming support (#9353)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2026-04-14 16:13:40 +02:00
..
acestep-cpp
chore: ⬆️ Update ace-step/acestep.cpp to e0c8d75a672fca5684c88c68dbf6d12f58754258 (#9261)
2026-04-07 00:39:24 +02:00
llm/llama
feat: add distributed mode (#9124)
2026-03-30 00:47:27 +02:00
local-store
feat: add distributed mode (#9124)
2026-03-30 00:47:27 +02:00
opus
feat: add distributed mode (#9124)
2026-03-30 00:47:27 +02:00
piper
fix(package.sh): drop redundant -a and -R
2026-02-05 16:39:38 +01:00
qwen3-tts-cpp
feat(qwen3tts.cpp): add new backend (#9316)
2026-04-11 23:14:26 +02:00
sam3-cpp
feat(rocm): bump to 7.x (#9323)
2026-04-12 08:51:30 +02:00
silero-vad
fix(package.sh): drop redundant -a and -R
2026-02-05 16:39:38 +01:00
stablediffusion-ggml
feat(rocm): bump to 7.x (#9323)
2026-04-12 08:51:30 +02:00
voxtral
feat: wire transcription for llama.cpp, add streaming support (#9353)
2026-04-14 16:13:40 +02:00
whisper
feat: wire transcription for llama.cpp, add streaming support (#9353)
2026-04-14 16:13:40 +02:00
Powered by Gitea Version: 1.26.0+dev-152-g4c51acb26b Page: 32ms Template: 4ms
Auto
English
Bahasa Indonesia Deutsch English Español Français Gaeilge Italiano Latviešu Magyar nyelv Nederlands Polski Português de Portugal Português do Brasil Suomi Svenska Türkçe Čeština Ελληνικά Български Русский Українська فارسی മലയാളം 日本語 简体中文 繁體中文(台灣) 繁體中文(香港) 한국어
Licenses API