mirror of
https://github.com/mudler/LocalAI.git
synced 2026-01-22 13:22:03 -05:00
chore(model gallery): 🤖 add 1 new models via gallery agent (#7922)
chore(model gallery): 🤖 add new models via gallery agent Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com> Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
This commit is contained in:
@@ -1,4 +1,29 @@
|
||||
---
|
||||
- name: "liquidai.lfm2-2.6b-transcript"
|
||||
url: "github:mudler/LocalAI/gallery/virtual.yaml@master"
|
||||
urls:
|
||||
- https://huggingface.co/DevQuasar/LiquidAI.LFM2-2.6B-Transcript-GGUF
|
||||
description: |
|
||||
This is a large language model (2.6B parameters) designed for text-generation tasks. It is a quantized version of the original model `LiquidAI/LFM2-2.6B-Transcript`, optimized for efficiency while retaining strong performance. The model is built on the foundation of the base model, with additional optimizations for deployment and use cases like transcription or language modeling. It is trained on large-scale text data and supports multiple languages.
|
||||
overrides:
|
||||
parameters:
|
||||
model: llama-cpp/models/LiquidAI.LFM2-2.6B-Transcript.Q4_K_M.gguf
|
||||
name: LiquidAI.LFM2-2.6B-Transcript-GGUF
|
||||
backend: llama-cpp
|
||||
template:
|
||||
use_tokenizer_template: true
|
||||
known_usecases:
|
||||
- chat
|
||||
function:
|
||||
grammar:
|
||||
disable: true
|
||||
description: Imported from https://huggingface.co/DevQuasar/LiquidAI.LFM2-2.6B-Transcript-GGUF
|
||||
options:
|
||||
- use_jinja:true
|
||||
files:
|
||||
- filename: llama-cpp/models/LiquidAI.LFM2-2.6B-Transcript.Q4_K_M.gguf
|
||||
sha256: 301a8467531781909dc7a6263318103a3d8673a375afc4641e358d4174bd15d4
|
||||
uri: https://huggingface.co/DevQuasar/LiquidAI.LFM2-2.6B-Transcript-GGUF/resolve/main/LiquidAI.LFM2-2.6B-Transcript.Q4_K_M.gguf
|
||||
- name: "lfm2.5-1.2b-nova-function-calling"
|
||||
url: "github:mudler/LocalAI/gallery/virtual.yaml@master"
|
||||
urls:
|
||||
|
||||
Reference in New Issue
Block a user