Compare commits

..

22 Commits

Author SHA1 Message Date
Ettore Di Giacinto
8fb95686af chore(model gallery): add ibm-granite_granite-4.0-micro (#6376)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-10-03 10:03:34 +02:00
Ettore Di Giacinto
4132085c01 chore(model gallery): add ibm-granite_granite-4.0-h-micro (#6375)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-10-03 09:32:20 +02:00
Ettore Di Giacinto
c14f1ffcfd chore(model gallery): add ibm-granite_granite-4.0-h-tiny (#6374)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-10-03 09:31:00 +02:00
Ettore Di Giacinto
07cca4b69a chore(model gallery): add ibm-granite_granite-4.0-h-small (#6373)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2025-10-03 09:28:57 +02:00
LocalAI [bot]
dd927c36f6 chore: ⬆️ Update ggml-org/llama.cpp to d64c8104f090b27b1f99e8da5995ffcfa6b726e2 (#6371)
⬆️ Update ggml-org/llama.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-10-02 21:09:00 +00:00
LocalAI [bot]
052f42e926 chore: ⬆️ Update ggml-org/llama.cpp to 1fe4e38cc20af058ed320bd46cac934991190056 (#6368)
⬆️ Update ggml-org/llama.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
Co-authored-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2025-10-02 16:29:57 +02:00
LocalAI [bot]
30d43588ab chore: ⬆️ Update ggml-org/whisper.cpp to 7849aff7a2e1f4234aa31b01a1870906d5431959 (#6367)
⬆️ Update ggml-org/whisper.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-10-01 21:15:28 +00:00
LocalAI [bot]
d21ec22f74 chore: ⬆️ Update ggml-org/whisper.cpp to 8c0855fd6bb115e113c0dca6255ea05f774d35f7 (#6365)
⬆️ Update ggml-org/whisper.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-10-01 12:12:27 +02:00
LocalAI [bot]
04fecd634a chore: ⬆️ Update ggml-org/llama.cpp to b2ba81dbe07b6dbea9c96b13346c66973dede32c (#6366)
⬆️ Update ggml-org/llama.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-09-30 21:13:23 +00:00
LocalAI [bot]
33c14198db chore: ⬆️ Update ggml-org/llama.cpp to 5f7e166cbf7b9ca928c7fad990098ef32358ac75 (#6355)
⬆️ Update ggml-org/llama.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-09-30 14:41:16 +02:00
LocalAI [bot]
967c2727e3 chore: ⬆️ Update ggml-org/whisper.cpp to 32be14f8ebfc0498c2c619182f0d7f4c822d52c4 (#6354)
⬆️ Update ggml-org/whisper.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-09-30 14:40:59 +02:00
dependabot[bot]
f41f30ad92 chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/exllama2 (#6356)
chore(deps): bump grpcio in /backend/python/exllama2

Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 14:40:41 +02:00
dependabot[bot]
e77340e8a5 chore(deps): bump grpcio from 1.75.0 to 1.75.1 in /backend/python/transformers (#6362)
chore(deps): bump grpcio in /backend/python/transformers

Bumps [grpcio](https://github.com/grpc/grpc) from 1.75.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.75.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 14:40:29 +02:00
dependabot[bot]
d51a3090f7 chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/bark (#6359)
Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 14:40:16 +02:00
dependabot[bot]
1bf3bc932c chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/vllm (#6357)
Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 14:40:02 +02:00
dependabot[bot]
564a47da4e chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/common/template (#6358)
chore(deps): bump grpcio in /backend/python/common/template

Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 08:52:36 +02:00
dependabot[bot]
c37ee93ff2 chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/rerankers (#6360)
chore(deps): bump grpcio in /backend/python/rerankers

Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 08:52:25 +02:00
dependabot[bot]
f4b65db4e7 chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/diffusers (#6361)
chore(deps): bump grpcio in /backend/python/diffusers

Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-30 08:52:11 +02:00
Ettore Di Giacinto
f5fa8e6649 Revert "chore(deps): bump transformers from 4.48.3 to 4.56.2 in /backend/python/coqui" (#6363)
Revert "chore(deps): bump transformers from 4.48.3 to 4.56.2 in /backend/pyth…"

This reverts commit 570e39bdcf.
2025-09-30 08:51:49 +02:00
dependabot[bot]
570e39bdcf chore(deps): bump transformers from 4.48.3 to 4.56.2 in /backend/python/coqui (#6330)
chore(deps): bump transformers in /backend/python/coqui

Bumps [transformers](https://github.com/huggingface/transformers) from 4.48.3 to 4.56.2.
- [Release notes](https://github.com/huggingface/transformers/releases)
- [Commits](https://github.com/huggingface/transformers/compare/v4.48.3...v4.56.2)

---
updated-dependencies:
- dependency-name: transformers
  dependency-version: 4.56.2
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-29 21:30:29 +00:00
dependabot[bot]
2ebe37b671 chore(deps): bump grpcio from 1.74.0 to 1.75.1 in /backend/python/coqui (#6353)
Bumps [grpcio](https://github.com/grpc/grpc) from 1.74.0 to 1.75.1.
- [Release notes](https://github.com/grpc/grpc/releases)
- [Changelog](https://github.com/grpc/grpc/blob/master/doc/grpc_release_schedule.md)
- [Commits](https://github.com/grpc/grpc/compare/v1.74.0...v1.75.1)

---
updated-dependencies:
- dependency-name: grpcio
  dependency-version: 1.75.1
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <support@github.com>
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2025-09-29 20:11:55 +00:00
LocalAI [bot]
dca685f784 chore: ⬆️ Update ggml-org/llama.cpp to bd0af02fc96c2057726f33c0f0daf7bb8f3e462a (#6352)
⬆️ Update ggml-org/llama.cpp

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2025-09-28 21:08:50 +00:00
19 changed files with 1294 additions and 662 deletions

View File

File diff suppressed because it is too large Load Diff

View File

@@ -60,7 +60,7 @@ jobs:
runs-on: 'ubuntu-latest'
makeflags: "--jobs=3 --output-sync=target"
- build-type: 'vulkan'
platforms: 'linux/amd64,linux/arm64'
platforms: 'linux/amd64'
tag-latest: 'false'
tag-suffix: '-vulkan-core'
runs-on: 'ubuntu-latest'

View File

@@ -101,7 +101,7 @@ jobs:
makeflags: "--jobs=4 --output-sync=target"
aio: "-aio-gpu-nvidia-cuda-12"
- build-type: 'vulkan'
platforms: 'linux/amd64,linux/arm64'
platforms: 'linux/amd64'
tag-latest: 'auto'
tag-suffix: '-gpu-vulkan'
runs-on: 'ubuntu-latest'

View File

@@ -32,27 +32,15 @@ RUN <<EOT bash
if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
apt-get update && \
apt-get install -y --no-install-recommends \
software-properties-common pciutils sudo wget gpg-agent curl xz-utils && \
echo "vulkan" > /run/localai/capability && \
if [ "amd64" = "$TARGETARCH" ]; then
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
if [ "arm64" = "$TARGETARCH" ]; then
# For ARM64, we need to build the Vulkan SDK manually as there are no packages available
mkdir vulkan && cd vulkan && curl -L -o vulkan-sdk.tar.xz https://github.com/mudler/vulkan-sdk-arm/releases/download/1.4.321.1/vulkansdk-ubuntu-22.04-arm-1.4.321.1.tar.xz && \
tar -xvf vulkan-sdk.tar.xz && \
rm vulkan-sdk.tar.xz && \
cd * && \
cp -rfv aarch64/* /usr/ && \
cd ../.. && \
rm -rf vulkan
fi
software-properties-common pciutils wget gpg-agent && \
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/* && \
echo "vulkan" > /run/localai/capability
fi
EOT

View File

@@ -37,27 +37,14 @@ RUN <<EOT bash
if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
apt-get update && \
apt-get install -y --no-install-recommends \
software-properties-common pciutils sudo wget gpg-agent curl xz-utils && \
echo "vulkan" > /run/localai/capability && \
if [ "amd64" = "$TARGETARCH" ]; then
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
if [ "arm64" = "$TARGETARCH" ]; then
# For ARM64, we need to build the Vulkan SDK manually as there are no packages available
mkdir vulkan && cd vulkan && curl -L -o vulkan-sdk.tar.xz https://github.com/mudler/vulkan-sdk-arm/releases/download/1.4.321.1/vulkansdk-ubuntu-22.04-arm-1.4.321.1.tar.xz && \
tar -xvf vulkan-sdk.tar.xz && \
rm vulkan-sdk.tar.xz && \
cd * && \
cp -rfv aarch64/* /usr/ && \
cd ../.. && \
rm -rf vulkan
fi
software-properties-common pciutils wget gpg-agent && \
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
EOT

View File

@@ -85,27 +85,14 @@ RUN <<EOT bash
if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
apt-get update && \
apt-get install -y --no-install-recommends \
software-properties-common pciutils sudo wget gpg-agent curl xz-utils libxcb1 libx11-6 && \
echo "vulkan" > /run/localai/capability && \
if [ "amd64" = "$TARGETARCH" ]; then
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
if [ "arm64" = "$TARGETARCH" ]; then
# For ARM64, we need to build the Vulkan SDK manually as there are no packages available
mkdir vulkan && cd vulkan && curl -L -o vulkan-sdk.tar.xz https://github.com/mudler/vulkan-sdk-arm/releases/download/1.4.321.1/vulkansdk-ubuntu-22.04-arm-1.4.321.1.tar.xz && \
tar -xvf vulkan-sdk.tar.xz && \
rm vulkan-sdk.tar.xz && \
cd * && \
cp -rfv aarch64/* /usr/ && vulkaninfo \
cd ../.. && \
rm -rf vulkan
fi
software-properties-common pciutils wget gpg-agent && \
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
EOT

View File

@@ -45,27 +45,14 @@ RUN <<EOT bash
if [ "${BUILD_TYPE}" = "vulkan" ] && [ "${SKIP_DRIVERS}" = "false" ]; then
apt-get update && \
apt-get install -y --no-install-recommends \
software-properties-common pciutils sudo wget gpg-agent curl xz-utils && \
echo "vulkan" > /run/localai/capability && \
if [ "amd64" = "$TARGETARCH" ]; then
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
if [ "arm64" = "$TARGETARCH" ]; then
# For ARM64, we need to build the Vulkan SDK manually as there are no packages available
mkdir vulkan && cd vulkan && curl -L -o vulkan-sdk.tar.xz https://github.com/mudler/vulkan-sdk-arm/releases/download/1.4.321.1/vulkansdk-ubuntu-22.04-arm-1.4.321.1.tar.xz && \
tar -xvf vulkan-sdk.tar.xz && \
rm vulkan-sdk.tar.xz && \
cd * && \
cp -rfv aarch64/* /usr/ && \
cd ../.. && \
rm -rf vulkan
fi
software-properties-common pciutils wget gpg-agent && \
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
EOT

View File

@@ -1,5 +1,5 @@
LLAMA_VERSION?=4807e8f96a61b2adccebd5e57444c94d18de7264
LLAMA_VERSION?=d64c8104f090b27b1f99e8da5995ffcfa6b726e2
LLAMA_REPO?=https://github.com/ggerganov/llama.cpp
CMAKE_ARGS?=

View File

@@ -8,7 +8,7 @@ JOBS?=$(shell nproc --ignore=1)
# whisper.cpp version
WHISPER_REPO?=https://github.com/ggml-org/whisper.cpp
WHISPER_CPP_VERSION?=44fa2f647cf2a6953493b21ab83b50d5f5dbc483
WHISPER_CPP_VERSION?=7849aff7a2e1f4234aa31b01a1870906d5431959
CMAKE_ARGS+=-DBUILD_SHARED_LIBS=OFF

View File

@@ -1,4 +1,4 @@
bark==0.1.5
grpcio==1.74.0
grpcio==1.75.1
protobuf
certifi

View File

@@ -1,3 +1,3 @@
grpcio==1.74.0
grpcio==1.75.1
protobuf
grpcio-tools

View File

@@ -1,4 +1,4 @@
grpcio==1.74.0
grpcio==1.75.1
protobuf
certifi
packaging==24.1

View File

@@ -1,5 +1,5 @@
setuptools
grpcio==1.74.0
grpcio==1.75.1
pillow
protobuf
certifi

View File

@@ -1,4 +1,4 @@
grpcio==1.74.0
grpcio==1.75.1
protobuf
certifi
wheel

View File

@@ -1,3 +1,3 @@
grpcio==1.74.0
grpcio==1.75.1
protobuf
certifi

View File

@@ -1,4 +1,4 @@
grpcio==1.75.0
grpcio==1.75.1
protobuf==6.32.0
certifi
setuptools

View File

@@ -1,4 +1,4 @@
grpcio==1.74.0
grpcio==1.75.1
protobuf
certifi
setuptools

48
gallery/granite4.yaml Normal file
View File

@@ -0,0 +1,48 @@
---
name: "granite-3.2"
config_file: |
backend: "llama-cpp"
mmap: true
template:
chat_message: |
<|start_of_role|>{{ .RoleName }}<|end_of_role|>
{{ if .FunctionCall -}}
<tool_call>
{{ else if eq .RoleName "tool" -}}
<tool_response>
{{ end -}}
{{ if .Content -}}
{{.Content }}
{{ end -}}
{{ if eq .RoleName "tool" -}}
</tool_response>
{{ end -}}
{{ if .FunctionCall -}}
{{toJson .FunctionCall}}
</tool_call>
{{ end -}}
<|end_of_text|>
function: |
<|start_of_role|>system<|end_of_role|>
You are a helpful AI assistant with access to the following tools. When a tool is required to answer the user's query, respond with <|tool_call|> followed by a JSON list of tools used. If a tool does not exist in the provided list of tools, notify the user that you do not have the ability to fulfill the request.
Write the response to the user's input by strictly aligning with the facts in the provided documents. If the information needed to answer the question is not available in the documents, inform the user that the question cannot be answered based on the available data.
{{range .Functions}}
{'type': 'function', 'function': {'name': '{{.Name}}', 'description': '{{.Description}}', 'parameters': {{toJson .Parameters}} }}
{{end}}
For each function call return a json object with function name and arguments
{{.Input -}}
<|start_of_role|>assistant<|end_of_role|>
chat: |
{{.Input -}}
<|start_of_role|>assistant<|end_of_role|>
completion: |
{{.Input}}
context_size: 8192
f16: true
stopwords:
- '<|im_end|>'
- '<dummy32000>'
- '</s>'
- '<|end_of_text|>'

View File

@@ -1,4 +1,68 @@
---
- &granite4
url: "github:mudler/LocalAI/gallery/granite4.yaml@master"
name: "ibm-granite_granite-4.0-h-small"
license: apache-2.0
icon: https://cdn-avatars.huggingface.co/v1/production/uploads/639bcaa2445b133a4e942436/CEW-OjXkRkDNmTxSu8Egh.png
tags:
- gguf
- GPU
- CPU
- text-to-text
urls:
- https://huggingface.co/ibm-granite/granite-4.0-h-small
- https://huggingface.co/bartowski/ibm-granite_granite-4.0-h-small-GGUF
description: |
Granite-4.0-H-Small is a 32B parameter long-context instruct model finetuned from Granite-4.0-H-Small-Base using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. This model is developed using a diverse set of techniques with a structured chat format, including supervised finetuning, model alignment using reinforcement learning, and model merging. Granite 4.0 instruct models feature improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
overrides:
parameters:
model: ibm-granite_granite-4.0-h-small-Q4_K_M.gguf
files:
- filename: ibm-granite_granite-4.0-h-small-Q4_K_M.gguf
sha256: c59ce76239bd5794acdbdf88616dfc296247f4e78792a9678d4b3e24966ead69
uri: huggingface://bartowski/ibm-granite_granite-4.0-h-small-GGUF/ibm-granite_granite-4.0-h-small-Q4_K_M.gguf
- !!merge <<: *granite4
name: "ibm-granite_granite-4.0-h-tiny"
urls:
- https://huggingface.co/ibm-granite/granite-4.0-h-tiny
- https://huggingface.co/bartowski/ibm-granite_granite-4.0-h-tiny-GGUF
description: |
Granite-4.0-H-Tiny is a 7B parameter long-context instruct model finetuned from Granite-4.0-H-Tiny-Base using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. This model is developed using a diverse set of techniques with a structured chat format, including supervised finetuning, model alignment using reinforcement learning, and model merging. Granite 4.0 instruct models feature improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
overrides:
parameters:
model: ibm-granite_granite-4.0-h-tiny-Q4_K_M.gguf
files:
- filename: ibm-granite_granite-4.0-h-tiny-Q4_K_M.gguf
sha256: 33a689fe7f35b14ebab3ae599b65aaa3ed8548c393373b1b0eebee36c653146f
uri: huggingface://bartowski/ibm-granite_granite-4.0-h-tiny-GGUF/ibm-granite_granite-4.0-h-tiny-Q4_K_M.gguf
- !!merge <<: *granite4
name: "ibm-granite_granite-4.0-h-micro"
urls:
- https://huggingface.co/ibm-granite/granite-4.0-h-micro
- https://huggingface.co/bartowski/ibm-granite_granite-4.0-h-micro-GGUF
description: |
Granite-4.0-H-Micro is a 3B parameter long-context instruct model finetuned from Granite-4.0-H-Micro-Base using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. This model is developed using a diverse set of techniques with a structured chat format, including supervised finetuning, model alignment using reinforcement learning, and model merging. Granite 4.0 instruct models feature improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
overrides:
parameters:
model: ibm-granite_granite-4.0-h-micro-Q4_K_M.gguf
files:
- filename: ibm-granite_granite-4.0-h-micro-Q4_K_M.gguf
sha256: 48376d61449687a56b3811a418d92cc0e8e77b4d96ec13eb6c9d9503968c9f20
uri: huggingface://bartowski/ibm-granite_granite-4.0-h-micro-GGUF/ibm-granite_granite-4.0-h-micro-Q4_K_M.gguf
- !!merge <<: *granite4
name: "ibm-granite_granite-4.0-micro"
urls:
- https://huggingface.co/ibm-granite/granite-4.0-micro
- https://huggingface.co/bartowski/ibm-granite_granite-4.0-micro-GGUF
description: |
Granite-4.0-Micro is a 3B parameter long-context instruct model finetuned from Granite-4.0-Micro-Base using a combination of open source instruction datasets with permissive license and internally collected synthetic datasets. This model is developed using a diverse set of techniques with a structured chat format, including supervised finetuning, model alignment using reinforcement learning, and model merging. Granite 4.0 instruct models feature improved instruction following (IF) and tool-calling capabilities, making them more effective in enterprise applications.
overrides:
parameters:
model: ibm-granite_granite-4.0-micro-Q4_K_M.gguf
files:
- filename: ibm-granite_granite-4.0-micro-Q4_K_M.gguf
sha256: bd9d7b4795b9dc44e3e81aeae93bb5d8e6b891b7e823be5bf9910ed3ac060baf
uri: huggingface://bartowski/ibm-granite_granite-4.0-micro-GGUF/ibm-granite_granite-4.0-micro-Q4_K_M.gguf
- &ernie
url: "github:mudler/LocalAI/gallery/chatml.yaml@master"
name: "baidu_ernie-4.5-21b-a3b-thinking"