Compare commits

..

67 Commits

Author SHA1 Message Date
Ettore Di Giacinto
f41a519a2c tests: try to get logs
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-25 09:24:55 +02:00
Ettore Di Giacinto
e84b31935c feat(vulkan): add vulkan support to the llama.cpp backend (#2648)
feat(vulkan): add vulkan support to llama.cpp

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 20:04:58 +02:00
Ettore Di Giacinto
03b1cf51fd feat(whisper): add translate option (#2649)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 19:21:22 +02:00
Ettore Di Giacinto
9e6dec0bc4 fix(install.sh): not all systems have nproc
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 18:21:20 +02:00
Ettore Di Giacinto
04b01cd62c ci: put a cap on parallel runs
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 18:08:09 +02:00
Ettore Di Giacinto
a181dd0ebc refactor: gallery inconsistencies (#2647)
* refactor(gallery): move under core/

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fix(unarchive): do not allow symlinks

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 17:32:12 +02:00
Ettore Di Giacinto
69206fcd4b fix(install.sh): move ARCH detection so it works also for mac (#2646)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 10:34:35 +02:00
Ettore Di Giacinto
2c94e15746 fix(install.sh): fix version typo (#2645)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-24 10:30:17 +02:00
Dave
12513ebae0 rf: centralize base64 image handling (#2595)
contains simple fixes to warnings and errors, removes a broken / outdated test, runs go mod tidy, and as the actual change, centralizes base64 image handling

Signed-off-by: Dave Lee <dave@gray101.com>
2024-06-24 08:34:36 +02:00
LocalAI [bot]
4156a4f15f ⬆️ Update ggerganov/llama.cpp (#2632)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-23 22:21:38 +00:00
Ettore Di Giacinto
491bb4f174 Update hermes-2-pro-mistral.yaml
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-23 15:17:41 +02:00
Sertaç Özercan
5866fc8ded chore: fix go.mod module (#2635)
Signed-off-by: Sertac Ozercan <sozercan@gmail.com>
2024-06-23 08:24:36 +00:00
Ettore Di Giacinto
eb4cd78ca6 ci: run master jobs on self-hosted
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-23 10:15:53 +02:00
Ettore Di Giacinto
40ce71855a ci: disable max-parallelism on master 2024-06-22 23:28:09 +02:00
Ettore Di Giacinto
9c0d0afd09 ci: bump parallel jobs (#2633)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 23:24:46 +02:00
Ettore Di Giacinto
0f9aa1ef91 fix(install.sh): install CUDA toolkit only if CUDA is detected
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 12:21:59 +02:00
Ettore Di Giacinto
3ee5ceb9fa Update kubernetes.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 12:16:55 +02:00
Ettore Di Giacinto
1bd72a3be5 Update kubernetes.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 12:16:27 +02:00
Ettore Di Giacinto
fbd14118bf Update kubernetes.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 12:14:53 +02:00
Ettore Di Giacinto
515d98b978 Update model-gallery.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 12:10:49 +02:00
Ettore Di Giacinto
789cf6c599 Update model-gallery.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 12:10:27 +02:00
Ettore Di Giacinto
0bc82d7270 fix(install.sh): properly detect suse distros
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 12:08:48 +02:00
Ettore Di Giacinto
9a7ad75bff docs: update to include installer and update advanced YAML options (#2631)
* docs: update quickstart and advanced sections

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* docs: improvements

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* examples(kubernete): add nvidia example

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 12:00:38 +02:00
Ettore Di Giacinto
9fb3e4040b Update README.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 10:29:46 +02:00
Ettore Di Giacinto
070fd1b9da Update distributed_inferencing.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 10:06:09 +02:00
Ettore Di Giacinto
dda5b9f260 Update distributed_inferencing.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-22 10:05:48 +02:00
Ettore Di Giacinto
8d84dd4f88 fix(worker): use dynaload for single binaries (#2620)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 09:33:18 +02:00
Ettore Di Giacinto
f569237a50 feat(oci): support OCI images and Ollama models (#2628)
* Support specifying oci:// and ollama:// for model URLs

Fixes: https://github.com/mudler/LocalAI/issues/2527
Fixes: https://github.com/mudler/LocalAI/issues/1028

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Lower watcher warnings

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Allow to install ollama models from CLI

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fixup tests

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Do not keep file ownership

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* Skip test on darwin

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-22 08:17:41 +02:00
LocalAI [bot]
e265a618d9 models(gallery): ⬆️ update checksum (#2630)
⬆️ Checksum updates in gallery/index.yaml

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-22 04:45:41 +00:00
LocalAI [bot]
533343c84f ⬆️ Update ggerganov/llama.cpp (#2629)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-22 02:28:06 +00:00
Ettore Di Giacinto
260f2e1d94 fix(install.sh): correctly handle systemd service installation (#2627)
Fixup install.sh systemd service installation

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-21 23:56:06 +02:00
Ettore Di Giacinto
964732590d models(gallery): add hermes-2-theta-llama-3-70b (#2626)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-21 19:41:49 +02:00
LocalAI [bot]
70a2bfe82e ⬆️ Update ggerganov/llama.cpp (#2617)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-21 06:41:34 +00:00
Ettore Di Giacinto
ba2d969c44 models(gallery): add qwen2-1.5b-ita (#2615)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-20 20:35:53 +02:00
Ettore Di Giacinto
d3c78cf4d7 models(gallery): add magnum-72b-v1 (#2614)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-20 20:31:23 +02:00
Ettore Di Giacinto
34afd891a6 models(gallery): add llama3-8b-darkidol-1.1-iq-imatrix (#2613)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-20 20:30:47 +02:00
Ettore Di Giacinto
d3137775a1 models(gallery): add llama-3-cursedstock-v1.8-8b-iq-imatrix (#2612)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-20 20:14:48 +02:00
Ettore Di Giacinto
e1772026a1 models(gallery): add llama-3-sec-chat (#2611)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-20 20:14:03 +02:00
LocalAI [bot]
d0423254dd ⬆️ Update ggerganov/llama.cpp (#2606)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-20 00:58:40 +00:00
LocalAI [bot]
db0e52ae9d ⬆️ Update docs version mudler/LocalAI (#2605)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-20 00:05:19 +00:00
LocalAI [bot]
4f030f9cd3 models(gallery): ⬆️ update checksum (#2607)
⬆️ Checksum updates in gallery/index.yaml

Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-19 22:20:17 +02:00
Ettore Di Giacinto
60fb45eb97 models(gallery): add l3-umbral-mind-rp-v1.0-8b-iq-imatrix (#2608)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-19 22:19:40 +02:00
Rene Leonhardt
43f0688a95 feat: Upgrade to CUDA 12.5 (#2601)
Signed-off-by: Rene Leonhardt <65483435+reneleonhardt@users.noreply.github.com>
2024-06-19 17:50:49 +02:00
LocalAI [bot]
8142bdc48f ⬆️ Update ggerganov/llama.cpp (#2603)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-19 00:28:50 +00:00
Ettore Di Giacinto
89a11e15e7 fix(single-binary): bundle ld.so (#2602)
* debug

* fix copy command/silly muscle memory

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* remove tmate

* Debugging

* Start binary with ld.so if present in libdir

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* small refactor

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-18 22:43:43 +02:00
Ettore Di Giacinto
06de542032 feat(talk): display an informative box, better colors (#2600)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-18 15:10:01 +02:00
Ettore Di Giacinto
ecbb61cbf4 feat(sd-3): add stablediffusion 3 support (#2591)
* feat(sd-3): add stablediffusion 3 support

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* deps(diffusers): add sentencepiece

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* models(gallery): add stablediffusion-3

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-18 15:09:39 +02:00
Ettore Di Giacinto
7f13e3a783 docs(models): fixup top message
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-18 08:42:30 +02:00
LocalAI [bot]
c926469b9c ⬆️ Update ggerganov/llama.cpp (#2594)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-18 03:06:31 +00:00
LocalAI [bot]
c30b57a629 ⬆️ Update docs version mudler/LocalAI (#2593)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-18 01:47:04 +00:00
LocalAI [bot]
2f297979a7 ⬆️ Update ggerganov/llama.cpp (#2587)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-17 15:28:19 +00:00
Ettore Di Giacinto
2437a2769d models(gallery): add gemma-1.1-7b-it (#2588)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-17 14:13:27 +02:00
Ettore Di Giacinto
b58b7cad94 models(gallery): add samantha-qwen2 (#2586)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-17 10:08:29 +02:00
LocalAI [bot]
68148f2a1a ⬆️ Update ggerganov/llama.cpp (#2584)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-17 00:18:44 +00:00
Ettore Di Giacinto
4897eb0ba2 ci: pack less libs inside the binary (#2579)
The binary grew up to 1.8GB quickly - rocm at least raises +800MB by
itself - so we might just want to manage the GPU libs separately.

Adds a comment to list all the libraries found so far that we are
depending on, but will likely follow up in a way to bundle these
separately.

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-16 22:10:28 +02:00
Ettore Di Giacinto
1b43966c48 Update README.md
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-16 20:27:37 +02:00
Ettore Di Giacinto
c5f2f11503 models(gallery): add hathor_stable-v0.2-l3-8b (#2582)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-16 20:24:36 +02:00
Ettore Di Giacinto
895443d1b5 models(gallery): add tess-v2.5-phi-3-medium-128k-14b (#2581)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-16 20:22:08 +02:00
Ettore Di Giacinto
6a0802e8e6 models(gallery): add dolphin-qwen (#2580)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-16 20:11:21 +02:00
Ettore Di Giacinto
94cfaad7f4 feat(libpath): refactor and expose functions for external library paths (#2578)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-16 13:58:28 +02:00
Ettore Di Giacinto
ac4a94dd44 feat(build): bundle libs for arm64 and x86 linux binaries (#2572)
This PR bundles further libs into the arm64 and x86_64 binaries

This can be improved by a lot - it's far from perfect, however in this PR I wanted to collect the required libs, and give a simple baseline to improve later upon. It is quite challenging to do this exercise with CI only - but it's the fastest way I see now. 

I hope that after the list is initially built we can further improve this down the line and remove some of the technical debt left here to speedup things and do not get stuck in the middle of CI cycles.

In this PR:

- The x86_64 binary now bundles hipblas, nvidia and intel libraries too to avoid any dependency to be installed in the host
- Similarly, for the arm64 we now bundle all the required assets

## What's left

We should be also able to cross-compile Nvidia for arm64 - however I didn't succeed so far so I've left that open. Similarly I might have missed some libraries, but we will see with bug reports and testing around with the new binaries. I've tested on my arm64 board and I could finally start things up.

An open point still is shipping libraries for e.g. tts and stablediffusion. this is not done yet, however with the same methodology we should be able to extend support also for these two backends in the binary.
2024-06-16 09:10:44 +02:00
LocalAI [bot]
58bf8614d9 ⬆️ Update ggerganov/llama.cpp (#2575)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-15 23:45:10 +00:00
Ettore Di Giacinto
3764e50b35 models(gallery): add firefly-gemma-7b (#2576)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-15 23:07:20 +02:00
Nate Harris
3f464d2d9e Fix standard image latest Docker tags (#2574)
- Fix standard image latest Docker tags

Signed-off-by: Nate Harris <nwithan8@users.noreply.github.com>
2024-06-15 22:08:30 +02:00
LocalAI [bot]
5116d561e1 ⬆️ Update ggerganov/llama.cpp (#2570)
Signed-off-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
Co-authored-by: mudler <2420543+mudler@users.noreply.github.com>
2024-06-14 23:39:20 +00:00
Ettore Di Giacinto
96a7a3b59f fix(Makefile): enable STATIC on dist (#2569)
Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
2024-06-14 12:28:46 +02:00
Ettore Di Giacinto
112d0ffa45 feat(darwin): embed grpc libs (#2567)
* debug

* feat(makefile): allow to bundle libs into binary

* ci: bundle protobuf into single-binary

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* ci: tests

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* fix(assets): correctly reference extract folder

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* bundle also abseil

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

* bundle more libs

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>

---------

Signed-off-by: Ettore Di Giacinto <mudler@localai.io>
Signed-off-by: Ettore Di Giacinto <mudler@users.noreply.github.com>
2024-06-14 08:51:25 +02:00
170 changed files with 2459 additions and 1249 deletions

View File

@@ -75,7 +75,7 @@ var modelPageTemplate string = `
<div class="container mx-auto px-4 py-4"> <div class="container mx-auto px-4 py-4">
<div class="flex items-center justify-between"> <div class="flex items-center justify-between">
<div class="flex items-center"> <div class="flex items-center">
<a href="/" class="text-white text-xl font-bold"><img src="https://github.com/go-skynet/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd" alt="LocalAI Logo" class="h-10 mr-3 border-2 border-gray-300 shadow rounded"></a> <a href="/" class="text-white text-xl font-bold"><img src="https://github.com/mudler/LocalAI/assets/2420543/0966aa2a-166e-4f99-a3e5-6c915fc997dd" alt="LocalAI Logo" class="h-10 mr-3 border-2 border-gray-300 shadow rounded"></a>
<a href="/" class="text-white text-xl font-bold">LocalAI</a> <a href="/" class="text-white text-xl font-bold">LocalAI</a>
</div> </div>
<!-- Menu button for small screens --> <!-- Menu button for small screens -->
@@ -114,12 +114,12 @@ var modelPageTemplate string = `
<h2 class="text-center text-3xl font-semibold text-gray-100"> <h2 class="text-center text-3xl font-semibold text-gray-100">
🖼️ Available {{.AvailableModels}} models</i> repositories <a href="https://localai.io/models/" target="_blank" > 🖼️ Available {{.AvailableModels}} models</i> <a href="https://localai.io/models/" target="_blank" >
<i class="fas fa-circle-info pr-2"></i> <i class="fas fa-circle-info pr-2"></i>
</a></h2> </a></h2>
<h3> <h3>
Refer to <a href="https://localai.io/models" target=_blank> Model gallery</a> for more information on how to use the models with LocalAI. Refer to the Model gallery <a href="https://localai.io/models/" target="_blank" ><i class="fas fa-circle-info pr-2"></i></a> for more information on how to use the models with LocalAI.<br>
You can install models with the CLI command <code>local-ai models install <model-name></code>. or by using the WebUI. You can install models with the CLI command <code>local-ai models install <model-name></code>. or by using the WebUI.
</h3> </h3>

View File

@@ -32,7 +32,7 @@ jobs:
strategy: strategy:
# Pushing with all jobs in parallel # Pushing with all jobs in parallel
# eats the bandwidth of all the nodes # eats the bandwidth of all the nodes
max-parallel: ${{ github.event_name != 'pull_request' && 2 || 4 }} max-parallel: ${{ github.event_name != 'pull_request' && 4 || 8 }}
matrix: matrix:
include: include:
- build-type: '' - build-type: ''
@@ -46,7 +46,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda12-ffmpeg' tag-suffix: '-cublas-cuda12-ffmpeg'
@@ -119,7 +119,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda12-ffmpeg-core' tag-suffix: '-cublas-cuda12-ffmpeg-core'
@@ -128,3 +128,12 @@ jobs:
runs-on: 'ubuntu-latest' runs-on: 'ubuntu-latest'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"
- build-type: 'vulkan'
platforms: 'linux/amd64'
tag-latest: 'false'
tag-suffix: '-vulkan-ffmpeg-core'
ffmpeg: 'true'
image-type: 'core'
runs-on: 'ubuntu-latest'
base-image: "ubuntu:22.04"
makeflags: "--jobs=4 --output-sync=target"

View File

@@ -39,7 +39,7 @@ jobs:
strategy: strategy:
# Pushing with all jobs in parallel # Pushing with all jobs in parallel
# eats the bandwidth of all the nodes # eats the bandwidth of all the nodes
max-parallel: ${{ github.event_name != 'pull_request' && 2 || 4 }} max-parallel: ${{ github.event_name != 'pull_request' && 6 || 12 }}
matrix: matrix:
include: include:
# Extra images # Extra images
@@ -64,7 +64,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "11" cuda-major-version: "11"
cuda-minor-version: "7" cuda-minor-version: "8"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda11' tag-suffix: '-cublas-cuda11'
@@ -75,7 +75,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda12' tag-suffix: '-cublas-cuda12'
@@ -86,7 +86,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "11" cuda-major-version: "11"
cuda-minor-version: "7" cuda-minor-version: "8"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'auto' tag-latest: 'auto'
tag-suffix: '-cublas-cuda11-ffmpeg' tag-suffix: '-cublas-cuda11-ffmpeg'
@@ -100,7 +100,7 @@ jobs:
makeflags: "--jobs=3 --output-sync=target" makeflags: "--jobs=3 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'auto' tag-latest: 'auto'
tag-suffix: '-cublas-cuda12-ffmpeg' tag-suffix: '-cublas-cuda12-ffmpeg'
@@ -266,52 +266,61 @@ jobs:
ffmpeg: 'true' ffmpeg: 'true'
image-type: 'core' image-type: 'core'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
runs-on: 'ubuntu-latest' runs-on: 'arc-runner-set'
aio: "-aio-cpu" aio: "-aio-cpu"
latest-image: 'latest-cpu' latest-image: 'latest-cpu'
latest-image-aio: 'latest-aio-cpu' latest-image-aio: 'latest-aio-cpu'
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "11" cuda-major-version: "11"
cuda-minor-version: "7" cuda-minor-version: "8"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda11-core' tag-suffix: '-cublas-cuda11-core'
ffmpeg: '' ffmpeg: ''
image-type: 'core' image-type: 'core'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
runs-on: 'ubuntu-latest' runs-on: 'arc-runner-set'
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda12-core' tag-suffix: '-cublas-cuda12-core'
ffmpeg: '' ffmpeg: ''
image-type: 'core' image-type: 'core'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
runs-on: 'ubuntu-latest' runs-on: 'arc-runner-set'
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "11" cuda-major-version: "11"
cuda-minor-version: "7" cuda-minor-version: "8"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda11-ffmpeg-core' tag-suffix: '-cublas-cuda11-ffmpeg-core'
ffmpeg: 'true' ffmpeg: 'true'
image-type: 'core' image-type: 'core'
runs-on: 'ubuntu-latest' runs-on: 'arc-runner-set'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"
- build-type: 'cublas' - build-type: 'cublas'
cuda-major-version: "12" cuda-major-version: "12"
cuda-minor-version: "1" cuda-minor-version: "5"
platforms: 'linux/amd64' platforms: 'linux/amd64'
tag-latest: 'false' tag-latest: 'false'
tag-suffix: '-cublas-cuda12-ffmpeg-core' tag-suffix: '-cublas-cuda12-ffmpeg-core'
ffmpeg: 'true' ffmpeg: 'true'
image-type: 'core' image-type: 'core'
runs-on: 'ubuntu-latest' runs-on: 'arc-runner-set'
base-image: "ubuntu:22.04"
makeflags: "--jobs=4 --output-sync=target"
- build-type: 'vulkan'
platforms: 'linux/amd64,linux/arm64'
tag-latest: 'false'
tag-suffix: '-vulkan-ffmpeg-core'
ffmpeg: 'true'
image-type: 'core'
runs-on: 'arc-runner-set'
base-image: "ubuntu:22.04" base-image: "ubuntu:22.04"
makeflags: "--jobs=4 --output-sync=target" makeflags: "--jobs=4 --output-sync=target"

View File

@@ -19,11 +19,11 @@ on:
type: string type: string
cuda-major-version: cuda-major-version:
description: 'CUDA major version' description: 'CUDA major version'
default: "11" default: "12"
type: string type: string
cuda-minor-version: cuda-minor-version:
description: 'CUDA minor version' description: 'CUDA minor version'
default: "7" default: "5"
type: string type: string
platforms: platforms:
description: 'Platforms' description: 'Platforms'

View File

@@ -40,7 +40,7 @@ jobs:
sudo apt-get update sudo apt-get update
sudo apt-get install -y cuda-cross-aarch64 cuda-nvcc-cross-aarch64-${CUDA_VERSION} libcublas-cross-aarch64-${CUDA_VERSION} sudo apt-get install -y cuda-cross-aarch64 cuda-nvcc-cross-aarch64-${CUDA_VERSION} libcublas-cross-aarch64-${CUDA_VERSION}
env: env:
CUDA_VERSION: 12-4 CUDA_VERSION: 12-5
- name: Cache grpc - name: Cache grpc
id: cache-grpc id: cache-grpc
uses: actions/cache@v4 uses: actions/cache@v4
@@ -100,7 +100,14 @@ jobs:
go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.0 go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.0
export PATH=$PATH:$GOPATH/bin export PATH=$PATH:$GOPATH/bin
export PATH=/usr/local/cuda/bin:$PATH export PATH=/usr/local/cuda/bin:$PATH
GO_TAGS=p2p GOOS=linux GOARCH=arm64 CMAKE_ARGS="-DProtobuf_INCLUDE_DIRS=$CROSS_STAGING_PREFIX/include -DProtobuf_DIR=$CROSS_STAGING_PREFIX/lib/cmake/protobuf -DgRPC_DIR=$CROSS_STAGING_PREFIX/lib/cmake/grpc -DCMAKE_TOOLCHAIN_FILE=$CMAKE_CROSS_TOOLCHAIN -DCMAKE_C_COMPILER=aarch64-linux-gnu-gcc -DCMAKE_CXX_COMPILER=aarch64-linux-gnu-g++" make dist-cross-linux-arm64 sudo rm -rf /usr/aarch64-linux-gnu/lib/libstdc++.so.6
sudo cp -rf /usr/aarch64-linux-gnu/lib/libstdc++.so* /usr/aarch64-linux-gnu/lib/libstdc++.so.6
sudo cp /usr/aarch64-linux-gnu/lib/ld-linux-aarch64.so.1 ld.so
GO_TAGS=p2p \
BACKEND_LIBS="./grpc/cmake/cross_build/third_party/re2/libre2.a ./grpc/cmake/cross_build/libgrpc.a ./grpc/cmake/cross_build/libgrpc++.a ./grpc/cmake/cross_build/third_party/protobuf/libprotobuf.a /usr/aarch64-linux-gnu/lib/libc.so.6 /usr/aarch64-linux-gnu/lib/libstdc++.so.6 /usr/aarch64-linux-gnu/lib/libgomp.so.1 /usr/aarch64-linux-gnu/lib/libm.so.6 /usr/aarch64-linux-gnu/lib/libgcc_s.so.1 /usr/aarch64-linux-gnu/lib/libdl.so.2 /usr/aarch64-linux-gnu/lib/libpthread.so.0 ./ld.so" \
GOOS=linux \
GOARCH=arm64 \
CMAKE_ARGS="-DProtobuf_INCLUDE_DIRS=$CROSS_STAGING_PREFIX/include -DProtobuf_DIR=$CROSS_STAGING_PREFIX/lib/cmake/protobuf -DgRPC_DIR=$CROSS_STAGING_PREFIX/lib/cmake/grpc -DCMAKE_TOOLCHAIN_FILE=$CMAKE_CROSS_TOOLCHAIN -DCMAKE_C_COMPILER=aarch64-linux-gnu-gcc -DCMAKE_CXX_COMPILER=aarch64-linux-gnu-g++" make dist-cross-linux-arm64
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v4
with: with:
name: LocalAI-linux-arm64 name: LocalAI-linux-arm64
@@ -111,7 +118,13 @@ jobs:
with: with:
files: | files: |
release/* release/*
- name: Setup tmate session if tests fail
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3.18
with:
detached: true
connect-timeout-seconds: 180
limit-access-to-actor: true
build-linux: build-linux:
runs-on: arc-runner-set runs-on: arc-runner-set
steps: steps:
@@ -190,6 +203,7 @@ jobs:
- name: Install gRPC - name: Install gRPC
run: | run: |
cd grpc && cd cmake/build && sudo make --jobs 5 --output-sync=target install cd grpc && cd cmake/build && sudo make --jobs 5 --output-sync=target install
# BACKEND_LIBS needed for gpu-workload: /opt/intel/oneapi/*/lib/libiomp5.so /opt/intel/oneapi/*/lib/libmkl_core.so /opt/intel/oneapi/*/lib/libmkl_core.so.2 /opt/intel/oneapi/*/lib/libmkl_intel_ilp64.so /opt/intel/oneapi/*/lib/libmkl_intel_ilp64.so.2 /opt/intel/oneapi/*/lib/libmkl_sycl_blas.so /opt/intel/oneapi/*/lib/libmkl_sycl_blas.so.4 /opt/intel/oneapi/*/lib/libmkl_tbb_thread.so /opt/intel/oneapi/*/lib/libmkl_tbb_thread.so.2 /opt/intel/oneapi/*/lib/libsycl.so /opt/intel/oneapi/*/lib/libsycl.so.7 /opt/intel/oneapi/*/lib/libsycl.so.7.1.0 /opt/rocm-*/lib/libamdhip64.so /opt/rocm-*/lib/libamdhip64.so.5 /opt/rocm-*/lib/libamdhip64.so.6 /opt/rocm-*/lib/libamdhip64.so.6.1.60100 /opt/rocm-*/lib/libhipblas.so /opt/rocm-*/lib/libhipblas.so.2 /opt/rocm-*/lib/libhipblas.so.2.1.60100 /opt/rocm-*/lib/librocblas.so /opt/rocm-*/lib/librocblas.so.4 /opt/rocm-*/lib/librocblas.so.4.1.60100 /usr/lib/x86_64-linux-gnu/libstdc++.so.6 /usr/lib/x86_64-linux-gnu/libOpenCL.so.1 /usr/lib/x86_64-linux-gnu/libOpenCL.so.1.0.0 /usr/lib/x86_64-linux-gnu/libm.so.6 /usr/lib/x86_64-linux-gnu/libgcc_s.so.1 /usr/lib/x86_64-linux-gnu/libc.so.6 /usr/lib/x86_64-linux-gnu/librt.so.1 /usr/local/cuda-*/targets/x86_64-linux/lib/libcublas.so /usr/local/cuda-*/targets/x86_64-linux/lib/libcublasLt.so /usr/local/cuda-*/targets/x86_64-linux/lib/libcudart.so /usr/local/cuda-*/targets/x86_64-linux/lib/stubs/libcuda.so
- name: Build - name: Build
id: build id: build
run: | run: |
@@ -199,7 +213,10 @@ jobs:
export PATH=/usr/local/cuda/bin:$PATH export PATH=/usr/local/cuda/bin:$PATH
export PATH=/opt/rocm/bin:$PATH export PATH=/opt/rocm/bin:$PATH
source /opt/intel/oneapi/setvars.sh source /opt/intel/oneapi/setvars.sh
GO_TAGS=p2p make -j4 dist sudo cp /lib64/ld-linux-x86-64.so.2 ld.so
GO_TAGS=p2p \
BACKEND_LIBS="./ld.so /usr/lib/x86_64-linux-gnu/libstdc++.so.6 /usr/lib/x86_64-linux-gnu/libm.so.6 /usr/lib/x86_64-linux-gnu/libgcc_s.so.1 /usr/lib/x86_64-linux-gnu/libc.so.6 /usr/lib/x86_64-linux-gnu/libgomp.so.1" \
make -j4 dist
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v4
with: with:
name: LocalAI-linux name: LocalAI-linux
@@ -210,7 +227,13 @@ jobs:
with: with:
files: | files: |
release/* release/*
- name: Setup tmate session if tests fail
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3.18
with:
detached: true
connect-timeout-seconds: 180
limit-access-to-actor: true
build-stablediffusion: build-stablediffusion:
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
@@ -249,11 +272,6 @@ jobs:
build-macOS-arm64: build-macOS-arm64:
runs-on: macos-14 runs-on: macos-14
steps: steps:
- name: Setup tmate session if tests fail
uses: mxschmitt/action-tmate@v3.18
with:
connect-timeout-seconds: 180
limit-access-to-actor: true
- name: Clone - name: Clone
uses: actions/checkout@v4 uses: actions/checkout@v4
with: with:
@@ -273,7 +291,8 @@ jobs:
export C_INCLUDE_PATH=/usr/local/include export C_INCLUDE_PATH=/usr/local/include
export CPLUS_INCLUDE_PATH=/usr/local/include export CPLUS_INCLUDE_PATH=/usr/local/include
export PATH=$PATH:$GOPATH/bin export PATH=$PATH:$GOPATH/bin
GO_TAGS=p2p make dist
BACKEND_LIBS="$(ls /opt/homebrew/opt/grpc/lib/*.dylib /opt/homebrew/opt/re2/lib/*.dylib /opt/homebrew/opt/openssl@3/lib/*.dylib /opt/homebrew/opt/protobuf/lib/*.dylib /opt/homebrew/opt/abseil/lib/*.dylib | xargs)" GO_TAGS=p2p make dist
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v4
with: with:
name: LocalAI-MacOS-arm64 name: LocalAI-MacOS-arm64
@@ -284,3 +303,10 @@ jobs:
with: with:
files: | files: |
release/* release/*
- name: Setup tmate session if tests fail
if: ${{ failure() }}
uses: mxschmitt/action-tmate@v3.18
with:
detached: true
connect-timeout-seconds: 180
limit-access-to-actor: true

View File

@@ -33,7 +33,7 @@ RUN curl -L -s https://go.dev/dl/go${GO_VERSION}.linux-${TARGETARCH}.tar.gz | ta
ENV PATH $PATH:/root/go/bin:/usr/local/go/bin ENV PATH $PATH:/root/go/bin:/usr/local/go/bin
# Install grpc compilers # Install grpc compilers
RUN go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.1 && \ RUN go install google.golang.org/protobuf/cmd/protoc-gen-go@v1.34.2 && \
go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@1958fcbe2ca8bd93af633f11e97d44e567e945af go install google.golang.org/grpc/cmd/protoc-gen-go-grpc@1958fcbe2ca8bd93af633f11e97d44e567e945af
COPY --chmod=644 custom-ca-certs/* /usr/local/share/ca-certificates/ COPY --chmod=644 custom-ca-certs/* /usr/local/share/ca-certificates/
@@ -98,11 +98,27 @@ RUN pip install --user grpcio-tools
FROM requirements-${IMAGE_TYPE} AS requirements-drivers FROM requirements-${IMAGE_TYPE} AS requirements-drivers
ARG BUILD_TYPE ARG BUILD_TYPE
ARG CUDA_MAJOR_VERSION=11 ARG CUDA_MAJOR_VERSION=12
ARG CUDA_MINOR_VERSION=8 ARG CUDA_MINOR_VERSION=5
ENV BUILD_TYPE=${BUILD_TYPE} ENV BUILD_TYPE=${BUILD_TYPE}
# Vulkan requirements
RUN <<EOT bash
if [ "${BUILD_TYPE}" = "vulkan" ]; then
apt-get update && \
apt-get install -y --no-install-recommends \
software-properties-common pciutils wget gpg-agent && \
wget -qO - https://packages.lunarg.com/lunarg-signing-key-pub.asc | apt-key add - && \
wget -qO /etc/apt/sources.list.d/lunarg-vulkan-jammy.list https://packages.lunarg.com/vulkan/lunarg-vulkan-jammy.list && \
apt-get update && \
apt-get install -y \
vulkan-sdk && \
apt-get clean && \
rm -rf /var/lib/apt/lists/*
fi
EOT
# CuBLAS requirements # CuBLAS requirements
RUN <<EOT bash RUN <<EOT bash
if [ "${BUILD_TYPE}" = "cublas" ]; then if [ "${BUILD_TYPE}" = "cublas" ]; then
@@ -292,7 +308,7 @@ ENV REBUILD=false
ENV HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz ENV HEALTHCHECK_ENDPOINT=http://localhost:8080/readyz
ENV MAKEFLAGS=${MAKEFLAGS} ENV MAKEFLAGS=${MAKEFLAGS}
ARG CUDA_MAJOR_VERSION=11 ARG CUDA_MAJOR_VERSION=12
ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility ENV NVIDIA_DRIVER_CAPABILITIES=compute,utility
ENV NVIDIA_REQUIRE_CUDA="cuda>=${CUDA_MAJOR_VERSION}.0" ENV NVIDIA_REQUIRE_CUDA="cuda>=${CUDA_MAJOR_VERSION}.0"
ENV NVIDIA_VISIBLE_DEVICES=all ENV NVIDIA_VISIBLE_DEVICES=all

View File

@@ -5,7 +5,7 @@ BINARY_NAME=local-ai
# llama.cpp versions # llama.cpp versions
GOLLAMA_STABLE_VERSION?=2b57a8ae43e4699d3dc5d1496a1ccd42922993be GOLLAMA_STABLE_VERSION?=2b57a8ae43e4699d3dc5d1496a1ccd42922993be
CPPLLAMA_VERSION?=172c8256840ffd882ab9992ecedbb587d9b21f15 CPPLLAMA_VERSION?=e112b610a1a75cb7fa8351e1a933e2e7a755a5ce
# gpt4all version # gpt4all version
GPT4ALL_REPO?=https://github.com/nomic-ai/gpt4all GPT4ALL_REPO?=https://github.com/nomic-ai/gpt4all
@@ -103,6 +103,10 @@ ifeq ($(BUILD_TYPE),cublas)
CGO_LDFLAGS_WHISPER+=-L$(CUDA_LIBPATH)/stubs/ -lcuda -lcufft CGO_LDFLAGS_WHISPER+=-L$(CUDA_LIBPATH)/stubs/ -lcuda -lcufft
endif endif
ifeq ($(BUILD_TYPE),vulkan)
CMAKE_ARGS+=-DLLAMA_VULKAN=1
endif
ifeq ($(BUILD_TYPE),hipblas) ifeq ($(BUILD_TYPE),hipblas)
ROCM_HOME ?= /opt/rocm ROCM_HOME ?= /opt/rocm
ROCM_PATH ?= /opt/rocm ROCM_PATH ?= /opt/rocm
@@ -313,6 +317,10 @@ build: prepare backend-assets grpcs ## Build the project
$(info ${GREEN}I BUILD_TYPE: ${YELLOW}$(BUILD_TYPE)${RESET}) $(info ${GREEN}I BUILD_TYPE: ${YELLOW}$(BUILD_TYPE)${RESET})
$(info ${GREEN}I GO_TAGS: ${YELLOW}$(GO_TAGS)${RESET}) $(info ${GREEN}I GO_TAGS: ${YELLOW}$(GO_TAGS)${RESET})
$(info ${GREEN}I LD_FLAGS: ${YELLOW}$(LD_FLAGS)${RESET}) $(info ${GREEN}I LD_FLAGS: ${YELLOW}$(LD_FLAGS)${RESET})
ifneq ($(BACKEND_LIBS),)
$(MAKE) backend-assets/lib
cp $(BACKEND_LIBS) backend-assets/lib/
endif
CGO_LDFLAGS="$(CGO_LDFLAGS)" $(GOCMD) build -ldflags "$(LD_FLAGS)" -tags "$(GO_TAGS)" -o $(BINARY_NAME) ./ CGO_LDFLAGS="$(CGO_LDFLAGS)" $(GOCMD) build -ldflags "$(LD_FLAGS)" -tags "$(GO_TAGS)" -o $(BINARY_NAME) ./
build-minimal: build-minimal:
@@ -321,8 +329,11 @@ build-minimal:
build-api: build-api:
BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_API_ONLY=true GO_TAGS=none $(MAKE) build BUILD_GRPC_FOR_BACKEND_LLAMA=true BUILD_API_ONLY=true GO_TAGS=none $(MAKE) build
backend-assets/lib:
mkdir -p backend-assets/lib
dist: dist:
STATIC=true $(MAKE) backend-assets/grpc/llama-cpp-avx2 $(MAKE) backend-assets/grpc/llama-cpp-avx2
ifeq ($(OS),Darwin) ifeq ($(OS),Darwin)
$(info ${GREEN}I Skip CUDA/hipblas build on MacOS${RESET}) $(info ${GREEN}I Skip CUDA/hipblas build on MacOS${RESET})
else else
@@ -331,7 +342,7 @@ else
$(MAKE) backend-assets/grpc/llama-cpp-sycl_f16 $(MAKE) backend-assets/grpc/llama-cpp-sycl_f16
$(MAKE) backend-assets/grpc/llama-cpp-sycl_f32 $(MAKE) backend-assets/grpc/llama-cpp-sycl_f32
endif endif
$(MAKE) build STATIC=true $(MAKE) build
mkdir -p release mkdir -p release
# if BUILD_ID is empty, then we don't append it to the binary name # if BUILD_ID is empty, then we don't append it to the binary name
ifeq ($(BUILD_ID),) ifeq ($(BUILD_ID),)
@@ -344,7 +355,7 @@ endif
dist-cross-linux-arm64: dist-cross-linux-arm64:
CMAKE_ARGS="$(CMAKE_ARGS) -DLLAMA_NATIVE=off" GRPC_BACKENDS="backend-assets/grpc/llama-cpp-fallback backend-assets/grpc/llama-cpp-grpc backend-assets/util/llama-cpp-rpc-server" \ CMAKE_ARGS="$(CMAKE_ARGS) -DLLAMA_NATIVE=off" GRPC_BACKENDS="backend-assets/grpc/llama-cpp-fallback backend-assets/grpc/llama-cpp-grpc backend-assets/util/llama-cpp-rpc-server" \
$(MAKE) build STATIC=true $(MAKE) build
mkdir -p release mkdir -p release
# if BUILD_ID is empty, then we don't append it to the binary name # if BUILD_ID is empty, then we don't append it to the binary name
ifeq ($(BUILD_ID),) ifeq ($(BUILD_ID),)
@@ -393,7 +404,7 @@ prepare-e2e:
mkdir -p $(TEST_DIR) mkdir -p $(TEST_DIR)
cp -rfv $(abspath ./tests/e2e-fixtures)/gpu.yaml $(TEST_DIR)/gpu.yaml cp -rfv $(abspath ./tests/e2e-fixtures)/gpu.yaml $(TEST_DIR)/gpu.yaml
test -e $(TEST_DIR)/ggllm-test-model.bin || wget -q https://huggingface.co/TheBloke/CodeLlama-7B-Instruct-GGUF/resolve/main/codellama-7b-instruct.Q2_K.gguf -O $(TEST_DIR)/ggllm-test-model.bin test -e $(TEST_DIR)/ggllm-test-model.bin || wget -q https://huggingface.co/TheBloke/CodeLlama-7B-Instruct-GGUF/resolve/main/codellama-7b-instruct.Q2_K.gguf -O $(TEST_DIR)/ggllm-test-model.bin
docker build --build-arg GRPC_BACKENDS="$(GRPC_BACKENDS)" --build-arg IMAGE_TYPE=core --build-arg BUILD_TYPE=$(BUILD_TYPE) --build-arg CUDA_MAJOR_VERSION=11 --build-arg CUDA_MINOR_VERSION=7 --build-arg FFMPEG=true -t localai-tests . docker build --build-arg GRPC_BACKENDS="$(GRPC_BACKENDS)" --build-arg IMAGE_TYPE=core --build-arg BUILD_TYPE=$(BUILD_TYPE) --build-arg CUDA_MAJOR_VERSION=12 --build-arg CUDA_MINOR_VERSION=5 --build-arg FFMPEG=true -t localai-tests .
run-e2e-image: run-e2e-image:
ls -liah $(abspath ./tests/e2e-fixtures) ls -liah $(abspath ./tests/e2e-fixtures)
@@ -803,6 +814,17 @@ docker:
--build-arg BUILD_TYPE=$(BUILD_TYPE) \ --build-arg BUILD_TYPE=$(BUILD_TYPE) \
-t $(DOCKER_IMAGE) . -t $(DOCKER_IMAGE) .
docker-cuda11:
docker build \
--build-arg CUDA_MAJOR_VERSION=11 \
--build-arg CUDA_MINOR_VERSION=8 \
--build-arg BASE_IMAGE=$(BASE_IMAGE) \
--build-arg IMAGE_TYPE=$(IMAGE_TYPE) \
--build-arg GO_TAGS="$(GO_TAGS)" \
--build-arg MAKEFLAGS="$(DOCKER_MAKEFLAGS)" \
--build-arg BUILD_TYPE=$(BUILD_TYPE) \
-t $(DOCKER_IMAGE)-cuda11 .
docker-aio: docker-aio:
@echo "Building AIO image with base $(BASE_IMAGE) as $(DOCKER_AIO_IMAGE)" @echo "Building AIO image with base $(BASE_IMAGE) as $(DOCKER_AIO_IMAGE)"
docker build \ docker build \

View File

@@ -48,6 +48,13 @@
![screen](https://github.com/mudler/LocalAI/assets/2420543/20b5ccd2-8393-44f0-aaf6-87a23806381e) ![screen](https://github.com/mudler/LocalAI/assets/2420543/20b5ccd2-8393-44f0-aaf6-87a23806381e)
Run the installer script:
```bash
curl https://localai.io/install.sh | sh
```
Or run with docker:
```bash ```bash
docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
# Alternative images: # Alternative images:
@@ -65,6 +72,7 @@ docker run -ti --name local-ai -p 8080:8080 localai/localai:latest-aio-cpu
[Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap) [Roadmap](https://github.com/mudler/LocalAI/issues?q=is%3Aissue+is%3Aopen+label%3Aroadmap)
- 🆕 You can browse now the model gallery without LocalAI! Check out https://models.localai.io
- 🔥🔥 Decentralized llama.cpp: https://github.com/mudler/LocalAI/pull/2343 (peer2peer llama.cpp!) 👉 Docs https://localai.io/features/distribute/ - 🔥🔥 Decentralized llama.cpp: https://github.com/mudler/LocalAI/pull/2343 (peer2peer llama.cpp!) 👉 Docs https://localai.io/features/distribute/
- 🔥🔥 Openvoice: https://github.com/mudler/LocalAI/pull/2334 - 🔥🔥 Openvoice: https://github.com/mudler/LocalAI/pull/2334
- 🆕 Function calls without grammars and mixed mode: https://github.com/mudler/LocalAI/pull/2328 - 🆕 Function calls without grammars and mixed mode: https://github.com/mudler/LocalAI/pull/2328

View File

@@ -230,6 +230,7 @@ message TranscriptRequest {
string dst = 2; string dst = 2;
string language = 3; string language = 3;
uint32 threads = 4; uint32 threads = 4;
bool translate = 5;
} }
message TranscriptResult { message TranscriptResult {

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -3,9 +3,9 @@ package main
// This is a wrapper to statisfy the GRPC service interface // This is a wrapper to statisfy the GRPC service interface
// It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc) // It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc)
import ( import (
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/LocalAI/pkg/stablediffusion" "github.com/mudler/LocalAI/pkg/stablediffusion"
) )
type Image struct { type Image struct {

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -3,9 +3,9 @@ package main
// This is a wrapper to statisfy the GRPC service interface // This is a wrapper to statisfy the GRPC service interface
// It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc) // It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc)
import ( import (
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/LocalAI/pkg/tinydream" "github.com/mudler/LocalAI/pkg/tinydream"
) )
type Image struct { type Image struct {

View File

@@ -5,8 +5,8 @@ package main
import ( import (
bert "github.com/go-skynet/go-bert.cpp" bert "github.com/go-skynet/go-bert.cpp"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
) )
type Embeddings struct { type Embeddings struct {

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -5,8 +5,8 @@ package main
import ( import (
"fmt" "fmt"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
gpt4all "github.com/nomic-ai/gpt4all/gpt4all-bindings/golang" gpt4all "github.com/nomic-ai/gpt4all/gpt4all-bindings/golang"
) )

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -6,9 +6,9 @@ import (
"fmt" "fmt"
"os" "os"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/LocalAI/pkg/langchain" "github.com/mudler/LocalAI/pkg/langchain"
) )
type LLM struct { type LLM struct {

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -5,9 +5,9 @@ package main
import ( import (
"fmt" "fmt"
"github.com/go-skynet/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/go-llama.cpp" "github.com/go-skynet/go-llama.cpp"
"github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/mudler/LocalAI/pkg/grpc/proto"
) )
type LLM struct { type LLM struct {

View File

@@ -3,7 +3,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -6,9 +6,9 @@ import (
"fmt" "fmt"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/go-skynet/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/go-llama.cpp" "github.com/go-skynet/go-llama.cpp"
"github.com/mudler/LocalAI/pkg/grpc/base"
) )
type LLM struct { type LLM struct {

View File

@@ -7,7 +7,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -7,8 +7,8 @@ import (
"path/filepath" "path/filepath"
"github.com/donomii/go-rwkv.cpp" "github.com/donomii/go-rwkv.cpp"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
) )
const tokenizerSuffix = ".tokenizer.json" const tokenizerSuffix = ".tokenizer.json"
@@ -31,7 +31,7 @@ func (llm *LLM) Load(opts *pb.ModelOptions) error {
model := rwkv.LoadFiles(opts.ModelFile, tokenizerPath, uint32(opts.GetThreads())) model := rwkv.LoadFiles(opts.ModelFile, tokenizerPath, uint32(opts.GetThreads()))
if model == nil { if model == nil {
return fmt.Errorf("could not load model") return fmt.Errorf("rwkv could not load model")
} }
llm.rwkv = model llm.rwkv = model
return nil return nil

View File

@@ -6,7 +6,7 @@ import (
"flag" "flag"
"os" "os"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
"github.com/rs/zerolog" "github.com/rs/zerolog"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -8,8 +8,8 @@ import (
"math" "math"
"slices" "slices"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -8,7 +8,7 @@ import (
"github.com/ggerganov/whisper.cpp/bindings/go/pkg/whisper" "github.com/ggerganov/whisper.cpp/bindings/go/pkg/whisper"
"github.com/go-audio/wav" "github.com/go-audio/wav"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
) )
func ffmpegCommand(args []string) (string, error) { func ffmpegCommand(args []string) (string, error) {
@@ -29,7 +29,7 @@ func audioToWav(src, dst string) error {
return nil return nil
} }
func Transcript(model whisper.Model, audiopath, language string, threads uint) (schema.TranscriptionResult, error) { func Transcript(model whisper.Model, audiopath, language string, translate bool, threads uint) (schema.TranscriptionResult, error) {
res := schema.TranscriptionResult{} res := schema.TranscriptionResult{}
dir, err := os.MkdirTemp("", "whisper") dir, err := os.MkdirTemp("", "whisper")
@@ -75,6 +75,10 @@ func Transcript(model whisper.Model, audiopath, language string, threads uint) (
context.SetLanguage("auto") context.SetLanguage("auto")
} }
if translate {
context.SetTranslate(true)
}
if err := context.Process(data, nil, nil); err != nil { if err := context.Process(data, nil, nil); err != nil {
return res, err return res, err
} }

View File

@@ -4,9 +4,9 @@ package main
// It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc) // It is meant to be used by the main executable that is the server for the specific backend type (falcon, gpt3, etc)
import ( import (
"github.com/ggerganov/whisper.cpp/bindings/go/pkg/whisper" "github.com/ggerganov/whisper.cpp/bindings/go/pkg/whisper"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
) )
type Whisper struct { type Whisper struct {
@@ -22,5 +22,5 @@ func (sd *Whisper) Load(opts *pb.ModelOptions) error {
} }
func (sd *Whisper) AudioTranscription(opts *pb.TranscriptRequest) (schema.TranscriptionResult, error) { func (sd *Whisper) AudioTranscription(opts *pb.TranscriptRequest) (schema.TranscriptionResult, error) {
return Transcript(sd.whisper, opts.Dst, opts.Language, uint(opts.Threads)) return Transcript(sd.whisper, opts.Dst, opts.Language, opts.Translate, uint(opts.Threads))
} }

View File

@@ -5,7 +5,7 @@ package main
import ( import (
"flag" "flag"
grpc "github.com/go-skynet/LocalAI/pkg/grpc" grpc "github.com/mudler/LocalAI/pkg/grpc"
) )
var ( var (

View File

@@ -7,8 +7,8 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/pkg/grpc/base" "github.com/mudler/LocalAI/pkg/grpc/base"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
piper "github.com/mudler/go-piper" piper "github.com/mudler/go-piper"
) )

View File

@@ -17,7 +17,7 @@ import backend_pb2_grpc
import grpc import grpc
from diffusers import StableDiffusionXLPipeline, StableDiffusionDepth2ImgPipeline, DPMSolverMultistepScheduler, StableDiffusionPipeline, DiffusionPipeline, EulerAncestralDiscreteScheduler from diffusers import StableDiffusion3Pipeline, StableDiffusionXLPipeline, StableDiffusionDepth2ImgPipeline, DPMSolverMultistepScheduler, StableDiffusionPipeline, DiffusionPipeline, EulerAncestralDiscreteScheduler
from diffusers import StableDiffusionImg2ImgPipeline, AutoPipelineForText2Image, ControlNetModel, StableVideoDiffusionPipeline from diffusers import StableDiffusionImg2ImgPipeline, AutoPipelineForText2Image, ControlNetModel, StableVideoDiffusionPipeline
from diffusers.pipelines.stable_diffusion import safety_checker from diffusers.pipelines.stable_diffusion import safety_checker
from diffusers.utils import load_image,export_to_video from diffusers.utils import load_image,export_to_video
@@ -225,6 +225,17 @@ class BackendServicer(backend_pb2_grpc.BackendServicer):
torch_dtype=torchType, torch_dtype=torchType,
use_safetensors=True, use_safetensors=True,
variant=variant) variant=variant)
elif request.PipelineType == "StableDiffusion3Pipeline":
if fromSingleFile:
self.pipe = StableDiffusion3Pipeline.from_single_file(modelFile,
torch_dtype=torchType,
use_safetensors=True)
else:
self.pipe = StableDiffusion3Pipeline.from_pretrained(
request.Model,
torch_dtype=torchType,
use_safetensors=True,
variant=variant)
if CLIPSKIP and request.CLIPSkip != 0: if CLIPSKIP and request.CLIPSkip != 0:
self.clip_skip = request.CLIPSkip self.clip_skip = request.CLIPSkip

View File

@@ -5,6 +5,7 @@ grpcio==1.64.0
opencv-python opencv-python
pillow pillow
protobuf protobuf
sentencepiece
torch torch
transformers transformers
certifi certifi

View File

@@ -1,9 +1,9 @@
package core package core
import ( import (
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/services" "github.com/mudler/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
) )
// The purpose of this structure is to hold pointers to all initialized services, to make plumbing easy // The purpose of this structure is to hold pointers to all initialized services, to make plumbing easy

View File

@@ -3,10 +3,10 @@ package backend
import ( import (
"fmt" "fmt"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/grpc" "github.com/mudler/LocalAI/pkg/grpc"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
) )
func ModelEmbedding(s string, tokens []int, loader *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (func() ([]float32, error), error) { func ModelEmbedding(s string, tokens []int, loader *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (func() ([]float32, error), error) {

View File

@@ -1,10 +1,10 @@
package backend package backend
import ( import (
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/grpc/proto" "github.com/mudler/LocalAI/pkg/grpc/proto"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
) )
func ImageGeneration(height, width, mode, step, seed int, positive_prompt, negative_prompt, src, dst string, loader *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (func() error, error) { func ImageGeneration(height, width, mode, step, seed int, positive_prompt, negative_prompt, src, dst string, loader *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (func() error, error) {

View File

@@ -9,14 +9,14 @@ import (
"sync" "sync"
"unicode/utf8" "unicode/utf8"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/core/gallery"
"github.com/go-skynet/LocalAI/pkg/grpc" "github.com/mudler/LocalAI/pkg/grpc"
"github.com/go-skynet/LocalAI/pkg/grpc/proto" "github.com/mudler/LocalAI/pkg/grpc/proto"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
) )
type LLMResponse struct { type LLMResponse struct {

View File

@@ -5,9 +5,9 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
pb "github.com/go-skynet/LocalAI/pkg/grpc/proto" pb "github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -4,9 +4,9 @@ import (
"context" "context"
"fmt" "fmt"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/grpc/proto" "github.com/mudler/LocalAI/pkg/grpc/proto"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
) )
func Rerank(backend, modelFile string, request *proto.RerankRequest, loader *model.ModelLoader, appConfig *config.ApplicationConfig, backendConfig config.BackendConfig) (*proto.RerankResult, error) { func Rerank(backend, modelFile string, request *proto.RerankRequest, loader *model.ModelLoader, appConfig *config.ApplicationConfig, backendConfig config.BackendConfig) (*proto.RerankResult, error) {

View File

@@ -1,10 +1,10 @@
package backend package backend
import ( import (
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/grpc" "github.com/mudler/LocalAI/pkg/grpc"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
) )
func StoreBackend(sl *model.ModelLoader, appConfig *config.ApplicationConfig, storeName string) (grpc.Backend, error) { func StoreBackend(sl *model.ModelLoader, appConfig *config.ApplicationConfig, storeName string) (grpc.Backend, error) {

View File

@@ -4,14 +4,14 @@ import (
"context" "context"
"fmt" "fmt"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/grpc/proto" "github.com/mudler/LocalAI/pkg/grpc/proto"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
) )
func ModelTranscription(audio, language string, ml *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (*schema.TranscriptionResult, error) { func ModelTranscription(audio, language string, translate bool, ml *model.ModelLoader, backendConfig config.BackendConfig, appConfig *config.ApplicationConfig) (*schema.TranscriptionResult, error) {
opts := modelOpts(backendConfig, appConfig, []model.Option{ opts := modelOpts(backendConfig, appConfig, []model.Option{
model.WithBackendString(model.WhisperBackend), model.WithBackendString(model.WhisperBackend),
@@ -31,8 +31,9 @@ func ModelTranscription(audio, language string, ml *model.ModelLoader, backendCo
} }
return whisperModel.AudioTranscription(context.Background(), &proto.TranscriptRequest{ return whisperModel.AudioTranscription(context.Background(), &proto.TranscriptRequest{
Dst: audio, Dst: audio,
Language: language, Language: language,
Threads: uint32(*backendConfig.Threads), Translate: translate,
Threads: uint32(*backendConfig.Threads),
}) })
} }

View File

@@ -6,11 +6,11 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/grpc/proto" "github.com/mudler/LocalAI/pkg/grpc/proto"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
) )
func generateUniqueFileName(dir, baseName, ext string) string { func generateUniqueFileName(dir, baseName, ext string) string {

View File

@@ -1,8 +1,8 @@
package cli package cli
import ( import (
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/core/cli/worker" "github.com/mudler/LocalAI/core/cli/worker"
) )
var CLI struct { var CLI struct {

View File

@@ -4,10 +4,12 @@ import (
"encoding/json" "encoding/json"
"fmt" "fmt"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/core/gallery"
"github.com/go-skynet/LocalAI/pkg/startup" "github.com/mudler/LocalAI/pkg/downloader"
"github.com/mudler/LocalAI/pkg/startup"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"github.com/schollz/progressbar/v3" "github.com/schollz/progressbar/v3"
) )
@@ -33,7 +35,7 @@ type ModelsCMD struct {
} }
func (ml *ModelsList) Run(ctx *cliContext.Context) error { func (ml *ModelsList) Run(ctx *cliContext.Context) error {
var galleries []gallery.Gallery var galleries []config.Gallery
if err := json.Unmarshal([]byte(ml.Galleries), &galleries); err != nil { if err := json.Unmarshal([]byte(ml.Galleries), &galleries); err != nil {
log.Error().Err(err).Msg("unable to load galleries") log.Error().Err(err).Msg("unable to load galleries")
} }
@@ -53,10 +55,11 @@ func (ml *ModelsList) Run(ctx *cliContext.Context) error {
} }
func (mi *ModelsInstall) Run(ctx *cliContext.Context) error { func (mi *ModelsInstall) Run(ctx *cliContext.Context) error {
var galleries []gallery.Gallery var galleries []config.Gallery
if err := json.Unmarshal([]byte(mi.Galleries), &galleries); err != nil { if err := json.Unmarshal([]byte(mi.Galleries), &galleries); err != nil {
log.Error().Err(err).Msg("unable to load galleries") log.Error().Err(err).Msg("unable to load galleries")
} }
for _, modelName := range mi.ModelArgs { for _, modelName := range mi.ModelArgs {
progressBar := progressbar.NewOptions( progressBar := progressbar.NewOptions(
@@ -78,13 +81,15 @@ func (mi *ModelsInstall) Run(ctx *cliContext.Context) error {
return err return err
} }
model := gallery.FindModel(models, modelName, mi.ModelsPath) if !downloader.LooksLikeOCI(modelName) {
if model == nil { model := gallery.FindModel(models, modelName, mi.ModelsPath)
log.Error().Str("model", modelName).Msg("model not found") if model == nil {
return err log.Error().Str("model", modelName).Msg("model not found")
} return err
}
log.Info().Str("model", modelName).Str("license", model.License).Msg("installing model") log.Info().Str("model", modelName).Str("license", model.License).Msg("installing model")
}
err = startup.InstallModels(galleries, "", mi.ModelsPath, progressCallback, modelName) err = startup.InstallModels(galleries, "", mi.ModelsPath, progressCallback, modelName)
if err != nil { if err != nil {
return err return err

View File

@@ -6,11 +6,11 @@ import (
"strings" "strings"
"time" "time"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http" "github.com/mudler/LocalAI/core/http"
"github.com/go-skynet/LocalAI/core/p2p" "github.com/mudler/LocalAI/core/p2p"
"github.com/go-skynet/LocalAI/core/startup" "github.com/mudler/LocalAI/core/startup"
"github.com/rs/zerolog" "github.com/rs/zerolog"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -43,6 +43,7 @@ type RunCMD struct {
Address string `env:"LOCALAI_ADDRESS,ADDRESS" default:":8080" help:"Bind address for the API server" group:"api"` Address string `env:"LOCALAI_ADDRESS,ADDRESS" default:":8080" help:"Bind address for the API server" group:"api"`
CORS bool `env:"LOCALAI_CORS,CORS" help:"" group:"api"` CORS bool `env:"LOCALAI_CORS,CORS" help:"" group:"api"`
CORSAllowOrigins string `env:"LOCALAI_CORS_ALLOW_ORIGINS,CORS_ALLOW_ORIGINS" group:"api"` CORSAllowOrigins string `env:"LOCALAI_CORS_ALLOW_ORIGINS,CORS_ALLOW_ORIGINS" group:"api"`
LibraryPath string `env:"LOCALAI_LIBRARY_PATH,LIBRARY_PATH" help:"Path to the library directory (for e.g. external libraries used by backends)" default:"/usr/share/local-ai/libs" group:"backends"`
CSRF bool `env:"LOCALAI_CSRF" help:"Enables fiber CSRF middleware" group:"api"` CSRF bool `env:"LOCALAI_CSRF" help:"Enables fiber CSRF middleware" group:"api"`
UploadLimit int `env:"LOCALAI_UPLOAD_LIMIT,UPLOAD_LIMIT" default:"15" help:"Default upload-limit in MB" group:"api"` UploadLimit int `env:"LOCALAI_UPLOAD_LIMIT,UPLOAD_LIMIT" default:"15" help:"Default upload-limit in MB" group:"api"`
APIKeys []string `env:"LOCALAI_API_KEY,API_KEY" help:"List of API Keys to enable API authentication. When this is set, all the requests must be authenticated with one of these API keys" group:"api"` APIKeys []string `env:"LOCALAI_API_KEY,API_KEY" help:"List of API Keys to enable API authentication. When this is set, all the requests must be authenticated with one of these API keys" group:"api"`
@@ -80,6 +81,7 @@ func (r *RunCMD) Run(ctx *cliContext.Context) error {
config.WithCors(r.CORS), config.WithCors(r.CORS),
config.WithCorsAllowOrigins(r.CORSAllowOrigins), config.WithCorsAllowOrigins(r.CORSAllowOrigins),
config.WithCsrf(r.CSRF), config.WithCsrf(r.CSRF),
config.WithLibPath(r.LibraryPath),
config.WithThreads(r.Threads), config.WithThreads(r.Threads),
config.WithBackendAssets(ctx.BackendAssets), config.WithBackendAssets(ctx.BackendAssets),
config.WithBackendAssetsOutput(r.BackendAssetsPath), config.WithBackendAssetsOutput(r.BackendAssetsPath),

View File

@@ -5,10 +5,10 @@ import (
"errors" "errors"
"fmt" "fmt"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -18,6 +18,7 @@ type TranscriptCMD struct {
Backend string `short:"b" default:"whisper" help:"Backend to run the transcription model"` Backend string `short:"b" default:"whisper" help:"Backend to run the transcription model"`
Model string `short:"m" required:"" help:"Model name to run the TTS"` Model string `short:"m" required:"" help:"Model name to run the TTS"`
Language string `short:"l" help:"Language of the audio file"` Language string `short:"l" help:"Language of the audio file"`
Translate bool `short:"t" help:"Translate the transcription to english"`
Threads int `short:"t" default:"1" help:"Number of threads used for parallel computation"` Threads int `short:"t" default:"1" help:"Number of threads used for parallel computation"`
ModelsPath string `env:"LOCALAI_MODELS_PATH,MODELS_PATH" type:"path" default:"${basepath}/models" help:"Path containing models used for inferencing" group:"storage"` ModelsPath string `env:"LOCALAI_MODELS_PATH,MODELS_PATH" type:"path" default:"${basepath}/models" help:"Path containing models used for inferencing" group:"storage"`
BackendAssetsPath string `env:"LOCALAI_BACKEND_ASSETS_PATH,BACKEND_ASSETS_PATH" type:"path" default:"/tmp/localai/backend_data" help:"Path used to extract libraries that are required by some of the backends in runtime" group:"storage"` BackendAssetsPath string `env:"LOCALAI_BACKEND_ASSETS_PATH,BACKEND_ASSETS_PATH" type:"path" default:"/tmp/localai/backend_data" help:"Path used to extract libraries that are required by some of the backends in runtime" group:"storage"`
@@ -50,7 +51,7 @@ func (t *TranscriptCMD) Run(ctx *cliContext.Context) error {
} }
}() }()
tr, err := backend.ModelTranscription(t.Filename, t.Language, ml, c, opts) tr, err := backend.ModelTranscription(t.Filename, t.Language, t.Translate, ml, c, opts)
if err != nil { if err != nil {
return err return err
} }

View File

@@ -7,10 +7,10 @@ import (
"path/filepath" "path/filepath"
"strings" "strings"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -5,7 +5,7 @@ import (
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
gguf "github.com/thxcode/gguf-parser-go" gguf "github.com/thxcode/gguf-parser-go"
) )

View File

@@ -5,8 +5,9 @@ import (
"os" "os"
"syscall" "syscall"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/pkg/assets" "github.com/mudler/LocalAI/pkg/assets"
"github.com/mudler/LocalAI/pkg/library"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -27,17 +28,18 @@ func (r *LLamaCPP) Run(ctx *cliContext.Context) error {
return fmt.Errorf("usage: local-ai worker llama-cpp-rpc -- <llama-rpc-server-args>") return fmt.Errorf("usage: local-ai worker llama-cpp-rpc -- <llama-rpc-server-args>")
} }
grpcProcess := assets.ResolvePath(
r.BackendAssetsPath,
"util",
"llama-cpp-rpc-server",
)
args := os.Args[4:]
args, grpcProcess = library.LoadLDSO(r.BackendAssetsPath, args, grpcProcess)
args = append([]string{grpcProcess}, args...)
return syscall.Exec( return syscall.Exec(
assets.ResolvePath( grpcProcess,
r.BackendAssetsPath, args,
"util",
"llama-cpp-rpc-server",
),
append([]string{
assets.ResolvePath(
r.BackendAssetsPath,
"util",
"llama-cpp-rpc-server",
)}, os.Args[4:]...),
os.Environ()) os.Environ())
} }

View File

@@ -6,7 +6,7 @@ package worker
import ( import (
"fmt" "fmt"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
) )
type P2P struct{} type P2P struct{}

View File

@@ -10,9 +10,10 @@ import (
"os/exec" "os/exec"
"time" "time"
cliContext "github.com/go-skynet/LocalAI/core/cli/context" cliContext "github.com/mudler/LocalAI/core/cli/context"
"github.com/go-skynet/LocalAI/core/p2p" "github.com/mudler/LocalAI/core/p2p"
"github.com/go-skynet/LocalAI/pkg/assets" "github.com/mudler/LocalAI/pkg/assets"
"github.com/mudler/LocalAI/pkg/library"
"github.com/phayes/freeport" "github.com/phayes/freeport"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -71,13 +72,18 @@ func (r *P2P) Run(ctx *cliContext.Context) error {
go func() { go func() {
for { for {
log.Info().Msgf("Starting llama-cpp-rpc-server on '%s:%d'", address, port) log.Info().Msgf("Starting llama-cpp-rpc-server on '%s:%d'", address, port)
grpcProcess := assets.ResolvePath(
r.BackendAssetsPath,
"util",
"llama-cpp-rpc-server",
)
args := append([]string{"--host", address, "--port", fmt.Sprint(port)}, r.ExtraLLamaCPPArgs...)
args, grpcProcess = library.LoadLDSO(r.BackendAssetsPath, args, grpcProcess)
cmd := exec.Command( cmd := exec.Command(
assets.ResolvePath( grpcProcess, args...,
r.BackendAssetsPath,
"util",
"llama-cpp-rpc-server",
),
append([]string{"--host", address, "--port", fmt.Sprint(port)}, r.ExtraLLamaCPPArgs...)...,
) )
cmd.Env = os.Environ() cmd.Env = os.Environ()
@@ -86,7 +92,7 @@ func (r *P2P) Run(ctx *cliContext.Context) error {
cmd.Stdout = os.Stdout cmd.Stdout = os.Stdout
if err := cmd.Start(); err != nil { if err := cmd.Start(); err != nil {
log.Error().Err(err).Msg("Failed to start llama-cpp-rpc-server") log.Error().Any("grpcProcess", grpcProcess).Any("args", args).Err(err).Msg("Failed to start llama-cpp-rpc-server")
} }
cmd.Wait() cmd.Wait()

View File

@@ -6,8 +6,7 @@ import (
"encoding/json" "encoding/json"
"time" "time"
"github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/pkg/xsysinfo"
"github.com/go-skynet/LocalAI/pkg/xsysinfo"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -15,6 +14,7 @@ type ApplicationConfig struct {
Context context.Context Context context.Context
ConfigFile string ConfigFile string
ModelPath string ModelPath string
LibPath string
UploadLimitMB, Threads, ContextSize int UploadLimitMB, Threads, ContextSize int
DisableWebUI bool DisableWebUI bool
F16 bool F16 bool
@@ -35,7 +35,7 @@ type ApplicationConfig struct {
ModelLibraryURL string ModelLibraryURL string
Galleries []gallery.Gallery Galleries []Gallery
BackendAssets embed.FS BackendAssets embed.FS
AssetsDestination string AssetsDestination string
@@ -101,6 +101,12 @@ func WithModelLibraryURL(url string) AppOption {
} }
} }
func WithLibPath(path string) AppOption {
return func(o *ApplicationConfig) {
o.LibPath = path
}
}
var EnableWatchDog = func(o *ApplicationConfig) { var EnableWatchDog = func(o *ApplicationConfig) {
o.WatchDog = true o.WatchDog = true
} }
@@ -173,10 +179,10 @@ func WithBackendAssets(f embed.FS) AppOption {
func WithStringGalleries(galls string) AppOption { func WithStringGalleries(galls string) AppOption {
return func(o *ApplicationConfig) { return func(o *ApplicationConfig) {
if galls == "" { if galls == "" {
o.Galleries = []gallery.Gallery{} o.Galleries = []Gallery{}
return return
} }
var galleries []gallery.Gallery var galleries []Gallery
if err := json.Unmarshal([]byte(galls), &galleries); err != nil { if err := json.Unmarshal([]byte(galls), &galleries); err != nil {
log.Error().Err(err).Msg("failed loading galleries") log.Error().Err(err).Msg("failed loading galleries")
} }
@@ -184,7 +190,7 @@ func WithStringGalleries(galls string) AppOption {
} }
} }
func WithGalleries(galleries []gallery.Gallery) AppOption { func WithGalleries(galleries []Gallery) AppOption {
return func(o *ApplicationConfig) { return func(o *ApplicationConfig) {
o.Galleries = append(o.Galleries, galleries...) o.Galleries = append(o.Galleries, galleries...)
} }

View File

@@ -5,10 +5,10 @@ import (
"regexp" "regexp"
"strings" "strings"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/downloader" "github.com/mudler/LocalAI/pkg/downloader"
"github.com/go-skynet/LocalAI/pkg/functions" "github.com/mudler/LocalAI/pkg/functions"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
) )
const ( const (
@@ -390,10 +390,6 @@ func (c *BackendConfig) Validate() bool {
} }
} }
if c.Name == "" {
return false
}
if c.Backend != "" { if c.Backend != "" {
// a regex that checks that is a string name with no special characters, except '-' and '_' // a regex that checks that is a string name with no special characters, except '-' and '_'
re := regexp.MustCompile(`^[a-zA-Z0-9-_]+$`) re := regexp.MustCompile(`^[a-zA-Z0-9-_]+$`)

View File

@@ -11,9 +11,9 @@ import (
"sync" "sync"
"github.com/charmbracelet/glamour" "github.com/charmbracelet/glamour"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/downloader" "github.com/mudler/LocalAI/pkg/downloader"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"gopkg.in/yaml.v3" "gopkg.in/yaml.v3"
) )

View File

@@ -16,7 +16,8 @@ var _ = Describe("Test cases for config related functions", func() {
Expect(err).To(BeNil()) Expect(err).To(BeNil())
defer os.Remove(tmp.Name()) defer os.Remove(tmp.Name())
_, err = tmp.WriteString( _, err = tmp.WriteString(
`backend: "foo-bar" `backend: "../foo-bar"
name: "foo"
parameters: parameters:
model: "foo-bar"`) model: "foo-bar"`)
Expect(err).ToNot(HaveOccurred()) Expect(err).ToNot(HaveOccurred())

6
core/config/gallery.go Normal file
View File

@@ -0,0 +1,6 @@
package config
type Gallery struct {
URL string `json:"url" yaml:"url"`
Name string `json:"name" yaml:"name"`
}

View File

@@ -5,8 +5,8 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/pkg/downloader" "github.com/mudler/LocalAI/pkg/downloader"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
"gopkg.in/yaml.v3" "gopkg.in/yaml.v3"
) )

View File

@@ -7,19 +7,15 @@ import (
"path/filepath" "path/filepath"
"strings" "strings"
"github.com/go-skynet/LocalAI/pkg/downloader"
"github.com/imdario/mergo" "github.com/imdario/mergo"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/pkg/downloader"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"gopkg.in/yaml.v2" "gopkg.in/yaml.v2"
) )
type Gallery struct {
URL string `json:"url" yaml:"url"`
Name string `json:"name" yaml:"name"`
}
// Installs a model from the gallery // Installs a model from the gallery
func InstallModelFromGallery(galleries []Gallery, name string, basePath string, req GalleryModel, downloadStatus func(string, string, string, float64)) error { func InstallModelFromGallery(galleries []config.Gallery, name string, basePath string, req GalleryModel, downloadStatus func(string, string, string, float64)) error {
applyModel := func(model *GalleryModel) error { applyModel := func(model *GalleryModel) error {
name = strings.ReplaceAll(name, string(os.PathSeparator), "__") name = strings.ReplaceAll(name, string(os.PathSeparator), "__")
@@ -117,7 +113,7 @@ func FindModel(models []*GalleryModel, name string, basePath string) *GalleryMod
// List available models // List available models
// Models galleries are a list of yaml files that are hosted on a remote server (for example github). // Models galleries are a list of yaml files that are hosted on a remote server (for example github).
// Each yaml file contains a list of models that can be downloaded and optionally overrides to define a new model setting. // Each yaml file contains a list of models that can be downloaded and optionally overrides to define a new model setting.
func AvailableGalleryModels(galleries []Gallery, basePath string) ([]*GalleryModel, error) { func AvailableGalleryModels(galleries []config.Gallery, basePath string) ([]*GalleryModel, error) {
var models []*GalleryModel var models []*GalleryModel
// Get models from galleries // Get models from galleries
@@ -134,7 +130,7 @@ func AvailableGalleryModels(galleries []Gallery, basePath string) ([]*GalleryMod
func findGalleryURLFromReferenceURL(url string, basePath string) (string, error) { func findGalleryURLFromReferenceURL(url string, basePath string) (string, error) {
var refFile string var refFile string
err := downloader.GetURI(url, basePath, func(url string, d []byte) error { err := downloader.DownloadAndUnmarshal(url, basePath, func(url string, d []byte) error {
refFile = string(d) refFile = string(d)
if len(refFile) == 0 { if len(refFile) == 0 {
return fmt.Errorf("invalid reference file at url %s: %s", url, d) return fmt.Errorf("invalid reference file at url %s: %s", url, d)
@@ -146,7 +142,7 @@ func findGalleryURLFromReferenceURL(url string, basePath string) (string, error)
return refFile, err return refFile, err
} }
func getGalleryModels(gallery Gallery, basePath string) ([]*GalleryModel, error) { func getGalleryModels(gallery config.Gallery, basePath string) ([]*GalleryModel, error) {
var models []*GalleryModel = []*GalleryModel{} var models []*GalleryModel = []*GalleryModel{}
if strings.HasSuffix(gallery.URL, ".ref") { if strings.HasSuffix(gallery.URL, ".ref") {
@@ -157,7 +153,7 @@ func getGalleryModels(gallery Gallery, basePath string) ([]*GalleryModel, error)
} }
} }
err := downloader.GetURI(gallery.URL, basePath, func(url string, d []byte) error { err := downloader.DownloadAndUnmarshal(gallery.URL, basePath, func(url string, d []byte) error {
return yaml.Unmarshal(d, &models) return yaml.Unmarshal(d, &models)
}) })
if err != nil { if err != nil {

View File

@@ -5,9 +5,11 @@ import (
"os" "os"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/pkg/downloader"
"github.com/go-skynet/LocalAI/pkg/utils"
"github.com/imdario/mergo" "github.com/imdario/mergo"
lconfig "github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/pkg/downloader"
"github.com/mudler/LocalAI/pkg/utils"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"gopkg.in/yaml.v2" "gopkg.in/yaml.v2"
) )
@@ -65,7 +67,7 @@ type PromptTemplate struct {
func GetGalleryConfigFromURL(url string, basePath string) (Config, error) { func GetGalleryConfigFromURL(url string, basePath string) (Config, error) {
var config Config var config Config
err := downloader.GetURI(url, basePath, func(url string, d []byte) error { err := downloader.DownloadAndUnmarshal(url, basePath, func(url string, d []byte) error {
return yaml.Unmarshal(d, &config) return yaml.Unmarshal(d, &config)
}) })
if err != nil { if err != nil {
@@ -172,6 +174,15 @@ func InstallModel(basePath, nameOverride string, config *Config, configOverrides
return fmt.Errorf("failed to marshal updated config YAML: %v", err) return fmt.Errorf("failed to marshal updated config YAML: %v", err)
} }
backendConfig := lconfig.BackendConfig{}
err = yaml.Unmarshal(updatedConfigYAML, &backendConfig)
if err != nil {
return fmt.Errorf("failed to unmarshal updated config YAML: %v", err)
}
if !backendConfig.Validate() {
return fmt.Errorf("failed to validate updated config YAML")
}
err = os.WriteFile(configFilePath, updatedConfigYAML, 0600) err = os.WriteFile(configFilePath, updatedConfigYAML, 0600)
if err != nil { if err != nil {
return fmt.Errorf("failed to write updated config file: %v", err) return fmt.Errorf("failed to write updated config file: %v", err)

View File

@@ -5,7 +5,8 @@ import (
"os" "os"
"path/filepath" "path/filepath"
. "github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/core/config"
. "github.com/mudler/LocalAI/core/gallery"
. "github.com/onsi/ginkgo/v2" . "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega" . "github.com/onsi/gomega"
"gopkg.in/yaml.v3" "gopkg.in/yaml.v3"
@@ -54,7 +55,7 @@ var _ = Describe("Model test", func() {
err = os.WriteFile(galleryFilePath, out, 0600) err = os.WriteFile(galleryFilePath, out, 0600)
Expect(err).ToNot(HaveOccurred()) Expect(err).ToNot(HaveOccurred())
Expect(filepath.IsAbs(galleryFilePath)).To(BeTrue(), galleryFilePath) Expect(filepath.IsAbs(galleryFilePath)).To(BeTrue(), galleryFilePath)
galleries := []Gallery{ galleries := []config.Gallery{
{ {
Name: "test", Name: "test",
URL: "file://" + galleryFilePath, URL: "file://" + galleryFilePath,

View File

@@ -1,5 +1,7 @@
package gallery package gallery
import "github.com/mudler/LocalAI/core/config"
type GalleryOp struct { type GalleryOp struct {
Id string Id string
GalleryModelName string GalleryModelName string
@@ -7,7 +9,7 @@ type GalleryOp struct {
Delete bool Delete bool
Req GalleryModel Req GalleryModel
Galleries []Gallery Galleries []config.Gallery
} }
type GalleryOpStatus struct { type GalleryOpStatus struct {

View File

@@ -3,6 +3,8 @@ package gallery
import ( import (
"fmt" "fmt"
"strings" "strings"
"github.com/mudler/LocalAI/core/config"
) )
// GalleryModel is the struct used to represent a model in the gallery returned by the endpoint. // GalleryModel is the struct used to represent a model in the gallery returned by the endpoint.
@@ -23,7 +25,7 @@ type GalleryModel struct {
// AdditionalFiles are used to add additional files to the model // AdditionalFiles are used to add additional files to the model
AdditionalFiles []File `json:"files,omitempty" yaml:"files,omitempty"` AdditionalFiles []File `json:"files,omitempty" yaml:"files,omitempty"`
// Gallery is a reference to the gallery which contains the model // Gallery is a reference to the gallery which contains the model
Gallery Gallery `json:"gallery,omitempty" yaml:"gallery,omitempty"` Gallery config.Gallery `json:"gallery,omitempty" yaml:"gallery,omitempty"`
// Installed is used to indicate if the model is installed or not // Installed is used to indicate if the model is installed or not
Installed bool `json:"installed,omitempty" yaml:"installed,omitempty"` Installed bool `json:"installed,omitempty" yaml:"installed,omitempty"`
} }

View File

@@ -1,7 +1,7 @@
package gallery_test package gallery_test
import ( import (
. "github.com/go-skynet/LocalAI/pkg/gallery" . "github.com/mudler/LocalAI/core/gallery"
. "github.com/onsi/ginkgo/v2" . "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega" . "github.com/onsi/gomega"
) )

View File

@@ -6,16 +6,16 @@ import (
"net/http" "net/http"
"strings" "strings"
"github.com/go-skynet/LocalAI/pkg/utils" "github.com/mudler/LocalAI/pkg/utils"
"github.com/go-skynet/LocalAI/core/http/endpoints/localai" "github.com/mudler/LocalAI/core/http/endpoints/localai"
"github.com/go-skynet/LocalAI/core/http/endpoints/openai" "github.com/mudler/LocalAI/core/http/endpoints/openai"
"github.com/go-skynet/LocalAI/core/http/routes" "github.com/mudler/LocalAI/core/http/routes"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/core/services" "github.com/mudler/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/gofiber/contrib/fiberzerolog" "github.com/gofiber/contrib/fiberzerolog"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"

View File

@@ -13,15 +13,15 @@ import (
"path/filepath" "path/filepath"
"runtime" "runtime"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
. "github.com/go-skynet/LocalAI/core/http" . "github.com/mudler/LocalAI/core/http"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/core/startup" "github.com/mudler/LocalAI/core/startup"
"github.com/go-skynet/LocalAI/pkg/downloader"
"github.com/go-skynet/LocalAI/pkg/gallery"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/gallery"
"github.com/mudler/LocalAI/pkg/downloader"
"github.com/mudler/LocalAI/pkg/model"
. "github.com/onsi/ginkgo/v2" . "github.com/onsi/ginkgo/v2"
. "github.com/onsi/gomega" . "github.com/onsi/gomega"
"gopkg.in/yaml.v3" "gopkg.in/yaml.v3"
@@ -74,7 +74,7 @@ func getModelStatus(url string) (response map[string]interface{}) {
func getModels(url string) (response []gallery.GalleryModel) { func getModels(url string) (response []gallery.GalleryModel) {
// TODO: No tests currently seem to exercise file:// urls. Fix? // TODO: No tests currently seem to exercise file:// urls. Fix?
downloader.GetURI(url, "", func(url string, i []byte) error { downloader.DownloadAndUnmarshal(url, "", func(url string, i []byte) error {
// Unmarshal YAML data into a struct // Unmarshal YAML data into a struct
return json.Unmarshal(i, &response) return json.Unmarshal(i, &response)
}) })
@@ -247,7 +247,7 @@ var _ = Describe("API test", func() {
err = os.WriteFile(filepath.Join(modelDir, "gallery_simple.yaml"), out, 0600) err = os.WriteFile(filepath.Join(modelDir, "gallery_simple.yaml"), out, 0600)
Expect(err).ToNot(HaveOccurred()) Expect(err).ToNot(HaveOccurred())
galleries := []gallery.Gallery{ galleries := []config.Gallery{
{ {
Name: "test", Name: "test",
URL: "file://" + filepath.Join(modelDir, "gallery_simple.yaml"), URL: "file://" + filepath.Join(modelDir, "gallery_simple.yaml"),
@@ -603,7 +603,7 @@ var _ = Describe("API test", func() {
c, cancel = context.WithCancel(context.Background()) c, cancel = context.WithCancel(context.Background())
galleries := []gallery.Gallery{ galleries := []config.Gallery{
{ {
Name: "model-gallery", Name: "model-gallery",
URL: "https://raw.githubusercontent.com/go-skynet/model-gallery/main/index.yaml", URL: "https://raw.githubusercontent.com/go-skynet/model-gallery/main/index.yaml",

View File

@@ -4,8 +4,8 @@ import (
"fmt" "fmt"
"strings" "strings"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -6,9 +6,9 @@ import (
"github.com/chasefleming/elem-go" "github.com/chasefleming/elem-go"
"github.com/chasefleming/elem-go/attrs" "github.com/chasefleming/elem-go/attrs"
"github.com/go-skynet/LocalAI/core/services" "github.com/mudler/LocalAI/core/gallery"
"github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/xsync" "github.com/mudler/LocalAI/pkg/xsync"
) )
const ( const (

View File

@@ -1,13 +1,13 @@
package elevenlabs package elevenlabs
import ( import (
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
fiberContext "github.com/go-skynet/LocalAI/core/http/ctx" fiberContext "github.com/mudler/LocalAI/core/http/ctx"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/schema"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -1,14 +1,14 @@
package jina package jina
import ( import (
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
fiberContext "github.com/go-skynet/LocalAI/core/http/ctx"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/grpc/proto"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
fiberContext "github.com/mudler/LocalAI/core/http/ctx"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/pkg/grpc/proto"
"github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -1,9 +1,9 @@
package localai package localai
import ( import (
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/core/services"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/core/services"
) )
func BackendMonitorEndpoint(bm *services.BackendMonitorService) func(c *fiber.Ctx) error { func BackendMonitorEndpoint(bm *services.BackendMonitorService) func(c *fiber.Ctx) error {

View File

@@ -5,15 +5,16 @@ import (
"fmt" "fmt"
"slices" "slices"
"github.com/go-skynet/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/gallery"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/gallery"
"github.com/mudler/LocalAI/core/services"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
type ModelGalleryEndpointService struct { type ModelGalleryEndpointService struct {
galleries []gallery.Gallery galleries []config.Gallery
modelPath string modelPath string
galleryApplier *services.GalleryService galleryApplier *services.GalleryService
} }
@@ -24,7 +25,7 @@ type GalleryModel struct {
gallery.GalleryModel gallery.GalleryModel
} }
func CreateModelGalleryEndpointService(galleries []gallery.Gallery, modelPath string, galleryApplier *services.GalleryService) ModelGalleryEndpointService { func CreateModelGalleryEndpointService(galleries []config.Gallery, modelPath string, galleryApplier *services.GalleryService) ModelGalleryEndpointService {
return ModelGalleryEndpointService{ return ModelGalleryEndpointService{
galleries: galleries, galleries: galleries,
modelPath: modelPath, modelPath: modelPath,
@@ -129,12 +130,12 @@ func (mgs *ModelGalleryEndpointService) ListModelGalleriesEndpoint() func(c *fib
func (mgs *ModelGalleryEndpointService) AddModelGalleryEndpoint() func(c *fiber.Ctx) error { func (mgs *ModelGalleryEndpointService) AddModelGalleryEndpoint() func(c *fiber.Ctx) error {
return func(c *fiber.Ctx) error { return func(c *fiber.Ctx) error {
input := new(gallery.Gallery) input := new(config.Gallery)
// Get input data from the request body // Get input data from the request body
if err := c.BodyParser(input); err != nil { if err := c.BodyParser(input); err != nil {
return err return err
} }
if slices.ContainsFunc(mgs.galleries, func(gallery gallery.Gallery) bool { if slices.ContainsFunc(mgs.galleries, func(gallery config.Gallery) bool {
return gallery.Name == input.Name return gallery.Name == input.Name
}) { }) {
return fmt.Errorf("%s already exists", input.Name) return fmt.Errorf("%s already exists", input.Name)
@@ -151,17 +152,17 @@ func (mgs *ModelGalleryEndpointService) AddModelGalleryEndpoint() func(c *fiber.
func (mgs *ModelGalleryEndpointService) RemoveModelGalleryEndpoint() func(c *fiber.Ctx) error { func (mgs *ModelGalleryEndpointService) RemoveModelGalleryEndpoint() func(c *fiber.Ctx) error {
return func(c *fiber.Ctx) error { return func(c *fiber.Ctx) error {
input := new(gallery.Gallery) input := new(config.Gallery)
// Get input data from the request body // Get input data from the request body
if err := c.BodyParser(input); err != nil { if err := c.BodyParser(input); err != nil {
return err return err
} }
if !slices.ContainsFunc(mgs.galleries, func(gallery gallery.Gallery) bool { if !slices.ContainsFunc(mgs.galleries, func(gallery config.Gallery) bool {
return gallery.Name == input.Name return gallery.Name == input.Name
}) { }) {
return fmt.Errorf("%s is not currently registered", input.Name) return fmt.Errorf("%s is not currently registered", input.Name)
} }
mgs.galleries = slices.DeleteFunc(mgs.galleries, func(gallery gallery.Gallery) bool { mgs.galleries = slices.DeleteFunc(mgs.galleries, func(gallery config.Gallery) bool {
return gallery.Name == input.Name return gallery.Name == input.Name
}) })
return c.Send(nil) return c.Send(nil)

View File

@@ -3,9 +3,9 @@ package localai
import ( import (
"time" "time"
"github.com/go-skynet/LocalAI/core/services"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/gofiber/fiber/v2/middleware/adaptor" "github.com/gofiber/fiber/v2/middleware/adaptor"
"github.com/mudler/LocalAI/core/services"
"github.com/prometheus/client_golang/prometheus/promhttp" "github.com/prometheus/client_golang/prometheus/promhttp"
) )

View File

@@ -1,12 +1,12 @@
package localai package localai
import ( import (
"github.com/go-skynet/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/pkg/store"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/backend"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/pkg/model"
"github.com/mudler/LocalAI/pkg/store"
) )
func StoresSetEndpoint(sl *model.ModelLoader, appConfig *config.ApplicationConfig) func(c *fiber.Ctx) error { func StoresSetEndpoint(sl *model.ModelLoader, appConfig *config.ApplicationConfig) func(c *fiber.Ctx) error {

View File

@@ -1,13 +1,13 @@
package localai package localai
import ( import (
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
fiberContext "github.com/go-skynet/LocalAI/core/http/ctx" fiberContext "github.com/mudler/LocalAI/core/http/ctx"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/schema"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -1,11 +1,11 @@
package localai package localai
import ( import (
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/internal"
"github.com/go-skynet/LocalAI/pkg/gallery"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/gallery"
"github.com/mudler/LocalAI/internal"
"github.com/mudler/LocalAI/pkg/model"
) )
func WelcomeEndpoint(appConfig *config.ApplicationConfig, func WelcomeEndpoint(appConfig *config.ApplicationConfig,

View File

@@ -9,10 +9,10 @@ import (
"sync/atomic" "sync/atomic"
"time" "time"
"github.com/go-skynet/LocalAI/core/config"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/pkg/utils"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
model "github.com/mudler/LocalAI/pkg/model"
"github.com/mudler/LocalAI/pkg/utils"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -339,7 +339,7 @@ func CreateAssistantFileEndpoint(cl *config.BackendConfigLoader, ml *model.Model
} }
} }
return c.Status(fiber.StatusNotFound).SendString(fmt.Sprintf("Unable to find ")) return c.Status(fiber.StatusNotFound).SendString(fmt.Sprintf("Unable to find %q", assistantID))
} }
} }

View File

@@ -4,7 +4,6 @@ import (
"encoding/json" "encoding/json"
"fmt" "fmt"
"io" "io"
"io/ioutil"
"net/http" "net/http"
"net/http/httptest" "net/http/httptest"
"os" "os"
@@ -13,9 +12,9 @@ import (
"testing" "testing"
"time" "time"
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/pkg/model"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
) )
@@ -183,7 +182,7 @@ func TestAssistantEndpoints(t *testing.T) {
assert.NoError(t, err) assert.NoError(t, err)
assert.Equal(t, tt.expectedStatus, response.StatusCode) assert.Equal(t, tt.expectedStatus, response.StatusCode)
if tt.expectedStatus != fiber.StatusOK { if tt.expectedStatus != fiber.StatusOK {
all, _ := ioutil.ReadAll(response.Body) all, _ := io.ReadAll(response.Body)
assert.Equal(t, tt.expectedStringResult, string(all)) assert.Equal(t, tt.expectedStringResult, string(all))
} else { } else {
var result []Assistant var result []Assistant
@@ -279,6 +278,7 @@ func TestAssistantEndpoints(t *testing.T) {
assert.NoError(t, err) assert.NoError(t, err)
var getAssistant Assistant var getAssistant Assistant
err = json.NewDecoder(modifyResponse.Body).Decode(&getAssistant) err = json.NewDecoder(modifyResponse.Body).Decode(&getAssistant)
assert.NoError(t, err)
t.Cleanup(cleanupAllAssistants(t, app, []string{getAssistant.ID})) t.Cleanup(cleanupAllAssistants(t, app, []string{getAssistant.ID}))
@@ -391,7 +391,10 @@ func createAssistantFile(app *fiber.App, afr AssistantFileRequest, assistantId s
} }
var assistantFile AssistantFile var assistantFile AssistantFile
all, err := ioutil.ReadAll(resp.Body) all, err := io.ReadAll(resp.Body)
if err != nil {
return AssistantFile{}, resp, err
}
err = json.NewDecoder(strings.NewReader(string(all))).Decode(&assistantFile) err = json.NewDecoder(strings.NewReader(string(all))).Decode(&assistantFile)
if err != nil { if err != nil {
return AssistantFile{}, resp, err return AssistantFile{}, resp, err
@@ -422,8 +425,7 @@ func createAssistant(app *fiber.App, ar AssistantRequest) (Assistant, *http.Resp
var resultAssistant Assistant var resultAssistant Assistant
err = json.NewDecoder(strings.NewReader(string(bodyString))).Decode(&resultAssistant) err = json.NewDecoder(strings.NewReader(string(bodyString))).Decode(&resultAssistant)
return resultAssistant, resp, err
return resultAssistant, resp, nil
} }
func cleanupAllAssistants(t *testing.T, app *fiber.App, ids []string) func() { func cleanupAllAssistants(t *testing.T, app *fiber.App, ids []string) func() {

View File

@@ -8,13 +8,13 @@ import (
"strings" "strings"
"time" "time"
"github.com/go-skynet/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/functions"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/backend"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/pkg/functions"
model "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"github.com/valyala/fasthttp" "github.com/valyala/fasthttp"
) )

View File

@@ -8,14 +8,14 @@ import (
"fmt" "fmt"
"time" "time"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/functions"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/pkg/functions"
model "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"github.com/valyala/fasthttp" "github.com/valyala/fasthttp"
) )

View File

@@ -5,13 +5,13 @@ import (
"fmt" "fmt"
"time" "time"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/schema"
model "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -5,12 +5,12 @@ import (
"fmt" "fmt"
"time" "time"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/schema"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"

View File

@@ -8,10 +8,10 @@ import (
"sync/atomic" "sync/atomic"
"time" "time"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/pkg/utils"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/pkg/utils"
) )
var UploadedFiles []File var UploadedFiles []File

View File

@@ -13,10 +13,10 @@ import (
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
utils2 "github.com/go-skynet/LocalAI/pkg/utils"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
utils2 "github.com/mudler/LocalAI/pkg/utils"
"github.com/stretchr/testify/assert" "github.com/stretchr/testify/assert"
"testing" "testing"

View File

@@ -13,14 +13,14 @@ import (
"strings" "strings"
"time" "time"
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/google/uuid" "github.com/google/uuid"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
model "github.com/mudler/LocalAI/pkg/model"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )

View File

@@ -1,11 +1,11 @@
package openai package openai
import ( import (
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/schema" "github.com/mudler/LocalAI/core/schema"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
) )
func ComputeChoices( func ComputeChoices(

View File

@@ -1,9 +1,9 @@
package openai package openai
import ( import (
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/core/services"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/core/services"
) )
func ListModelsEndpoint(lms *services.ListModelsService) func(ctx *fiber.Ctx) error { func ListModelsEndpoint(lms *services.ListModelsService) func(ctx *fiber.Ctx) error {

View File

@@ -2,19 +2,16 @@ package openai
import ( import (
"context" "context"
"encoding/base64"
"encoding/json" "encoding/json"
"fmt" "fmt"
"io"
"net/http"
"strings"
"github.com/go-skynet/LocalAI/core/config"
fiberContext "github.com/go-skynet/LocalAI/core/http/ctx"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/go-skynet/LocalAI/pkg/functions"
model "github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
fiberContext "github.com/mudler/LocalAI/core/http/ctx"
"github.com/mudler/LocalAI/core/schema"
"github.com/mudler/LocalAI/pkg/functions"
"github.com/mudler/LocalAI/pkg/model"
"github.com/mudler/LocalAI/pkg/utils"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
) )
@@ -39,41 +36,6 @@ func readRequest(c *fiber.Ctx, ml *model.ModelLoader, o *config.ApplicationConfi
return modelFile, input, err return modelFile, input, err
} }
// this function check if the string is an URL, if it's an URL downloads the image in memory
// encodes it in base64 and returns the base64 string
func getBase64Image(s string) (string, error) {
if strings.HasPrefix(s, "http") {
// download the image
resp, err := http.Get(s)
if err != nil {
return "", err
}
defer resp.Body.Close()
// read the image data into memory
data, err := io.ReadAll(resp.Body)
if err != nil {
return "", err
}
// encode the image data in base64
encoded := base64.StdEncoding.EncodeToString(data)
// return the base64 string
return encoded, nil
}
// if the string instead is prefixed with "data:image/...;base64,", drop it
dropPrefix := []string{"data:image/jpeg;base64,", "data:image/png;base64,"}
for _, prefix := range dropPrefix {
if strings.HasPrefix(s, prefix) {
return strings.ReplaceAll(s, prefix, ""), nil
}
}
return "", fmt.Errorf("not valid string")
}
func updateRequestConfig(config *config.BackendConfig, input *schema.OpenAIRequest) { func updateRequestConfig(config *config.BackendConfig, input *schema.OpenAIRequest) {
if input.Echo { if input.Echo {
config.Echo = input.Echo config.Echo = input.Echo
@@ -187,7 +149,7 @@ func updateRequestConfig(config *config.BackendConfig, input *schema.OpenAIReque
input.Messages[i].StringContent = pp.Text input.Messages[i].StringContent = pp.Text
} else if pp.Type == "image_url" { } else if pp.Type == "image_url" {
// Detect if pp.ImageURL is an URL, if it is download the image and encode it in base64: // Detect if pp.ImageURL is an URL, if it is download the image and encode it in base64:
base64, err := getBase64Image(pp.ImageURL.URL) base64, err := utils.GetImageURLAsBase64(pp.ImageURL.URL)
if err == nil { if err == nil {
input.Messages[i].StringImages = append(input.Messages[i].StringImages, base64) // TODO: make sure that we only return base64 stuff input.Messages[i].StringImages = append(input.Messages[i].StringImages, base64) // TODO: make sure that we only return base64 stuff
// set a placeholder for each image // set a placeholder for each image
@@ -295,5 +257,9 @@ func mergeRequestWithConfig(modelFile string, input *schema.OpenAIRequest, cm *c
// Set the parameters for the language model prediction // Set the parameters for the language model prediction
updateRequestConfig(cfg, input) updateRequestConfig(cfg, input)
if !cfg.Validate() {
return nil, nil, fmt.Errorf("failed to validate config")
}
return cfg, input, err return cfg, input, err
} }

View File

@@ -8,9 +8,9 @@ import (
"path" "path"
"path/filepath" "path/filepath"
"github.com/go-skynet/LocalAI/core/backend" "github.com/mudler/LocalAI/core/backend"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
model "github.com/go-skynet/LocalAI/pkg/model" model "github.com/mudler/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
@@ -32,7 +32,7 @@ func TranscriptEndpoint(cl *config.BackendConfigLoader, ml *model.ModelLoader, a
config, input, err := mergeRequestWithConfig(m, input, cl, ml, appConfig.Debug, appConfig.Threads, appConfig.ContextSize, appConfig.F16) config, input, err := mergeRequestWithConfig(m, input, cl, ml, appConfig.Debug, appConfig.Threads, appConfig.ContextSize, appConfig.F16)
if err != nil { if err != nil {
return fmt.Errorf("failed reading parameters from request:%w", err) return fmt.Errorf("failed reading parameters from request: %w", err)
} }
// retrieve the file data from the request // retrieve the file data from the request
file, err := c.FormFile("file") file, err := c.FormFile("file")
@@ -65,7 +65,7 @@ func TranscriptEndpoint(cl *config.BackendConfigLoader, ml *model.ModelLoader, a
log.Debug().Msgf("Audio file copied to: %+v", dst) log.Debug().Msgf("Audio file copied to: %+v", dst)
tr, err := backend.ModelTranscription(dst, input.Language, ml, *config, appConfig) tr, err := backend.ModelTranscription(dst, input.Language, input.Translate, ml, *config, appConfig)
if err != nil { if err != nil {
return err return err
} }

View File

@@ -7,10 +7,10 @@ import (
"net/http" "net/http"
"github.com/Masterminds/sprig/v3" "github.com/Masterminds/sprig/v3"
"github.com/go-skynet/LocalAI/core/schema"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
fiberhtml "github.com/gofiber/template/html/v2" fiberhtml "github.com/gofiber/template/html/v2"
"github.com/microcosm-cc/bluemonday" "github.com/microcosm-cc/bluemonday"
"github.com/mudler/LocalAI/core/schema"
"github.com/russross/blackfriday" "github.com/russross/blackfriday"
) )
@@ -21,14 +21,13 @@ func notFoundHandler(c *fiber.Ctx) error {
// Check if the request accepts JSON // Check if the request accepts JSON
if string(c.Context().Request.Header.ContentType()) == "application/json" || len(c.Accepts("html")) == 0 { if string(c.Context().Request.Header.ContentType()) == "application/json" || len(c.Accepts("html")) == 0 {
// The client expects a JSON response // The client expects a JSON response
c.Status(fiber.StatusNotFound).JSON(schema.ErrorResponse{ return c.Status(fiber.StatusNotFound).JSON(schema.ErrorResponse{
Error: &schema.APIError{Message: "Resource not found", Code: fiber.StatusNotFound}, Error: &schema.APIError{Message: "Resource not found", Code: fiber.StatusNotFound},
}) })
} else { } else {
// The client expects an HTML response // The client expects an HTML response
c.Status(fiber.StatusNotFound).Render("views/404", fiber.Map{}) return c.Status(fiber.StatusNotFound).Render("views/404", fiber.Map{})
} }
return nil
} }
func renderEngine() *fiberhtml.Engine { func renderEngine() *fiberhtml.Engine {

View File

@@ -1,10 +1,10 @@
package routes package routes
import ( import (
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http/endpoints/elevenlabs"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/http/endpoints/elevenlabs"
"github.com/mudler/LocalAI/pkg/model"
) )
func RegisterElevenLabsRoutes(app *fiber.App, func RegisterElevenLabsRoutes(app *fiber.App,

View File

@@ -1,11 +1,11 @@
package routes package routes
import ( import (
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http/endpoints/jina" "github.com/mudler/LocalAI/core/http/endpoints/jina"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/pkg/model"
) )
func RegisterJINARoutes(app *fiber.App, func RegisterJINARoutes(app *fiber.App,

View File

@@ -1,13 +1,13 @@
package routes package routes
import ( import (
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http/endpoints/localai"
"github.com/go-skynet/LocalAI/core/services"
"github.com/go-skynet/LocalAI/internal"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/gofiber/swagger" "github.com/gofiber/swagger"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/http/endpoints/localai"
"github.com/mudler/LocalAI/core/services"
"github.com/mudler/LocalAI/internal"
"github.com/mudler/LocalAI/pkg/model"
) )
func RegisterLocalAIRoutes(app *fiber.App, func RegisterLocalAIRoutes(app *fiber.App,

View File

@@ -1,12 +1,12 @@
package routes package routes
import ( import (
"github.com/go-skynet/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http/endpoints/localai"
"github.com/go-skynet/LocalAI/core/http/endpoints/openai"
"github.com/go-skynet/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/model"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"
"github.com/mudler/LocalAI/core/config"
"github.com/mudler/LocalAI/core/http/endpoints/localai"
"github.com/mudler/LocalAI/core/http/endpoints/openai"
"github.com/mudler/LocalAI/core/services"
"github.com/mudler/LocalAI/pkg/model"
) )
func RegisterOpenAIRoutes(app *fiber.App, func RegisterOpenAIRoutes(app *fiber.App,

View File

@@ -6,14 +6,14 @@ import (
"sort" "sort"
"strings" "strings"
"github.com/go-skynet/LocalAI/core/config" "github.com/mudler/LocalAI/core/config"
"github.com/go-skynet/LocalAI/core/http/elements" "github.com/mudler/LocalAI/core/gallery"
"github.com/go-skynet/LocalAI/core/http/endpoints/localai" "github.com/mudler/LocalAI/core/http/elements"
"github.com/go-skynet/LocalAI/core/services" "github.com/mudler/LocalAI/core/http/endpoints/localai"
"github.com/go-skynet/LocalAI/internal" "github.com/mudler/LocalAI/core/services"
"github.com/go-skynet/LocalAI/pkg/gallery" "github.com/mudler/LocalAI/internal"
"github.com/go-skynet/LocalAI/pkg/model" "github.com/mudler/LocalAI/pkg/model"
"github.com/go-skynet/LocalAI/pkg/xsync" "github.com/mudler/LocalAI/pkg/xsync"
"github.com/rs/zerolog/log" "github.com/rs/zerolog/log"
"github.com/gofiber/fiber/v2" "github.com/gofiber/fiber/v2"

View File

@@ -50,6 +50,10 @@
</div> </div>
<div id="loader" class="my-2 loader" style="display: none;"></div> <div id="loader" class="my-2 loader" style="display: none;"></div>
<div id="statustext" class="my-2 p-2 block text-white-700 text-sm font-bold mb-2" ></div> <div id="statustext" class="my-2 p-2 block text-white-700 text-sm font-bold mb-2" ></div>
<!-- Note for recording box -->
<div class="text-sm mb-4 text-white-500">
<strong>Note:</strong> You need an LLM a audio-transcription(whisper) and a tts model installed in order for this to work. Select the appropariate model from the toolbox and then click the 'Talk' button to start recording. The recording will continue until you click 'Stop recording'. Make sure your microphone is set up and enabled.
</div>
<div class="mb-4" > <div class="mb-4" >
<label for="modelSelect" class="block text-white-700 text-sm font-bold mb-2">LLM Model:</label> <label for="modelSelect" class="block text-white-700 text-sm font-bold mb-2">LLM Model:</label>
<select id="modelSelect" <select id="modelSelect"
@@ -95,7 +99,7 @@
class="bg-red-500 hover:bg-red-700 text-white font-bold py-2 px-4 rounded focus:outline-none focus:shadow-outline" class="bg-red-500 hover:bg-red-700 text-white font-bold py-2 px-4 rounded focus:outline-none focus:shadow-outline"
><i class="fa-solid fa-microphone pr-2"></i>Talk</button> ><i class="fa-solid fa-microphone pr-2"></i>Talk</button>
<a id="resetButton" <a id="resetButton"
class="inline-block align-baseline font-bold text-sm text-blue-500 hover:text-blue-800" class="inline-block align-baseline font-bold text-sm text-blue-500 hover:text-gray-200"
href="#" href="#"
>Reset conversation</a> >Reset conversation</a>
<audio id="audioPlayback" controls hidden></audio> <audio id="audioPlayback" controls hidden></audio>

View File

@@ -13,8 +13,8 @@ import (
"strings" "strings"
"time" "time"
"github.com/go-skynet/LocalAI/pkg/utils"
"github.com/libp2p/go-libp2p/core/peer" "github.com/libp2p/go-libp2p/core/peer"
"github.com/mudler/LocalAI/pkg/utils"
"github.com/mudler/edgevpn/pkg/node" "github.com/mudler/edgevpn/pkg/node"
"github.com/mudler/edgevpn/pkg/protocol" "github.com/mudler/edgevpn/pkg/protocol"
"github.com/mudler/edgevpn/pkg/types" "github.com/mudler/edgevpn/pkg/types"

View File

@@ -3,7 +3,7 @@ package schema
import ( import (
"context" "context"
functions "github.com/go-skynet/LocalAI/pkg/functions" functions "github.com/mudler/LocalAI/pkg/functions"
) )
// APIError provides error information returned by the OpenAI API. // APIError provides error information returned by the OpenAI API.

View File

@@ -8,6 +8,9 @@ type PredictionOptions struct {
// Also part of the OpenAI official spec // Also part of the OpenAI official spec
Language string `json:"language"` Language string `json:"language"`
// Only for audio transcription
Translate bool `json:"translate"`
// Also part of the OpenAI official spec. use it for returning multiple results // Also part of the OpenAI official spec. use it for returning multiple results
N int `json:"n"` N int `json:"n"`

Some files were not shown because too many files have changed in this diff Show More