This website requires JavaScript.
Explore
Help
Register
Sign In
mirror
/
LocalAI
Watch
1
Star
0
Fork
0
You've already forked LocalAI
mirror of
https://github.com/mudler/LocalAI.git
synced
2025-12-30 09:59:36 -05:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
Files
949da7792d8a5bd4558d8fe73cd942a0f6b5758d
LocalAI
/
backend
/
python
/
vllm
History
Ettore Di Giacinto
949da7792d
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
...
* deps(conda): use transformers with vllm * join vllm, exllama, exllama2, split petals
2024-01-06 13:32:28 +01:00
..
backend_pb2_grpc.py
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
backend_pb2.py
feat(diffusers): update, add autopipeline, controlnet (
#1432
)
2023-12-13 19:20:22 +01:00
backend_vllm.py
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
Makefile
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
refactor: move backends into the backends directory (
#1279
)
2023-11-13 22:40:16 +01:00
run.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
test_backend_vllm.py
feat(conda): share envs with transformer-based backends (
#1465
)
2023-12-21 08:35:15 +01:00
test.sh
deps(conda): use transformers-env with vllm,exllama(2) (
#1554
)
2024-01-06 13:32:28 +01:00
README.md
Creating a separate environment for the vllm project
make vllm
Reference in New Issue
View Git Blame
Copy Permalink