aarnphm-ec2-dev 765c1a6e5c feat: requires_gpu for specific LLM.
This will determine the behaviour of SUPPORTED_RESOURCES

TODO: Support TPU

supports for requirements for specific LLM

Signed-off-by: aarnphm-ec2-dev <29749331+aarnphm@users.noreply.github.com>
Signed-off-by: Aaron <29749331+aarnphm@users.noreply.github.com>
2023-05-25 16:22:06 -07:00
2023-05-03 17:50:14 -07:00
2023-05-25 16:22:06 -07:00
2023-05-03 17:50:14 -07:00
2023-05-15 18:22:25 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00
2023-05-03 17:50:14 -07:00

OpenLLM


REST/gRPC API server for running any Open Large-Language Model - StableLM, Llama, Alpaca, Dolly, Flan-T5, and more
Powered by BentoML 🍱 + HuggingFace 🤗
Description
No description provided
Readme Apache-2.0 50 MiB
Languages
Python 95.9%
Shell 4.1%