``` pip install . openllm serve # or openllm run ``` To find out what LLM models are already in your hands. License ------- This project is licensed under the MIT License - see the LICENSE file for details. Acknowledgements ---------------- This project makes use of the following open-source projects: * [bentoml/bentoml](https://github.com/bentoml/bentoml) for production level model serving * [blrchen/chatgpt-lite](https://github.com/blrchen/chatgpt-lite) for a fancy Web Chat UI * [chujiezheng/chat_templates](https://github.com/chujiezheng/chat_templates) We are grateful to the developers and contributors of these projects for their hard work and dedication.