![Banner for OpenLLM](/.github/assets/main-banner.png)

🦑 OpenLLM Core

pypi_status test_pypi_status Twitter Discord ci pre-commit.ci status
python_version Hatch code style Ruff types - mypy types - pyright

OpenLLM Core: Core components for OpenLLM.

## 📖 Introduction With OpenLLM, you can run inference with any open-source large-language models, deploy to the cloud or on-premises, and build powerful AI apps, and more. To learn more about OpenLLM, please visit OpenLLM's README.md This package holds the core components of OpenLLM, and considered as internal. Components includes: - Configuration generation. - Utilities for interacting with OpenLLM server. - Schema and generation utilities for OpenLLM server. ![Gif showing OpenLLM Intro](/.github/assets/output.gif)
## 📔 Citation If you use OpenLLM in your research, we provide a [citation](../CITATION.cff) to use: ```bibtex @software{Pham_OpenLLM_Operating_LLMs_2023, author = {Pham, Aaron and Yang, Chaoyu and Sheng, Sean and Zhao, Shenyang and Lee, Sauyon and Jiang, Bo and Dong, Fog and Guan, Xipeng and Ming, Frost}, license = {Apache-2.0}, month = jun, title = {{OpenLLM: Operating LLMs in production}}, url = {https://github.com/bentoml/OpenLLM}, year = {2023} } ```