Files
OpenLLM/examples/langchain-tools-demo
Aaron Pham 98328be394 peft(models): improve implementation (#60)
If you have a local Dolly-V2 version, please do `openllm prune`
2023-06-24 05:22:18 -04:00
..
2023-06-16 18:10:50 -04:00

LangChain + BentoML + OpenLLM

Run it locally:

export SERPAPI_API_KEY="__Your_SERP_API_key__"
export BENTOML_CONFIG_OPTIONS="api_server.traffic.timeout=900 runners.traffic.timeout=900"
bentoml serve

Build Bento:

bentoml build

Generate docker image:

bentoml containerize ...
docker run \
  -e SERPAPI_API_KEY="__Your_SERP_API_key__" \
  -e BENTOML_CONFIG_OPTIONS="api_server.traffic.timeout=900 runners.traffic.timeout=900" \
  -p 3000:3000 \
  ..image_name