此 PR 添加了使用 liteLLM 的示例笔记本:https ://github.com/BerriAI/litellm/
liteLLM 是一个用于简化调用 OpenAI、Azure、Cohere、Anthropic、Huggingface API 端点的包 TLDR:使用 chatGPT 输入/输出格式调用所有 LLM API
下面是它的使用示例:
from litellm import completion
## set ENV variables
# ENV variables can be set in .env file, too. Example in .env.example
os.environ["OPENAI_API_KEY"] = "openai key"
os.environ["COHERE_API_KEY"] = "cohere key"
messages = [{ "content": "Hello, how are you?","role": "user"}]
# openai call
response = completion(model="gpt-3.5-turbo", messages=messages)
# llama2 call
model_name = "replicate/llama-2-70b-chat:2c1608e18606fad2812020dc541930f2d0495ce32eee50074220b87300bc16e1"
response = completion(model_name, messages)
# cohere call
response = completion("command-nightly", messages)
# anthropic call
response = completion(model="claude-instant-1", messages=messages)