Petals
Petals: https://github.com/bigscience-workshop/petals
Pre-Requisites
Ensure you have petals
installed
pip install git+https://github.com/bigscience-workshop/petals
Usage
Ensure you add petals/
as a prefix for all petals LLMs. This sets the custom_llm_provider to petals
from litellm import completion
response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}]
)
print(response)
Usage with Streaming
response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}],
stream=True
)
print(response)
for chunk in response:
print(chunk)
Model Details
Model Name | Function Call |
---|---|
petals-team/StableBeluga | completion('petals/petals-team/StableBeluga2', messages) |
huggyllama/llama-65b | completion('petals/huggyllama/llama-65b', messages) |