- Direct SDK integration - Using the LiteLLM Python SDK
- Proxy Server integration - Using LiteLLM as an OpenAI-compatible proxy
Prerequisites
For both integration methods, you’ll need:LITELLM_API_KEY.
SDK Integration
TheLiteLLM class provides direct integration with the LiteLLM Python SDK.
Basic Usage
Using Hugging Face Models
LiteLLM can also work with Hugging Face models:Configuration Options
TheLiteLLM class accepts the following parameters:
| Parameter | Type | Description | Default | 
|---|---|---|---|
| id | str | Model identifier (e.g., “gpt-4o” or “huggingface/mistralai/Mistral-7B-Instruct-v0.2”) | “gpt-4o” | 
| name | str | Display name for the model | ”LiteLLM” | 
| provider | str | Provider name | ”LiteLLM” | 
| api_key | Optional[str] | API key (falls back to LITELLM_API_KEY environment variable) | None | 
| api_base | Optional[str] | Base URL for API requests | None | 
| max_tokens | Optional[int] | Maximum tokens in the response | None | 
| temperature | float | Sampling temperature | 0.7 | 
| top_p | float | Top-p sampling value | 1.0 | 
| request_params | Optional[Dict[str, Any]] | Additional request parameters | None | 
SDK Examples
 View more examples here.