-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to configure the openai API proxy endpoint? #823
Comments
You can use the Or you can edit the config: @xjspace let me know if this solves your question. |
Notnaton, strangely when I did this Openai still appears in the model name when running code. But I have been trying to run code llama through huggingface: this is the line I'm referring to: Model: openai/huggingface/codellama/CodeLlama-34b-Instruct-hf Interpreter Info
|
This is because we add it, so litellm uses openai format to communicate with the endpoint. |
Is your feature request related to a problem? Please describe.
Hi, how to set the api proxy api instead of the official open api address?
could i place it into .env? what's the ENV name?
Describe the solution you'd like
How to configure the openai API proxy endpoint?
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: