You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have my llm hosted on an ec2 instance through vllm. I'd like to use that llm while evaluating through uptrain. I currently use that llm in my RAG service by providing its name and its base_url. But I am unable to do the same for uptrain. Despite updating the api_base, it doesnt get routed to the llm at my ec2 instance and instead keeps on asking for an openai api key.
** PS **
The api_base above is tweaked a little before posting here(for security purposes)
I'm providing the exact same api_base and model to the llama_index RAG service and its working perfectly, but the uptrain eval is not.
The text was updated successfully, but these errors were encountered:
I have my llm hosted on an ec2 instance through vllm. I'd like to use that llm while evaluating through uptrain. I currently use that llm in my RAG service by providing its name and its base_url. But I am unable to do the same for uptrain. Despite updating the api_base, it doesnt get routed to the llm at my ec2 instance and instead keeps on asking for an openai api key.
Code is below
** PS **
The api_base above is tweaked a little before posting here(for security purposes)
I'm providing the exact same api_base and model to the llama_index RAG service and its working perfectly, but the uptrain eval is not.
The text was updated successfully, but these errors were encountered: