Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Swap chat model provider at runtime #1245

Open
ikwattro opened this issue Jan 27, 2025 · 3 comments
Open

Swap chat model provider at runtime #1245

ikwattro opened this issue Jan 27, 2025 · 3 comments
Labels
question Further information is requested

Comments

@ikwattro
Copy link

ikwattro commented Jan 27, 2025

More of a question than an issue first as nothing is really mentioned in the docs.

Version : 0.24.0.CR1

With the following settings :

# langchain4j
quarkus.langchain4j.log-requests=true
quarkus.langchain4j.log-responses=true
quarkus.langchain4j.chat-memory.memory-window.max-messages=20

# openai
quarkus.langchain4j.m1.chat-model.provider=${M1_CHAT_MODEL_PROVIDER:ollama}
quarkus.langchain4j.openai.m1.api-key=${OPENAI_API_KEY:''}

# ollama
quarkus.langchain4j.ollama.base-url=${OLLAMA_BASE_URL:http://localhost:11434}

I was hoping to be able to switch between ollama and openai with the same build depending if I'm running locally or in deployed environment, but it seems that even when then M1_CHAT_MODEL_PROVIDER=openai variable is passed to the container ( as in docker container ), the app continues trying to use ollama.

Did I miss something or is the model provider constrained to be set at build time ? If it is the case, what is the recommended way to approach this ?

@geoand
Copy link
Collaborator

geoand commented Jan 27, 2025

The model to use is indeed constrained to build time

@ikwattro
Copy link
Author

Thanks @geoand , what is the recommendation on how to approach this ? Can I somehow register all of them with different names and choose one based on another config property ?

@geoand
Copy link
Collaborator

geoand commented Jan 27, 2025

It depends on what exactly you want to achieve.

I believe @maxandersen had some example were was doing something similar

@geoand geoand added the question Further information is requested label Jan 28, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants