You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have done this multiple times in multiple places.
You have many options.
For example, you can install a vllm server on a server then on lollms, select vllm binding, and after installing it, you can setup the link to your server as well as the parameyters like the key if you have one etc. Then you'll be able to access your served models on your client pc on lollms.
Expected Behavior
I would like to bind LoLLMs to a LLM that is deployed on premisse in our organization.
Current Behavior
It seems there is no binding allowing to set an endpoint, an engine, a version, a token ...
Apologies if I missed anything in the doc.
The text was updated successfully, but these errors were encountered: