You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are some code examples how how you could create a mega service from multiple micro-services
These examples appear to be incomplete in explaining on how to get them to work.
It appears you can do pip install opea-comps but my library can't find the comps directory
It shows also to clone the repo but it downloads as GenAIComps so its my assumpution we have to reference it via GenAIComps
We have a megaservice example of class. What do you do with it? How do you run it? Do I just create an instance of the class and it should spin up expected resources
Other Notes
TGI suggests its only for Xeon and Gaudi but reviewing the code doesn't suggest it can run on consumer grade Intel hardware or GPUs.
Ollama is lacking documentation, I thought maybe I should not use TGI locally for teaching but then when I read about LLMs comp it suggests you have to use vLLM and TGI
Further investigation suggests that these three models all follow the OpenAI API schema so they likely will be interchangable.
The text was updated successfully, but these errors were encountered:
There are some code examples how how you could create a mega service from multiple micro-services
These examples appear to be incomplete in explaining on how to get them to work.
https://opea-project.github.io/latest/GenAIComps/README.html
pip install opea-comps
but my library can't find the comps directoryOther Notes
The text was updated successfully, but these errors were encountered: