-
Notifications
You must be signed in to change notification settings - Fork 271
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
TXTSearchTool with ollama #89
Comments
@fre391, nice solution, though I am not yet getting it to work. I am trying to apply the text tool to a local file like so: tool_txt_search = TXTSearchTool(
txt=path/'nonsense.txt',
config=config,
verbose=True
) My config: config=dict(
llm=dict(
provider="ollama", # Change this to your LLM provider
config=dict(
model="llama3.1", # Specify the model you want to use
),
),
embedder=dict(
provider="ollama", # Change this to your LLM provider
config=dict(
model="mxbai-embed-large", # Specify the embedding model
),
),
) Running this from jupyter currently (on a server via ssh). Unfortunately, the instantiation never seems to finish. I am not seeing any text output and nothing in How long does it take you to instantiate the tool? Any idea, why it might get stuck? |
+1 . Also facing the same issue. |
I got everything working with this:
Note: Using this will make this your default llm, no need to configure everything separately |
Sorry not working.
It is very clear that Ollama provider doesn't have a key called |
Ok I inspected the code and found that I had to upgrade crewai to latest version. Now it is working. |
Search is NOT limited to given txt file.
`
from crewai_tools import TXTSearchTool
txt_search_tool = TXTSearchTool(
txt="kunst.txt",
config=dict(
llm=dict(
provider="ollama",
config=dict(
model="llama3.1",
),
),
embedder=dict(
provider="ollama",
config=dict(
model="mxbai-embed-large",
),
),
)
)
`
No error message and at the end the best result is shown, but in between (Verbose) it will also show snippets from other sources, f.e. of a PDF which was searched by XMLSearchTool earlier by using a seperate script....
The text was updated successfully, but these errors were encountered: