Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Run on CPU? #29

Open
dekid69 opened this issue Jan 9, 2025 · 1 comment
Open

Run on CPU? #29

dekid69 opened this issue Jan 9, 2025 · 1 comment
Labels
enhancement New feature or request question Further information is requested

Comments

@dekid69
Copy link

dekid69 commented Jan 9, 2025

I was wondering if there's a way to run the Ollama with the plugin by CPU instead of GPU.

Mainly I don't want to use my GPU all the time but the models I'm currently running is very lightweight

@dekid69 dekid69 added the enhancement New feature or request label Jan 9, 2025
@238SAMIxD
Copy link
Owner

Hey! Thabk you for the suggestion but unfortunately, it is not related to the bot itself. Try reaching out on the Ollama GitHub repo

@238SAMIxD 238SAMIxD added the question Further information is requested label Jan 9, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants