You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Which version of the llama_cpp python module supports what format.
GGML became deprecated August 21st. The support of GGUF isn't compatible with GGML, so we need to be cognizant of what version llama.cpp we use at the backend.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Which version of the llama_cpp python module supports what format.
GGML became deprecated August 21st. The support of GGUF isn't compatible with GGML, so we need to be cognizant of what version llama.cpp we use at the backend.
Edit. Mentioned under issues:
#628
GGUF Not currently supported yet. Someone probably inpatient will probably throw in a pull request this weekend to handle it.
Beta Was this translation helpful? Give feedback.
All reactions