-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
model conversion colab issue #5811
Comments
Hi @burntail, Thank you for reporting this issue. We are able to reproduce it; you can view the details here: Gist. Additionally, we have verified that other models on CPU are experiencing the same issue, while GPU conversion is working as expected. We have highlighted the issue, and our team is already working on a fix. We will provide updates as soon as we have more information. |
I am experiencing the same issue for CPU conversion of Falcon 1B, StableLM 3B and Phi 2 models. The GPU conversion of said models works as expected. |
We have reproduced the issue in the latest version (0.10.21). For your reference, we are attaching the Colab gist. Could you please review the issue again? Thank you!! |
https://colab.research.google.com/github/googlesamples/mediapipe/blob/main/examples/llm_inference/conversion/llm_conversion.ipynb?hl=ko#scrollTo=LSSxrLyQPofw
I tried to take converted models (ex) Falcon_cpu etc), but there is a issue like the picture which I attached.
Could explain about the issue and how to resolve the issue? I didn't change any other codes in colab.
Thank you.
The text was updated successfully, but these errors were encountered: