Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

model conversion colab issue #5811

Open
burntail opened this issue Jan 9, 2025 · 4 comments
Open

model conversion colab issue #5811

burntail opened this issue Jan 9, 2025 · 4 comments
Assignees
Labels
platform:python MediaPipe Python issues stat:awaiting response Waiting for user response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:bug Bug in the Source Code of MediaPipe Solution

Comments

@burntail
Copy link

burntail commented Jan 9, 2025

https://colab.research.google.com/github/googlesamples/mediapipe/blob/main/examples/llm_inference/conversion/llm_conversion.ipynb?hl=ko#scrollTo=LSSxrLyQPofw

I tried to take converted models (ex) Falcon_cpu etc), but there is a issue like the picture which I attached.
Could explain about the issue and how to resolve the issue? I didn't change any other codes in colab.
Thank you.

스크린샷 2025-01-09 오후 3 22 04
@burntail burntail added the type:others issues not falling in bug, perfromance, support, build and install or feature label Jan 9, 2025
@kuaashish kuaashish assigned kuaashish and unassigned kalyan2789g Jan 9, 2025
@kuaashish kuaashish added type:support General questions task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup platform:python MediaPipe Python issues and removed type:others issues not falling in bug, perfromance, support, build and install or feature labels Jan 9, 2025
@kuaashish
Copy link
Collaborator

Hi @burntail,

Thank you for reporting this issue. We are able to reproduce it; you can view the details here: Gist. Additionally, we have verified that other models on CPU are experiencing the same issue, while GPU conversion is working as expected. We have highlighted the issue, and our team is already working on a fix. We will provide updates as soon as we have more information.

@kuaashish kuaashish added type:bug Bug in the Source Code of MediaPipe Solution stat:awaiting googler Waiting for Google Engineer's Response and removed type:support General questions labels Jan 10, 2025
@JakobPogacnikSouvent
Copy link

I am experiencing the same issue for CPU conversion of Falcon 1B, StableLM 3B and Phi 2 models. The GPU conversion of said models works as expected.

@kuaashish
Copy link
Collaborator

Hi @schmidt-sebastian,

We have reproduced the issue in the latest version (0.10.21). For your reference, we are attaching the Colab gist. Could you please review the issue again?

Thank you!!

@kuaashish kuaashish removed the stat:awaiting googler Waiting for Google Engineer's Response label Feb 19, 2025
@kuaashish kuaashish assigned kuaashish and unassigned burntail Feb 19, 2025
@kuaashish
Copy link
Collaborator

Hi @burntail,

This issue is now resolved. We have attached Gists for Phi2 and Falcon 1B for your reference. Could you please verify and confirm if we can close the issue?

Thank you!!

@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Feb 19, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:python MediaPipe Python issues stat:awaiting response Waiting for user response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:bug Bug in the Source Code of MediaPipe Solution
Projects
None yet
Development

No branches or pull requests

5 participants