Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

use_flash_attention #12

Open
fourier20laplace opened this issue Jan 9, 2025 · 1 comment
Open

use_flash_attention #12

fourier20laplace opened this issue Jan 9, 2025 · 1 comment

Comments

@fourier20laplace
Copy link

image
Thanks 4 your work!
I wonder if I can load the ESM-2 without the flash-attention, when I pass the use_flash_attention=False to the FAEsmForMaskedLM , it gives an error.

@pengzhangzhi
Copy link
Owner

u dont have to pass the param, just run the code and it will by default disable flash attention

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants