Skip to content

Commit

Permalink
update flash attention for gemma support: (axolotl-ai-cloud#1368)
Browse files Browse the repository at this point in the history
  • Loading branch information
winglian authored Mar 6, 2024
1 parent ed70a08 commit 58b0d4b
Show file tree
Hide file tree
Showing 2 changed files with 2 additions and 2 deletions.
2 changes: 1 addition & 1 deletion requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ fire
PyYAML>=6.0
requests
datasets>=2.15.0
flash-attn==2.3.3
flash-attn==2.5.5
sentencepiece
wandb
einops
Expand Down
2 changes: 1 addition & 1 deletion setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ def parse_requirements():
dependency_links=dependency_links,
extras_require={
"flash-attn": [
"flash-attn==2.5.0",
"flash-attn==2.5.5",
],
"fused-dense-lib": [
"fused-dense-lib @ git+https://github.com/Dao-AILab/[email protected]#subdirectory=csrc/fused_dense_lib",
Expand Down

0 comments on commit 58b0d4b

Please sign in to comment.