Skip to content
This repository has been archived by the owner on Dec 20, 2024. It is now read-only.

Feature/44 make flash attention configurable #467

Feature/44 make flash attention configurable

Feature/44 make flash attention configurable #467

Annotations

1 warning

checks (3.10)  /  Run pytest with Python 3.10 on ubuntu-latest

succeeded Dec 19, 2024 in 1m 52s