Skip to content

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context) #184

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context)

Optimize TPU Flash Attention (20x XLA compilation speed-up on 32k long context) #184

Annotations

1 warning

pre-commit

succeeded Jan 7, 2025 in 31m 2s