Skip to content
This repository has been archived by the owner on Dec 20, 2024. It is now read-only.

Feature/44 make flash attention configurable #370

Feature/44 make flash attention configurable

Feature/44 make flash attention configurable #370

Triggered via pull request December 10, 2024 10:05
Status Cancelled
Total duration 2m 53s
Artifacts

python-pull-request.yml

on: pull_request
quality  /  pre-commit-run
25s
quality / pre-commit-run
Matrix: checks
Fit to window
Zoom out
Zoom in

Annotations

8 errors and 1 warning
checks (3.10) / Run pytest with Python 3.10 on macos-latest
Process completed with exit code 1.
checks (3.11) / Run pytest with Python 3.11 on macos-latest
Process completed with exit code 1.
checks (3.10) / Run pytest with Python 3.10 on macos-latest
Process completed with exit code 1.
checks (3.9) / Run pytest with Python 3.9 on macos-latest
Process completed with exit code 1.
checks (3.11) / Run pytest with Python 3.11 on ubuntu-latest
Process completed with exit code 1.
checks (3.10) / Run pytest with Python 3.10 on ubuntu-latest
FailFast: cancelling since parallel instance has failed
checks (3.9) / Run pytest with Python 3.9 on ubuntu-latest
FailFast: cancelling since parallel instance has failed
checks (3.10) / Run pytest with Python 3.10 on ubuntu-latest
FailFast: cancelling since parallel instance has failed
quality / pre-commit-run
ubuntu-latest pipelines will use ubuntu-24.04 soon. For more details, see https://github.com/actions/runner-images/issues/10636