Skip to content

[FA] Add tl.assume to flash_attention.py #551

[FA] Add tl.assume to flash_attention.py

[FA] Add tl.assume to flash_attention.py #551

Triggered via pull request February 21, 2025 15:44
Status Success
Total duration 1h 30m 0s
Artifacts
Check-File-Changes
16s
Check-File-Changes
pre-commit (code formatting)
1m 30s
pre-commit (code formatting)
Matrix: Integration-Tests-AMD
Fit to window
Zoom out
Zoom in

Annotations

1 warning
pre-commit (code formatting)
Cache not found for keys: setup-python-Linux-x64-24.04-Ubuntu-python-3.12.9-pip-1ed350ddc94376925cc8071a212e8b28c56c57750ee8bc3df0bfb1c839387980, setup-python-Linux-x64-24.04-Ubuntu-python-3.12.9-pip