Skip to content

[FA] Add tl.assume to flash_attention.py #553

[FA] Add tl.assume to flash_attention.py

[FA] Add tl.assume to flash_attention.py #553

Triggered via pull request February 21, 2025 17:26
Status Success
Total duration 1h 25m 55s
Artifacts
Check-File-Changes
15s
Check-File-Changes
pre-commit (code formatting)
54s
pre-commit (code formatting)
Matrix: Integration-Tests-AMD
Fit to window
Zoom out
Zoom in