[FA] Add tl.assume to flash_attention.py #551
Triggered via pull request
February 21, 2025 15:44
Status
Success
Total duration
1h 30m 0s
Artifacts
–
amd_perf_kernel_Integration_tests.yml
on: pull_request
Check-File-Changes
16s
pre-commit (code formatting)
1m 30s
Matrix: Integration-Tests-AMD
Annotations
1 warning
pre-commit (code formatting)
Cache not found for keys: setup-python-Linux-x64-24.04-Ubuntu-python-3.12.9-pip-1ed350ddc94376925cc8071a212e8b28c56c57750ee8bc3df0bfb1c839387980, setup-python-Linux-x64-24.04-Ubuntu-python-3.12.9-pip
|