Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] is flex attention compatible? #3479

Open
2 tasks done
johnnynunez opened this issue Feb 11, 2025 · 1 comment
Open
2 tasks done

[Feature] is flex attention compatible? #3479

johnnynunez opened this issue Feb 11, 2025 · 1 comment
Assignees
Labels
amd help wanted Extra attention is needed

Comments

@johnnynunez
Copy link

Checklist

Motivation

context:
In the future, we plan on extending this support to allow for quantized versions of attention or things like RadixAttention as well.

https://pytorch.org/blog/flexattention/

Related resources

No response

@zhaochenyang20
Copy link
Collaborator

hmmm. I personally do not know this. Maybe cc @yzh119

@zhaochenyang20 zhaochenyang20 self-assigned this Feb 11, 2025
@zhaochenyang20 zhaochenyang20 added help wanted Extra attention is needed amd labels Feb 11, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
amd help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

2 participants