Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add support for sm_120 to NV_TARGET #3493

Closed
bernhardmgruber opened this issue Jan 22, 2025 · 0 comments · Fixed by #3550
Closed

Add support for sm_120 to NV_TARGET #3493

bernhardmgruber opened this issue Jan 22, 2025 · 0 comments · Fixed by #3550
Assignees

Comments

@bernhardmgruber
Copy link
Contributor

bernhardmgruber commented Jan 22, 2025

@bernhardmgruber
10.0 blackwell b100/b200
10.1 blackwell thor
10.1a blackwell digits
12.0 blackwell rtx50
??

I have RTX5090
IMG_2517
IMG_2518

more references:
pytorch/pytorch#145270 I added to pytorch
Dao-AILab/flash-attention#1436

Originally posted by @johnnynunez in #3166 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Done
Development

Successfully merging a pull request may close this issue.

2 participants