Skip to content

Pull requests: fla-org/flash-linear-attention

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Reviews
Assignee
Filter by who’s assigned
Sort

Pull requests list

[Attn] Fix cache update of swa
#183 opened Feb 12, 2025 by Pan-Yuqi Loading…
[Misc.] Fix tests on non-CUDA devices
#176 opened Feb 8, 2025 by uniartisan Loading…
[RWKV]: update rwkv6 support
#175 opened Feb 8, 2025 by uniartisan Loading…
[Misc.] Add activations for non-cuda Backends
#174 opened Feb 8, 2025 by uniartisan Loading…
[RWKV]: add more kernels for rwkv7
#166 opened Feb 5, 2025 by uniartisan Loading…
[Misc.] Add More Backends Support
#163 opened Feb 5, 2025 by uniartisan Loading…
ProTip! no:milestone will show everything without a milestone.