reference: https://github.com/rishikksh20/rectified-linear-attention
This is not the official repo, too.
This repo contain tensorflow2 keras implementation of Sparse Attention with Linear Units.
You may need to install "einops" library first. Use it as an ordinary tensorflow2 keras layers.
pip install einops
@misc{zhang2021sparse,
title={Sparse Attention with Linear Units},
author={Biao Zhang and Ivan Titov and Rico Sennrich},
year={2021},
eprint={2104.07012},
archivePrefix={arXiv},
primaryClass={cs.CL}
}