Skip to content

dongyups/rectified-linear-attention-tf2-keras

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 

Repository files navigation

TF2 Keras Rectified Linear Attention

reference: https://github.com/rishikksh20/rectified-linear-attention

This is not the official repo, too.

This repo contain tensorflow2 keras implementation of Sparse Attention with Linear Units.

You may need to install "einops" library first. Use it as an ordinary tensorflow2 keras layers.

Library:

pip install einops

Citation:

@misc{zhang2021sparse,
      title={Sparse Attention with Linear Units}, 
      author={Biao Zhang and Ivan Titov and Rico Sennrich},
      year={2021},
      eprint={2104.07012},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages