Implementation of attention block in Manet #717
QueLastimaSenor
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi! I've got a question about implementation of PAB in Manet(https://arxiv.org/pdf/2009.02130.pdf, i suppose it's a kernel attention in the article).


I can't find any resemblance between pab and attention kernel.
It seems like you've written just an attention mechanism. Am i wrong?
Beta Was this translation helpful? Give feedback.
All reactions