Linear complexity self-attention with 3rd order polynomials
Self-attention mechanisms and non-local blocks have become crucial building blocks for state-of-the-art neural architectures thanks to their unparalleled ability in capturing long-range dependencies in the input. However their cost is quadratic with the number of spatial positions hence making their...
Main Authors: | , , , , , , , |
---|---|
Format: | Journal article |
Language: | English |
Published: |
IEEE
2023
|