Linear complexity self-attention with 3rd order polynomials

Self-attention mechanisms and non-local blocks have become crucial building blocks for state-of-the-art neural architectures thanks to their unparalleled ability in capturing long-range dependencies in the input. However their cost is quadratic with the number of spatial positions hence making their...

Full description

Bibliographic Details
Main Authors: Babiloni, F, Marras, I, Deng, J, Kokkinos, F, Maggioni, M, Chrysos, G, Torr, P, Zafeiriou, S
Format: Journal article
Language:English
Published: IEEE 2023

Similar Items