Linear complexity self-attention with 3rd order polynomials

Self-attention mechanisms and non-local blocks have become crucial building blocks for state-of-the-art neural architectures thanks to their unparalleled ability in capturing long-range dependencies in the input. However their cost is quadratic with the number of spatial positions hence making their...

Deskribapen osoa

Xehetasun bibliografikoak
Egile Nagusiak: Babiloni, F, Marras, I, Deng, J, Kokkinos, F, Maggioni, M, Chrysos, G, Torr, P, Zafeiriou, S
Formatua: Conference item
Hizkuntza:English
Argitaratua: IEEE 2023