Linear complexity self-attention with 3rd order polynomials
Self-attention mechanisms and non-local blocks have become crucial building blocks for state-of-the-art neural architectures thanks to their unparalleled ability in capturing long-range dependencies in the input. However their cost is quadratic with the number of spatial positions hence making their...
Main Authors: | Babiloni, F, Marras, I, Deng, J, Kokkinos, F, Maggioni, M, Chrysos, G, Torr, P, Zafeiriou, S |
---|---|
Format: | Journal article |
Language: | English |
Published: |
IEEE
2023
|
Similar Items
-
Linear complexity self-attention with 3rd order polynomials
by: Babiloni, F, et al.
Published: (2023) -
Linear 3 and 5-step methods using Taylor series expansion for solving special 3rd order ODEs
by: Rajabi, Marzieh, et al.
Published: (2016) -
VFusion3D: learning scalable 3D generative models from video diffusion models
by: Han, J, et al.
Published: (2024) -
Polynomial combined first-order rewritings for linear and guarded existential rules
by: Gottlob, G, et al.
Published: (2023) -
Sub-second "temporal attention" modulates alpha rhythms. A high-resolution EEG study.
by: Babiloni, C, et al.
Published: (2004)