Give me your attention: dot-product attention considered harmful for adversarial patch robustness
Neural architectures based on attention such as vision transformers are revolutionizing image recognition. Their main benefit is that attention allows reasoning about all parts of a scene jointly. In this paper, we show how the global reasoning of (scaled) dot-product attention can be the source of...
Main Authors: | , , , , |
---|---|
Format: | Conference item |
Language: | English |
Published: |
IEEE
2022
|