Low-Pass Image Filtering to Achieve Adversarial Robustness
In this paper, we continue the research cycle on the properties of convolutional neural network-based image recognition systems and ways to improve noise immunity and robustness. Currently, a popular research area related to artificial neural networks is adversarial attacks. The adversarial attacks...
Main Authors: | Vadim Ziyadinov, Maxim Tereshonok |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2023-11-01
|
Series: | Sensors |
Subjects: | |
Online Access: | https://www.mdpi.com/1424-8220/23/22/9032 |
Similar Items
-
Increasing Neural-Based Pedestrian Detectors’ Robustness to Adversarial Patch Attacks Using Anomaly Localization
by: Olga Ilina, et al.
Published: (2025-01-01) -
Noise Immunity and Robustness Study of Image Recognition Using a Convolutional Neural Network
by: Vadim Ziyadinov, et al.
Published: (2022-02-01) -
Adversarial Robustness of Deep Convolutional Neural Network-based Image Recognition Models: A Review
by: Hao SUN, et al.
Published: (2021-08-01) -
Black-box adversarial attacks through speech distortion for speech emotion recognition
by: Jinxing Gao, et al.
Published: (2022-08-01) -
Adversarial Robustness of Vision Transformers Versus Convolutional Neural Networks
by: Kazim Ali, et al.
Published: (2024-01-01)