Backdoor Attack on Deep Neural Networks Triggered by Fault Injection Attack on Image Sensor Interface

A backdoor attack is a type of attack method that induces deep neural network (DNN) misclassification. The adversary who aims to trigger the backdoor attack inputs the image with a specific pattern (the adversarial mark) into the DNN model (backdoor model). In general, the adversary mark is created...

Full description

Bibliographic Details
Main Authors: Tatsuya Oyama, Shunsuke Okura, Kota Yoshida, Takeshi Fujino
Format: Article
Language:English
Published: MDPI AG 2023-05-01
Series:Sensors
Subjects:
Online Access:https://www.mdpi.com/1424-8220/23/10/4742