A method to improve the computational performance of nonlinear all—optical diffractive deep neural network model
AbstractTo further improve the computational performance of the diffractive deep neural network (D2NN) model, we use the ReLU function to limit the phase parameters, which effectively solves the problem of vanishing gradient that occurs in the mitigation model. We add various commonly used nonlinear...
Main Authors: | Yichen Sun, Mingli Dong, Mingxin Yu, Lidan Lu, Shengjun Liang, Jiabin Xia, Lianqing Zhu |
---|---|
Format: | Article |
Language: | English |
Published: |
Taylor & Francis Group
2023-12-01
|
Series: | International Journal of Optomechatronics |
Subjects: | |
Online Access: | https://www.tandfonline.com/doi/10.1080/15599612.2023.2209624 |
Similar Items
-
On the Generative Power of ReLU Network for Generating Similar Strings
by: Mamoona Ghafoor, et al.
Published: (2024-01-01) -
Locally linear attributes of ReLU neural networks
by: Ben Sattelberg, et al.
Published: (2023-11-01) -
RBUE: a ReLU-based uncertainty estimation method for convolutional neural networks
by: Yufeng Xia, et al.
Published: (2023-02-01) -
Deep vs. shallow networks : An approximation theory perspective
by: Mhaskar, Hrushikesh, et al.
Published: (2016) -
Nonparametric Estimation for High-Dimensional Space Models Based on a Deep Neural Network
by: Hongxia Wang, et al.
Published: (2023-09-01)