Smooth Function Approximation by Deep Neural Networks with General Activation Functions
There has been a growing interest in expressivity of deep neural networks. However, most of the existing work about this topic focuses only on the specific activation function such as ReLU or sigmoid. In this paper, we investigate the approximation ability of deep neural networks with a broad class...
Main Authors: | Ilsang Ohn, Yongdai Kim |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2019-06-01
|
Series: | Entropy |
Subjects: | |
Online Access: | https://www.mdpi.com/1099-4300/21/7/627 |
Similar Items
-
Analytic Function Approximation by Path-Norm-Regularized Deep Neural Networks
by: Aleksandr Beknazaryan
Published: (2022-08-01) -
On Discrete Approximation of Analytic Functions by Shifts of the Lerch Zeta Function
by: Audronė Rimkevičienė, et al.
Published: (2022-12-01) -
Joint Approximation of Analytic Functions by Shifts of Lerch Zeta-Functions
by: Antanas Laurinčikas, et al.
Published: (2023-02-01) -
Generalized 5-Point Approximating Subdivision Scheme of Varying Arity
by: Sardar Muhammad Hussain, et al.
Published: (2020-03-01) -
The approximation of continuous functions by positive linear operators /
by: 351539 Devore, Ronald A.
Published: (1972)