On the Redundancy in the Rank of Neural Network Parameters and Its Controllability
In this paper, we show that parameters of a neural network can have redundancy in their ranks, both theoretically and empirically. When viewed as a function from one space to another, neural networks can exhibit feature correlation and slower training due to this redundancy. Motivated by this, we pr...
Main Authors: | Chanhee Lee, Young-Bum Kim, Hyesung Ji, Yeonsoo Lee, Yuna Hur, Heuiseok Lim |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2021-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/11/2/725 |
Similar Items
-
Frequency Regularization: Reducing Information Redundancy in Convolutional Neural Networks
by: Chenqiu Zhao, et al.
Published: (2023-01-01) -
Study on Redundancy in Robot Kinematic Parameter Identification
by: Yue Zhang, et al.
Published: (2022-01-01) -
Reexamining low rank matrix factorization for trace norm regularization
by: Carlo Ciliberto, et al.
Published: (2023-08-01) -
Kinematic Model and Redundant Space Analysis of 4-DOF Redundant Robot
by: Yu Li, et al.
Published: (2022-02-01) -
A Pruning Method for Deep Convolutional Network Based on Heat Map Generation Metrics
by: Wenli Zhang, et al.
Published: (2022-03-01)