Accelerated CNN Training through Gradient Approximation
© 2019 IEEE. Training deep convolutional neural networks such as VGG and ResNet by gradient descent is an expensive exercise requiring specialized hardware such as GPUs. Recent works have examined the possibility of approximating the gradient computation while maintaining the same convergence proper...
Main Authors: | Harsha, NS, Wang, Z, Amarasinghe, S |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2021
|
Online Access: | https://hdl.handle.net/1721.1/132255 |
Similar Items
-
Accelerated CNN Training through Gradient Approximation
by: Wang, Ziheng, et al.
Published: (2022) -
All Analog CNN Accelerator with RRAMs for Fast Inference
by: Chao, Minghan
Published: (2022) -
Flexible Low Power CNN Accelerator for Edge Computing with Weight Tuning
by: Wang, Miaorong, et al.
Published: (2020) -
Accelerated on-line calibration of dynamic traffic assignment using distributed stochastic gradient approximation
by: Huang, Enyang, et al.
Published: (2013) -
A universally optimal multistage accelerated stochastic gradient method
by: Aybat, NS, et al.
Published: (2021)