Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer
Random label noise (or observational noise) widely exists in practical machine learning settings. While previous studies primarily focused on the effects of label noise to the performance of learning, our work intends to investigate the implicit regularization effects of label noise, under mini-batc...
Main Authors: | , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
IOP Publishing
2024-01-01
|
Series: | Machine Learning: Science and Technology |
Subjects: | |
Online Access: | https://doi.org/10.1088/2632-2153/ad13ba |
_version_ | 1827330647835803648 |
---|---|
author | Haoyi Xiong Xuhong Li Boyang Yu Dongrui Wu Zhanxing Zhu Dejing Dou |
author_facet | Haoyi Xiong Xuhong Li Boyang Yu Dongrui Wu Zhanxing Zhu Dejing Dou |
author_sort | Haoyi Xiong |
collection | DOAJ |
description | Random label noise (or observational noise) widely exists in practical machine learning settings. While previous studies primarily focused on the effects of label noise to the performance of learning, our work intends to investigate the implicit regularization effects of label noise, under mini-batch sampling settings of stochastic gradient descent (SGD), with the assumption that label noise is unbiased. Specifically, we analyze the learning dynamics of SGD over the quadratic loss with unbiased label noise (ULN), where we model the dynamics of SGD as a stochastic differentiable equation with two diffusion terms (namely a doubly stochastic model). While the first diffusion term is caused by mini-batch sampling over the (label-noiseless) loss gradients, as in many other works on SGD (Zhu et al 2019 ICML 7654–63; Wu et al 2020 Int. Conf. on Machine Learning (PMLR) pp 10367–76), our model investigates the second noise term of SGD dynamics, which is caused by mini-batch sampling over the label noise, as an implicit regularizer. Our theoretical analysis finds such an implicit regularizer would favor some convergence points that could stabilize model outputs against perturbations of parameters (namely inference stability ). Though similar phenomenon have been investigated by Blanc et al (2020 Conf. on Learning Theory (PMLR) pp 483–513), our work does not assume SGD as an Ornstein–Uhlenbeck-like process and achieves a more generalizable result with convergence of the approximation proved. To validate our analysis, we design two sets of empirical studies to analyze the implicit regularizer of SGD with unbiased random label noise for deep neural network training and linear regression. Our first experiment studies the noisy self-distillation tricks for deep learning, where student networks are trained using the outputs from well-trained teachers with additive unbiased random label noise. Our experiment shows that the implicit regularizer caused by the label noise tends to select models with improved inference stability. We also carry out experiments on SGD-based linear regression with ULN, where we plot the trajectories of parameters learned in every step and visualize the effects of implicit regularization. The results back up our theoretical findings. |
first_indexed | 2024-03-07T16:17:26Z |
format | Article |
id | doaj.art-01e1085ba69a4d02bd8f61206ab99fcb |
institution | Directory Open Access Journal |
issn | 2632-2153 |
language | English |
last_indexed | 2024-03-07T16:17:26Z |
publishDate | 2024-01-01 |
publisher | IOP Publishing |
record_format | Article |
series | Machine Learning: Science and Technology |
spelling | doaj.art-01e1085ba69a4d02bd8f61206ab99fcb2024-03-04T10:13:12ZengIOP PublishingMachine Learning: Science and Technology2632-21532024-01-015101503910.1088/2632-2153/ad13baStochastic gradient descent with random label noises: doubly stochastic models and inference stabilizerHaoyi Xiong0https://orcid.org/0000-0002-5451-3253Xuhong Li1Boyang Yu2Dongrui Wu3Zhanxing Zhu4Dejing Dou5Baidu Inc. , Beijing, People’s Republic of ChinaBaidu Inc. , Beijing, People’s Republic of ChinaBaidu Inc. , Beijing, People’s Republic of ChinaHuazhong University of Science and Technology , Wuhan, People’s Republic of ChinaPeking University , Beijing, People’s Republic of ChinaBaidu Inc. , Beijing, People’s Republic of ChinaRandom label noise (or observational noise) widely exists in practical machine learning settings. While previous studies primarily focused on the effects of label noise to the performance of learning, our work intends to investigate the implicit regularization effects of label noise, under mini-batch sampling settings of stochastic gradient descent (SGD), with the assumption that label noise is unbiased. Specifically, we analyze the learning dynamics of SGD over the quadratic loss with unbiased label noise (ULN), where we model the dynamics of SGD as a stochastic differentiable equation with two diffusion terms (namely a doubly stochastic model). While the first diffusion term is caused by mini-batch sampling over the (label-noiseless) loss gradients, as in many other works on SGD (Zhu et al 2019 ICML 7654–63; Wu et al 2020 Int. Conf. on Machine Learning (PMLR) pp 10367–76), our model investigates the second noise term of SGD dynamics, which is caused by mini-batch sampling over the label noise, as an implicit regularizer. Our theoretical analysis finds such an implicit regularizer would favor some convergence points that could stabilize model outputs against perturbations of parameters (namely inference stability ). Though similar phenomenon have been investigated by Blanc et al (2020 Conf. on Learning Theory (PMLR) pp 483–513), our work does not assume SGD as an Ornstein–Uhlenbeck-like process and achieves a more generalizable result with convergence of the approximation proved. To validate our analysis, we design two sets of empirical studies to analyze the implicit regularizer of SGD with unbiased random label noise for deep neural network training and linear regression. Our first experiment studies the noisy self-distillation tricks for deep learning, where student networks are trained using the outputs from well-trained teachers with additive unbiased random label noise. Our experiment shows that the implicit regularizer caused by the label noise tends to select models with improved inference stability. We also carry out experiments on SGD-based linear regression with ULN, where we plot the trajectories of parameters learned in every step and visualize the effects of implicit regularization. The results back up our theoretical findings.https://doi.org/10.1088/2632-2153/ad13bastochastic gradient descentcontinuous-time analysisdynamical systems |
spellingShingle | Haoyi Xiong Xuhong Li Boyang Yu Dongrui Wu Zhanxing Zhu Dejing Dou Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer Machine Learning: Science and Technology stochastic gradient descent continuous-time analysis dynamical systems |
title | Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer |
title_full | Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer |
title_fullStr | Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer |
title_full_unstemmed | Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer |
title_short | Stochastic gradient descent with random label noises: doubly stochastic models and inference stabilizer |
title_sort | stochastic gradient descent with random label noises doubly stochastic models and inference stabilizer |
topic | stochastic gradient descent continuous-time analysis dynamical systems |
url | https://doi.org/10.1088/2632-2153/ad13ba |
work_keys_str_mv | AT haoyixiong stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer AT xuhongli stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer AT boyangyu stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer AT dongruiwu stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer AT zhanxingzhu stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer AT dejingdou stochasticgradientdescentwithrandomlabelnoisesdoublystochasticmodelsandinferencestabilizer |