Sharper Sub-Weibull Concentrations
Constant-specified and exponential concentration inequalities play an essential role in the finite-sample theory of machine learning and high-dimensional statistics area. We obtain sharper and constants-specified concentration inequalities for the sum of independent sub-Weibull random variables, whi...
Main Authors: | , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-06-01
|
Series: | Mathematics |
Subjects: | |
Online Access: | https://www.mdpi.com/2227-7390/10/13/2252 |
_version_ | 1797434099214843904 |
---|---|
author | Huiming Zhang Haoyu Wei |
author_facet | Huiming Zhang Haoyu Wei |
author_sort | Huiming Zhang |
collection | DOAJ |
description | Constant-specified and exponential concentration inequalities play an essential role in the finite-sample theory of machine learning and high-dimensional statistics area. We obtain sharper and constants-specified concentration inequalities for the sum of independent sub-Weibull random variables, which leads to a mixture of two tails: sub-Gaussian for small deviations and sub-Weibull for large deviations from the mean. These bounds are new and improve existing bounds with sharper constants. In addition, a new <i>sub-Weibull parameter</i> is also proposed, which enables recovering the tight concentration inequality for a random variable (vector). For statistical applications, we give an <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mo>ℓ</mo><mn>2</mn></msub></semantics></math></inline-formula>-error of estimated coefficients in negative binomial regressions when the heavy-tailed covariates are sub-Weibull distributed with sparse structures, which is a new result for negative binomial regressions. In applying random matrices, we derive non-asymptotic versions of Bai-Yin’s theorem for sub-Weibull entries with exponential tail bounds. Finally, by demonstrating a sub-Weibull confidence region for a log-truncated Z-estimator without the second-moment condition, we discuss and define the <i>sub-Weibull type robust estimator</i> for independent observations <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msubsup><mrow><mo>{</mo><msub><mi>X</mi><mi>i</mi></msub><mo>}</mo></mrow><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup></semantics></math></inline-formula> without exponential-moment conditions. |
first_indexed | 2024-03-09T10:27:25Z |
format | Article |
id | doaj.art-9f5b790bee52471db47966cc77cef37b |
institution | Directory Open Access Journal |
issn | 2227-7390 |
language | English |
last_indexed | 2024-03-09T10:27:25Z |
publishDate | 2022-06-01 |
publisher | MDPI AG |
record_format | Article |
series | Mathematics |
spelling | doaj.art-9f5b790bee52471db47966cc77cef37b2023-12-01T21:35:18ZengMDPI AGMathematics2227-73902022-06-011013225210.3390/math10132252Sharper Sub-Weibull ConcentrationsHuiming Zhang0Haoyu Wei1Department of Mathematics, Faculty of Science and Technology, University of Macau, Macau 999078, ChinaDepartment of Statistics, North Carolina State University, Raleigh, NC 27695, USAConstant-specified and exponential concentration inequalities play an essential role in the finite-sample theory of machine learning and high-dimensional statistics area. We obtain sharper and constants-specified concentration inequalities for the sum of independent sub-Weibull random variables, which leads to a mixture of two tails: sub-Gaussian for small deviations and sub-Weibull for large deviations from the mean. These bounds are new and improve existing bounds with sharper constants. In addition, a new <i>sub-Weibull parameter</i> is also proposed, which enables recovering the tight concentration inequality for a random variable (vector). For statistical applications, we give an <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msub><mo>ℓ</mo><mn>2</mn></msub></semantics></math></inline-formula>-error of estimated coefficients in negative binomial regressions when the heavy-tailed covariates are sub-Weibull distributed with sparse structures, which is a new result for negative binomial regressions. In applying random matrices, we derive non-asymptotic versions of Bai-Yin’s theorem for sub-Weibull entries with exponential tail bounds. Finally, by demonstrating a sub-Weibull confidence region for a log-truncated Z-estimator without the second-moment condition, we discuss and define the <i>sub-Weibull type robust estimator</i> for independent observations <inline-formula><math xmlns="http://www.w3.org/1998/Math/MathML" display="inline"><semantics><msubsup><mrow><mo>{</mo><msub><mi>X</mi><mi>i</mi></msub><mo>}</mo></mrow><mrow><mi>i</mi><mo>=</mo><mn>1</mn></mrow><mi>n</mi></msubsup></semantics></math></inline-formula> without exponential-moment conditions.https://www.mdpi.com/2227-7390/10/13/2252constants-specified concentration inequalitiesexponential tail boundsheavy-tailed random variablessub-Weibull parameterlower bounds on the least singular value |
spellingShingle | Huiming Zhang Haoyu Wei Sharper Sub-Weibull Concentrations Mathematics constants-specified concentration inequalities exponential tail bounds heavy-tailed random variables sub-Weibull parameter lower bounds on the least singular value |
title | Sharper Sub-Weibull Concentrations |
title_full | Sharper Sub-Weibull Concentrations |
title_fullStr | Sharper Sub-Weibull Concentrations |
title_full_unstemmed | Sharper Sub-Weibull Concentrations |
title_short | Sharper Sub-Weibull Concentrations |
title_sort | sharper sub weibull concentrations |
topic | constants-specified concentration inequalities exponential tail bounds heavy-tailed random variables sub-Weibull parameter lower bounds on the least singular value |
url | https://www.mdpi.com/2227-7390/10/13/2252 |
work_keys_str_mv | AT huimingzhang sharpersubweibullconcentrations AT haoyuwei sharpersubweibullconcentrations |