Population Risk Improvement with Model Compression: An Information-Theoretic Approach

It has been reported in many recent works on deep model compression that the population risk of a compressed model can be even better than that of the original model. In this paper, an information-theoretic explanation for this population risk improvement phenomenon is provided by jointly studying t...

Full description

Bibliographic Details
Main Authors: Yuheng Bu, Weihao Gao, Shaofeng Zou, Venugopal V. Veeravalli
Format: Article
Language:English
Published: MDPI AG 2021-09-01
Series:Entropy
Subjects:
Online Access:https://www.mdpi.com/1099-4300/23/10/1255
_version_ 1797514693953191936
author Yuheng Bu
Weihao Gao
Shaofeng Zou
Venugopal V. Veeravalli
author_facet Yuheng Bu
Weihao Gao
Shaofeng Zou
Venugopal V. Veeravalli
author_sort Yuheng Bu
collection DOAJ
description It has been reported in many recent works on deep model compression that the population risk of a compressed model can be even better than that of the original model. In this paper, an information-theoretic explanation for this population risk improvement phenomenon is provided by jointly studying the decrease in the generalization error and the increase in the empirical risk that results from model compression. It is first shown that model compression reduces an information-theoretic bound on the generalization error, which suggests that model compression can be interpreted as a regularization technique to avoid overfitting. The increase in empirical risk caused by model compression is then characterized using rate distortion theory. These results imply that the overall population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. A linear regression example is presented to demonstrate that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest a way to improve a widely used model compression algorithm, i.e., Hessian-weighted <i>K</i>-means clustering, by regularizing the distance between the clustering centers. Experiments with neural networks are provided to validate our theoretical assertions.
first_indexed 2024-03-10T06:35:12Z
format Article
id doaj.art-f91dc754b8804e9d99cb126561abe33e
institution Directory Open Access Journal
issn 1099-4300
language English
last_indexed 2024-03-10T06:35:12Z
publishDate 2021-09-01
publisher MDPI AG
record_format Article
series Entropy
spelling doaj.art-f91dc754b8804e9d99cb126561abe33e2023-11-22T18:10:11ZengMDPI AGEntropy1099-43002021-09-012310125510.3390/e23101255Population Risk Improvement with Model Compression: An Information-Theoretic ApproachYuheng Bu0Weihao Gao1Shaofeng Zou2Venugopal V. Veeravalli3Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61820, USABytedance Inc., Bellevue, WA 98004, USADepartment of Electrical Engineering, University at Buffalo, The State University of New York, Buffalo, NY 14221, USADepartment of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign, Urbana, IL 61820, USAIt has been reported in many recent works on deep model compression that the population risk of a compressed model can be even better than that of the original model. In this paper, an information-theoretic explanation for this population risk improvement phenomenon is provided by jointly studying the decrease in the generalization error and the increase in the empirical risk that results from model compression. It is first shown that model compression reduces an information-theoretic bound on the generalization error, which suggests that model compression can be interpreted as a regularization technique to avoid overfitting. The increase in empirical risk caused by model compression is then characterized using rate distortion theory. These results imply that the overall population risk could be improved by model compression if the decrease in generalization error exceeds the increase in empirical risk. A linear regression example is presented to demonstrate that such a decrease in population risk due to model compression is indeed possible. Our theoretical results further suggest a way to improve a widely used model compression algorithm, i.e., Hessian-weighted <i>K</i>-means clustering, by regularizing the distance between the clustering centers. Experiments with neural networks are provided to validate our theoretical assertions.https://www.mdpi.com/1099-4300/23/10/1255empirical riskgeneralization errorK-means clusteringmodel compressionpopulation riskrate distortion theory
spellingShingle Yuheng Bu
Weihao Gao
Shaofeng Zou
Venugopal V. Veeravalli
Population Risk Improvement with Model Compression: An Information-Theoretic Approach
Entropy
empirical risk
generalization error
K-means clustering
model compression
population risk
rate distortion theory
title Population Risk Improvement with Model Compression: An Information-Theoretic Approach
title_full Population Risk Improvement with Model Compression: An Information-Theoretic Approach
title_fullStr Population Risk Improvement with Model Compression: An Information-Theoretic Approach
title_full_unstemmed Population Risk Improvement with Model Compression: An Information-Theoretic Approach
title_short Population Risk Improvement with Model Compression: An Information-Theoretic Approach
title_sort population risk improvement with model compression an information theoretic approach
topic empirical risk
generalization error
K-means clustering
model compression
population risk
rate distortion theory
url https://www.mdpi.com/1099-4300/23/10/1255
work_keys_str_mv AT yuhengbu populationriskimprovementwithmodelcompressionaninformationtheoreticapproach
AT weihaogao populationriskimprovementwithmodelcompressionaninformationtheoreticapproach
AT shaofengzou populationriskimprovementwithmodelcompressionaninformationtheoreticapproach
AT venugopalvveeravalli populationriskimprovementwithmodelcompressionaninformationtheoreticapproach