Distributed Model Reuse with Multiple Classifiers

Traditional machine learning always takes a data centralized training strategy, while the transmission cost or data privacy protection in many real-world applications results in distributed and isolated data. Distributed learning provides an effective solution for efficient data fusion across isolat...

Full description

Bibliographic Details
Main Author: LI Xinchun, ZHAN Dechuan
Format: Article
Language:zho
Published: Journal of Computer Engineering and Applications Beijing Co., Ltd., Science Press 2022-10-01
Series:Jisuanji kexue yu tansuo
Subjects:
Online Access:http://fcst.ceaj.org/fileup/1673-9418/PDF/1673-9418-16-10-2310.pdf
Description
Summary:Traditional machine learning always takes a data centralized training strategy, while the transmission cost or data privacy protection in many real-world applications results in distributed and isolated data. Distributed learning provides an effective solution for efficient data fusion across isolated islands. However, due to the natural heterogeneity in real-world applications, the distributions of local data are not independently and identically distributed (Non-IID), which poses a huge challenge to distributed learning. First of all, to overcome the problem of data heterogeneity across local clients, this paper introduces model reuse into the procedure of distributed training and proposes a distributed model reuse (DMR) framework. Then, this paper theoretically shows that ensemble learning can provide a universal solution to data heterogeneity, and proposes a technique of multiple classifiers based distributed model reuse (McDMR). Finally, in order to reduce the storage, computation and transmission cost in practical applications, this paper further proposes two specific solutions including multi-head classifier and stochastic classifier based McDMR, which are named as McDMR-MH and McDMR-SC respectively. Experimental results on several public datasets verify the superiorities of the proposed methods.
ISSN:1673-9418