Similarity‐based adversarial knowledge distillation using graph convolutional neural network
Abstract This letter presents an adversarial knowledge distillation based on graph convolutional neural network. For knowledge distillation, many methods have been proposed in which the student model individually and independently imitates the output of the teacher model on the input data. Our metho...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Wiley
2022-08-01
|
Series: | Electronics Letters |
Online Access: | https://doi.org/10.1049/ell2.12543 |
_version_ | 1818501577338519552 |
---|---|
author | Sungjun Lee Sejun Kim Seong Soo Kim Kisung Seo |
author_facet | Sungjun Lee Sejun Kim Seong Soo Kim Kisung Seo |
author_sort | Sungjun Lee |
collection | DOAJ |
description | Abstract This letter presents an adversarial knowledge distillation based on graph convolutional neural network. For knowledge distillation, many methods have been proposed in which the student model individually and independently imitates the output of the teacher model on the input data. Our method suggests the application of a similarity matrix to consider the relationship among output vectors, compared to the other existing approaches. The similarity matrix of the output vectors is calculated and converted into a graph structure, and a generative adversarial network using graph convolutional neural network is applied. We suggest similarity‐based knowledge distillation in which a student model simultaneously imitates both of output vector and similarity matrix of the teacher model. We evaluate our method on ResNet, MobileNet and Wide ResNet using CIFAR‐10 and CIFAR‐100 datasets, and our results outperform results of the baseline model and other existing knowledge distillations like KLD and DML. |
first_indexed | 2024-12-10T20:58:12Z |
format | Article |
id | doaj.art-9c39d2c5854c435ca81726a22f5637c9 |
institution | Directory Open Access Journal |
issn | 0013-5194 1350-911X |
language | English |
last_indexed | 2024-12-10T20:58:12Z |
publishDate | 2022-08-01 |
publisher | Wiley |
record_format | Article |
series | Electronics Letters |
spelling | doaj.art-9c39d2c5854c435ca81726a22f5637c92022-12-22T01:33:55ZengWileyElectronics Letters0013-51941350-911X2022-08-01581660660810.1049/ell2.12543Similarity‐based adversarial knowledge distillation using graph convolutional neural networkSungjun Lee0Sejun Kim1Seong Soo Kim2Kisung Seo3Department of Electronics Engineering Seokyeong University Seoul KoreaDepartment of Electronics Engineering Seokyeong University Seoul KoreaDepartment of Electrical & Electronic Engineering Yonam Institute of Technology Jinju‐si KoreaDepartment of Electronics Engineering Seokyeong University Seoul KoreaAbstract This letter presents an adversarial knowledge distillation based on graph convolutional neural network. For knowledge distillation, many methods have been proposed in which the student model individually and independently imitates the output of the teacher model on the input data. Our method suggests the application of a similarity matrix to consider the relationship among output vectors, compared to the other existing approaches. The similarity matrix of the output vectors is calculated and converted into a graph structure, and a generative adversarial network using graph convolutional neural network is applied. We suggest similarity‐based knowledge distillation in which a student model simultaneously imitates both of output vector and similarity matrix of the teacher model. We evaluate our method on ResNet, MobileNet and Wide ResNet using CIFAR‐10 and CIFAR‐100 datasets, and our results outperform results of the baseline model and other existing knowledge distillations like KLD and DML.https://doi.org/10.1049/ell2.12543 |
spellingShingle | Sungjun Lee Sejun Kim Seong Soo Kim Kisung Seo Similarity‐based adversarial knowledge distillation using graph convolutional neural network Electronics Letters |
title | Similarity‐based adversarial knowledge distillation using graph convolutional neural network |
title_full | Similarity‐based adversarial knowledge distillation using graph convolutional neural network |
title_fullStr | Similarity‐based adversarial knowledge distillation using graph convolutional neural network |
title_full_unstemmed | Similarity‐based adversarial knowledge distillation using graph convolutional neural network |
title_short | Similarity‐based adversarial knowledge distillation using graph convolutional neural network |
title_sort | similarity based adversarial knowledge distillation using graph convolutional neural network |
url | https://doi.org/10.1049/ell2.12543 |
work_keys_str_mv | AT sungjunlee similaritybasedadversarialknowledgedistillationusinggraphconvolutionalneuralnetwork AT sejunkim similaritybasedadversarialknowledgedistillationusinggraphconvolutionalneuralnetwork AT seongsookim similaritybasedadversarialknowledgedistillationusinggraphconvolutionalneuralnetwork AT kisungseo similaritybasedadversarialknowledgedistillationusinggraphconvolutionalneuralnetwork |