Multi-CC: A New Baseline for Faster and Better Deep Clustering

The aim of our paper is to introduce a new deep clustering model called Multi-head Cross-Attention Contrastive Clustering (Multi-CC), which seeks to enhance the performance of the existing deep clustering model CC. Our approach involves first augmenting the data to form image pairs and then using th...

Full description

Bibliographic Details
Main Authors: Yulin Yao, Yu Yang, Linna Zhou, Xinsheng Guo, Gang Wang
Format: Article
Language:English
Published: MDPI AG 2023-10-01
Series:Electronics
Subjects:
Online Access:https://www.mdpi.com/2079-9292/12/20/4204
_version_ 1797574027787632640
author Yulin Yao
Yu Yang
Linna Zhou
Xinsheng Guo
Gang Wang
author_facet Yulin Yao
Yu Yang
Linna Zhou
Xinsheng Guo
Gang Wang
author_sort Yulin Yao
collection DOAJ
description The aim of our paper is to introduce a new deep clustering model called Multi-head Cross-Attention Contrastive Clustering (Multi-CC), which seeks to enhance the performance of the existing deep clustering model CC. Our approach involves first augmenting the data to form image pairs and then using the same backbone to extract the feature representation of these image pairs. We then undertake contrastive learning, separately in the row space and column space of the feature matrix, to jointly learn the instance and cluster representations. Our approach offers several key improvements over the existing model. Firstly, we use a mixed strategy of strong and weak augmentation to construct image pairs. Secondly, we get rid of the pooling layer of the backbone to prevent loss of information. Finally, we introduce a multi-head cross-attention module to improve the model’s performance. These improvements have allowed us to reduce the model training time by 80%. As a baseline, Multi-CC achieves the best results on CIFAR-10, ImageNet-10, and ImageNet-dogs. It is easily replaceable with CC, making models based on CC achieve better performance.
first_indexed 2024-03-10T21:17:20Z
format Article
id doaj.art-30c97e8248344c92afee999178a4416e
institution Directory Open Access Journal
issn 2079-9292
language English
last_indexed 2024-03-10T21:17:20Z
publishDate 2023-10-01
publisher MDPI AG
record_format Article
series Electronics
spelling doaj.art-30c97e8248344c92afee999178a4416e2023-11-19T16:18:18ZengMDPI AGElectronics2079-92922023-10-011220420410.3390/electronics12204204Multi-CC: A New Baseline for Faster and Better Deep ClusteringYulin Yao0Yu Yang1Linna Zhou2Xinsheng Guo3Gang Wang4School of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaSchool of Cyberspace Security, Beijing University of Posts and Telecommunications, Beijing 100876, ChinaIntelligent Policing Key Laboratory of Sichuan Province, Sichuan Police College, Luzhou 646000, ChinaThe aim of our paper is to introduce a new deep clustering model called Multi-head Cross-Attention Contrastive Clustering (Multi-CC), which seeks to enhance the performance of the existing deep clustering model CC. Our approach involves first augmenting the data to form image pairs and then using the same backbone to extract the feature representation of these image pairs. We then undertake contrastive learning, separately in the row space and column space of the feature matrix, to jointly learn the instance and cluster representations. Our approach offers several key improvements over the existing model. Firstly, we use a mixed strategy of strong and weak augmentation to construct image pairs. Secondly, we get rid of the pooling layer of the backbone to prevent loss of information. Finally, we introduce a multi-head cross-attention module to improve the model’s performance. These improvements have allowed us to reduce the model training time by 80%. As a baseline, Multi-CC achieves the best results on CIFAR-10, ImageNet-10, and ImageNet-dogs. It is easily replaceable with CC, making models based on CC achieve better performance.https://www.mdpi.com/2079-9292/12/20/4204clusteringdeep clusteringcontrastive learning
spellingShingle Yulin Yao
Yu Yang
Linna Zhou
Xinsheng Guo
Gang Wang
Multi-CC: A New Baseline for Faster and Better Deep Clustering
Electronics
clustering
deep clustering
contrastive learning
title Multi-CC: A New Baseline for Faster and Better Deep Clustering
title_full Multi-CC: A New Baseline for Faster and Better Deep Clustering
title_fullStr Multi-CC: A New Baseline for Faster and Better Deep Clustering
title_full_unstemmed Multi-CC: A New Baseline for Faster and Better Deep Clustering
title_short Multi-CC: A New Baseline for Faster and Better Deep Clustering
title_sort multi cc a new baseline for faster and better deep clustering
topic clustering
deep clustering
contrastive learning
url https://www.mdpi.com/2079-9292/12/20/4204
work_keys_str_mv AT yulinyao multiccanewbaselineforfasterandbetterdeepclustering
AT yuyang multiccanewbaselineforfasterandbetterdeepclustering
AT linnazhou multiccanewbaselineforfasterandbetterdeepclustering
AT xinshengguo multiccanewbaselineforfasterandbetterdeepclustering
AT gangwang multiccanewbaselineforfasterandbetterdeepclustering