CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis
With the development of the Internet, the content that people share contains types of text, images, and videos, and utilizing these multimodal data for sentiment analysis has become an important area of research. Multimodal sentiment analysis aims to understand and perceive emotions or sentiments in...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2024-02-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/14/5/1934 |
_version_ | 1797264853177466880 |
---|---|
author | Peicheng Wang Shuxian Liu Jinyan Chen |
author_facet | Peicheng Wang Shuxian Liu Jinyan Chen |
author_sort | Peicheng Wang |
collection | DOAJ |
description | With the development of the Internet, the content that people share contains types of text, images, and videos, and utilizing these multimodal data for sentiment analysis has become an important area of research. Multimodal sentiment analysis aims to understand and perceive emotions or sentiments in different types of data. Currently, the realm of multimodal sentiment analysis faces various challenges, with a major emphasis on addressing two key issues: (1) inefficiency when modeling the intramodality and intermodality dynamics and (2) inability to effectively fuse multimodal features. In this paper, we propose the CCDA (cross-correlation in dual-attention) model, a novel method to explore dynamics between different modalities and fuse multimodal features efficiently. We capture dynamics at intra- and intermodal levels by using two types of attention mechanisms simultaneously. Meanwhile, the cross-correlation loss is introduced to capture the correlation between attention mechanisms. Moreover, the relevant coefficient is proposed to integrate multimodal features effectively. Extensive experiments were conducted on three publicly available datasets, CMU-MOSI, CMU-MOSEI, and CH-SIMS. The experimental results fully confirm the effectiveness of our proposed method, and, compared with the current optimal method (SOTA), our model shows obvious advantages in most of the key metrics, proving its better performance in multimodal sentiment analysis. |
first_indexed | 2024-04-25T00:35:30Z |
format | Article |
id | doaj.art-808687507dfb499ba25ce0edd1d1a132 |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-04-25T00:35:30Z |
publishDate | 2024-02-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-808687507dfb499ba25ce0edd1d1a1322024-03-12T16:39:32ZengMDPI AGApplied Sciences2076-34172024-02-01145193410.3390/app14051934CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment AnalysisPeicheng Wang0Shuxian Liu1Jinyan Chen2School of Information Science and Engineering, Xinjiang University, Urumqi 830017, ChinaSchool of Information Science and Engineering, Xinjiang University, Urumqi 830017, ChinaSchool of Information Science and Engineering, Xinjiang University, Urumqi 830017, ChinaWith the development of the Internet, the content that people share contains types of text, images, and videos, and utilizing these multimodal data for sentiment analysis has become an important area of research. Multimodal sentiment analysis aims to understand and perceive emotions or sentiments in different types of data. Currently, the realm of multimodal sentiment analysis faces various challenges, with a major emphasis on addressing two key issues: (1) inefficiency when modeling the intramodality and intermodality dynamics and (2) inability to effectively fuse multimodal features. In this paper, we propose the CCDA (cross-correlation in dual-attention) model, a novel method to explore dynamics between different modalities and fuse multimodal features efficiently. We capture dynamics at intra- and intermodal levels by using two types of attention mechanisms simultaneously. Meanwhile, the cross-correlation loss is introduced to capture the correlation between attention mechanisms. Moreover, the relevant coefficient is proposed to integrate multimodal features effectively. Extensive experiments were conducted on three publicly available datasets, CMU-MOSI, CMU-MOSEI, and CH-SIMS. The experimental results fully confirm the effectiveness of our proposed method, and, compared with the current optimal method (SOTA), our model shows obvious advantages in most of the key metrics, proving its better performance in multimodal sentiment analysis.https://www.mdpi.com/2076-3417/14/5/1934multimodalitysentiment analysisattention mechanism |
spellingShingle | Peicheng Wang Shuxian Liu Jinyan Chen CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis Applied Sciences multimodality sentiment analysis attention mechanism |
title | CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis |
title_full | CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis |
title_fullStr | CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis |
title_full_unstemmed | CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis |
title_short | CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis |
title_sort | ccda a novel method to explore the cross correlation in dual attention for multimodal sentiment analysis |
topic | multimodality sentiment analysis attention mechanism |
url | https://www.mdpi.com/2076-3417/14/5/1934 |
work_keys_str_mv | AT peichengwang ccdaanovelmethodtoexplorethecrosscorrelationindualattentionformultimodalsentimentanalysis AT shuxianliu ccdaanovelmethodtoexplorethecrosscorrelationindualattentionformultimodalsentimentanalysis AT jinyanchen ccdaanovelmethodtoexplorethecrosscorrelationindualattentionformultimodalsentimentanalysis |