Interpretable Multimodal Sentiment Classification Using Deep Multi-View Attentive Network of Image and Text Data
Multimodal data can convey user emotions and feelings more effectively and interactively than unimodal content. Thus, multimodal sentiment analysis (MSA) research has recently acquired great significance as a field of study. However, most current approaches either acquire sentimental features indepe...
Main Authors: | Israa Khalaf Salman Al-Tameemi, Mohammad-Reza Feizi-Derakhshi, Saeid Pashazadeh, Mohammad Asadpour |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2023-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10227255/ |
Similar Items
-
Sentiment Analysis of Social Media via Multimodal Feature Fusion
by: Kang Zhang, et al.
Published: (2020-12-01) -
CCDA: A Novel Method to Explore the Cross-Correlation in Dual-Attention for Multimodal Sentiment Analysis
by: Peicheng Wang, et al.
Published: (2024-02-01) -
Global Local Fusion Neural Network for Multimodal Sentiment Analysis
by: Xiaoran Hu, et al.
Published: (2022-08-01) -
UsbVisdaNet: User Behavior Visual Distillation and Attention Network for Multimodal Sentiment Classification
by: Shangwu Hou, et al.
Published: (2023-05-01) -
Context-Dependent Multimodal Sentiment Analysis Based on a Complex Attention Mechanism
by: Lujuan Deng, et al.
Published: (2023-08-01)