Marfusion: An Attention-Based Multimodal Fusion Model for Human Activity Recognition in Real-World Scenarios
Human Activity Recognition(HAR) plays an important role in the field of ubiquitous computing, which can benefit various human-centric applications such as smart homes, health monitoring, and aging systems. Human Activity Recognition mainly leverages smartphones and wearable devices to collect sensor...
Main Authors: | Yunhan Zhao, Siqi Guo, Zeqi Chen, Qiang Shen, Zhengyuan Meng, Hao Xu |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-05-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/11/5408 |
Similar Items
-
A Framework to Evaluate Fusion Methods for Multimodal Emotion Recognition
by: Diego Pena, et al.
Published: (2023-01-01) -
Robust Multimodal Emotion Recognition from Conversation with Transformer-Based Crossmodality Fusion
by: Baijun Xie, et al.
Published: (2021-07-01) -
Multimodal Emotion Recognition and Sentiment Analysis Using Masked Attention and Multimodal Interaction
by: Tatiana Voloshina, et al.
Published: (2023-05-01) -
ANALYSIS OF MULTIMODAL FUSION TECHNIQUES FOR AUDIO-VISUAL SPEECH RECOGNITION
by: D.V. Ivanko, et al.
Published: (2016-05-01) -
Alignment-Enhanced Interactive Fusion Model for Complete and Incomplete Multimodal Hand Gesture Recognition
by: Shengcai Duan, et al.
Published: (2023-01-01)