Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences
This paper proposes an emotion recognition method for tweets containing emoticons using their emoticon image and language features. Some of the existing methods register emoticons and their facial expression categories in a dictionary and use them, while other methods recognize emoticon facial expre...
Main Authors: | , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
MDPI AG
2022-01-01
|
Series: | Applied Sciences |
Subjects: | |
Online Access: | https://www.mdpi.com/2076-3417/12/3/1256 |
_version_ | 1827661648992665600 |
---|---|
author | Akira Fujisawa Kazuyuki Matsumoto Minoru Yoshida Kenji Kita |
author_facet | Akira Fujisawa Kazuyuki Matsumoto Minoru Yoshida Kenji Kita |
author_sort | Akira Fujisawa |
collection | DOAJ |
description | This paper proposes an emotion recognition method for tweets containing emoticons using their emoticon image and language features. Some of the existing methods register emoticons and their facial expression categories in a dictionary and use them, while other methods recognize emoticon facial expressions based on the various elements of the emoticons. However, highly accurate emotion recognition cannot be performed unless the recognition is based on a combination of the features of sentences and emoticons. Therefore, we propose a model that recognizes emotions by extracting the shape features of emoticons from their image data and applying the feature vector input that combines the image features with features extracted from the text of the tweets. Based on evaluation experiments, the proposed method is confirmed to achieve high accuracy and shown to be more effective than methods that use text features only. |
first_indexed | 2024-03-10T00:14:10Z |
format | Article |
id | doaj.art-8e292aa15bad4a27b6154124fccebdfa |
institution | Directory Open Access Journal |
issn | 2076-3417 |
language | English |
last_indexed | 2024-03-10T00:14:10Z |
publishDate | 2022-01-01 |
publisher | MDPI AG |
record_format | Article |
series | Applied Sciences |
spelling | doaj.art-8e292aa15bad4a27b6154124fccebdfa2023-11-23T15:54:35ZengMDPI AGApplied Sciences2076-34172022-01-01123125610.3390/app12031256Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of SentencesAkira Fujisawa0Kazuyuki Matsumoto1Minoru Yoshida2Kenji Kita3Faculty of Software and Information Technology, Aomori University, Aomori 0300943, JapanGraduate School of Technology, Industrial and Social Sciences, Tokushima University, Tokushima 7708506, JapanGraduate School of Technology, Industrial and Social Sciences, Tokushima University, Tokushima 7708506, JapanGraduate School of Technology, Industrial and Social Sciences, Tokushima University, Tokushima 7708506, JapanThis paper proposes an emotion recognition method for tweets containing emoticons using their emoticon image and language features. Some of the existing methods register emoticons and their facial expression categories in a dictionary and use them, while other methods recognize emoticon facial expressions based on the various elements of the emoticons. However, highly accurate emotion recognition cannot be performed unless the recognition is based on a combination of the features of sentences and emoticons. Therefore, we propose a model that recognizes emotions by extracting the shape features of emoticons from their image data and applying the feature vector input that combines the image features with features extracted from the text of the tweets. Based on evaluation experiments, the proposed method is confirmed to achieve high accuracy and shown to be more effective than methods that use text features only.https://www.mdpi.com/2076-3417/12/3/1256emoticonemotion estimationmultimodal information |
spellingShingle | Akira Fujisawa Kazuyuki Matsumoto Minoru Yoshida Kenji Kita Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences Applied Sciences emoticon emotion estimation multimodal information |
title | Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences |
title_full | Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences |
title_fullStr | Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences |
title_full_unstemmed | Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences |
title_short | Emotion Estimation Method Based on Emoticon Image Features and Distributed Representations of Sentences |
title_sort | emotion estimation method based on emoticon image features and distributed representations of sentences |
topic | emoticon emotion estimation multimodal information |
url | https://www.mdpi.com/2076-3417/12/3/1256 |
work_keys_str_mv | AT akirafujisawa emotionestimationmethodbasedonemoticonimagefeaturesanddistributedrepresentationsofsentences AT kazuyukimatsumoto emotionestimationmethodbasedonemoticonimagefeaturesanddistributedrepresentationsofsentences AT minoruyoshida emotionestimationmethodbasedonemoticonimagefeaturesanddistributedrepresentationsofsentences AT kenjikita emotionestimationmethodbasedonemoticonimagefeaturesanddistributedrepresentationsofsentences |