User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations
Technology that detects user personality based on user speech signals must be researched to enhance the function of interaction between a user and virtual agent that takes place through a speech interface. In this study, personality patterns were automatically classified as either extroverted or int...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Springer
2013-08-01
|
Series: | International Journal of Computational Intelligence Systems |
Subjects: | |
Online Access: | https://www.atlantis-press.com/article/25868419.pdf |
_version_ | 1818233470036475904 |
---|---|
author | Soonil Kwon Joon Yeon Choeh Jong-Weon Lee |
author_facet | Soonil Kwon Joon Yeon Choeh Jong-Weon Lee |
author_sort | Soonil Kwon |
collection | DOAJ |
description | Technology that detects user personality based on user speech signals must be researched to enhance the function of interaction between a user and virtual agent that takes place through a speech interface. In this study, personality patterns were automatically classified as either extroverted or introverted. Personality patterns were recognized based on non-verbal cues such as the rate, energy, pitch, and silent intervals of speech with patterns of their change. Through experimentation, a maximum pattern classification accuracy of 86.3% was achieved. Using the same data, another pattern classification test was manually carried out by people to see how well the automatic pattern classification of personal traits performed. The results in the second manual test showed an accuracy of 86.6%. This proves that the automatic pattern classification of personal traits can achieve results comparable to the level of performance accomplished by humans. The Silent Intervals feature of the automatic pattern classification performed admirably while in the second test done by people, pitch was a key factor in producing better accuracy. This information will be useful and applicable in future studies. |
first_indexed | 2024-12-12T11:22:41Z |
format | Article |
id | doaj.art-21976974f4d943bf8e656c3d36fdfe53 |
institution | Directory Open Access Journal |
issn | 1875-6883 |
language | English |
last_indexed | 2024-12-12T11:22:41Z |
publishDate | 2013-08-01 |
publisher | Springer |
record_format | Article |
series | International Journal of Computational Intelligence Systems |
spelling | doaj.art-21976974f4d943bf8e656c3d36fdfe532022-12-22T00:25:58ZengSpringerInternational Journal of Computational Intelligence Systems1875-68832013-08-016410.1080/18756891.2013.804143User-Personality Classification Based on the Non-Verbal Cues from Spoken ConversationsSoonil KwonJoon Yeon ChoehJong-Weon LeeTechnology that detects user personality based on user speech signals must be researched to enhance the function of interaction between a user and virtual agent that takes place through a speech interface. In this study, personality patterns were automatically classified as either extroverted or introverted. Personality patterns were recognized based on non-verbal cues such as the rate, energy, pitch, and silent intervals of speech with patterns of their change. Through experimentation, a maximum pattern classification accuracy of 86.3% was achieved. Using the same data, another pattern classification test was manually carried out by people to see how well the automatic pattern classification of personal traits performed. The results in the second manual test showed an accuracy of 86.6%. This proves that the automatic pattern classification of personal traits can achieve results comparable to the level of performance accomplished by humans. The Silent Intervals feature of the automatic pattern classification performed admirably while in the second test done by people, pitch was a key factor in producing better accuracy. This information will be useful and applicable in future studies.https://www.atlantis-press.com/article/25868419.pdfVoice User InterfaceUser Personality TraitSpeech ProcessingHuman-Computer Interaction |
spellingShingle | Soonil Kwon Joon Yeon Choeh Jong-Weon Lee User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations International Journal of Computational Intelligence Systems Voice User Interface User Personality Trait Speech Processing Human-Computer Interaction |
title | User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations |
title_full | User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations |
title_fullStr | User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations |
title_full_unstemmed | User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations |
title_short | User-Personality Classification Based on the Non-Verbal Cues from Spoken Conversations |
title_sort | user personality classification based on the non verbal cues from spoken conversations |
topic | Voice User Interface User Personality Trait Speech Processing Human-Computer Interaction |
url | https://www.atlantis-press.com/article/25868419.pdf |
work_keys_str_mv | AT soonilkwon userpersonalityclassificationbasedonthenonverbalcuesfromspokenconversations AT joonyeonchoeh userpersonalityclassificationbasedonthenonverbalcuesfromspokenconversations AT jongweonlee userpersonalityclassificationbasedonthenonverbalcuesfromspokenconversations |