Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies

© 2017 IEEE. Stories can have tremendous power - not only useful for entertainment, they can activate our interests and mobilize our actions. The degree to which a story resonates with its audience may be in part reflected in the emotional journey it takes the audience upon. In this paper, we use ma...

Full description

Bibliographic Details
Main Authors: Chu, Eric, Roy, Deb
Format: Article
Language:English
Published: IEEE 2021
Online Access:https://hdl.handle.net/1721.1/137445
_version_ 1826216084275986432
author Chu, Eric
Roy, Deb
author_facet Chu, Eric
Roy, Deb
author_sort Chu, Eric
collection MIT
description © 2017 IEEE. Stories can have tremendous power - not only useful for entertainment, they can activate our interests and mobilize our actions. The degree to which a story resonates with its audience may be in part reflected in the emotional journey it takes the audience upon. In this paper, we use machine learning methods to construct emotional arcs in movies, calculate families of arcs, and demonstrate the ability for certain arcs to predict audience engagement. The system is applied to Hollywood films and high quality shorts found on the web. We begin by using deep convolutional neural networks for audio and visual sentiment analysis. These models are trained on both new and existing large-scale datasets, after which they can be used to compute separate audio and visual emotional arcs. We then crowdsource annotations for 30-second video clips extracted from highs and lows in the arcs in order to assess the micro-level precision of the system, with precision measured in terms of agreement in polarity between the system's predictions and annotators' ratings. These annotations are also used to combine the audio and visual predictions. Next, we look at macro-level characterizations of movies by investigating whether there exist 'universal shapes' of emotional arcs. In particular, we develop a clustering approach to discover distinct classes of emotional arcs. Finally, we show on a sample corpus of short web videos that certain emotional arcs are statistically significant predictors of the number of comments a video receives. These results suggest that the emotional arcs learned by our approach successfully represent macroscopic aspects of a video story that drive audience engagement. Such machine understanding could be used to predict audience reactions to video stories, ultimately improving our ability as storytellers to communicate with each other.
first_indexed 2024-09-23T16:41:42Z
format Article
id mit-1721.1/137445
institution Massachusetts Institute of Technology
language English
last_indexed 2024-09-23T16:41:42Z
publishDate 2021
publisher IEEE
record_format dspace
spelling mit-1721.1/1374452021-11-06T03:01:38Z Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies Chu, Eric Roy, Deb © 2017 IEEE. Stories can have tremendous power - not only useful for entertainment, they can activate our interests and mobilize our actions. The degree to which a story resonates with its audience may be in part reflected in the emotional journey it takes the audience upon. In this paper, we use machine learning methods to construct emotional arcs in movies, calculate families of arcs, and demonstrate the ability for certain arcs to predict audience engagement. The system is applied to Hollywood films and high quality shorts found on the web. We begin by using deep convolutional neural networks for audio and visual sentiment analysis. These models are trained on both new and existing large-scale datasets, after which they can be used to compute separate audio and visual emotional arcs. We then crowdsource annotations for 30-second video clips extracted from highs and lows in the arcs in order to assess the micro-level precision of the system, with precision measured in terms of agreement in polarity between the system's predictions and annotators' ratings. These annotations are also used to combine the audio and visual predictions. Next, we look at macro-level characterizations of movies by investigating whether there exist 'universal shapes' of emotional arcs. In particular, we develop a clustering approach to discover distinct classes of emotional arcs. Finally, we show on a sample corpus of short web videos that certain emotional arcs are statistically significant predictors of the number of comments a video receives. These results suggest that the emotional arcs learned by our approach successfully represent macroscopic aspects of a video story that drive audience engagement. Such machine understanding could be used to predict audience reactions to video stories, ultimately improving our ability as storytellers to communicate with each other. 2021-11-05T13:13:10Z 2021-11-05T13:13:10Z 2017-11 2019-07-23T17:06:06Z Article http://purl.org/eprint/type/ConferencePaper https://hdl.handle.net/1721.1/137445 Chu, Eric and Roy, Deb. 2017. "Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies." en 10.1109/icdm.2017.100 Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/ application/pdf IEEE MIT web domain
spellingShingle Chu, Eric
Roy, Deb
Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title_full Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title_fullStr Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title_full_unstemmed Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title_short Audio-Visual Sentiment Analysis for Learning Emotional Arcs in Movies
title_sort audio visual sentiment analysis for learning emotional arcs in movies
url https://hdl.handle.net/1721.1/137445
work_keys_str_mv AT chueric audiovisualsentimentanalysisforlearningemotionalarcsinmovies
AT roydeb audiovisualsentimentanalysisforlearningemotionalarcsinmovies