Acume: A New Visualization Tool for Understanding Facial Expression and Gesture Data

Facial and head actions contain significant affective information. To date, these actions have mostly been studied in isolation because the space of naturalistic combinations is vast. Interactive visualization tools could enable new explorations of dynamically changing combinations of actions as peo...

Full description

Bibliographic Details
Main Authors: McDuff, Daniel Jonathan, el Kaliouby, Rana, Kassam, Karim, Picard, Rosalind W.
Other Authors: Massachusetts Institute of Technology. Media Laboratory
Format: Article
Language:en_US
Published: Institute of Electrical and Electronics Engineers 2011
Online Access:http://hdl.handle.net/1721.1/67459
https://orcid.org/0000-0002-5661-0022
Description
Summary:Facial and head actions contain significant affective information. To date, these actions have mostly been studied in isolation because the space of naturalistic combinations is vast. Interactive visualization tools could enable new explorations of dynamically changing combinations of actions as people interact with natural stimuli. This paper describes a new open-source tool that enables navigation of and interaction with dynamic face and gesture data across large groups of people, making it easy to see when multiple facial actions co-occur, and how these patterns compare and cluster across groups of participants. We share two case studies that demonstrate how the tool allows researchers to quickly view an entire corpus of data for single or multiple participants, stimuli and actions. Acume yielded patterns of actions across participants and across stimuli, and helped give insight into how our automated facial analysis methods could be better designed. The results of these case studies are used to demonstrate the efficacy of the tool. The open-source code is designed to directly address the needs of the face and gesture research community, while also being extensible and flexible for accommodating other kinds of behavioral data. Source code, application and documentation are available at http://affect.media.mit.edu/acume.