3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch

Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measuremen...

Full description

Bibliographic Details
Main Authors: Shan Xu, Chang Xu, Sarah McIntyre, Håkan Olausson, Gregory J. Gerling
Format: Article
Language:English
Published: Frontiers Media S.A. 2022-06-01
Series:Frontiers in Physiology
Subjects:
Online Access:https://www.frontiersin.org/articles/10.3389/fphys.2022.841938/full
_version_ 1817969083948204032
author Shan Xu
Chang Xu
Sarah McIntyre
Håkan Olausson
Gregory J. Gerling
author_facet Shan Xu
Chang Xu
Sarah McIntyre
Håkan Olausson
Gregory J. Gerling
author_sort Shan Xu
collection DOAJ
description Across a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy.
first_indexed 2024-04-13T20:16:55Z
format Article
id doaj.art-f2cb7a6ceaa94e18b2121f34adf9e1df
institution Directory Open Access Journal
issn 1664-042X
language English
last_indexed 2024-04-13T20:16:55Z
publishDate 2022-06-01
publisher Frontiers Media S.A.
record_format Article
series Frontiers in Physiology
spelling doaj.art-f2cb7a6ceaa94e18b2121f34adf9e1df2022-12-22T02:31:41ZengFrontiers Media S.A.Frontiers in Physiology1664-042X2022-06-011310.3389/fphys.2022.8419388419383D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human TouchShan Xu0Chang Xu1Sarah McIntyre2Håkan Olausson3Gregory J. Gerling4School of Engineering and Applied Science, University of Virginia, Charlottesville, VA, United StatesSchool of Engineering and Applied Science, University of Virginia, Charlottesville, VA, United StatesCenter for Social and Affective Neuroscience (CSAN), Linköping University, Linköping, SwedenCenter for Social and Affective Neuroscience (CSAN), Linköping University, Linköping, SwedenSchool of Engineering and Applied Science, University of Virginia, Charlottesville, VA, United StatesAcross a plethora of social situations, we touch others in natural and intuitive ways to share thoughts and emotions, such as tapping to get one’s attention or caressing to soothe one’s anxiety. A deeper understanding of these human-to-human interactions will require, in part, the precise measurement of skin-to-skin physical contact. Among prior efforts, each measurement approach exhibits certain constraints, e.g., motion trackers do not capture the precise shape of skin surfaces, while pressure sensors impede skin-to-skin contact. In contrast, this work develops an interference-free 3D visual tracking system using a depth camera to measure the contact attributes between the bare hand of a toucher and the forearm of a receiver. The toucher’s hand is tracked as a posed and positioned mesh by fitting a hand model to detected 3D hand joints, whereas a receiver’s forearm is extracted as a 3D surface updated upon repeated skin contact. Based on a contact model involving point clouds, the spatiotemporal changes of hand-to-forearm contact are decomposed as six, high-resolution, time-series contact attributes, i.e., contact area, indentation depth, absolute velocity, and three orthogonal velocity components, together with contact duration. To examine the system’s capabilities and limitations, two types of experiments were performed. First, to evaluate its ability to discern human touches, one person delivered cued social messages, e.g., happiness, anger, sympathy, to another person using their preferred gestures. The results indicated that messages and gestures, as well as the identities of the touchers, were readily discerned from their contact attributes. Second, the system’s spatiotemporal accuracy was validated against measurements from independent devices, including an electromagnetic motion tracker, sensorized pressure mat, and laser displacement sensor. While validated here in the context of social communication, this system is extendable to human touch interactions such as maternal care of infants and massage therapy.https://www.frontiersin.org/articles/10.3389/fphys.2022.841938/fulltouchsocial touchhapticsvisual trackingtactile mechanicshuman performance
spellingShingle Shan Xu
Chang Xu
Sarah McIntyre
Håkan Olausson
Gregory J. Gerling
3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
Frontiers in Physiology
touch
social touch
haptics
visual tracking
tactile mechanics
human performance
title 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_full 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_fullStr 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_full_unstemmed 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_short 3D Visual Tracking to Quantify Physical Contact Interactions in Human-to-Human Touch
title_sort 3d visual tracking to quantify physical contact interactions in human to human touch
topic touch
social touch
haptics
visual tracking
tactile mechanics
human performance
url https://www.frontiersin.org/articles/10.3389/fphys.2022.841938/full
work_keys_str_mv AT shanxu 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT changxu 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT sarahmcintyre 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT hakanolausson 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch
AT gregoryjgerling 3dvisualtrackingtoquantifyphysicalcontactinteractionsinhumantohumantouch