Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time
We examined how stream/bounce event perception is affected by motion correspondence based on the surface features of moving objects passing behind an occlusion. In the stream/bounce display two identical objects moving across each other in a two-dimensional display can be perceived as either streami...
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
SAGE Publishing
2011-06-01
|
Series: | i-Perception |
Online Access: | https://doi.org/10.1068/i0399 |
_version_ | 1818495874039283712 |
---|---|
author | Yousuke Kawachi Takahiro Kawabe Jiro Gyoba |
author_facet | Yousuke Kawachi Takahiro Kawabe Jiro Gyoba |
author_sort | Yousuke Kawachi |
collection | DOAJ |
description | We examined how stream/bounce event perception is affected by motion correspondence based on the surface features of moving objects passing behind an occlusion. In the stream/bounce display two identical objects moving across each other in a two-dimensional display can be perceived as either streaming through or bouncing off each other at coincidence. Here, surface features such as colour (Experiments 1 and 2) or luminance (Experiment 3) were switched between the two objects at coincidence. The moment of coincidence was invisible to observers due to an occluder. Additionally, the presentation of the moving objects was manipulated in duration after the feature switch at coincidence. The results revealed that a postcoincidence duration of approximately 200 ms was required for the visual system to stabilize judgments of stream/bounce events by determining motion correspondence between the objects across the occlusion on the basis of the surface feature. The critical duration was similar across motion speeds of objects and types of surface features. Moreover, controls (Experiments 4a–4c) showed that cognitive bias based on feature (colour/luminance) congruency across the occlusion could not fully account for the effects of surface features on the stream/bounce judgments. We discuss the roles of motion correspondence, visual feature processing, and attentive tracking in the stream/bounce judgments. |
first_indexed | 2024-12-10T18:26:26Z |
format | Article |
id | doaj.art-1651a988bea94e99afc42a179c8a152f |
institution | Directory Open Access Journal |
issn | 2041-6695 |
language | English |
last_indexed | 2024-12-10T18:26:26Z |
publishDate | 2011-06-01 |
publisher | SAGE Publishing |
record_format | Article |
series | i-Perception |
spelling | doaj.art-1651a988bea94e99afc42a179c8a152f2022-12-22T01:38:04ZengSAGE Publishingi-Perception2041-66952011-06-01210.1068/i039910.1068_i0399Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and TimeYousuke KawachiTakahiro KawabeJiro GyobaWe examined how stream/bounce event perception is affected by motion correspondence based on the surface features of moving objects passing behind an occlusion. In the stream/bounce display two identical objects moving across each other in a two-dimensional display can be perceived as either streaming through or bouncing off each other at coincidence. Here, surface features such as colour (Experiments 1 and 2) or luminance (Experiment 3) were switched between the two objects at coincidence. The moment of coincidence was invisible to observers due to an occluder. Additionally, the presentation of the moving objects was manipulated in duration after the feature switch at coincidence. The results revealed that a postcoincidence duration of approximately 200 ms was required for the visual system to stabilize judgments of stream/bounce events by determining motion correspondence between the objects across the occlusion on the basis of the surface feature. The critical duration was similar across motion speeds of objects and types of surface features. Moreover, controls (Experiments 4a–4c) showed that cognitive bias based on feature (colour/luminance) congruency across the occlusion could not fully account for the effects of surface features on the stream/bounce judgments. We discuss the roles of motion correspondence, visual feature processing, and attentive tracking in the stream/bounce judgments.https://doi.org/10.1068/i0399 |
spellingShingle | Yousuke Kawachi Takahiro Kawabe Jiro Gyoba Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time i-Perception |
title | Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time |
title_full | Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time |
title_fullStr | Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time |
title_full_unstemmed | Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time |
title_short | Stream/Bounce Event Perception Reveals a Temporal Limit of Motion Correspondence Based on Surface Feature over Space and Time |
title_sort | stream bounce event perception reveals a temporal limit of motion correspondence based on surface feature over space and time |
url | https://doi.org/10.1068/i0399 |
work_keys_str_mv | AT yousukekawachi streambounceeventperceptionrevealsatemporallimitofmotioncorrespondencebasedonsurfacefeatureoverspaceandtime AT takahirokawabe streambounceeventperceptionrevealsatemporallimitofmotioncorrespondencebasedonsurfacefeatureoverspaceandtime AT jirogyoba streambounceeventperceptionrevealsatemporallimitofmotioncorrespondencebasedonsurfacefeatureoverspaceandtime |