Wearable Visual Robots

Research work reported in the literature in wearable visual computing has used exclusively static (or non-active) cameras, making the imagery and image measurements dependent on the wearer's posture and motions. It is assumed that the camera is pointing in a good direction to view relevant part...

Full description

Bibliographic Details
Main Authors: Mayol, W, Tordoff, B, Murray, D
Format: Journal article
Language:English
Published: Springer-Verlag London Ltd 2002
_version_ 1797068958648500224
author Mayol, W
Tordoff, B
Murray, D
author_facet Mayol, W
Tordoff, B
Murray, D
author_sort Mayol, W
collection OXFORD
description Research work reported in the literature in wearable visual computing has used exclusively static (or non-active) cameras, making the imagery and image measurements dependent on the wearer's posture and motions. It is assumed that the camera is pointing in a good direction to view relevant parts of the scene at best by virtue of being mounted on the wearer's head, or at worst wholly by chance. Even when pointing in roughly the correct direction, any visual processing relying on feature correspondence from a passive camera is made more difficult by the large, uncontrolled inter-image movements which occur when the wearer moves, or even breathes. This paper presents a wearable active visual sensor which is able to achieve a level of decoupling of camera movement from the wearer's posture and motions by a combination of inertial and visual sensor feedback and active control. The issues of sensor placement, robot kinematics and their relation to wearability are discussed. The performance of the prototype robot is evaluated for some essential visual tasks. The paper also discusses potential applications for this kind of wearable robot. © 2002 Springer-Verlag London Ltd.
first_indexed 2024-03-06T22:17:31Z
format Journal article
id oxford-uuid:53e78f72-4fd6-41f5-961a-94d9fac8517f
institution University of Oxford
language English
last_indexed 2024-03-06T22:17:31Z
publishDate 2002
publisher Springer-Verlag London Ltd
record_format dspace
spelling oxford-uuid:53e78f72-4fd6-41f5-961a-94d9fac8517f2022-03-26T16:34:30ZWearable Visual RobotsJournal articlehttp://purl.org/coar/resource_type/c_dcae04bcuuid:53e78f72-4fd6-41f5-961a-94d9fac8517fEnglishSymplectic Elements at OxfordSpringer-Verlag London Ltd2002Mayol, WTordoff, BMurray, DResearch work reported in the literature in wearable visual computing has used exclusively static (or non-active) cameras, making the imagery and image measurements dependent on the wearer's posture and motions. It is assumed that the camera is pointing in a good direction to view relevant parts of the scene at best by virtue of being mounted on the wearer's head, or at worst wholly by chance. Even when pointing in roughly the correct direction, any visual processing relying on feature correspondence from a passive camera is made more difficult by the large, uncontrolled inter-image movements which occur when the wearer moves, or even breathes. This paper presents a wearable active visual sensor which is able to achieve a level of decoupling of camera movement from the wearer's posture and motions by a combination of inertial and visual sensor feedback and active control. The issues of sensor placement, robot kinematics and their relation to wearability are discussed. The performance of the prototype robot is evaluated for some essential visual tasks. The paper also discusses potential applications for this kind of wearable robot. © 2002 Springer-Verlag London Ltd.
spellingShingle Mayol, W
Tordoff, B
Murray, D
Wearable Visual Robots
title Wearable Visual Robots
title_full Wearable Visual Robots
title_fullStr Wearable Visual Robots
title_full_unstemmed Wearable Visual Robots
title_short Wearable Visual Robots
title_sort wearable visual robots
work_keys_str_mv AT mayolw wearablevisualrobots
AT tordoffb wearablevisualrobots
AT murrayd wearablevisualrobots