A Novel Identification Methodology for the Coordinate Relationship between a 3D Vision System and a Legged Robot

Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established tha...

Full description

Bibliographic Details
Main Authors: Xun Chai, Feng Gao, Yang Pan, Chenkun Qi, Yilin Xu
Format: Article
Language:English
Published: MDPI AG 2015-04-01
Series:Sensors
Subjects:
Online Access:http://www.mdpi.com/1424-8220/15/4/9519
Description
Summary:Coordinate identification between vision systems and robots is quite a challenging issue in the field of intelligent robotic applications, involving steps such as perceiving the immediate environment, building the terrain map and planning the locomotion automatically. It is now well established that current identification methods have non-negligible limitations such as a difficult feature matching, the requirement of external tools and the intervention of multiple people. In this paper, we propose a novel methodology to identify the geometric parameters of 3D vision systems mounted on robots without involving other people or additional equipment. In particular, our method focuses on legged robots which have complex body structures and excellent locomotion ability compared to their wheeled/tracked counterparts. The parameters can be identified only by moving robots on a relatively flat ground. Concretely, an estimation approach is provided to calculate the ground plane. In addition, the relationship between the robot and the ground is modeled. The parameters are obtained by formulating the identification problem as an optimization problem. The methodology is integrated on a legged robot called “Octopus”, which can traverse through rough terrains with high stability after obtaining the identification parameters of its mounted vision system using the proposed method. Diverse experiments in different environments demonstrate our novel method is accurate and robust.
ISSN:1424-8220