VisGraB: A Benchmark for Vision-Based Grasping
We present a database and a software tool, VisGraB, for benchmarking of methods for vision-based grasping of unknown objects with no prior object knowledge. The benchmark is a combined real-world and simulated experimental setup. Stereo images of real scenes containing several objects in different c...
Главные авторы: | Kootstra Gert, Popović Mila, Jørgensen Jimmy Alison, Kragic Danica, Petersen Henrik Gordon, Krüger Norbert |
---|---|
Формат: | Статья |
Язык: | English |
Опубликовано: |
De Gruyter
2012-06-01
|
Серии: | Paladyn |
Предметы: | |
Online-ссылка: | https://doi.org/10.2478/s13230-012-0020-5 |
Схожие документы
-
Model-Based Grasping of Unknown Objects from a Random Pile
по: Bruno Sauvet, и др.
Опубликовано: (2019-09-01) -
Robotic Grasping of Unknown Objects Based on Deep Learning-Based Feature Detection
по: Kai Sherng Khor, и др.
Опубликовано: (2024-07-01) -
Performance measures to benchmark the grasping, manipulation, and assembly of deformable objects typical to manufacturing applications
по: Kenneth Kimble, и др.
Опубликовано: (2022-11-01) -
Imitation Learning of Whole-Body Grasps
по: Hsiao, Kaijen, и др.
Опубликовано: (2005) -
An Accessible, Open-Source Dexterity Test: Evaluating the Grasping and Dexterous Manipulation Capabilities of Humans and Robots
по: Nathan Elangovan, и др.
Опубликовано: (2022-04-01)