the CONTACT Visuo-Motor Grasping dataBase
is one of the most interesting challenges in nowadays robotics, posing
problems to the mechanical and electronic engineer, the computer
vision researcher, the control theorist and, more recently, the
neuroscientist. The study of human grasping has proved beneﬁcial to get
a better understanding of the problem. In this paper we present VMGdB,
the CONTACT Visuo-Motor Grasping Database, a recording of grasping
actions performed by 20 human subjects on 7 objects using 5 ways of
grasping, under variable illumination conditions. The VMGdB consists of
5200 grasping acts organized in 260 data entries — each of which made
of 2 video sequences recorded from two colour cameras, and motor data
recorded from a sensorised glove. Labeled data are available as
standard AVI videos and a ﬁle of ASCII outputs from the glove.
The intentionally unstructured illumination conditions and the fact that the objects are of the most diverse shapes, textures, and colors, make this database a rather realistic ground model of the human act of grasping.
The VMGdB provides to the community a reliable and ﬂexible testbed for tackling the problem of grasping from a humanoid/human-oriented perspective, and hopefully not only that.
VMGdB contains recorgings of grasping actions performed by
human subjects sitting in front of a desk onto which an object to grasp
Each subject was asked to grasp the object in front of him/her, with the right hand wearing a sensorized glove, and then to put it back in the original position of the desk with the left hand, as the right hand went to the resting position.
The scene was illuminated by natural light. Illumination conditions were intentionally not controlled and changed overtime since the acquisition session spanned over a week.
The main acquisition components are:
Each of the 260 (subject, object, grasp) entry is associated to the following data:
Click on the links to download zip files (about
200Mb each) of the grasping actions performed by each volunteer
For hints on how to browse the data and how to use the scripts
|Contacts and acknoledgements|
This is a joint work betweeen Università degli Studi di Genova (It), IDIAP (Ch) and IIT (It)
for more information: noceti <at> disi.unige.it (Nicoletta Noceti)