Benchmarks for Understanding Grasping
Benchmarks for Understanding Grasping
Disciplines
Electrical Engineering, Electronics, Information Engineering (60%); Computer Sciences (40%)
Keywords
-
Teilerkennung,
Roboter,
Benchmark,
Objekt,
Greifen
Grasping rigid objects has been reasonably studied under a wide variety of
settings. The common measure of success is a check of the robot to hold an
object for a few seconds. This is not enough. To obtain a deeper
understanding of object manipulation, we propose (1) a task-oriented part-
based modelling of grasping and (2) BURG - our castle* of setups, tools and
metrics for community building around an objective benchmark protocol.
The idea is to boost grasping research by focusing on complete tasks. This
calls for attention on object parts since they are essential to know how
and where the gripper can grasp given the manipulation constraints imposed
by the task. Moreover, parts facilitate knowledge transfer to novel
objects, across different sources (virtual/real data) and grippers,
providing for a versatile and scalable system. The part-based approach
naturally extends to deformable objects for which the recognition of
relevant semantic parts, regardless of the object actual deformation, is
essential to get a tractable manipulation problem. Finally, by focusing on
parts we can deal easier with environmental constraints that are detected
and used to facilitate grasping.
Regarding benchmarking of manipulation, so far robotics suffered from
uncomparable grasping and manipulation work. Datasets cover only the object
detection aspect. Object sets are difficult to get, not extendible, and
neither scenes nor manipulation tasks are replicable. There are no common
tools to solve the basic needs of setting up replicable scenes or reliably
estimate object pose.
Hence, with the BURG benchmark we propose to focus on community building
through enabling and sharing tools for reproducible performance evaluation,
including collecting data and feedback from different laboratories for
studying manipulation across different robot embodiments. We will develop a
set of repeatable scenarios spanning different levels of quantifiable
complexity that involve the choice of the objects, tasks and environments.
Examples include fully quantified settings with layers of objects, adding
deformable objects and environmental constraints. The benchmark will
include metrics defined to assess the performance of both low-level
primitives (object pose, grasp point and type, collision-free motion) as
well as manipulation tasks (stacking, aligning, assembling, packing,
handover, folding) requiring ordering as well as common sense knowledge for
semantic reasoning.
* Burg [brk] f
The idea of the BURG project is to boost grasping research by focusing on complete tasks. This calls for attention on object parts since they are essential to know how and where the gripper can grasp given the manipulation constraints imposed by the task. Moreover, parts facilitate knowledge transfer to novel objects, across different sources (virtual/real data) and grippers, providing for a versatile and scalable system. The part-based approach naturally extends to deformable objects for which the recognition of relevant semantic parts, regardless of the object's actual deformation, is essential to get a tractable manipulation problem. Finally, by focusing on parts we can deal easier with environmental constraints that are detected and used to facilitate grasping. With the BURG project we proposed to focus on community building through enabling and sharing tools for reproducible performance evaluation, including collecting data and feedback from different laboratories for studying manipulation across different robot embodiments. We developed a set of repeatable scenarios spanning different levels of quantifiable complexity that involve the choice of the objects, tasks and environments. Examples include fully quantified settings with layers of objects, adding deformable objects and environmental constraints. The benchmark will include metrics defined to assess the performance of both low-level primitives (object pose, grasp point and type, collision-free motion) as well as manipulation tasks (stacking, aligning, assembling, packing, handover, folding) requiring ordering as well as common sense knowledge for semantic reasoning. Results are a set of tools for creating the BURG benchmark and carrying it out and software tools for annotation and processing of the data.
- Technische Universität Wien - 100%
- Tatiana Tommasi, Sonstige Forschungs- oder Entwicklungseinrichtungen - Italy
- Guillem Alenya, Agencia Estatal Consejo Superior de Investigaciones Cientificas - Spain
- Ales Leonardis, The University of Birmingham
Research Output
- 774 Citations
- 22 Publications
- 1 Policies
- 2 Methods & Materials
- 2 Datasets & models
- 1 Software
- 1 Disseminations
- 2 Scientific Awards
- 1 Fundings
-
2020
Title Learn, detect, and grasp objects in real-world settings DOI 10.1007/s00502-020-00817-6 Type Journal Article Author Vincze M Journal e & i Elektrotechnik und Informationstechnik Pages 324-330 Link Publication -
2020
Title Neural Object Learning for 6D Pose Estimation Using a Few Cluttered Images DOI 10.1007/978-3-030-58548-8_38 Type Book Chapter Author Park K Publisher Springer Nature Pages 656-673 -
2020
Title Robot perception of static and dynamic objects with an autonomous floor scrubber DOI 10.1007/s11370-020-00324-9 Type Journal Article Author Yan Z Journal Intelligent Service Robotics Pages 403-417 -
2020
Title VeREFINE: Integrating Object Pose Verification With Physics-Guided Iterative Refinement DOI 10.1109/lra.2020.2996059 Type Journal Article Author Bauer D Journal IEEE Robotics and Automation Letters Pages 4289-4296 Link Publication -
2020
Title DGCM-Net: Dense Geometrical Correspondence Matching Network for Incremental Experience-Based Robotic Grasping DOI 10.3389/frobt.2020.00120 Type Journal Article Author Patten T Journal Frontiers in Robotics and AI Pages 120 Link Publication -
2020
Title Unsupervised Domain Adaptation Through Inter-Modal Rotation for RGB-D Object Recognition DOI 10.1109/lra.2020.3007092 Type Journal Article Author Loghmani M Journal IEEE Robotics and Automation Letters Pages 6631-6638 Link Publication -
2019
Title Pix2Pose: Pixel-Wise Coordinate Regression of Objects for 6D Pose Estimation DOI 10.1109/iccv.2019.00776 Type Conference Proceeding Abstract Author Park K Pages 7667-7676 Link Publication -
2019
Title EasyLabel: A Semi-Automatic Pixel-wise Object Annotation Tool for Creating Robotic RGB-D Datasets DOI 10.1109/icra.2019.8793917 Type Conference Proceeding Abstract Author Suchi M Pages 6678-6684 Link Publication -
2019
Title Pix2Pose: Pixel-Wise Coordinate Regression of Objects for 6D Pose Estimation DOI 10.48550/arxiv.1908.07433 Type Preprint Author Park K -
2022
Title SporeAgent: Reinforced Scene-level Plausibility for Object Pose Refinement DOI 10.1109/wacv51458.2022.00027 Type Conference Proceeding Abstract Author Bauer D Pages 196-204 Link Publication -
2022
Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World DOI 10.48550/arxiv.2205.14099 Type Preprint Author Rudorfer M -
2022
Title Where Does It Belong? Autonomous Object Mapping in Open-World Settings DOI 10.3389/frobt.2022.828732 Type Journal Article Author Langer E Journal Frontiers in Robotics and AI Pages 828732 Link Publication -
2021
Title ReAgent: Point Cloud Registration using Imitation and Reinforcement Learning DOI 10.1109/cvpr46437.2021.01435 Type Conference Proceeding Abstract Author Bauer D Pages 14581-14589 Link Publication -
2020
Title Physical Plausibility of 6D Pose Estimates in Scenes of Static Rigid Objects DOI 10.1007/978-3-030-66096-3_43 Type Book Chapter Author Bauer D Publisher Springer Nature Pages 648-662 -
2023
Title 3D-DAT: 3D-Dataset Annotation Toolkit for Robotic Vision DOI 10.1109/icra48891.2023.10160669 Type Conference Proceeding Abstract Author Neuberger B Pages 9162-9168 -
2023
Title COPE: End-to-end trainable Constant Runtime Object Pose Estimation DOI 10.1109/wacv56688.2023.00288 Type Conference Proceeding Abstract Author Patten T Pages 2859-2869 -
2023
Title [Recognizing transparent objects for laboratory automation]. DOI 10.1007/s00502-023-01158-w Type Journal Article Author Vincze M Journal Elektrotechnik und Informationstechnik : E & I Pages 519-529 -
2023
Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World Type Conference Proceeding Abstract Author Markus Suchi Conference UK Robot Manipulation workshop -
2023
Title Object Change Detection for Autonomous Indoor Robots in Open-World Settings DOI 10.34726/hss.2023.111500 Type Other Author Langer E Link Publication -
2023
Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World DOI 10.34726/5453 Type Other Author Rudorfer M Link Publication -
2019
Title Recurrent Convolutional Fusion for RGB-D Object Recognition DOI 10.1109/lra.2019.2921506 Type Journal Article Author Loghmani M Journal IEEE Robotics and Automation Letters Pages 2878-2885 Link Publication -
2019
Title An Empirical Evaluation of Ten Depth Cameras DOI 10.1109/mra.2018.2852795 Type Journal Article Author Halmetschlager-Funek G Journal IEEE Robotics & Automation Magazine Pages 67-77 Link Publication
-
2020
Title Trying to increase the interest in STEM Type Implementation circular/rapid advice/letter to e.g. Ministry of Health
-
2022
Title Object class detection and pose within the heap Type Improvements to research infrastructure Public Access -
2021
Title Probabilistic part-based scene segmentation Type Improvements to research infrastructure Public Access
-
2023
Link
Title 3D-DAT 3D-Dataset Annotation Toolkit for Robotic Vision Type Data analysis technique Public Access Link Link -
2022
Link
Title ObChange Dataset DOI 10.48436/r2sxr-tsc12 Type Database/Collection of data Public Access Link Link
-
2020
Title School workshops Type Participation in an activity, workshop or similar
-
2022
Title Invitiation to Robotics Lab Opening at University Bremen as keynot speaker Type Personally asked as a key note speaker to a conference Level of Recognition Continental/International -
2021
Title Nomination for best paper award of IEEE RA-L Type Research prize Level of Recognition Continental/International
-
2020
Title Traceable Robotics Handling of Sterile Medical Products Type Research grant (including intramural programme) Start of Funding 2020 Funder European Commission