• Skip to content (access key 1)
  • Skip to search (access key 7)
FWF — Austrian Science Fund
  • Go to overview page Discover

    • Research Radar
      • Research Radar Archives 1974–1994
    • Discoveries
      • Emmanuelle Charpentier
      • Adrian Constantin
      • Monika Henzinger
      • Ferenc Krausz
      • Wolfgang Lutz
      • Walter Pohl
      • Christa Schleper
      • Elly Tanaka
      • Anton Zeilinger
    • Impact Stories
      • Verena Gassner
      • Wolfgang Lechner
      • Birgit Mitter
      • Oliver Spadiut
      • Georg Winter
    • scilog Magazine
    • Austrian Science Awards
      • FWF Wittgenstein Awards
      • FWF ASTRA Awards
      • FWF START Awards
      • Award Ceremony
    • excellent=austria
      • Clusters of Excellence
      • Emerging Fields
    • In the Spotlight
      • 40 Years of Erwin Schrödinger Fellowships
      • Quantum Austria
    • Dialogs and Talks
      • think.beyond Summit
    • Knowledge Transfer Events
    • E-Book Library
  • Go to overview page Funding

    • Portfolio
      • excellent=austria
        • Clusters of Excellence
        • Emerging Fields
      • Projects
        • Principal Investigator Projects
        • Principal Investigator Projects International
        • Clinical Research
        • 1000 Ideas
        • Arts-Based Research
        • FWF Wittgenstein Award
      • Careers
        • ESPRIT
        • FWF ASTRA Awards
        • Erwin Schrödinger
        • doc.funds
        • doc.funds.connect
      • Collaborations
        • Specialized Research Groups
        • Special Research Areas
        • Research Groups
        • International – Multilateral Initiatives
        • #ConnectingMinds
      • Communication
        • Top Citizen Science
        • Science Communication
        • Book Publications
        • Digital Publications
        • Open-Access Block Grant
      • Subject-Specific Funding
        • AI Mission Austria
        • Belmont Forum
        • ERA-NET HERA
        • ERA-NET NORFACE
        • ERA-NET QuantERA
        • Alternative Methods to Animal Testing
        • European Partnership BE READY
        • European Partnership Biodiversa+
        • European Partnership BrainHealth
        • European Partnership ERA4Health
        • European Partnership ERDERA
        • European Partnership EUPAHW
        • European Partnership FutureFoodS
        • European Partnership OHAMR
        • European Partnership PerMed
        • European Partnership Water4All
        • Gottfried and Vera Weiss Award
        • LUKE – Ukraine
        • netidee SCIENCE
        • Herzfelder Foundation Projects
        • Quantum Austria
        • Rückenwind Funding Bonus
        • WE&ME Award
        • Zero Emissions Award
      • International Collaborations
        • Belgium/Flanders
        • Germany
        • France
        • Italy/South Tyrol
        • Japan
        • Korea
        • Luxembourg
        • Poland
        • Switzerland
        • Slovenia
        • Taiwan
        • Tyrol-South Tyrol-Trentino
        • Czech Republic
        • Hungary
    • Step by Step
      • Find Funding
      • Submitting Your Application
      • International Peer Review
      • Funding Decisions
      • Carrying out Your Project
      • Closing Your Project
      • Further Information
        • Integrity and Ethics
        • Inclusion
        • Applying from Abroad
        • Personnel Costs
        • PROFI
        • Final Project Reports
        • Final Project Report Survey
    • FAQ
      • Project Phase PROFI
      • Project Phase Ad Personam
      • Expiring Programs
        • Elise Richter and Elise Richter PEEK
        • FWF START Awards
  • Go to overview page About Us

    • Mission Statement
    • FWF Video
    • Values
    • Facts and Figures
    • Annual Report
    • What We Do
      • Research Funding
        • Matching Funds Initiative
      • International Collaborations
      • Studies and Publications
      • Equal Opportunities and Diversity
        • Objectives and Principles
        • Measures
        • Creating Awareness of Bias in the Review Process
        • Terms and Definitions
        • Your Career in Cutting-Edge Research
      • Open Science
        • Open-Access Policy
          • Open-Access Policy for Peer-Reviewed Publications
          • Open-Access Policy for Peer-Reviewed Book Publications
          • Open-Access Policy for Research Data
        • Research Data Management
        • Citizen Science
        • Open Science Infrastructures
        • Open Science Funding
      • Evaluations and Quality Assurance
      • Academic Integrity
      • Science Communication
      • Philanthropy
      • Sustainability
    • History
    • Legal Basis
    • Organization
      • Executive Bodies
        • Executive Board
        • Supervisory Board
        • Assembly of Delegates
        • Scientific Board
        • Juries
      • FWF Office
    • Jobs at FWF
  • Go to overview page News

    • News
    • Press
      • Logos
    • Calendar
      • Post an Event
      • FWF Informational Events
    • Job Openings
      • Enter Job Opening
    • Newsletter
  • Discovering
    what
    matters.

    FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

    SOCIAL MEDIA

    • LinkedIn, external URL, opens in a new window
    • , external URL, opens in a new window
    • Facebook, external URL, opens in a new window
    • Instagram, external URL, opens in a new window
    • YouTube, external URL, opens in a new window

    SCILOG

    • Scilog — The science magazine of the Austrian Science Fund (FWF)
  • elane login, external URL, opens in a new window
  • Scilog external URL, opens in a new window
  • de Wechsle zu Deutsch

  

Benchmarks for Understanding Grasping

Benchmarks for Understanding Grasping

Markus Vincze (ORCID: 0000-0002-2799-491X)
  • Grant DOI 10.55776/I3967
  • Funding program International - Multilateral Initiatives
  • Status ended
  • Start October 1, 2019
  • End July 31, 2023
  • Funding amount € 249,354
  • Project website

Disciplines

Electrical Engineering, Electronics, Information Engineering (60%); Computer Sciences (40%)

Keywords

    Teilerkennung, Roboter, Benchmark, Objekt, Greifen

Abstract Final report

Grasping rigid objects has been reasonably studied under a wide variety of settings. The common measure of success is a check of the robot to hold an object for a few seconds. This is not enough. To obtain a deeper understanding of object manipulation, we propose (1) a task-oriented part- based modelling of grasping and (2) BURG - our castle* of setups, tools and metrics for community building around an objective benchmark protocol. The idea is to boost grasping research by focusing on complete tasks. This calls for attention on object parts since they are essential to know how and where the gripper can grasp given the manipulation constraints imposed by the task. Moreover, parts facilitate knowledge transfer to novel objects, across different sources (virtual/real data) and grippers, providing for a versatile and scalable system. The part-based approach naturally extends to deformable objects for which the recognition of relevant semantic parts, regardless of the object actual deformation, is essential to get a tractable manipulation problem. Finally, by focusing on parts we can deal easier with environmental constraints that are detected and used to facilitate grasping. Regarding benchmarking of manipulation, so far robotics suffered from uncomparable grasping and manipulation work. Datasets cover only the object detection aspect. Object sets are difficult to get, not extendible, and neither scenes nor manipulation tasks are replicable. There are no common tools to solve the basic needs of setting up replicable scenes or reliably estimate object pose. Hence, with the BURG benchmark we propose to focus on community building through enabling and sharing tools for reproducible performance evaluation, including collecting data and feedback from different laboratories for studying manipulation across different robot embodiments. We will develop a set of repeatable scenarios spanning different levels of quantifiable complexity that involve the choice of the objects, tasks and environments. Examples include fully quantified settings with layers of objects, adding deformable objects and environmental constraints. The benchmark will include metrics defined to assess the performance of both low-level primitives (object pose, grasp point and type, collision-free motion) as well as manipulation tasks (stacking, aligning, assembling, packing, handover, folding) requiring ordering as well as common sense knowledge for semantic reasoning. * Burg [brk] f ; German: castle, stronghold, fortress

The idea of the BURG project is to boost grasping research by focusing on complete tasks. This calls for attention on object parts since they are essential to know how and where the gripper can grasp given the manipulation constraints imposed by the task. Moreover, parts facilitate knowledge transfer to novel objects, across different sources (virtual/real data) and grippers, providing for a versatile and scalable system. The part-based approach naturally extends to deformable objects for which the recognition of relevant semantic parts, regardless of the object's actual deformation, is essential to get a tractable manipulation problem. Finally, by focusing on parts we can deal easier with environmental constraints that are detected and used to facilitate grasping. With the BURG project we proposed to focus on community building through enabling and sharing tools for reproducible performance evaluation, including collecting data and feedback from different laboratories for studying manipulation across different robot embodiments. We developed a set of repeatable scenarios spanning different levels of quantifiable complexity that involve the choice of the objects, tasks and environments. Examples include fully quantified settings with layers of objects, adding deformable objects and environmental constraints. The benchmark will include metrics defined to assess the performance of both low-level primitives (object pose, grasp point and type, collision-free motion) as well as manipulation tasks (stacking, aligning, assembling, packing, handover, folding) requiring ordering as well as common sense knowledge for semantic reasoning. Results are a set of tools for creating the BURG benchmark and carrying it out and software tools for annotation and processing of the data.

Research institution(s)
  • Technische Universität Wien - 100%
International project participants
  • Tatiana Tommasi, Sonstige Forschungs- oder Entwicklungseinrichtungen - Italy
  • Guillem Alenya, Agencia Estatal Consejo Superior de Investigaciones Cientificas - Spain
  • Ales Leonardis, The University of Birmingham

Research Output

  • 774 Citations
  • 22 Publications
  • 1 Policies
  • 2 Methods & Materials
  • 2 Datasets & models
  • 1 Software
  • 1 Disseminations
  • 2 Scientific Awards
  • 1 Fundings
Publications
  • 2020
    Title Learn, detect, and grasp objects in real-world settings
    DOI 10.1007/s00502-020-00817-6
    Type Journal Article
    Author Vincze M
    Journal e & i Elektrotechnik und Informationstechnik
    Pages 324-330
    Link Publication
  • 2020
    Title Neural Object Learning for 6D Pose Estimation Using a Few Cluttered Images
    DOI 10.1007/978-3-030-58548-8_38
    Type Book Chapter
    Author Park K
    Publisher Springer Nature
    Pages 656-673
  • 2020
    Title Robot perception of static and dynamic objects with an autonomous floor scrubber
    DOI 10.1007/s11370-020-00324-9
    Type Journal Article
    Author Yan Z
    Journal Intelligent Service Robotics
    Pages 403-417
  • 2020
    Title VeREFINE: Integrating Object Pose Verification With Physics-Guided Iterative Refinement
    DOI 10.1109/lra.2020.2996059
    Type Journal Article
    Author Bauer D
    Journal IEEE Robotics and Automation Letters
    Pages 4289-4296
    Link Publication
  • 2020
    Title DGCM-Net: Dense Geometrical Correspondence Matching Network for Incremental Experience-Based Robotic Grasping
    DOI 10.3389/frobt.2020.00120
    Type Journal Article
    Author Patten T
    Journal Frontiers in Robotics and AI
    Pages 120
    Link Publication
  • 2020
    Title Unsupervised Domain Adaptation Through Inter-Modal Rotation for RGB-D Object Recognition
    DOI 10.1109/lra.2020.3007092
    Type Journal Article
    Author Loghmani M
    Journal IEEE Robotics and Automation Letters
    Pages 6631-6638
    Link Publication
  • 2019
    Title Pix2Pose: Pixel-Wise Coordinate Regression of Objects for 6D Pose Estimation
    DOI 10.1109/iccv.2019.00776
    Type Conference Proceeding Abstract
    Author Park K
    Pages 7667-7676
    Link Publication
  • 2019
    Title EasyLabel: A Semi-Automatic Pixel-wise Object Annotation Tool for Creating Robotic RGB-D Datasets
    DOI 10.1109/icra.2019.8793917
    Type Conference Proceeding Abstract
    Author Suchi M
    Pages 6678-6684
    Link Publication
  • 2019
    Title Pix2Pose: Pixel-Wise Coordinate Regression of Objects for 6D Pose Estimation
    DOI 10.48550/arxiv.1908.07433
    Type Preprint
    Author Park K
  • 2022
    Title SporeAgent: Reinforced Scene-level Plausibility for Object Pose Refinement
    DOI 10.1109/wacv51458.2022.00027
    Type Conference Proceeding Abstract
    Author Bauer D
    Pages 196-204
    Link Publication
  • 2022
    Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World
    DOI 10.48550/arxiv.2205.14099
    Type Preprint
    Author Rudorfer M
  • 2022
    Title Where Does It Belong? Autonomous Object Mapping in Open-World Settings
    DOI 10.3389/frobt.2022.828732
    Type Journal Article
    Author Langer E
    Journal Frontiers in Robotics and AI
    Pages 828732
    Link Publication
  • 2021
    Title ReAgent: Point Cloud Registration using Imitation and Reinforcement Learning
    DOI 10.1109/cvpr46437.2021.01435
    Type Conference Proceeding Abstract
    Author Bauer D
    Pages 14581-14589
    Link Publication
  • 2020
    Title Physical Plausibility of 6D Pose Estimates in Scenes of Static Rigid Objects
    DOI 10.1007/978-3-030-66096-3_43
    Type Book Chapter
    Author Bauer D
    Publisher Springer Nature
    Pages 648-662
  • 2023
    Title 3D-DAT: 3D-Dataset Annotation Toolkit for Robotic Vision
    DOI 10.1109/icra48891.2023.10160669
    Type Conference Proceeding Abstract
    Author Neuberger B
    Pages 9162-9168
  • 2023
    Title COPE: End-to-end trainable Constant Runtime Object Pose Estimation
    DOI 10.1109/wacv56688.2023.00288
    Type Conference Proceeding Abstract
    Author Patten T
    Pages 2859-2869
  • 2023
    Title [Recognizing transparent objects for laboratory automation].
    DOI 10.1007/s00502-023-01158-w
    Type Journal Article
    Author Vincze M
    Journal Elektrotechnik und Informationstechnik : E & I
    Pages 519-529
  • 2023
    Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World
    Type Conference Proceeding Abstract
    Author Markus Suchi
    Conference UK Robot Manipulation workshop
  • 2023
    Title Object Change Detection for Autonomous Indoor Robots in Open-World Settings
    DOI 10.34726/hss.2023.111500
    Type Other
    Author Langer E
    Link Publication
  • 2023
    Title BURG-Toolkit: Robot Grasping Experiments in Simulation and the Real World
    DOI 10.34726/5453
    Type Other
    Author Rudorfer M
    Link Publication
  • 2019
    Title Recurrent Convolutional Fusion for RGB-D Object Recognition
    DOI 10.1109/lra.2019.2921506
    Type Journal Article
    Author Loghmani M
    Journal IEEE Robotics and Automation Letters
    Pages 2878-2885
    Link Publication
  • 2019
    Title An Empirical Evaluation of Ten Depth Cameras
    DOI 10.1109/mra.2018.2852795
    Type Journal Article
    Author Halmetschlager-Funek G
    Journal IEEE Robotics & Automation Magazine
    Pages 67-77
    Link Publication
Policies
  • 2020
    Title Trying to increase the interest in STEM
    Type Implementation circular/rapid advice/letter to e.g. Ministry of Health
Methods & Materials
  • 2022
    Title Object class detection and pose within the heap
    Type Improvements to research infrastructure
    Public Access
  • 2021
    Title Probabilistic part-based scene segmentation
    Type Improvements to research infrastructure
    Public Access
Datasets & models
  • 2023 Link
    Title 3D-DAT 3D-Dataset Annotation Toolkit for Robotic Vision
    Type Data analysis technique
    Public Access
    Link Link
  • 2022 Link
    Title ObChange Dataset
    DOI 10.48436/r2sxr-tsc12
    Type Database/Collection of data
    Public Access
    Link Link
Software
  • 2023 Link
    Title 3D-DAT: 3D-Dataset Annotation Toolkit for Robotic Vision
    Link Link
Disseminations
  • 2020
    Title School workshops
    Type Participation in an activity, workshop or similar
Scientific Awards
  • 2022
    Title Invitiation to Robotics Lab Opening at University Bremen as keynot speaker
    Type Personally asked as a key note speaker to a conference
    Level of Recognition Continental/International
  • 2021
    Title Nomination for best paper award of IEEE RA-L
    Type Research prize
    Level of Recognition Continental/International
Fundings
  • 2020
    Title Traceable Robotics Handling of Sterile Medical Products
    Type Research grant (including intramural programme)
    Start of Funding 2020
    Funder European Commission

Discovering
what
matters.

Newsletter

FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

Contact

Austrian Science Fund (FWF)
Georg-Coch-Platz 2
(Entrance Wiesingerstraße 4)
1010 Vienna

office(at)fwf.ac.at
+43 1 505 67 40

General information

  • Job Openings
  • Jobs at FWF
  • Press
  • Philanthropy
  • scilog
  • FWF Office
  • Social Media Directory
  • LinkedIn, external URL, opens in a new window
  • , external URL, opens in a new window
  • Facebook, external URL, opens in a new window
  • Instagram, external URL, opens in a new window
  • YouTube, external URL, opens in a new window
  • Cookies
  • Whistleblowing/Complaints Management
  • Accessibility Statement
  • Data Protection
  • Acknowledgements
  • IFG-Form
  • Social Media Directory
  • © Österreichischer Wissenschaftsfonds FWF
© Österreichischer Wissenschaftsfonds FWF