• Skip to content (access key 1)
  • Skip to search (access key 7)
FWF — Austrian Science Fund
  • Go to overview page Discover

    • Research Radar
      • Research Radar Archives 1974–1994
    • Discoveries
      • Emmanuelle Charpentier
      • Adrian Constantin
      • Monika Henzinger
      • Ferenc Krausz
      • Wolfgang Lutz
      • Walter Pohl
      • Christa Schleper
      • Elly Tanaka
      • Anton Zeilinger
    • Impact Stories
      • Verena Gassner
      • Wolfgang Lechner
      • Birgit Mitter
      • Oliver Spadiut
      • Georg Winter
    • scilog Magazine
    • Austrian Science Awards
      • FWF Wittgenstein Awards
      • FWF ASTRA Awards
      • FWF START Awards
      • Award Ceremony
    • excellent=austria
      • Clusters of Excellence
      • Emerging Fields
    • In the Spotlight
      • 40 Years of Erwin Schrödinger Fellowships
      • Quantum Austria
    • Dialogs and Talks
      • think.beyond Summit
    • Knowledge Transfer Events
    • E-Book Library
  • Go to overview page Funding

    • Portfolio
      • excellent=austria
        • Clusters of Excellence
        • Emerging Fields
      • Projects
        • Principal Investigator Projects
        • Principal Investigator Projects International
        • Clinical Research
        • 1000 Ideas
        • Arts-Based Research
        • FWF Wittgenstein Award
      • Careers
        • ESPRIT
        • FWF ASTRA Awards
        • Erwin Schrödinger
        • doc.funds
        • doc.funds.connect
      • Collaborations
        • Specialized Research Groups
        • Special Research Areas
        • Research Groups
        • International – Multilateral Initiatives
        • #ConnectingMinds
      • Communication
        • Top Citizen Science
        • Science Communication
        • Book Publications
        • Digital Publications
        • Open-Access Block Grant
      • Subject-Specific Funding
        • AI Mission Austria
        • Belmont Forum
        • ERA-NET HERA
        • ERA-NET NORFACE
        • ERA-NET QuantERA
        • Alternative Methods to Animal Testing
        • European Partnership BE READY
        • European Partnership Biodiversa+
        • European Partnership BrainHealth
        • European Partnership ERA4Health
        • European Partnership ERDERA
        • European Partnership EUPAHW
        • European Partnership FutureFoodS
        • European Partnership OHAMR
        • European Partnership PerMed
        • European Partnership Water4All
        • Gottfried and Vera Weiss Award
        • LUKE – Ukraine
        • netidee SCIENCE
        • Herzfelder Foundation Projects
        • Quantum Austria
        • Rückenwind Funding Bonus
        • WE&ME Award
        • Zero Emissions Award
      • International Collaborations
        • Belgium/Flanders
        • Germany
        • France
        • Italy/South Tyrol
        • Japan
        • Korea
        • Luxembourg
        • Poland
        • Switzerland
        • Slovenia
        • Taiwan
        • Tyrol–South Tyrol–Trentino
        • Czech Republic
        • Hungary
    • Step by Step
      • Find Funding
      • Submitting Your Application
      • International Peer Review
      • Funding Decisions
      • Carrying out Your Project
      • Closing Your Project
      • Further Information
        • Integrity and Ethics
        • Inclusion
        • Applying from Abroad
        • Personnel Costs
        • PROFI
        • Final Project Reports
        • Final Project Report Survey
    • FAQ
      • Project Phase PROFI
      • Project Phase Ad Personam
      • Expiring Programs
        • Elise Richter and Elise Richter PEEK
        • FWF START Awards
  • Go to overview page About Us

    • Mission Statement
    • FWF Video
    • Values
    • Facts and Figures
    • Annual Report
    • What We Do
      • Research Funding
        • Matching Funds Initiative
      • International Collaborations
      • Studies and Publications
      • Equal Opportunities and Diversity
        • Objectives and Principles
        • Measures
        • Creating Awareness of Bias in the Review Process
        • Terms and Definitions
        • Your Career in Cutting-Edge Research
      • Open Science
        • Open-Access Policy
          • Open-Access Policy for Peer-Reviewed Publications
          • Open-Access Policy for Peer-Reviewed Book Publications
          • Open-Access Policy for Research Data
        • Research Data Management
        • Citizen Science
        • Open Science Infrastructures
        • Open Science Funding
      • Evaluations and Quality Assurance
      • Academic Integrity
      • Science Communication
      • Philanthropy
      • Sustainability
    • History
    • Legal Basis
    • Organization
      • Executive Bodies
        • Executive Board
        • Supervisory Board
        • Assembly of Delegates
        • Scientific Board
        • Juries
      • FWF Office
    • Jobs at FWF
  • Go to overview page News

    • News
    • Press
      • Logos
    • Calendar
      • Post an Event
      • FWF Informational Events
    • Job Openings
      • Enter Job Opening
    • Newsletter
  • Discovering
    what
    matters.

    FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

    SOCIAL MEDIA

    • LinkedIn, external URL, opens in a new window
    • , external URL, opens in a new window
    • Facebook, external URL, opens in a new window
    • Instagram, external URL, opens in a new window
    • YouTube, external URL, opens in a new window

    SCILOG

    • Scilog — The science magazine of the Austrian Science Fund (FWF)
  • elane login, external URL, opens in a new window
  • Scilog external URL, opens in a new window
  • de Wechsle zu Deutsch

  

Augmented Reality by Example

Augmented Reality by Example

Denis Kalkofen (ORCID: 0000-0002-0359-206X)
  • Grant DOI 10.55776/P30694
  • Funding program Principal Investigator Projects
  • Status ended
  • Start March 1, 2018
  • End February 28, 2022
  • Funding amount € 349,473
  • Project website

Disciplines

Computer Sciences (100%)

Keywords

    Augmented Reality, Authoring

Abstract Final report

Augmented Reality (AR) by Example is a novel approach to generate AR applications from existing video and image demonstrations. Previous techniques create AR applications using a time-consuming process, which requires skills with 3D modeling and animation software, in addition to knowledge of the technical components of an AR system. While any AR application will benefit from a simple authoring process, this project will mainly target the generation of AR applications for knowledge presentation in tutorials. While traditionally images have been used to provide visual instructions, with the success of video sharing platforms a large body of video tutorials is additionally available. Both, image and video tutorials effectively convey complex motions, but are difficult to follow precisely because of their 2D nature. AR tutorials have been revealed to be more effective. This research projects brings the advantages of 2D and 3D instructions together by automatically creating three-dimensional AR tutorials from conventional 2D data. Unlike previous work, we will not simply overlay video, but synthesize 3D registered motion from the input data. Since the information in the resulting AR tutorial is registered to 3D objects, the user can freely change the viewpoint without degrading the AR experience. This is achieved by investigating a number of different techniques. First, we have to extract the author`s 3D environment and all 3D motion from the input data. Second, we need to provide tools for editing and retargeting the resulting 3D scene to the user`s current environment. Third, we will develop comprehensible visualization techniques, specific for instructions in AR environments to effectively communicate the instructions extracted before. The research is complemented by qualitative and quantitative evaluations of the effects the investigated techniques have on users. This research project has great potential in applications concerned with crowed sourced training and teaching.

Augmented Reality (AR) by Example was a research project investigating novel approaches for AR applications based on video and image data. The project was mainly targeting knowledge presentation in AR tutorials. Traditional image and video tutorials effectively convey complex motions but are difficult to follow precisely because of their 2D nature and their missing link to the user's real world environment. Since AR tutorials have been revealed to be more effective, the goal of the project was to provide novel tools to make 2D image and video data more accessible in 3D AR environments. This was achieved by investigating approaches for AR scene acquisition, sequence detection, motion visualization, and retargeting of captured 2D data to 3D AR environments. In particular, a novel approach for authoring AR assembly tutorials from video demonstrations has been developed. The approach makes use of a 3D model and the corresponding assembly graph of the assembly to generate a 3D animation, which is annotated with corresponding video sequences that have been extracted from the input video. This approach was subsequently refined by also deriving the 3D model and the assembly graph from an RGB-D input stream. Furthermore, to coherently render the modeled and the real environment in AR, this project has developed a series of neural networks, which are able to mimic the characteristics of the real camera used in video-see through AR applications. To avoid the need for a 3D model, we have also developed a novel approach for authoring and viewing AR instructions based on an image database. Thus, approaches for guided capturing, editing and display of light fields on mobile devices were researched. This has been demonstrated to well support applications to remote AR assistance in complex and unprepared environments. Recent advances in free viewpoint rendering have furthermore enabled to extend the developed approach from images to multi-view video representations. In the context of this project, a light field video capturing and rendering approach has been implemented. To research the impact of 3D AR video tutorials in comparison to traditional model based AR tutorials, we have furthermore developed an approach to automatically generate interactive AR tutorials for users at different skill levels. We focused on generating guitar tutorials as it provides a challenging application case to learning motor skills. To reduce perceptual issues while following AR tutorials, we have also developed a novel video-based scene representation that supports focus cues. Our approach is based on a focal stack representation, as they naturally support defocus blur. Thus, we introduce a pipeline for capturing, rendering and blending of focal stack images in real-time frame rates. We demonstrate that this representation is able to efficiently drive layered displays to present focus cues to an AR user.

Research institution(s)
  • Technische Universität Graz - 100%
International project participants
  • Vincent Lepetit, Universite Paris Est - France
  • Christian Sandor, Universite Paris-Saclay - France

Research Output

  • 192 Citations
  • 14 Publications
  • 3 Scientific Awards
Publications
  • 2022
    Title Video See-Through Mixed Reality with Focus Cues
    DOI 10.1109/tvcg.2022.3150504
    Type Journal Article
    Author Ebner C
    Journal IEEE Transactions on Visualization and Computer Graphics
    Pages 2256-2266
    Link Publication
  • 2018
    Title TutAR: augmented reality tutorials for hands-only procedures
    Type Conference Proceeding Abstract
    Author Christian Sandor
    Conference CM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry (VRCAI)
    Pages 1-3
    Link Publication
  • 2018
    Title TutAR
    DOI 10.1145/3284398.3284399
    Type Conference Proceeding Abstract
    Author Eckhoff D
    Pages 1-3
  • 2021
    Title Neural Cameras: Learning Camera Characteristics for Coherent Mixed Reality Rendering
    DOI 10.1109/ismar52148.2021.00068
    Type Conference Proceeding Abstract
    Author Mandl D
    Pages 508-516
  • 2020
    Title Mixed Reality Light Fields for Interactive Remote Assistance
    DOI 10.1145/3313831.3376289
    Type Conference Proceeding Abstract
    Author Mohr P
    Pages 1-12
  • 2020
    Title Video-Annotated Augmented Reality Assembly Tutorials
    DOI 10.1145/3379337.3415819
    Type Conference Proceeding Abstract
    Author Yamaguchi M
    Pages 1010-1022
  • 2020
    Title InpaintFusion: Incremental RGB-D Inpainting for 3D Scenes
    DOI 10.1109/tvcg.2020.3003768
    Type Journal Article
    Author Mori S
    Journal IEEE Transactions on Visualization and Computer Graphics
    Pages 2994-3007
  • 2020
    Title Video-Annotated Augmented Reality Assembly Tutorials
    Type Conference Proceeding Abstract
    Author Masahiro Yamaguchi
    Conference ACM Symposium on User Interface Software and Technology (UIST)
    Pages 1010-1022
    Link Publication
  • 2020
    Title Mixed Reality Light Fields for Interactive Remote Assistance
    Type Conference Proceeding Abstract
    Author Peter Mohr
    Conference CHI Conference on Human Factors in Computing Systems
    Pages 1-12
    Link Publication
  • 2020
    Title Perspective Matters: Design Implications for Motion Guidance in Mixed Reality
    DOI 10.1109/ismar50242.2020.00085
    Type Conference Proceeding Abstract
    Author Yu X
    Pages 577-587
  • 2023
    Title guitARhero: Interactive Augmented Reality Guitar Tutorials.
    DOI 10.1109/tvcg.2023.3320266
    Type Journal Article
    Author Kalkofen D
    Journal IEEE transactions on visualization and computer graphics
    Pages 4676-4685
  • 2018
    Title 3D PixMix: Image Inpainting in 3D Environments
    DOI 10.1109/ismar-adjunct.2018.00020
    Type Conference Proceeding Abstract
    Author Mori S
    Pages 1-2
  • 2022
    Title AR Hero: Generating Interactive Augmented Reality Guitar Tutorials
    DOI 10.1109/vrw55335.2022.00086
    Type Conference Proceeding Abstract
    Author Skreinig L
    Pages 395-401
  • 2022
    Title Model-Free Authoring by Demonstration of Assembly Instructions in Augmented Reality
    DOI 10.1109/tvcg.2022.3203104
    Type Journal Article
    Author Stanescu A
    Journal IEEE Transactions on Visualization and Computer Graphics
    Pages 3821-3831
Scientific Awards
  • 2018
    Title Best Short Paper at VRCAI
    Type Research prize
    Level of Recognition Continental/International
  • 2022
    Title Best Journal Paper at IEEE Virtual Reality (VR) 2022
    Type Research prize
    Level of Recognition Continental/International
  • 2021
    Title Best Conference Paper at IEEE ISMAR, 2021
    Type Research prize
    Level of Recognition Continental/International

Discovering
what
matters.

Newsletter

FWF-Newsletter Press-Newsletter Calendar-Newsletter Job-Newsletter scilog-Newsletter

Contact

Austrian Science Fund (FWF)
Georg-Coch-Platz 2
(Entrance Wiesingerstraße 4)
1010 Vienna

office(at)fwf.ac.at
+43 1 505 67 40

General information

  • Job Openings
  • Jobs at FWF
  • Press
  • Philanthropy
  • scilog
  • FWF Office
  • Social Media Directory
  • LinkedIn, external URL, opens in a new window
  • , external URL, opens in a new window
  • Facebook, external URL, opens in a new window
  • Instagram, external URL, opens in a new window
  • YouTube, external URL, opens in a new window
  • Cookies
  • Whistleblowing/Complaints Management
  • Accessibility Statement
  • Data Protection
  • Acknowledgements
  • IFG-Form
  • Social Media Directory
  • © Österreichischer Wissenschaftsfonds FWF
© Österreichischer Wissenschaftsfonds FWF