Mobile Collaborative Augmented Reality
Mobile Collaborative Augmented Reality
Disciplines
Computer Sciences (100%)
Keywords
-
AUGMENTED REALITY,
MOBILE COMPUTING,
3D USER INTERFACES,
HYBRID TRACKING,
VIRTUAL REALITY,
WEARABLE COMPUTING
Research project P 14470 Mobile Collaborative Augumented Reality Dieter SCHMALSTIEG 26.6.2000 Starting from a successful collaboration in the past - the collaborative augmented reality environment Studierstube has been developed by the proposers within FWF project P12074-MAT - this project aims at the development of a new mobile collaborative augmented reality system. We propose to create a mobile AR kit consisting of a wearable computer, wireless networking, 3D display, interaction and localization devices. The system will deliver the following key contributions: *) A mobile AR user interface which not only allows self-contained interaction with location-independent and location dependent 3D data, but also docking into a stationary Studierstube AR environment. *) Support for collaborative mobile AR by establishing instant networked sessions of users in a casual meeting. Users will be able to interact with each other and with virtual 3D objects, which each user brings along into the meeting. This feature distinguishes the proposed user interface from previous approaches. *) A new concept called qualitative AR tracking. Previous approaches (which should be termed "metric AR"), require perfect 6DOF registration with the environment, which is not strictly necessary for many mobile situations. Instead, we propose to use hybrid optical/inertial/GPS tracking to obtain a body-stabilized reference system, which operates precisely, but relative to one or multiple users in close range, and relies on natural landmarks for relaxed tracking in the far field. At the end of the project, two identical, fully operational mobile environments consisting of two AR kits each will be available at both sites in Vienna and Graz. Three scenarios are planned to demonstrate and evaluate the results: a conferencing environment with mobile users, an AR building information system, and a two-user outdoor architectural design session. These application scenarios will clearly demonstrate the potential benefits of mobile collaborative augmented reality.
The presented project applies the concepts of Augmented Reality to mobile computing and support of collaborative work of two or more users. Augmented Reality (AR) is a user interface technique that merges the presentation of the human computer interface with the real world. It seeks to add to the experience of reality to create new forms of interaction between humans and computers. For example, a see-through device that mixes the real image with a computer generated virtual image allows superimposing information over the user`s perception of the real world. Mobile computing allows to access information resources at any place anytime and combined with an AR interface provides a natural way to browse and interact with information based on the location of the user. AR also supports collaborative work styles because it allows natural interaction between the users enriched by the computer`s input. Example applications are navigation aids for use within a building or for use outdoors in a city environment. The path to a selected destination is projected into the user`s view and reduces the likelihood of misinterpreting the given information and following a wrong route. For outdoor environments we further developed demonstrations that allow multiple users to view information on parts of buildings or annotate their environment with symbols and text. Such information is displayed at the interesting location providing an instant link to the place or triggered by looking at the location. A number of issues needed to be addressed to implement such applications. TU Vienna developed software framework to support multiple users and distribute applications between their setups to provide the distributed system necessary for collaborative work. Mobile applications require large sets of location based data which was addressed by implementing a dedicated software architecture using a general world model and dedicated data transformations for individual applications. Such an approach allows simple creation and maintenance of the required data. The second problem addressed is the issue of locating a user within a defined environment. The correct presentation of information requires a very accurate knowledge of the user`s location and gaze direction which is approximated by measuring the user`s head orientation. TU Graz developed a novel combination of vision based methods and inertial sensors to improve the accuracy and update rate of such a positioning system. State of the art vision based systems rely on the use of artificial markers to simplify the recognition of the environment. The system developed within this project uses natural features detected in the environment to measure the movement of the user. Therefore, it requires less preparation and intrusion into the environment.
- Axel Pinz, Technische Universität Graz , associated research partner
- L. Miguel Encarnacao, Fraunhofer Center for Research in Computer Graphics CRCG - USA
Research Output
- 882 Citations
- 17 Publications