isar2000 - Virtual Object Manipulation on a Table-Top AR...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Virtual Object Manipulation on a Table-Top AR Environment H. Kato 1 , M. Billinghurst 2 , I. Poupyrev 3 , K. Imamoto 1 , K. Tachibana 1 1 Faculty of Information Sciences, Hiroshima City University 3-4-1, Ozuka-higashi, Asaminami-ku, Hiroshima, 731-3194 JAPAN 2 HIT Laboratory, University of Washington Box 352-142, Seattle, WA 98195, USA 3 ATR MIC Laboratories, ATR International, 2-2 Hikaridai, Seika-cho, Soraku-gun, Kyoto, Japan Abstract In this paper we address the problems of virtual object interaction and user tracking in a table-top Augmented Reality (AR) interface. In this setting there is a need for very accurate tracking and registration techniques and an intuitive and useful interface. This is especially true in AR interfaces for supporting face to face collaboration where users need to be able to easily cooperate with each other. We describe an accurate vision-based tracking method for table-top AR environments and tangible user interface (TUI) techniques based on this method that allow users to manipulate virtual objects in a natural and intuitive manner. Our approach is robust, allowing users to cover some of the tracking markers while still returning camera viewpoint information, overcoming one of the limitations of traditional computer vision based systems. After describing this technique we describe it’s use in a prototype AR applications. 1. Introduction In the design session of the future several architects sit around a table examining plans and pictures of a building they are about to construct. Mid-way through the design session they don light-weight see-through head mounted displays (HMDs). Through the displays they can still see each other and their real plans and drawings. However in the midst of the table they can now see a three- dimensional virtual image of their building. This image is exactly aligned over the real world so the architects are free to move around the table and examine it from any viewpoint. Each person has their own viewpoint into the model, just as if they were seeing a real object. Since it is virtual they are also free to interact with the model in real time, adding or deleting parts to the building or scaling portions of it to examine it in greater detail. While interacting with the virtual model they can also see each other and the real world, ensuring a very natural collaboration and flow of communication. While this may seem to be a far-off vision of the future there are a number of researchers that have already developed table-top AR systems for supporting face-to- face collaboration. In Kiyokawa’s work two users are able to collaboratively design virtual scenes in an AR interface and then fly inside those scenes and experience them immersively [Kiyokawa 98]. The AR2 Hockey system of Ohshima et. al. [Ohshima 98] allows two users to play virtual air hockey against each other, while the Shared Space interface supports several users around a table
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This document was uploaded on 09/19/2011.

Page1 / 9

isar2000 - Virtual Object Manipulation on a Table-Top AR...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online