olwal_surfacefusion_gi_2008

olwal_surfacefusion_gi_2008 - Olwal A and Wilson A...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: Olwal, A., and Wilson, A. SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces. Proceedings of GI 2008 (The 34th Canadian Graphics Interface Conference) , Windsor, Canada, May 28-30, 2008, pp. 235-242. SurfaceFusion: Unobtrusive Tracking of Everyday Objects in Tangible User Interfaces Alex Olwal School of Computer Science and Communication, KTH 1 Andrew D. Wilson Microsoft Research 2 ABSTRACT Interactive surfaces and related tangible user interfaces often in- volve everyday objects that are identified, tracked, and augmented with digital information. Traditional approaches for recognizing these objects typically rely on complex pattern recognition tech- niques, or the addition of active electronics or fiducials that alter the visual qualities of those objects, making them less practical for real-world use. Radio Frequency Identification (RFID) technology provides an unobtrusive method of sensing the presence of and identifying tagged nearby objects but has no inherent means of determining the position of tagged objects. Computer vision, on the other hand, is an established approach to track objects with a camera. While shapes and movement on an interactive surface can be determined from classic image processing techniques, object recognition tends to be complex, computationally expensive and sensitive to environmental conditions. We present a set of tech- niques in which movement and shape information from the com- puter vision system is fused with RFID events that identify what objects are in the image. By synchronizing these two complemen- tary sensing modalities, we can associate changes in the image with events in the RFID data, in order to recover position, shape and identification of the objects on the surface, while avoiding complex computer vision processes and exotic RFID solutions. CR Categories and Subject Descriptors: H5.2 [Information interfaces and presentation]: User Interfaces - Graphical user in- terfaces. Additional Keywords: RFID, Computer Vision, Fusion, Tangi- ble User Interface, Tabletop, Surface Computing. 1 INTRODUCTION As the cost of large displays and computing hardware continues to decline, we can expect an increasing variety of computing form factors arrayed throughout our everyday environment. Interactive table systems, for example, exploit large displays in combination with sensing technologies to bring new experiences and interac- tions to tables, desks and other horizontal surfaces. Many of these systems augment our interactions with familiar everyday physical objects, lending them unique capabilities in the virtual world. The bridging of the physical and virtual can be achieved in nu- merous ways. The DigitalDesk [21] tracks and augments paper with an overhead projector and camera, with which the user can interact using pens or their hands. The metaDESK [18] uses a rear-projected surface where vision-based tracking is performed with an IR-camera under the surface. It recognizes objects by the with an IR-camera under the surface....
View Full Document

{[ snackBarMessage ]}

Page1 / 8

olwal_surfacefusion_gi_2008 - Olwal A and Wilson A...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online