This preview has intentionally blurred sections. Sign up to view the full version.View Full Document
Unformatted text preview: EyePhone: Activating Mobile Phones With Your Eyes Emiliano Miluzzo, Tianyu Wang, Andrew T. Campbell Computer Science Department Dartmouth College Hanover, NH, USA ABSTRACT As smartphones evolve researchers are studying new tech- niques to ease the human-mobile interaction. We propose EyePhone , a novel “hand-free” interfacing system capable of driving mobile applications/functions using only the user’s eyes movement and actions (e.g., wink). EyePhone tracks the user’s eye movement across the phone’s display using the camera mounted on the front of the phone; more specifically, machine learning algorithms are used to: i) track the eye and infer its position on the mobile phone display as a user views a particular application; and ii) detect eye blinks that em- ulate mouse clicks to activate the target application under view. We present a prototype implementation of EyePhone on a Nokia N810, which is capable of tracking the position of the eye on the display, mapping this positions to an ap- plication that is activated by a wink. At no time does the user have to physically touch the phone display. Categories and Subject Descriptors C.3 [ Special-Purpose and Application-Based Systems ]: Real-time and embedded systems General Terms Algorithms, Design, Experimentation, Human Factors, Mea- surement, Performance Keywords Human-Phone Interaction, Mobile Sensing Systems, Machine Learning, Mobile Phones 1. INTRODUCTION Human-Computer Interaction (HCI) researchers and phone vendors are continuously searching for new approaches to reduce the effort users exert when accessing applications on limited form factor devices such as mobile phones. The most significant innovation of the past few years is the adoption of Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. MobiHeld 2010, August 30, 2010, New Delhi, India. Copyright 2010 ACM 978-1-4503-0197-8/10/08 ...$10.00. touchscreen technology introduced with the Apple iPhone  and recently followed by all the other major vendors, such as Nokia  and HTC . The touchscreen has changed the way people interact with their mobile phones because it provides an intuitive way to perform actions using the move- ment of one or more fingers on the display (e.g., pinching a photo to zoom in and out, or panning to move a map). Several recent research projects demonstrate new people- to-mobile phone interactions technologies [4, 5, 6, 7, 8, 9, 10]. For example, to infer and detect gestures made by the user, phones use the on-board accelerometer [4, 7, 8], camera [11, 5], specialized headsets , dedicated sensors  or radio features . We take a different approach than that foundfeatures ....
View Full Document
This note was uploaded on 08/25/2011 for the course EEL 6788 taught by Professor Boloni,l during the Spring '08 term at University of Central Florida.
- Spring '08