3DUIevalII - 3D User Interface Evaluation II 3D Lecture...

Info iconThis preview shows page 1. Sign up to view the full content.

View Full Document Right Arrow Icon
This is the end of the preview. Sign up to access the rest of the document.

Unformatted text preview: 3D User Interface Evaluation II 3D Lecture #15: Evaluating 3DUIs – Part II Spring 2008 Joseph J. LaViola Jr. Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Usability Evaluation in 3DUIs Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 1 Classification Shortcoming Does not tell you “when” a method should Does be applied Does not tell you “how” to apply more Does than one method 3DUI evaluation models 3DUI Testbed evaluation Testbed Sequential evaluation Sequential Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Testbed Evaluation Framework Developed by Bowman and Hodges (1999) Developed Empirically evaluate techniques outside of Empirically applications Components Components initial evaluation initial taxonomy taxonomy outside factors outside performance metrics performance testbed evaluation testbed application and generalization of results application Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 2 Testbed Evaluation 1 2 Taxonomy 5 Initial Evaluation Outside Factors 3 4 task, users, evnironment, system Performance Metrics Testbed Evaluation 6 7 Quantitative Performance Results Heuristics & Guidelines 8 User-centered Application Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Testbed Evaluation – Initial Evaluation Gain intuitive understanding of generic Gain interaction tasks and current technologies Experience and user observation Experience Used for Used building taxonomy building identifying outside factors identifying finding performance metrics finding Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 3 Testbed Evaluation – Taxonomy Develop taxomony of interaction Develop techniques for interaction task in question Can use task-subtask approach Can Task Sub-task Technique Component Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Testbed Evaluation – Outside Factors Cannot evaluate in a vacuum Cannot Need to take other factors into account Need Categories Categories task characteristics task environment characteristics environment user characteristics user system characteristics system Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 4 Testbed Evaluation – Metrics Objective measures Objective speed speed accuracy accuracy Subjective measures Subjective ease of use ease ease of learning ease frustration frustration etc… etc Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Testbed Evaluation – The Testbed Allows generic, generalizable , and reusable Allows evaluation Testbed Testbed examines all aspects of a task examines evaluates each technique component evaluates considers outside influences considers has good metrics has Normally use formal, factorial experimental Normally designs Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 5 Testbed Evaluation – Results Produces set of results or models that Produces characterize an interaction technique for a given task Usability in terms of multiple performance Usability metrics Results become part of a performance database Results for task Results can be generalized into heuristics or Results guidelines Apply to 3D applications Apply Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Testbed Evaluation Experiments Travel testbed (Bowman, Davis, et al. 1999) Travel compared seven different travel techniques compared naïve and primed search naï 44 subjects tested 44 Selection/Manipulation testbed (Bowman and Selection/Manipulation Hodges 1999) compared nine different interaction techniques compared 48 subjects 48 Produced unexpected and intersting results (see Produced papers for details) Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 6 Sequential Evaluation Developed by Gabbard, Hix, and Swan (1999) Developed Usability engineering approach Usability Evolved from existing GUI/2D evaluation methods Evolved Addresses both design and evaluation Addresses Employs Employs application specific guidelines application domain specific representative users domain application specific user tasks application Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Sequential Evaluation (1) User Task Analysis (A) Task Descriptions Sequences & Dependencies (B) Guidelines and Heuristics (2) Heuristic Evaluation (D) Representative User Task Scenarios (C) Streamlined User Interface Designs (3) Formative Evaluation (E) Iteratively Refined User Interface Designs (4) Summative Evaluation User-centered Application Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 7 Sequential Evaluation – Example Applied to Dragon system Applied Several evaluations performed Several in 9-month period 9one to three users one two to three evaluators two Four cycles Four Guideline-based evaluation GuidelineSummative evaluation Summative major study major four factors (2 x 2 x 3 x 2) four See See Hix et al. (1999) Hix Hix and Gabbard (2002) Hix Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Comparison of Approaches Goals Goals Testbed – finding generic performance characteristics Testbed Sequential – better UI for particular application Sequential Costs Costs Testbed – difficult experimental design, large Testbed numbers of trials and subjects Sequential – multiple evaluators, significant time Sequential investment Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 8 3D Usability Evaluation Things To Consider Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Formality of Evaluation Formal: independent & dependent variables, Formal statistical analysis, strict adherence to procedure, hold constant all other variables, usually done to compare multiple techniques or at the end of the design process Informal: looser procedure, often more Informal qualitative, subject comments very important, looking for broad usability issues, usually done during the design process to inform redesign Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 9 What is Being Evaluated? Application: Application: Prototype - consider fidelity, scope, form Prototype Complete working system Complete Controlled experiments are rare Controlled Interaction techniques / UI metaphors Interaction Can still evaluate a prototype Can More generic context of use More Formal experiments more often used Formal Consider “Wizard of Oz” evaluation Consider Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Subjects / Participants How many? How What backgrounds? What technical vs. non-technical technical nonexpert vs. novice VE users expert domain experts vs. general population domain What age range? What Recruiting Recruiting flyers flyers email/listservs/newsgroups email/listservs/newsgroups psychology dept. psychology CS classes CS Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 10 Number of Evaluators Multiple evaluators often needed for 3DUI Multiple evaluations Roles Roles cable wrangler cable software controller software note taker note timer timer behavior observer behavior … Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Procedure Welcome Welcome Informed consent Informed Demographic/background Demographic/background questionnaire Pre-testing PreFamiliarize with equipment Familiarize Exploration time with Exploration interface Tasks Tasks Questionnaires / post-testing Questionnaires postInterviews Interviews Spring 2008 Subject “packets” are Subject packets” often useful for organizing information and data Pilot testing should be Pilot used in most cases to: “debug” your procedure debug” identify variables that can identify be dropped from the experiment CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 11 Instructions How much to tell the subject about purposes of How experiment? How much to tell the subject about how to use the How interface? Always tell the subject what they should try to optimize Always in their behavior. If using think-aloud protocol, you will have to remind If thinkthem many times. If using trackers, you will have to help users “learn” to If learn” move their heads, feet, and bodies – it doesn’t come ’ doesn naturally to many people. Remind subjects you are NOT testing them, but the Remind interface. Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Formal Experiment Issues Choosing independent variables Choosing Choosing dependent variables Choosing Controlling (holding constant) other Controlling variables Within- vs. between-subjects design Within Counterbalancing order of conditions Counterbalancing Full factorial or partial designs Full Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 12 Independent Variables Main variable of interest (e.g. interaction Main technique) Secondary variables Secondary task characteristics task environment characteristics environment system characteristics system user characteristics user Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Metrics (dependent variables) Task performance time Task Task errors Task User comfort (subjective ratings) User Observations of behavior (e.g. strategies) Observations Spoken subject comments (e.g. Spoken preferences) Surveys/questionnaires Surveys/questionnaires Interviews Interviews Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 13 Data Analysis Averages (means) of quantitative metrics Averages Counts of errors, behaviors Counts Correlate data to demographics Correlate Analysis of variance (ANOVA) Analysis Post Hoc analysis (t-tests) Post (tVisual analysis of trends (esp. learning) Visual Interactions between variables are often important Interactions Expect high variance in 3DUI interaction studies Expect Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. Analysis Tools SPSS, SAS, etc. SPSS, full statistical analysis packages full parametric and non-parametric tests parametric nontest correction mechanisms (e.g., Bonferroni) test Bonferroni) Excel Excel basic aggregation of data basic Correlations Correlations confidence intervals confidence graphs graphs Matlab, Mathematica Matlab Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 14 Next Class 3DUI evaluation example 3DUI Readings Readings 3DUI Book – Chapter 11, 367-384 3DUI 367- Spring 2008 CAP6938 – 3D User Interfaces for Games and Virtual Reality ©Joseph J. LaViola Jr. 15 ...
View Full Document

This note was uploaded on 06/13/2011 for the course CAP 6938 taught by Professor Staff during the Spring '08 term at University of Central Florida.

Ask a homework question - tutors are online