RT - Information Theory & Reaction Time Three of the...

Info iconThis preview shows pages 1–2. Sign up to view the full content.

View Full Document Right Arrow Icon
Information Theory & Reaction Time : 1 Three of the basic principles we are going to explore in this class are - selection (search) among alternatives as a way of describing tasks ( 2- effects of novelty (things we don’t know how to do) and practice things we know very well) 3- human limitations (can we pin them down so we can design around them) e p We are studying reaction time because it provides a very clean picture of all thre rinciples by being simple enough for us to measure things in a controlled way. Shan- ) d non & Weaver’s theory of communications started with Shannon’s (at Bell Labs esire to measure the "information" carrying capacity of communications channels i rather than just their electrical characteristics. His approach is based on representing nformation as strings of binary digits (bits) and comparing the "same" strings at both o d ends of the channel. For our purposes, we are interested in finding some way t escribe how people process information so we can predict which tasks will be easy a t (take little time to do), which will be hard and which will be impossible. The ide hat the number of alternatives that have to be considered is related to the amount of f information which must be processed at a task is commonsense. What we are looking or is a good way to describe this relation between the number of alternatives and the c amount of processing (which we will measure as time) that is required. The relation ould be linear, RT=kN, so that it takes twice as long to process 4 alternatives as it does 2 and four times as long to process 8. It could be exponential, so that the RT = N k d b task difficulty increases much more rapidly than the number of alternatives or it coul e logarithmic so that the task difficulty increases more slowly than the number of alternatives. Shannon & Weaver’s theory suggests that if we process information as they d t define it, difficulty should increase in the log of the number of alternatives. If we fin his to be approximately true, three questions remain. e a 1- Do people have a fixed channel capacity for processing information (is ther n upper limit to the rate at which we can consider alternatives) ? e d 2- Does information theory describe human information processing in mor etailed ways (predict effects of probabilities and loss of information through 3 "errors") ? - Does information theory predict human performance better than the n simpler characterization of difficulty as a logarithmic function of the umber of alternatives?
Background image of page 1

Info iconThis preview has intentionally blurred sections. Sign up to view the full version.

View Full DocumentRight Arrow Icon
Image of page 2
This is the end of the preview. Sign up to access the rest of the document.

This note was uploaded on 03/18/2010 for the course IS 2350 taught by Professor Dr.lewis during the Spring '10 term at Pittsburg State Uiversity.

Page1 / 5

RT - Information Theory & Reaction Time Three of the...

This preview shows document pages 1 - 2. Sign up to view the full document.

View Full Document Right Arrow Icon
Ask a homework question - tutors are online