Unformatted text preview: Building, testing theories How to test, collect data regarding some concept from the real world? How do you measure something like “employee performance” in business, what methodology do you use? Operational definitions Difference between concepts and constructs Watt and Van den Berg (1995) Building, testing theories Concept – direct link with reality, Construct, more of an abstraction…
• $ paid on the job INCOME • Tall, short HEIGHT • Watt and Van den Berg (1995) mention “Source credibility” in media, communication • Well what is that exactly? • Expertise, Status, Objectivity • Where do we get expertise, status and objectivity from? Watt and Van den Berg (1995) Building, testing theories Watt and Van den Berg (p.12) suggest… Simply a matter of abstraction…before breaking it down into something observable…
• • Expertise Formal education, experience Status job title, mode of dress, etc. Watt and Van den Berg (1995) Building, testing theories So broad constructs such as “politeness of the restaurant waitstaff” can be helpful for communicating to people what we are interesting in, and what we are studying. But, as we saw with the method of science, other people should understand how you are measuring these things, which would allow them to measure them as well, and possibly replicate your findings Watt and Van den Berg (1995) For example, how would you define?
• • • • • • • Building, testing theories Customer satisfaction with an automobile Quality of life in a neighborhood Performance of federal government Student achievement Effectiveness of teacher Quality, prestige of college/university? Ability, rating of pro baseball player? Ever hear of the book, “Moneyball?” So, when you say age or height, people generally know what you are talking about. But when you mention any of these above, you will need to provide operational definitions. Watt and Van den Berg (1995) How do you define “stress” ? In a recent study, this city was rated as the most stressful… (a) Los Angeles (b) New York (c) Chicago (d) Baltimore (e) Washington, DC (f) Miami (g) St. Louis How do you define “stress” ? What operational definition would you provide to measure this construct? Here’s what this study used (from statistics gathered by the federal government)
June 2009 unemployment figures Cost of living Drops in median home prices from the first quarter in 2008 to the first quarter of 2009 • Population density based on 2008 figures • Number of sunny and partly sunny days based on 2007 figures • Air quality figures based on 2007 data • • • How do you define “stress” ? In a recent study, this city was rated as the most stressful… (a) Los Angeles (b) New York (c) Chicago (d) Baltimore (e) Washington, DC (f) Miami (g) St. Louis Building, testing theories Moving from theoretical to operational definitions
• Three elements are achieved Units of measurement Levels of measurement “provides a mathematical or logical statement” that shows how these measurements are “made and combined to create a single value” (p.16) Watt and Van den Berg (1995) Building, testing theories Units of measurement For example, time in terms of minutes number of instances of violence on TV number of books checked out of library parking tickets issued quarterback sacks in a football game (are there things called “pressures?”) Watt and Van den Berg (1995) Building, testing theories Levels of measurement
• Nominal • Ordinal Presence or absence, black or white e.g. Gender, Democrat, Republican, Independent No real order of magnitude (p.86) Rank ordering Getting a sense that one is greater than another, but not really by how much Now scores on this level specify degrees of measurement, so you how much better or worse someone or something is than another Same as Interval, only there is a true zero with its measurement, when you have zero of something you really have nothing… • Interval • Ratio Building, testing theories
INTERVAL Temperature RATIO Downloading delay NOMINAL Wearing a safety belt (vs. not) ORDINAL College football rankings NOMINAL Flavor of ice cream RATIO Points in a football game Pepsi vs. Coke NOMINAL Kelvin temperature is an exception – a ratio variable Customer satisfaction measured on Sometimes not quite as clear, what about “no satisfaction”? Ratio? a 7point scale (Strongly Agree to Strongly Disagree) INTERVAL Building, testing theories Operational definition + system of measurement = different values of your concept/construct or something that varies…then you’ve got a variable Variables used to test theories The extent to which your theoretical definition (your concept/construct) matches what goes on in the real world is called face validity. The extent to which your operational definition matches your what you meant in your theoretical definition is called measurement validity. Watt and Van den Berg (1995) Building, testing theories Study looking at “job dedication” for business research Theoretically defined by “how good your attendance at work is.” Probably has poor face validity – only speaking to a (small) part of what dedication to one’s job really is. What could improve this construct’s face validity? How about something as simple as “communication satisfaction”? Clampitt & Downs (1993) 8 dimensions of communication satisfaction (and their relation to job productivity) • Communication Climate • Supervisory Communication • Organizational Integration • Media Quality • Coworker Communication • Corporate Information • Personal Feedback • Subordinate Communication Organizational Communication and Job Productivity Communication Climate Clampitt & Downs (1993) • Does org’s communication motivate, inspire workers towards achieving goals and help them identify with the organization? Do people have positive attitude towards open communication? Supervisory Communication • Openness to new ideas, helping problem solving (problems related to jobs), attentiveness and empathy • How often/well do you receive news about your working environment? The 411 on your dept.’s plans, job requirements, news about personnel. Organizational Integration Organizational Communication and Job Productivity Media Quality Clampitt & Downs (1993) • Are meetings wellorganized? Communication isn’t sparse or excessive. Directives are clearly and concisely written. Coworker Communication • “Concerns the extent to which horizontal and informal communication is accurate and freeflowing.” (p.7), also satisfaction with “the grapevine.” • More basic and broad communication – mission/financial statements, policy decisions, goals articulated by organization Corporate Information Organizational Communication and Job Productivity Personal Feedback Clampitt & Downs (1993) • Appraisal of performance, judgments and perceptions about employee work Subordinate Communication • How well do those who work under you respond to your communication, to what extent to they initiate communication? Communication Satisfaction
Communication Climate Supervisory Communication Personal Feedback Media Quality Coworker Communication Corporate Information Organizational Integration Subordinate Communication Communication Satisfaction and Job Productivity Clampitt and Downs (1993) examining two organizations
• • • (S&L – service organization) (Furniture Manufacturer) For first company, personal feedback had greatest impact on job productivity (although this dimension was also ranked lowest [rankings based on employee perceptions.]) Communication Satisfaction and Job Productivity Clampitt and Downs (1993) examining two organizations
• For second company – personal feedback and subordinate communication associated with greatest influence/impact on productivity (these two dimensions clustered together as well.) What is productivity? How would you define productivity? Some notions from subjects…
• • • • • • • • • • Quantity – how much work gets done “Getting the job done” Quality of work (in addition to quantity) Pleasing customers, customer satisfaction Doing the best you can Achieving goals Performing job errorfree “Getting things done on time.” Efficiency with time Value you add to the organization Clampitt & Downs (1993) (p.1415) Other Managerial & Strategic Other Approaches to Organizational Communication Approaches (Eisenberg et al.) Business strategies • Lowest cost (…no frills) • Differentiation Challenge in reducing operating costs (p.312) “…involves highlighting unique or special qualities of a company’s product or service.” • How would you rate the quality of coffee at Starbucks next to its competitors? • Grey Goose and taste tests – fancier packaging does not necessarily make for a better product… Other Managerial & Strategic Other Approaches to Organizational Communication Approaches (Eisenberg et al.) Strategic Thinking … “bird’s eye view of the total enterprise.” (p.309) Changeable, sometimes turbulent environments Communication facilitating strategic positioning Moving organizations “in line” with strategy How to collect, measure data Behavioral Observation Obtrusive Measurement Unobtrusive Measurement Selfreport Measurement Surveys and Interviews Content Analysis Ethics Watt and Van den Berg (1995) How to collect, measure data Behavioral Observation • Trained individual looking out for certain behaviors from subjects/participants, who should also know how to assign the behaviors they record certain values corresponding to variables of interest • How to record something subtle like changes in facial expression? How about sportswriters that judge whether something is a hit or an error in a baseball game? Watt and Van den Berg (1995) How to collect, measure data Behavioral Observation • Watt and Van den Berg (1995) looked at Bandura et al. study – “if the observer sees a child from the experimental group carry out an act of physical or verbal aggression identical to that originally exhibited by the adult model, the child’s act is counted in the category “Imitative Aggression” and not in the category called “Partially Imitative Aggression”.” (p.274) • Other examples of more nuance in a business, social environment? What about measuring behavior within teams? Degrees of cooperation, conflict between/among members? Watt and Van den Berg (1995) How to collect, measure data Behavioral Observation Steps to ensure reliability during behavioral observation… make the observation task simple and concrete; give clear, extensive instructions and training to the observers. (Watt and Van den Berg, p.275) Watt and Van den Berg also use the example of measuring “niceness” in conversation (p.275) Simply asking someone to observe and count the “number of nice remarks made by each person in the conversation”, may not yield either consistent or accurate results… Just like with your operational definitions, feel free to break it down… as the authors say, “Count the number of times each person commented favorably on the clothing of the other,” and “Count the number of times each person said ‘thank you’ to the other,” – more specific and comprehensive, inclusive ways of measuring niceness, like giving more reliable results… Interrater reliability Watt and Van den Berg (1995) How to collect, measure data Obtrusive vs. unobtrusive measurement • How aware is your subject (if s/he is aware at all) of the research situation, the fact that they are “being evaluated” (p.275) • Remember volunteer effects and demand characteristics? • Awareness can produce the following… Sensitization to the experimental manipulation Enhanced memory effects Reactivity to the research settings, etc.
(p.275) Watt and Van den Berg (1995) Artifacts! Webster’s (2nd) definition for the word, artifact “…any nonnatural feature or structure accidentally introduced into something being observed or studied” Some are interactional, and noninteractional Artifacts in behavioral research From Rosnow and Rosenthal (1997) “People studying people” (p.15) Noninteractional artifacts
• Observer bias • Interpreter bias Did your observer over/underestimate occurrence of behaviors? Did this bias occur while recording them? This is why we have people (try to) independently replicate results. Any errors or bias while interpreting this data? This is why data should be made available to other (independent) scientists Well, this type is really bad, as you can imagine… • Intentional bias Artifacts in behavioral research From Rosnow and Rosenthal (1997) “People studying people” (p.15) Interactional artifacts
• Biosocial effect • Psychosocial effect Sex, age, ethnicity race of researcher, possible factors influencing results, how the study is conducted… Personality of researcher… e.g., Has there been prior social contact between researchers and their subjects? (p.29) Also, what/where is the setting of your research Subjects wanting to emulate thoughts, preferences, behaviors of researchers… When your hypothesis somehow influences you in your relationship to your subjects where your hypothesis is far more likely to be confirmed. • Situational effect • Modeling effect • Experimenterexpectancy bias How to collect, measure data Unobtrusive generally preferable… • Even so, subjects may get used to being watched and forget about it, sometimes it is best to let them know this at the beginning, since they generally gravitate back towards normal behavior… (gets into issues of ethics as well…) • Census data, measurement that is generally unobtrusive • Remember archival data?, that’s really unobtrusive • What about data collected over the Web? Implications? Gmail? Facebook, etc.? Watt and Van den Berg (1995) How to collect, measure data Selfreport measurement
• • Usually, an obtrusive measurement Often done through surveys, with associated biases (either inherent in the instrument in the demand characteristics of the situation) • How good is your memory? This may also hamper accurate data collection • Really depends on what you ask… Watt and Van den Berg (1995) How to collect, measure data Content analysis • Even qualitative data can be turned into quantitative data… • For example in communication research content analysis can be used in analyzing “media content, the transcripts of interpersonal conversations or group discussions, persuasive messages, organizational memos, and even nonverbal interchanges.” (Watt and Van den Berg, p.294) • What is your unit of analysis? What categories are you using to file each observation under? • Also issues of intercoder reliability Watt and Van den Berg (1995) How to collect, measure data Ethics Yes, they matter. Don’t make up data! Don’t mess with/change data that you have! Don’t slant (see bias) your study so that you get the data you want! Report your data accurately, talk about the bad as well as the good How to collect, measure data Ethics Withholding some information from subjects at beginning of study sometimes necessary, but full debriefing is in order (i.e., informing them after the study) The Belmont Report IRB’s – a vital function within a university Do no harm! Sadly, formal commissions needed to be conducted (such as those that produced the Belmont report) to prevent such irresponsible, overzealous research • Beneficence, Respect and Justice ...
View Full Document
- Spring '09
- Watt, van den Berg