This preview shows page 1. Sign up to view the full content.
Unformatted text preview: JS202: Research Design, Methods and Evaluation Week 4 Concepts, operationalisation and measurement Eg. Substance Abuse What is it? (conceptualization) How is it measured? (measurement) You need to know how something is defined and how something is measured to reach a full understanding of the concept Concept of crime What is a crime? Concept of harm? Presence of a victim? Against the law? Specific offense or incident? Specific individual offender? We need to be really clear about how we define our terms concept Is abstract Varies depending on our background and experience Words or symbols in language that we use to represent mental images Words can have many meanings, need to be specific Then we can replicate our (and others' studies) Process of conceptualization Conceptualization (the process of specifying what we mean by a term) Conceptual definition (working definition specifically assigned to a term) Operational definition (definition that spells out precisely how concept will be measured) Measurement (assigning value to observation) examples Conceptual definition Substance abuse, recidivism, relative poverty/absolute poverty abuse, recidivism, Operational definition Substance poverty Sometimes people disagree (non-terrorist hijacking) but that's okay, as long as we're clear about explaining our Levels of measurement Measurement is a systematic procedure to assign real numbers to objects A variable needs to be: Exhaustive Mutually exclusive Levels of measurement Nominal Names, categories Ordinal Order matters Interval Order matters + equal interval between values matters + equal interval + true zero Ratio Order Measurement error Systematic Can distort one's findings Not in file v unknown v missing Random Noise Inaccuracy due to varying degrees for individual instances Check that it's not related to another variable with random error Reliability and validity Short version: Reliability If you did it again would you get the same result? you measuring what you think you're measuring? Validity Are Reliability Consistency or stability of measurement Tape measures are reliable every time you use it, 4" will be 4" Theory of reliability: error is: Random: not systematic (fatigue, forgetfulness, transcription error) Self compensating: makes up for itself Reliability problems Interviewing personal characteristics of each interviewer Coding open ended questions Solutions to reliability problems Test-retest Inter-rater reliability Make items clear and unambiguous Add more (similar) items Give clear instructions Increase sample size Validity The proportion of total variance that is common factor variance More difficult to demonstrate than reliability Five ways to assess validity Face validity Content validity Criterion validity Construct validity Multiple measures Chronological perspectives of testing new programs Define the problem Fitting the evaluation to the problem Where is the problem and how big is it? Chronological perspectives to testing ongoing programs Is the Px reaching the appropriate beneficiaries? Is the Px being properly delivered? Are the funds being used appropriately? Can effectiveness be estimated? Did the Px work? Was the Px worth it? Put the findings in a larger context Understanding programs Is important to: Develop issues Formulate questions that are relevant and incisive Understand the data Interpret the evidence Make sound recommendations Reporting to the PX Use in meta-analysis Monitoring a good sense of the Planning the evaluation
***Timing is everything!!! ***Fix a deadline, get started early Identify the key questions for study Decide on a method (quant/qual) Develop measures and techniques Decide how to collect data Operationalize measures Plan appropriate design Collect and analyze data Write report Disseminate results ...
View Full Document
This note was uploaded on 09/08/2010 for the course SCWK 242 at San Jose State University .