Description
Survey Methodology 2nd Edition by Robert M. Groves, ISBN-13: 978-0470465462
[PDF eBook eTextbook] – Available Instantly
- Publisher: Wiley; 2nd edition (June 29, 2009)
- Language: English
- ISBN-10: 0470465468
- ISBN-13: 978-0470465462
This new edition of Survey Methodology continues to provide a state-of-the-science presentation of essential survey methodology topics and techniques. The volume’s six world-renowned authors have updated this Second Edition to present newly emerging approaches to survey research and provide more comprehensive coverage of the major considerations in designing and conducting a sample survey.
Key topics in survey methodology are clearly explained in the book’s chapters, with coverage including sampling frame evaluation, sample design, development of questionnaires, evaluation of questions, alternative modes of data collection, interviewing, nonresponse, post-collection processing of survey data, and practices for maintaining scientific integrity. Acknowledging the growing advances in research and technology, the Second Edition features:
- Updated explanations of sampling frame issues for mobile telephone and web surveys
- New scientific insight on the relationship between nonresponse rates and nonresponse errors
- Restructured discussion of ethical issues in survey research, emphasizing the growing research results on privacy, informed consent, and confidentiality issues
- The latest research findings on effective questionnaire development techniques
- The addition of 50% more exercises at the end of each chapter, illustrating basic principles of survey design
- An expanded FAQ chapter that addresses the concerns that accompany newly established methods
Providing valuable and informative perspectives on the most modern methods in the field, Survey Methodology, Second Edition is an ideal book for survey research courses at the upper-undergraduate and graduate levels. It is also an indispensable reference for practicing survey methodologists and any professional who employs survey research methods.
Table of Contents:
Front Matter
PREFACE TO THE FIRST EDITION
PREFACE TO THE SECOND EDITION
ACKNOWLEDGMENTS
CHAPTER ONE AN INTRODUCTION TO SURVEY METHODOLOGY
A Note to the Reader
1.1 Introduction
1.2 A Brief History of Survey Research
1.2.1 The Purposes of Surveys
Schuman (1997) on “Poll” Versus “Survey”
1.2.2 The Development of Standardized Questioning
1.2.3 The Development of Sampling Methods
1.2.4 The Development of Data Collection Methods
1.3 Some Examples of Ongoing Surveys
1.3.1 The National Crime Victimization Survey
Table 1.1. Example Survey: National Crime Victimization Survey (NCVS)
Figure 1.1 Percentage of U.S. households experiencing a crime by type, 1994-2005 National Crime Victimization Survey.
1.3.2 The National Survey on Drug Use and Health
Table 1.2. Example Survey: National Survey of Drug Use and Health (NSDUH)
Figure 1.2 Percentage of persons reporting illicit drug use in past month, by drug type, 2004-2006
1.3.3 The Surveys of Consumers
Table 1.3. Example Survey: Surveys of Consumers (SOC)
Figure 1.3 Consumer unemployment expectations and actual change in the U.S. unemployment rate, 1969-2009
1.3.4 The National Assessment of Educational Progress
Table 1.4. Example Survey: National Assessment of Educational Progress (NAEP)
Figure 1.4 Average scale scores on grade 12 mathematics assessment, by year by type of school.
1.3.5 The Behavioral Risk Factor Surveillance System
Table 1.5. Example Survey: Behavioral Risk Factor Surveillance System (BRFSS)
Figure 1.5a Percentage of state adults who are obese (body mass index ≥ 30) by state, 1994, BRFSS.
Figure 1.5b Percentage of state adults who are obese (body mass index ≥ 30) by state, 2001, BRFSS.
Figure 1.5c Percentage of state adults who are obese (body mass index ≥ 30) by state, 2007, BRFSS.
1.3.6 The Current Employment Statistics Program
Table 1.6. Example Survey: Current Employment Statistics (CES)
Figure 1.6 Number of employees of all nonfarm employers in thousands, annual estimates 1947-2007, Current Employment Statistics.
1.3.7 What Can We Learn From the Six Example Surveys?
1.4 What is Survey Methodology?
1.5 The Challenge of Survey Methodology
1.6 About this Book
Keywords
For More In-Depth Reading
National Crime Victimization Survey
National Survey of Drug Use and Health
Surveys of Consumers
National Assessment of Educational Progress
Behavioral Risk Factor Surveillance System
Current Employment Statistics
Exercises
CHAPTER TWO INFERENCE AND ERROR IN SURVEYS
2.1 Introduction
Figure 2.1 Two types of survey inference.
2.2 The Life Cycle of a Survey From a Design Perspective
Figure 2.2 Survey lifecycle from a design perspective.
2.2.1 Constructs
2.2.2 Measurement
2.2.3 Response
2.2.4 Edited Response
2.2.5 The Target Population
2.2.6 The Frame Population
Illustration—Populations of Inference and Target Populations
2.2.7 The Sample
2.2.8 The Respondents
Figure 2.3 Unit and item nonresponse in a survey data file.
2.2.9 Postsurvey Adjustments
2.2.10 How Design Becomes Process
Figure 2.4 A survey from a process perspective.
2.3 The Life Cycle of a Survey from A Quality Perspective
Figure 2.5 Survey life cycle from a quality perspective.
2.3.1 The Observational Gap between Constructs and Measures
The Notion of Trials
2.3.2 Measurement Error: The Observational Gap between the Ideal Measurement and the Response Obtained
The Notion of Variance or Variable Errors
2.3.3 Processing Error: The Observational Gap between the Variable Used in Estimation and that Provided by the Respondent
2.3.4 Coverage Error: The Nonobservational Gap between the Target Population and the Sampling Frame
Figure 2.6 Coverage of a target population by a frame.
2.3.5 Sampling Error: The Nonobservational Gap between the Sampling Frame and the Sample
Figure 2.7 Samples and the sampling distribution of the mean.
2.3.6 Nonresponse Error: The Nonobservational Gap between the Sample and the Respondent Pool
2.3.7 Adjustment Error
2.4 Putting It All Together
2.5 Error Notions in Different Kinds of Statistics
2.6 Nonstatistical Notions of Survey Quality
2.7 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER THREE TARGET POPULATIONS, SAMPLING FRAMES, AND COVERAGE ERROR
3.1 Introduction
3.2 Populations and Frames
3.3 Coverage Properties of Sampling Frames
3.3.1 Undercoverage
Mulry (2007) on U.S. Decennial Census Coverage
3.3.2 Ineligible Units
3.3.3 Clustering of Target Population Elements Within Frame Elements
Figure 3.1 Cluster of target population elements associated with one sampling frame element.
3.3.4 Duplication of Target Population Elements in Sampling Frames
Figure 3.2 Duplication of target population elements by more than one sampling frame element.
3.3.5 Complicated Mappings between Frame and Target Population Elements
Figure 3.3 Clustering and duplication of target population elements relative to sampling frame elements.
3.4 Alternative Frames for Surveys of the Target Population of Households or Persons
3.4.1 Area Frames
3.4.2 Telephone Number Frames for Households and Persons
Figure 3.4 Percentage of U.S. adults with wireless telephone service only and percentage without telephones, January, 2004-June, 2008.
3.4.3 Frames for Web Surveys of General Populations
3.5 Frame Issues for Other Common Target Populations
3.5.1 Customers, Employees, or Members of an Organization
3.5.2 Organizations
3.5.3 Events
3.5.4 Rare Populations
3.6 Coverage Error
3.7 Reducing Undercoverage
3.7.1 The Half-Open Interval
Figure 3.5 Address list for area household survey block.
Figure 3.6 Sketch map for area household survey block.
3.7.2 Multiplicity Sampling
3.7.3 Multiple Frame Designs
Figure 3.7 Dual frame sample design.
3.7.4 Increasing Coverage While Including More Ineligible Elements
Tourangeau, Shapiro, Kearney, and Ernst (1997) and Martin (1999) on Household Rosters
3.8 Summary
Keywords
For More in-Depth Reading
Exercises
CHAPTER FOUR SAMPLE DESIGN AND SAMPLING ERROR
4.1 Introduction
4.2 Samples and Estimates
Figure 4.1 Unknown distribution for variable Y in frame population.
Figure 4.2 Distributions of y variable from sample realizations samples and the sampling distribution of the mean.
Figure 4.3 Key notation for sample realization, frame population, and sampling distribution of sample means.
Warning
4.3 Simple Random Sampling
Comment
Comment
4.4 Cluster Sampling
Figure 4.4 A bird’s-eye view of a population of 30 “” and 30 “” households clustered into six city blocks, from which two blocks are selected.
Comment
4.4.1 The Design Effect and Within-Cluster Homogeneity
Table 4.1. Mean roh Values for Area Probability Surveys about Female Fertility Experiences in Five Countries by Type of Variable
Kish and Frankel (1974) on Design Effects for Regression Coefficients
4.4.2 Subsampling within Selected Clusters
4.5 Stratification and Stratified Sampling
Figure 4.5 Frame population of 20 establishments sorted alphabetically, with SRS sample realization of size n = 4.
Figure 4.6 Frame population of 20 establishments sorted by group, with stratified element sample of size nh = 1 from each stratum.
4.5.1 Proportionate Allocation to Strata
Table 4.2. Proportionate Stratified Random Sample Results from a School Population Divided Into Three Urbanicity Strata
Cochran (1961) on How Many Strata to Use
Design Effects for the Stratified Mean
4.5.2 Disproportionate Allocation to Strata
Neyman (1934) on Stratified Random Sampling
4.6 Systematic Selection
Figure 4.7 Frame population of 20 establishments sorted by group with systematic selection; selection interval = 5 and random start = 2.
Comment
4.7 Complications in Practice
4.7.1 Two-Stage Cluster Designs with Probabilities Proportionate to Size (PPS)
Table 4.3. Block Housing Unit Counts and Cumulative Counts for a Population of Nine Blocks
4.7.2 Multistage and Other Complex Designs
4.7.3 How Complex Sample Designs Are Described: The Sample Design for the NCVS
4.8 Sampling U.S. Telephone Households
Figure 4.8 Number of 100 blocks of number by number listed within the block, 1986 and 2008. (Source: Survey Sampling, Inc.)
4.9 Selecting Persons within Households
4.10 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER FIVE METHODS OF DATA COLLECTION
Figure 5.1 A survey from a process perspective.
5.1 Alternative Methods of Data Collection
Figure 5.2 The evolution of survey technology.
5.1.1 Degree of Interviewer Involvement
5.1.2 Degree of Interaction with the Respondent
5.1.3 Degree of Privacy
5.1.4 Channels of Communication
5.1.5 Technology Use
5.1.6 Implications of these Dimensions
5.2 Choosing the Appropriate Method
5.3 Effects of Different Data Collection Methods on Survey Errors
5.3.1 Measuring the Marginal Effect of Mode
Table 5.1. Design Issues in Research Comparing Face-to-Face and Telephone Surveys
Hochstim (1967) on Personal Interviews Versus Telephone Interviews Versus Mail Questionnaires
The de Leeuw and van der Zouwen (1998) Meta-Analysis of Data Quality in Telephone and Face-to-Face Surveys
5.3.2 Sampling Frame and Sample Design Implications of Mode Selection
5.3.3 Coverage Implications of Mode Selection
Figure 5.3 Percentage of U.S. adults who ever use the Internet, quarter 1, 2000, to quarter 3, 2008.
5.3.4 Nonresponse Implications of Mode Selection
5.3.5 Measurement Quality Implications of Mode Selection
The Tourangeau and Smith (1996) Study of Mode Effects on Answers to Sensitive Questions
Figure 5.4 Ratio of proportion of respondents reporting illicit drug use in self-administered versus interviewer-administered questionnaires, by time period by drug.
5.3.6 Cost Implications
5.3.7 Summary on the Choice of Method
5.4 Using Multiple Modes of Data Collection
Figure 5.5 Five different types of mixed mode designs.
5.5 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER SIX NONRESPONSE IN SAMPLE SURVEYS
6.1 Introduction
6.2 Response Rates
6.2.1 Computing Response Rates
Merkle and Edelman (2002) on How Nonresponse Rates Affect Nonresponse Error
6.2.2 Trends in Response Rates Over Time
Figure 6.1 Household nonresponse rate, household refusal rate, and person refusal rate for the National Crime Victimization Survey by year.
Figure 6.2 Nonresponse and refusal rates for the Current Population Survey by year.
Figure 6.3 Nonresponse rate and refusal rate for the Survey of Consumers by year.
Figure 6.4 Median nonresponse rate across states, Behavioral Risk Factor Surveillance System, 1987-2007.
6.3 Impact of Nonresponse on the Quality of Survey Estimates
Figure 6.5 Estimates of the absolute value of the relative nonresponse bias for 959 estimates by nonresponse rate of survey.
6.4 Thinking Causally About Survey Nonresponse Error
Figure 6.6 Alternative models for relationship between response propensity (P) and survey variable (Y), involving auxiliary variables (S, Z).
6.5 Dissecting The Nonresponse Phenomenon
6.5.1 Unit Nonresponse Due to Failure to Deliver the Survey Request
Figure 6.7 Causal influences on contact with sample household.
Figure 6.8 Percentage of eligible sample households by calls to first contact for five surveys.
Figure 6.9 Percentage household contacted among those previously uncontacted by call number by time of day. (National Survey of Family Growth, Cycle 6.)
Figure 6.10 Percentage nonresponse bias for estimated proportion of single person households, by number of calls required to reach the house hold, for four surveys.
6.5.2 Unit Nonresponse Due to Refusals
The “I’m Not Selling Anything” Phenomenon
Figure 6.11 Two sample persons with different leverages for attributes of a survey request.
What Interviewers Say
6.5.3 Unit Nonresponse Due to the Inability to Provide the Requested Data
6.6 Design Features to Reduce Unit Nonresponse
Figure 6.12 Tools for reducing unit nonresponse rates.
What Interviewers Say about Approaching Sample Households
Berlin, Mohadjer. Waksberg, Kolstad, Kirsch, Rock, and Yamamoto (1992) on Incentives and Interviewer Productivity
Morton-Williams (1993) on Tailoring Behavior by Interviewers
6.7 Item Nonresponse
Figure 6.13 Beatty-Herrmann model of response process for item-missing data.
6.8 Are Nonresponse Propensities Related To Other Error Sources?
6.9 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER SEVEN QUESTIONS AND ANSWERS IN SURVEYS
7.1 Alternatives Methods of Survey Measurement
7.2 Cognitive Processes in Answering Questions
Figure 7.1 A simple model of the survey response process.
7.2.1 Comprehension
7.2.2 Retrieval
7.2.3 Estimation and Judgment
7.2.4 Reporting
7.2.5 Other Models of the Response Process
Comments on Response Strategies
7.3 Problems in Answering Survey Questions
7.3.1 Encoding Problems
Fowler (1992) on Unclear Terms in Questions
7.3.2 Misinterpreting the Questions
7.3.3 Forgetting and Other Memory Problems
Figure 7.2 Recall accuracy for types of personal information.
Table 7.1. Summary of Factors Affecting Recall
Neter and Waksberg (1964) on Response Errors
7.3.4 Estimation Processes for Behavioral Questions
Overreporting and Underreporting
Schwarz, Hippler, Deutsch, and Strack (1985) on Response Scale Effects
7.3.5 Judgment Processes for Attitude Questions
7.3.6 Formatting the Answer
7.3.7 Motivated Misreporting
7.3.8 Navigational Errors
Figure 7.3 Example questions from Jenkins and Dillman (1997).
7.4 Guidelines for Writing Good Questions
7.4.1 Nonsensitive Questions About Behavior
7.4.2 Sensitive Questions About Behavior
7.4.3 Attitude Questions
7.4.4 Self-Administered Questions
Figure 7.4 Illustration of use of visual contrast to highlight the response box.
7.5 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER EIGHT EVALUATING SURVEY QUESTIONS
8.1 Introduction
8.2 Expert Reviews
8.3 Focus Groups
8.4 Cognitive Interviews
Presser and Blair (1994) on Alternative Pretesting Methods
8.5 Field Pretests and Behavior Coding
Table 8.1. Examples of Behavior Codes for Interviewer and Respondent Behaviors
8.6 Randomized or Split-Ballot Experiments
Oksenberg, Cannell, and Kalton (1991) on Probes and Behavior Coding
Percent of problems per question
8.7 Applying Question Standards
8.8 Summary of Question Evaluation Tools
Table 8.2. Studies Comparing Question Evaluation Methods
8.9 Linking Concepts of Measurement Quality to Statistical Estimates
8.9.1 Validity
Estimating Validity with Data External to the Survey.
Estimating Validity with Multiple Indicators of the Same Construct.
Figure 8.1 Path diagram representing Yαi = λαμi + ɛαi, a measurement model for μi.
8.9.2 Response Bias
Using Data on Individual Target Population Elements.
Table 8.3. Percentage of Known Hospitalizations Not Reported, by Length of Stay and Time Since Discharge
Using Population Statistics Not Subject to Survey Response Error.
8.9.3 Reliability and Simple Response Variance
Repeated Interviews with the Same Respondent.
Table 8.4. Indexes of Inconsistency for Various VictimizationIncident Characteristics, NCVS
Using Multiple Indicators of the Same Construct.
Table 8.5. Illustrative Intercorrelations among MHI-5 Items
O’Muircheartaigh (1991) on Reinterviews to Estimate Simple Response Variance
8.10 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER NINE SURVEY INTERVIEWING
9.1 The Role of the Interviewer
9.2 Interviewer Bias
9.2.1 Systematic Interviewer Effects on Reporting of Socially Undesirable Attributes
9.2.2 Systematic Interviewer Effects on Topics Related to Observable Interviewer Traits
Schuman and Converse (1971) on Race of Interviewer Effects in the United States
Percentage Answering in Given Category by Race of Interviewer
9.2.3 Systematic Interviewer Effects Associated with Interviewer Experience
Table 9.1. Percentage Reporting Lifetime Use of Any Illicit Substance by Interview Order by Interviewer Experience (1998 NSDUH)
9.3 Interviewer Variance
9.3.1 Randomization Requirements for Estimating Interviewer Variance
9.3.2 Estimation of Interviewer Variance
Kish (1962) on Interviewer Variance
Statistics Showing High Values of ρint
9.4 Strategies for Reducing Interviewer Bias
9.4.1 The Role of the Interviewer in Motivating Respondent Behavior
9.4.2 Changing Interviewer Behavior
9.5 Strategies for Reducing Interviewer-Related Variance
9.5.1 Minimizing Questions that Require Nonstandard Interviewer Behavior
9.5.2 Professional, Task-Oriented Interviewer Behavior
9.5.3 Interviewers Reading Questions as They Are Worded
9.5.4 Interviewers Explaining the Survey Process to the Respondent
9.5.5 Interviewers Probing Nondirectively
9.5.6 Interviewers Recording Answers Exactly as Given
9.5.7 Summary on Strategies to Reduce Interviewer Variance
9.6 The Controversy About Standardized Interviewing
9.7 Interviewer Management
9.7.1 Interviewer Selection
Conrad and Schober (2000) on Standardized versus Conversational Interviewing Techniques
9.7.2 Interviewer Training
Table 9.2. Percentage of Interviewers Rated Excellent or Satisfactory for Six Criteria by Length of Interviewer Training
9.7.3 Interviewer Supervision and Monitoring
9.7.4 The Size of Interviewer Workloads
9.7.5 Interviewers and Computer Use
9.8 Validating The Work of Interviewers
Table 9.3. Percentage of Interviewers Detecting Falsifying for Three Surveys Conducted by the U.S. Bureau of the Census
9.9 The Use Of Recorded Voices (And Faces) In Data Collection
9.10 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER TEN POSTCOLLECTION PROCESSING OF SURVEY DATA
10.1 Introduction
Figure 10.1 Flow of processing steps in paper surveys.
Figure 10.2 Flow of processing steps in computer-assisted surveys.
10.2 Coding
10.2.1 Practical Issues of Coding
10.2.2 Theoretical Issues in Coding Activities
Figure 10.3 Comprehension and judgment task of the coder.
10.2.3 “Field Coding”—An Intermediate Design
Table 10.1. Illustration of Field Coding in the NCVS for the Question “Where Did This Incident Happen?”
Collins and Courtenay (1985) on Field versus Office Coding
10.2.4 Standard Classification Systems
The Standard Occupational Classification (SOC).
Table 10.2. 23 Group Standard Occupational Classification
The North American Industry Classification System (NAICS).
Table 10.3. Comparison of SIC Divisions and NAICS Sectors
Table 10.4. NAICS Structure and Nomenclature
10.2.5 Other Common Coding Systems
10.2.6 Quality Indicators in Coding
Weaknesses in the Coding Structure.
Coder Variance.
Table 10.5. Coder Variance Statistics for Occupation Coding
10.2.7 Summary of Coding
10.3 Entering Numeric Data into Files
10.4 Editing
Summary of Editing.
10.5 Weighting
10.5.1 Weighting with a First-Stage Ratio Adjustment
10.5.2 Weighting for Differential Selection Probabilities
10.5.3 Weighting to Adjust for Unit Nonresponse
Table 10.6. Hypothetical Equal Allocation for Latinos, with Nonresponse Adjustments
10.5.4 Poststratification Weighting
Table 10.7. Weighted Sample Distribution and Poststratification for Hypothetical NCVS Sample by Gender, Age, and Ethnicity
10.5.5 Putting All the Weights Together
Ekholm and Laaksonen (1991) on Propensity Model Weighting to Adjust for Unit Nonresponse
10.6 Imputation for Item-Missing data
Figure 10.4 Unit and item nonresponse in a survey data file.
Table 10.8. Illustration of Sequential Hot-Deck Imputation for Family Income, Imputed Data, and Imputation Flag Variable
10.7 Sampling Variance Estimation for Complex Samples
Taylor Series Estimation.
Balanced Repeated Replication and Jackknife Replication.
10.8 Survey Data Documentation and Metadata
Figure 10.5 Illustration of printable codebook section for the National Crime Victimization Survey.
10.9 Summary
Keywords
For More In-Depth Reading
Exercises
CHAPTER ELEVEN PRINCIPLES AND PRACTICES RELATED TO ETHICAL RESEARCH
11.1 Introduction
11.2 Standards for the Conduct of Research
Table 11.1. Key Terminology in Research Misconduct
Table 11.2. Percentage of Interviewers Detected Falsifying for Three Surveys Conducted by the U.S. Bureau of the Census
11.3 Standards for Dealing with Clients
11.4 Standards for Dealing with the Public
Table 11.3. Elements of Minimal Disclosure (AAPOR Code)
11.5 Standards for Dealing with Respondents
11.5.1 Legal Obligations to Survey Respondents
The Tuskegee Study of Syphilis
11.5.2 Ethical Obligations to Respondents
11.5.3 Informed Consent: Respect for Persons
Table 11.4. Essential Elements of Informed Consent
Project Metropolitan
11.5.4 Beneficence: Protecting Respondents from Harm
11.5.5 Efforts at Persuasion
11.6 Emerging Ethical Issues
11.7 Research About Ethical Issues In Surveys
11.7.1 Research on Informed Consent Protocols
Research on Respondents’ Reactions to Informed Consent Protocols.
Table 11.5. Self-Reports on Type of Questions Considered Offensive to the Respondent Among Respondents Saying Researchers “Had No Business Asking” Sensitive Questions
Singer (1978) on Comprehension of Informed Consent
Research on Informed Consent Complications in Methodological Studies.
Research on Written versus Oral Informed Consent.
Summary of Research on Informed Consent in Surveys.
11.7.2 Research on Confidentiality Assurances and Survey Participation
11.8 Administrative and Technical Procedures for Safeguarding Confidentiality
11.8.1 Administrative Procedures
Figure 11.1 Pledge made by research team members about respondent privacy.
Table 11.6. Principles and Practices for Protection of Sensitive Data
11.8.2 Technical Procedures
Restricting Access to the Data.
Restricting the Contents of the Survey Data That May Be Released.
11.9 Summary and Conclusions
Keywords
For More In-Depth Reading
Exercises
CHAPTER TWELVE FAQS ABOUT SURVEY METHODOLOGY
12.1 Introduction
12.2 The Questions and Their Answers
Back Matter
REFERENCES
What makes us different?
• Instant Download
• Always Competitive Pricing
• 100% Privacy
• FREE Sample Available
• 24-7 LIVE Customer Support