methodology and data quality of the anes 2008-2009 panel study: lessons for future internet panels...

Post on 31-Mar-2015

217 Views

Category:

Documents

2 Downloads

Preview:

Click to see full reader

TRANSCRIPT

Methodology and Data Qualityof the

ANES 2008-2009 Panel Study:

Lessons for Future Internet PanelsMatthew DeBell

Stanford University

2

Acknowledgments

• National Science Foundation• Stanford University & the University of

Michigan • Knowledge Networks collected data• ANES people:

– Jon A. Krosnick & Arthur Lupia, PIs (2005-2009)

– Vincent Hutchings, Associate PI (2005-2009)– Matthew DeBell, Stanford project director– lots of others!

This talk

• Describe an Internet panel study: ANES 2008-2009 Panel Study

• Evaluate data quality and related factors

• Methodological lessons

3 problems to frame the talk

3 problems to frame the talk

• Coverage error– Inaccurate representation

3 problems to frame the talk

• Coverage error– Inaccurate representation

• Nonresponse bias– Inaccurate representation

3 problems to frame the talk

• Coverage error– Inaccurate representation

• Nonresponse bias– Inaccurate representation

• Attrition– Loss of representation– Loss of power

Other issues

• Conditioning– Loss of representation

• Measurement error– Reporting error, satisficing, other

mode-related effects

This talk

• Describe an Internet panel study: ANES 2008-2009 Panel Study

• Evaluate data quality and related factors

• Methodological lessons

American National Election Studies

• Presidential elections in the USA• 1948-present• Standard mode is face-to-face• Two-wave panel: before and after

each presidential election

ANES 2008-2009 Panel Study

12

Design

• Sequential mixed-mode– Phone recruitment– Internet panel

• 21 waves online (25-30 minutes each)

– 7 “ANES waves” – 14 “off-waves” of non-political KN data

• Internet provided to Rs without access

13

MSN TV 2

14

Recruitment

• RDD recruitment– Landline only

• Two cohorts– Cohort 1 started January 2008– Cohort 2 started September 2008

• One person per household

• $10 per month incentive

• Recruitment-stage response rate (AAPOR RR3): 42%

15

Mixed-Mode Recruitment

• Internet-only recruitment fallback– Cases not enrolled by telephone (2,992)

were mailed a letter asking them to complete the recruitment survey on-line. 119 did so.

Communications (first 8 months)

• D-0 invitation email• D+3 email reminder• D+6 email reminder #2• D+14 phone reminder (repeated

weekly)

Revised communications

• D-3 notice email• D-0 invitation email• D+3 email reminder• D+6email reminder #2• D+11 email reminder #3• D+13 phone reminder (repeated weekly)

• D+19 email reminder #4 (repeated 10d)

This talk

• Describe an Internet panel study: ANES 2008-2009 Panel Study

• Evaluate data quality and related factors

• Methodological lessons

19

Part 2: Quality Evaluation

• Will show:– Number of cases– Dropout recovery– Case validation – Response rate– Panel retention / attrition numbers– Attrition effects– Accuracy of estimates

20

ncohort 1 cohort2

• Wave 1 (Jan ’08) 1,624• Wave 2 (Feb ’08) 1,458• Wave 6 (Jun ’08) 1,421 • Wave 9 (Sep ’08) 1,488 + 1,106 = 2,594• Wave 10 (Oct ’08) 1,511 + 1,126 = 2,637• Wave 11 (Nov ’08) 1,508 + 1,167 = 2,675 • Wave 13 (Jan ’09) 1,453 + 1,094 = 2,547• Wave 17 (May ’09) 1,387 + 1,016 = 2,403

21

Panelist Recovery Experiment

Table 1. Yield and retention percentages in Panel Study incentive groups, by month

Month Total $30 group $50 group Difference Total $30 group $50 group DifferenceJune 48.5 45.0 52.0 7.0 — — — —July — — — — — — — —August 55.5 49.0 62.0 13.0 † 77.3 68.9 84.6 15.7 *September 13.5 11.0 16.0 5.0 21.6 17.8 25.0 7.2Notes:— not applicable.† statistically significant at p < .10.* statistically significant at p<.05.Initial N=100 in the $30 group, 100 in the $50 group, and 200 overall.The September figures reflect only the first two days of data collection.Yield is the proportion of the invited panelists who completed the surveyRetention is the proportion of invited panelists who completed the June survey and completed a later survey

Yield Retention

22

Panelist Recovery Action

• $50 offers to all 282 panel “dropouts”.– Dropouts had completed the Profile or

Wave 1 or Wave 2 but not waves 7-9.

• Result highlights:– W10: 132 completions (47 percent)– W11: 132 completions (47 percent)– W17: 129 completions (46 percent)

23

Case Validation

• Telephone surveys with 10% subsample of Rs to each of the 7 planned ANES waves

• 1,482 interviews

• We found…

24

Case Validation

• Telephone surveys with 10% subsample of Rs to each of the 7 planned ANES waves

• 1,482 interviews

• 100% confirmed names and participation

25

But…

• Only half to two-thirds recalled topic of previous survey– Probably due to excessive delay in

validation calls.

• Imperfect item reliability– 18 instances of sex inconsistency– 91 instances of year of birth inconsistency

• Many appear to be data entry errors

• Error rates appear consistent with face-to-face surveys

26

Response Rates

RR1 RR3 RR5min est max

• Recruitment 26 42 75• Wave 1 18 29 51• Wave 2 17 26 46• Wave 6 16 25 45 • Wave 11 17 27 47• Wave 17 15 24 43

Retention: ANES vs. KnowledgePanel

• ANES panel retention was better than KP– [graphic & numbers not for public

distribution]

27

28

Retention Rates at Wave 11

• n = 2,675

• Retention– From Recruitment 63 percent (2,649)– From Profile 84 percent (2,439)– From Wave 1 85 percent (1,381)– From Wave 10 95 percent (2,500)

29

Completions at Wave 11 (Nov ’08)

total cohort 1

• Total 2675 1508

• Completed: – Recruitment 2649 (99%) 1483 (98%)– Profile 2439 (91%) 1328 (88%)– Wave 1 1381 (52%) 1381 (92%)– Wave 10 2500 (93%) 1433 (95%)– Waves 9 & 10 2319 (87%) 1345 (89%)– All ANES stages 1058 (40%) 1058 (70%)– All stages (1-10) 738 (28%) 738 (49%)

30

Other Retention Rates

• Mean wave-to-wave retention: 91 percent

• Cohort 1 Rs who completed all ANES waves through May 2009 (1,2,6,9,10,11,17): 68 percent (939)

• 83% of May 2009 completers had completed Waves 9, 10, and 11.

31

Attrition effects

• 1623 Rs completed Wave 1• 1,258 of these also completed

Wave 17– 78 percent retention; 22 percent

attrition

• Ran frequencies on all Wave 1 variables for these two groups and…

32

Attrition effects

• 1623 Rs completed Wave 1• 1,258 of these also completed

Wave 17– 78 percent retention; 22 percent

attrition

• Ran frequencies on all Wave 1 variables for these two groups

• Average difference: 1.3 points.

Differential attrition (wave17)

Factor W1 W17 Diff.•18-29 year olds 19.716.1 -3.6•Males 47.244.4 -2.8•HS dropouts 10.88.8 -2.0•Home renters 17.414.7 -2.7•Non-voters 22.718.2 -4.5•No Obama affect 30.527.4 -3.1

34

Accuracy of Estimates (1 of

2)• Benchmark to CPS.• 43 statistics examined, for:

– Age, sex, race, ethnicity, race/ethnicity, education, home tenure, household size, marital status, household income, presidential vote choice, voter turnout

• Estimates are within 5 points of benchmark for 84 percent (36 of 43) of statistics examined.

Accuracy of Estimates (2 of

2)• >5pt errors for a few statistics

– Renters (-9.3)– One-person households (-5.4)– Married (+10.2)– Income >$100,000/yr (-6.6)

• Average error: 2.1 points

This talk

• Describe an Internet panel study: ANES 2008-2009 Panel Study

• Evaluate data quality and related factors

• Methodological lessons

Part 3: Lessons

Concerning:•Data quality•Measuring quality immediately•Promoting quality

Data Quality

• Data quality from the telephone-recruited panel is consistent with expectations for a high-quality telephone survey– Good accuracy– Only moderate attrition

Measuring Quality Immediately

• Monitor attrition during panel• Validation interviews

– Should happen immediately after web completion

• Concern: respondent identity

Promoting Quality

• Fight non-response bias – high-quality recruitment– multiple contacts over a long period of time– incentives

• Fight attrition– pleasant content– incentives– dropout recovery with added incentives

• Fight conditioning with variety

Final Thoughts

• Other possibilities– Targeted incentives to combat NR

bias– Oversamples at recruitment to

combat NR

• Landline RDD is obsolete!• Literacy

42

Thank you

www.electionstudies.org

Matthew DeBelldebell@stanford.edu

top related