Motivated Misreporting
How respondent incentives shape survey data quality
Motivated misreporting occurs when survey respondents give incorrect answers to shorten the interview. For example, a respondent who says “no” to a filter question to avoid a block of follow-up questions, or a panel respondent who learns over time which answers trigger fewer questions. This behavior is a form of satisficing that directly undermines data quality, and it can affect any survey where question structure creates incentives for strategic responding.
My research on this topic spans over a decade and investigates the mechanisms, contexts, and consequences of motivated misreporting.
Mechanisms of misreporting
My earliest work in this area focused on understanding why misreporting happens, by linking survey responses to high-quality administrative records.
Assessing the Mechanisms of Misreporting to Filter Questions in Surveys (Eckman, Kreuter, Kirchner, Jaeckle, Tourangeau & Presser, 2014, Public Opinion Quarterly). We tested whether misreporting in filter questions is driven by respondents strategically shortening the interview or by acquiescence bias. Linking to administrative data, we found strong evidence for motivated underreporting — respondents in the interleafed format learned to say “no” to avoid follow-ups.
Misreporting to Looping Questions in Surveys (Eckman & Kreuter, 2018, Survey Research Methods). Looping questions — which collect details about repeated events like jobs or accounts — create similar incentives. We tested how question format affects accuracy using a web survey linked to administrative records, finding that format shapes both the number of events reported and the quality of follow-up data.
Consumer expenditure studies
The U.S. Consumer Expenditure Survey is a major government survey whose data inform the calculation of the official measure of inflation. Its structure — many filter questions followed by detailed spending follow-ups — makes it a natural setting to study motivated misreporting.
Underreporting of Purchases in the U.S. Consumer Expenditure Survey (Eckman, 2021, Journal of Survey Statistics and Methodology). Using a parallel web survey and multiple imputation, I estimated that household purchases are underreported by approximately 5 percentage points in the first wave — without experimentally manipulating the survey itself.
Rotation Group Bias in Reporting of Household Purchases (Bach & Eckman, 2019, Economics Letters). We tested whether response quality changes across survey waves. Contrary to expectations, respondents actually improved over time — becoming more likely to report exact spending amounts and less likely to round.
Panel Conditioning in the U.S. Consumer Expenditure Survey (Eckman & Bach, 2021, Journal of Official Statistics). We tested whether respondents learn the questionnaire structure and underreport in later waves. Analyzing over 10,000 four-wave respondents, we found no evidence of declining data quality — panel respondents tended to give higher quality responses over time.
Mode and device effects
Survey mode and device can change the burden respondents experience, potentially amplifying misreporting.
- Motivated Misreporting in Smartphone Surveys (Daikeler, Bach, Silber & Eckman, 2020, Social Science Computer Review). Smartphone surveys are more burdensome due to smaller screens and longer load times. We tested whether this increases motivated misreporting. Smartphone respondents did not trigger fewer filter questions, but they did provide lower quality follow-up data, especially in the grouped question format.
Respondent characteristics
Not all respondents are equally prone to misreporting. These papers examine who misreports and why.
Interviewer Involvement in Sample Selection Shapes the Relationship between Response Rates and Data Quality (Eckman & Koch, 2019, Public Opinion Quarterly). Using seven rounds of the European Social Survey, we showed that when interviewers help select who gets surveyed, high response rates can actually signal worse data quality — because interviewers have incentives to select cooperative (but potentially unrepresentative) respondents.
Misreporting Among Reluctant Respondents (Bach, Eckman & Daikeler, 2019, Journal of Survey Statistics and Methodology). We tested whether respondents who were hardest to recruit also provide the lowest quality data. Using data from four surveys across several countries and modes, we found only limited evidence that reluctant respondents misreport more — a reassuring finding for surveys that invest in refusal conversion.