Skip to main content
Log in

PLease do not answer if you are reading this: respondent attention in online panels

  • Published:
Marketing Letters Aims and scope Submit manuscript

Abstract

This paper reports on the relevance of attention checks for online panels, e.g., M-Turk, SurveyMonkey, SmartSurvey, QualTrics. In two SmartSurvey studies approximately one third of the respondents failed a check that instructed them to skip the question. Attention-enhancing tools reduce this to approximately one fifth. The failure rate is not affected by replacing multiple-item scales with single-item measures. We find that failing the attention check relates to other attention indicators and that decreased attention levels often apply across the length of the survey. In support of relevance, our empirical findings show respondent inattentiveness systematically biases survey responses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Barone, M. J., Lyle, K. B., & Winterich, K. P. (2015). When deal depth doesn’t matter: how handedness consistency influences consumer response behavior to horizontal versus vertical price comparisons. Marketing Letters, 26(2), 213–223.

    Article  Google Scholar 

  • Bergkvist, L., & Rossiter, J. R. (2007). The predictive validity of multiple-item versus single-item measures of the same construct. Journal of Marketing Research, 44(2), 175–184.

    Article  Google Scholar 

  • Berman, J. Z., Levine, E. E., Brach, A., & Small, D. A. (2015). The braggart’s dilemma: on the social rewards and penalties of advertising prosocial behaviour. Journal of Marketing Research, 52(1), 90–104.

    Article  Google Scholar 

  • Billiet, J. B., & McClendon, M. J. (2000). Modeling acquiescence in measurement models for two balanced sets of items. Structural Equation Modeling, 7(4), 608–628.

    Article  Google Scholar 

  • Comley, P. (2005). Understanding the online panelist. ESOMAR. http://www.websm.org/db/12/11356/Bibliography/Understanding_the_online_panellist.

  • Echambadi, R., & Hess, J. P. (2007). Mean-centering does not alleviate collinearity problems in moderated multiple regression models. Marketing Science, 26(3), 438–445.

    Article  Google Scholar 

  • Emrich, O., & Verhoef, P. C. (2015). The impact of a homogenous versus a prototypical web design on online retail patronage for multichannel providers. International Journal of Research in Marketing, 32(4), 363–374.

    Article  Google Scholar 

  • Holbrook, M. B., & Batra, R. (1987). Assessing the role of emotions as mediators of consumer responses to advertising. Journal of Consumer Research, 14(4), 404–420.

    Article  Google Scholar 

  • Holbrook, A. L., Green, M. C., & Krosnick, J. A. (2003). Telephone versus face-to-face interviewing of national probability samples with long questionnaires. Comparisons of respondent satisficing and social desirability response bias. Public Opinion Quarterly, 67(1), 79–125.

    Article  Google Scholar 

  • Johnson, J. A. (2005). Ascertaining the validity of individual protocols from web-based personality inventories. Journal of Research in Personality, 39(1), 103–129.

    Article  Google Scholar 

  • Kamakura, W. A. (2015). Measure twice and cut once: the carpenter’s rule still applies. Marketing Letters, 26(3), 237–243.

    Article  Google Scholar 

  • Kaminska, O., McCutcheon, A. L., & Billiet, J. (2010). Satisficing among reluctant respondents in a cross-national context. Public Opinion Quarterly, 74(5), 956–984.

    Article  Google Scholar 

  • Kapelner, A., & Chandler, D. (2010). Preventing satisficing in online surveys: a “Kapcha” to ensure higher quality data. San Francisco: Paper presented at the Crowd Conference.

    Google Scholar 

  • Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied Cognitive Psychology, 5, 213–236.

    Article  Google Scholar 

  • Krosnick, J. A. (1999). Survey research. Annual Review Psychology, 50, 537–567.

    Article  Google Scholar 

  • Malhotra, N. (2008). Completion time and response order effects in web surveys. Public Opinion Quarterly, 72(5), 914–934.

    Article  Google Scholar 

  • Martin, C. L. (1994). The impact of topic interest on mail survey response behavior. Journal of the Marketing Research Society, 36(4), 327–338.

    Google Scholar 

  • Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods, 17(3), 437–455.

    Article  Google Scholar 

  • Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009). Instructional manipulation checks: detecting satisficing to increase statistical power. Journal of Experimental Social Psychology, 45, 867–972.

    Article  Google Scholar 

  • Putrevu, S., & Lord, K. (1994). Comparative and non-comparative advertising: Attitudinal effects under cognitive and affective involvement conditions. Journal of Advertising, 23(2), 77–91.

    Article  Google Scholar 

  • Rossiter, J. R. (2002). The C-OAR-SE procedure for scale development in marketing. International Journal for Research in Marketing, 19(4), 305–335.

    Article  Google Scholar 

  • Van Herk, H., Poortinga, Y. H., & Verhallen, T. M. M. (2004). Response styles in rating scales: evidence of method bias in data from six EU countries. Journal of Cross-Cultural Psychology, 35(3), 346–360.

    Article  Google Scholar 

  • Weijters, B., Baumgartner, H. and Schillewaert (2013). Reversed item bias: an integrative model. Psychological Methods, 18(3), 320–334.

  • Zhang, C., & Conrad, F. G. (2013). Speeding in web suveys: the tendency to answer very fast and its association with straightlining. Survey Research Methods, 8(2), 127–135.

    Google Scholar 

Download references

Acknowledgements

The authors are grateful for feedback provided by Harald van Heerde, the marketing group at the Radboud University and the School of Communication, Journalism and Marketing at Massey University. We also express our gratitude to Siara Khan for assessing the use of attention checks in the marketing literature.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Leonard J. Paas.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Paas, L.J., Morren, M. PLease do not answer if you are reading this: respondent attention in online panels. Mark Lett 29, 13–21 (2018). https://doi.org/10.1007/s11002-018-9448-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11002-018-9448-7

Keywords

Navigation