Skip to main content

Is the List Experiment Doing its Job?

Inconclusive Evidence!

  • Chapter
  • First Online:
Einstellungen und Verhalten in der empirischen Sozialforschung

Zusammenfassung

This paper sheds new light on the unobtrusive measure known as the list experiment or unmatched count technique. Proponents of this method claim that it detects social desirability bias in responses to sensitive questions in surveys. The logic of this method is quite straightforward. After a critical overview of the theory, logic, and empirical results of this type of measure, we present the results of a series of three studies. While the first study yielded promising results, the replication of the outcome pattern in Study 2 failed completely. The third study, based on longitudinal data, delivers indications for the systematic inconsistency of the claimed logic of the list experiment. First, significant mean differences between baseline and test condition occur even if the additional item is non-sensitive and has an agreement rate of about two percent in direct questioning. Second, test-retest reliability shows fluctuating results depending on the sensitivity and number of items included in the experiment. Implications for theory and practice in measuring social desirability by unobtrusive measures are discussed.

This work was supported by the Volkswagen Foundation, Freudenberg Foundation and Möllgaard Foundation (financial support of the project ‘Group-Focused Enmity’). The research of Peter Schmidt was supported by Humboldt Foundation. The authors thank Uli Wagner and the members of the graduate school ‘Group-Focused Enmity’ for their helpful comments on an earlier draft of this article. The authors would also like to thank Lisa Trierweiler for the English proof of the manuscript.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 64.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Literatur

  • Ahart, A. M., and P. R Sackett. 2004. A new method of examining relationships between individual difference measures and sensitive behavior criteria: Evaluating the unmatched count technique. Organizational Research Methods 7: 101–114. https://doi.org/10.1177/1094428103259557.

  • Auspurg, K., B. Jann, I. Krumpal, and H. von Hermanni. 2012, June. Randomized-Response-Technik: Hope or Hype? Eine Meta-Analyse unter Berücksichtigung von Publication-Bias [Randomized-response-technique: Hope or hype? A meta-analysis taking into account publication bias]. Paper presented at the First Mini-Conference of the Center of Quantitative Methods of the University of Leipzig. Asking Sensitive Questions: Theory and Data Collection Methods, Leipzig, Germany.

    Google Scholar 

  • Blair, G., and K. Imai. 2012. Statistical analysis of list experiments. Political Analysis 20: 47–77. https://doi.org/10.1093/pan/mpr048.

  • Chaudhuri, A., and T. C. Christofides. 2007. Item count technique in estimating the proportion of people with a sensitive feature. Journal of Statistical Planning and Inference 137(2): 589–593. https://doi.org/10.1016/j.jspi.2006.01.004.

  • Coutts, E., and B. Jann. 2011. Sensitive questions in online surveys: Experimental results for the randomized response technique (RRT) and the unmatched count technique (UCT). Sociological Methods & Research 40: 169–193. https://doi.org/10.1177/0049124110390768.

  • Creighton, M., D. Zavala-Rojas, P. Schmidt, and P. Brenner. 2018. Analysis of List Experiments. In: Experimental Methods in Survey Research: Techniques that Combine Random Sampling with Random Assignment, eds. P .J. Lavrakas, M. W. Traugott, C. Kennedy, A. Holbrook, E. D. de Leeuw, and B. T. West. New York: John Wiley & Sons.

    Google Scholar 

  • Crowne, D. P., and D. Marlowe. 1960. A new scale of social desirability independent of psychopathology. Journal of Consulting Psychology 24: 349–354. https://doi.org/10.1037/h0047358.

  • Dalton, D. R., J. C. Wimbush, and C. M. Daily. 1994. Using the unmatched count technique (UCT) to estimate base rates for sensitive behavior. Personnel Psychology 47(4): 817–828. https://doi.org/10.1111/j.1744-6570.1994.tb01578.x.

  • Davidov, E., S. Thörner, P. Schmidt, S. Gosen, and C. Wolf. 2011. Level and change of group-focused enmity in Germany: Unconditional and conditional latent growth curve models with four panel waves. AStA Advances in Statistical Analysis 95(4): 481–500. https://doi.org/10.1007/s10182-011-0174-1.

  • Dovidio, J. F., and S. L. Gaertner. 1986. Prejudice, Discrimination, and Racism: Historical Trends and Contemporary Approaches. In Prejudice, Discrimination, and Racism, eds. J. F. Dovidio and S. L. Gaertner, 1–34. San Diego, CA: Academic Press.

    Google Scholar 

  • Droitcour, J., R. A. Caspar, M. L. Hubbard, T. L. Parsley, W. Visscher, and T. M. Ezzati. 1991. The Item Count Technique as a Method of Indirect Questioning: A Review of its Development and a Case Study Application. In Measurement Errors in Surveys, eds. P. P. Biemer, R. M. Groves, L. E. Lyberg, N. A. Mathiowetz, and S. Sudman, 185–210. New York: Wiley. https://doi.org/10.1002/9781118150382.ch11.

  • Gilens, M., P. M. Sniderman, and J. H. Kuklinski. 1998. Affirmative action and the politics of realignment. British Journal of Political Science 28: 159–183. http://www.jstor.org/stable/194161.

  • Glynn, A. 2013. What can we learn with statistical truth serum? Design and analysis of the list experiment. Public Opinion Quarterly 77(S1): 159–172. https://doi.org/10.1093/poq/nfs070.

  • Heitmeyer, W. 2002. Deutsche Zustände, Folge 1 [German Conditions, Vol. 1]. Frankfurt am Main: Suhrkamp.

    Google Scholar 

  • Himmelfarb, S., and C. Lickteig. 1982. Social desirability and the randomized response technique. Journal of Personality and Social Psychology 43(4): 710–717. https://doi.org/10.1037/0022-3514.43.4.710.

  • Holbrook, A. L., and J. A. Krosnick. 2010a. Measuring voter turnout by using the randomized response technique: Evidence calling into question the method’s validity. Public Opinion Quarterly 74(2): 328–343. https://doi.org/10.1093/poq/nfq012.

  • Holbrook, A. L., and J. A. Krosnick. 2010b. Social desirability bias in voter turnout reports: Tests using the item count technique. Public Opinion Quarterly 74(1): 37–67. https://doi.org/10.1093/poq/nfp065.

  • Huddy, L., and S. Feldman. 2009. On assessing the political effects of racial prejudice. Annual Review of Political Science 12(1): 423–447. https://doi.org/10.1146/annurev.polisci.11.062906.070752.

  • Imai, K. 2011. Multivariate regression analysis for the item count technique. Journal of the American Statistical Association 106(494): 407–416. https://doi.org/10.1198/jasa.2011.ap10415.

  • Janus, A. L. 2010. The influence of social desirability Pressures on expressed immigration attitudes. Social Science Quarterly 91(4): 928–946. https://doi.org/10.1111/j.1540-6237.2010.00742.x.

  • Jones, E. E. and H. Sigall. 1971. The bogus pipeline: A new paradigm for measuring affect and attitude. Psychological Bulletin 76: 349–364. https://doi.org/10.1037/h0031617.

  • Kane, J. G., S. C. Craig, and K. D. Wald. 2004. Religion and presidential politics in Florida: A list experiment. Social Science Quarterly 85(2): 281–293. https://doi.org/10.1111/j.0038-4941.2004.08502004.x.

  • Karlan, D., and J. Zinman. 2011. List randomization for sensitive behavior: An application for measuring use of loan proceeds. Working Paper. Retrieved from http://karlan.yale.edu/p/JDE-ListRandomization.pdf.

  • Kuklinski, J. H., M. D. Cobb, and M. Gilens. 1997. Racial attitudes and the “new South.” The Journal of Politics 59(2): 323–349. https://doi.org/10.2307/2998167.

  • Kuklinski, J. H., P. M. Sniderman, K. Knight, T. Piazza, P. E. Tetlock, G. R. Lawrence, and B. Mellers. 1997. Racial prejudice and attitudes toward affirmative action. American Journal of Political Science 41(2): 402–419. https://doi.org/10.2307/2111770.

  • Krumpal, I. 2011. Determinants of social desirability bias in sensitive surveys: A literature review. Quality and Quantity 47(4): 2025–2047. https://doi.org/10.1007/s11135-011-9640-9.

  • LaBrie, J. W., and M. Earleywine. 2000. Sexual risk behaviors and alcohol: Higher base rates revealed using the unmatched‐count technique. Journal of Sex Research 37(4): 321–326. https://doi.org/10.1080/00224490009552054.

  • Lensvelt-Mulders, G. J. L. M., J. J. Hox, and P. G. M. van der Heijden. 2005. Meta-analysis of randomized response research: 35 years of validation. Sociological Methods & Research 33(3): 319–348. https://doi.org/10.1177/0049124104268664.

  • Miller, J. D. 1984. A New Survey Technique for Studying Deviant Behavior (Doctoral dissertation). George Washington University, Washington, DC.

    Google Scholar 

  • Nickel, B., M. Berger, P. Schmidt, and K. Plies. 1995. Qualitative sampling in a multi-method survey. Quality & Quantity 29(3): 223–240. https://doi.org/10.1007/bf01101971.

  • Oberski, D., W. Weber, and M. Révilla. 2012. The Effect of Individual Characteristics on Reports of Socially Desirable Attitudes towards Immigration. In Methods, Theories, and Empirical Applications in the Social Sciences. Festschrift for Peter Schmidt, eds. S. Salzborn, E. Davidov, and J. Reinecke, 151–157. Wiesbaden: Springer VS. https://doi.org/10.1007/978-3-531-18898-0_19.

  • Papastefanou, G., and M. Wiedenbeck. 1998. Singuläre und multiple Imputation fehlender Einkommenswerte: ein empirischer Vergleich [Singular and multiple imputation of missing income data: An empirical comparison]. ZUMA-Nachrichten 22(43): 73–89.

    Google Scholar 

  • Paulhus, D. L. 1984. Two-component models of socially desirable responding. Journal of Personality and Social Psychology 46(3): 598–609. https://doi.org/10.1037/0022-3514.46.3.598.

  • Paulhus, D. L., and D. B. Reid. 1991. Enhancement and denial in socially desirable responding. Journal of Personality and Social Psychology 60(2): 307–317. https://doi.org/10.1037/0022-3514.60.2.307.

  • Raghavarao, D., and W. T. Federer. 1979. Block total response as an alternative to the randomized response method in surveys. Journal of the Royal Statistical Society. Series B (Methodological) 41(1): 40–45.

    Google Scholar 

  • Salzborn, S. 2010. The politics of antisemitism. Journal for the Study of Antisemitism 2(1): 89–114.

    Google Scholar 

  • Schuessler, K. F. 1982. Measuring Social Life Feelings (1st ed.). San Francisco: Jossey-Bass.

    Google Scholar 

  • Schwarz, N., and S. Sudman. 1996. Answering Questions: Methodology for Determining Cognitive and Communicative Processes in Survey Research (1st ed.). San Francisco: Jossey-Bass.

    Google Scholar 

  • Sniderman, P. M., and D. B. Grob. 1996. Innovations in experimental design in attitude surveys. Annual Review of Sociology 22(1): 377–399. https://doi.org/10.1146/annurev.soc.22.1.377.

  • Streb, M. J., B. Burrell, B. Frederick, and M. A. Genovese. 2008. Social desirability effects and support for a female American president. Public Opinion Quarterly 72(1): 76–89. https://doi.org/10.1093/poq/nfm035.

  • Tourangeau, R., L. J. Rips, and K. A. Rasinski. 2000. The Psychology of Survey Response. New York: Cambridge University Press.

    Google Scholar 

  • Tourangeau, R., and T. W. Smith. 1996. Asking sensitive questions: The impact of data collection mode, question format, and question context. Public Opinion Quarterly 60(2): 275–304. https://doi.org/10.1086/297751.

  • Tourangeau, R., and T. Yan. 2007. Sensitive questions in surveys. Psychological Bulletin 133(5): 859–883. https://doi.org/10.1037/0033-2909.133.5.859.

  • Tsuchiya, T., Y. Hirai, and S. Ono. 2007. A study of the properties of the item count technique. Public Opinion Quarterly 71(2): 253–272. https://doi.org/10.1093/poq/nfm012.

  • Tsuchiya, T., and Y. Hirai. 2010. Elaborate item count questioning: Why do people underreport in item count responses? Survey Research Methods 4(3): 139–149. http://dx.doi.org/10.18148/srm/2010.v4i3.4620.

  • Warner, S. L. 1965. Randomized response: A survey technique for eliminating evasive answer bias. Journal of the American Statistical Association 60: 63–69. https://doi.org/10.1080/01621459.1965.10480775.

  • Weisband, S., and S. Kiesler. 1996. Self-disclosure on computer forms: Meta-analysis and implications. In Conference Companion on Human Factors in Computing Systems – CHI´96, ed. M. Tauber, 3–10. Vancouver: ACM Press. https://doi.org/10.1145/238386.238387.

  • Zick, A., C. Wolf, B. Küpper, E. Davidov, P. Schmidt, and W. Heitmeyer. 2008. The syndrome of group-focused enmity: The interrelation of prejudices tested with multiple cross-sectional and panel data. Journal of Social Issues 64(2): 363–383. https://doi.org/10.1111/j.1540-4560.2008.00566.x.

  • Zigerell, L. J. 2011. You wouldn’t like me when I’m angry: List experiment misreporting. Social Science Quarterly 92(2): 552–562. https://doi.org/10.1111/j.1540-6237.2011.00770.x.

  • Zimmerman, R. S., and L. M. Langer. 1995. Improving estimates of prevalence rates of sensitive behaviors: The randomized lists technique and consideration of self‐reported honesty. Journal of Sex Research 32(2): 107–111. https://doi.org/10.1080/00224499509551781.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stefanie Gosen .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Gosen, S., Schmidt, P., Thörner, S., Leibold, J. (2019). Is the List Experiment Doing its Job?. In: Mayerl, J., Krause, T., Wahl, A., Wuketich, M. (eds) Einstellungen und Verhalten in der empirischen Sozialforschung . Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-16348-8_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-658-16348-8_8

  • Published:

  • Publisher Name: Springer VS, Wiesbaden

  • Print ISBN: 978-3-658-16347-1

  • Online ISBN: 978-3-658-16348-8

  • eBook Packages: Social Science and Law (German Language)

Publish with us

Policies and ethics