Abstract
Although online surveys are becoming more and more prominent, the quality of the resulting data is still contested. One potential caveat of web surveys is the absence of an interviewer who controls the interview situation, can motivate respondents and prevent them from satisficing, i.e. answering questions with minimal cognitive effort. While there is evidence for differences between data gathered in interviewer-administered surveys and data from self-administered questionnaires, it has not yet been studied whether the sheer presence of an interviewer affects data quality. The present article addresses this research gap. Based on a recent panel study of young unemployed adults, we compare the results from a completely self-administered web survey with those from interviews which were self-administered but conducted in the presence of an interviewer. In particular, we look for differences concerning drop-out, speed, item-non-response, and item-non-differentiation. While we do find significant differences in drop-out rates, we do not find any evidence for interviewer-absence leading to less diligence in filling in the questionnaire. We thus conclude that the presence of an interviewer does not enhance data quality for self-administered questionnaires, but positively affects completion rates.
Acknowledgments
The study is based on data from a panel survey which has been funded by the Austrian Federal Ministry of Labour, Social Affairs and Consumer Protection as part of the JuSAW project. Work on the paper has been carried out as part of the CUPESSE project, which received funding from the European Union’s Seventh Framework Programme for research, technological development and demonstration under grant agreement no 613257.
Appendix
Balance of raw data | Balance of matched data | Balance improvement % | |||||
---|---|---|---|---|---|---|---|
Means treated | Means control | Mean difference | Means treated | Means control | Mean difference | ||
Distance | 0.21 | 0.15 | 0.06 | 0.21 | 0.21 | 0.00 | 96.76 |
Female | 0.50 | 0.45 | 0.04 | 0.50 | 0.48 | 0.02 | 55.10 |
Age | |||||||
18–20 | 0.31 | 0.34 | −0.03 | 0.31 | 0.32 | −0.01 | 67.86 |
21–24 | 0.40 | 0.35 | 0.05 | 0.40 | 0.39 | 0.01 | 78.93 |
25–28 | 0.30 | 0.31 | −0.02 | 0.30 | 0.30 | 0.00 | 100.00 |
Education | |||||||
ISCED 0–2 | 0.34 | 0.33 | 0.00 | 0.34 | 0.36 | −0.02 | −641.26a |
ISCED 3B | 0.25 | 0.28 | −0.03 | 0.25 | 0.24 | 0.01 | 68.79 |
ISCED 3A–4 | 0.23 | 0.22 | 0.00 | 0.23 | 0.22 | 0.01 | −209.94a |
ISCED 5–6 | 0.19 | 0.16 | 0.03 | 0.19 | 0.19 | 0.00 | 100.00 |
Migration background | |||||||
None | 0.52 | 0.54 | −0.01 | 0.52 | 0.53 | −0.01 | 23.74 |
2nd generation | 0.18 | 0.22 | −0.04 | 0.18 | 0.18 | 0.00 | 100.00 |
1st generation | 0.30 | 0.25 | 0.05 | 0.30 | 0.29 | 0.01 | 80.87 |
Diligence (self-reported) | 3.95 | 4.02 | −0.07 | 3.95 | 4.00 | −0.05 | 29.54 |
Interview month t0 | 7.19 | 6.97 | 0.22 | 7.19 | 7.16 | 0.03 | 86.39 |
Participation motivation | 0.86 | 0.96 | −0.10 | 0.86 | 0.86 | 0.00 | 100.00 |
Had a job at t1 | 0.53 | 0.41 | 0.12 | 0.53 | 0.54 | −0.01 | 91.85 |
Not living in Vienna at t1 | 0.04 | 0.01 | 0.03 | 0.04 | 0.04 | 0.00 | 100.00 |
Interview duration t0 | 37.83 | 37.11 | 0.72 | 37.83 | 36.83 | 1.00 | −38.66 |
Ratio don’t know t0 | 0.06 | 0.07 | −0.01 | 0.06 | 0.05 | 0.00 | 50.28 |
Inconsistency t0 | 0.64 | 0.83 | −0.19 | 0.64 | 0.63 | 0.01 | 94.69 |
Straightlining t0 | 0.56 | 0.59 | −0.02 | 0.56 | 0.56 | 0.00 | 83.30 |
Balance statistics calculated as described in Ho et al. (2011). aThe strong percentage decrease in balance for the two categories of Education is only due to the fact that difference between the means of the treatment and the control group in the raw data is extremely close to zero. Hence, even a minor increase (in this case from 0.00 to 0.02) strongly affects the percentage in the last row.
Drop-out | Interview duration | Ratio ‘don’t know’ | Non response open question | Inconsistency | Straightlining | |||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
No interviewer | 18.93 | (1.05)*** | 0.89 | (0.82) | 0.00 | (0.02) | −0.35 | (0.44) | −0.36 | (0.12)** | 0.03 | (0.02) |
Age | ||||||||||||
18–20 | -ref- | -ref- | -ref- | -ref- | -ref- | -ref- | ||||||
21–24 | −0.81 | (0.73) | 1.01 | (1.22) | −0.03 | (0.03) | 0.53 | (0.57) | −0.13 | (0.18) | 0.03 | (0.03) |
25–28 | −0.82 | (1.20) | 0.30 | (1.33) | −0.02 | (0.03) | 0.45 | (0.73) | −0.18 | (0.22) | −0.02 | (0.03) |
Female | 0.31 | (0.62) | 1.69 | (0.83)* | 0.01 | (0.02) | 0.24 | (0.45) | −0.06 | (0.13) | −0.01 | (0.03) |
Education | ||||||||||||
ISCED 0-2 | -ref- | -ref- | -ref- | -ref- | -ref- | -ref- | ||||||
ISCED 3B | 0.26 | (0.77) | 2.03 | (1.06) | −0.04 | (0.03) | 0.34 | (0.65) | −0.34 | (0.19) | 0.03 | (0.04) |
ISCED 3A–4 | −2.24 | (0.92)* | 1.30 | (1.18) | −0.03 | (0.03) | 0.83 | (0.62) | −0.04 | (0.20) | 0.04 | (0.04) |
ISCED 5–6 | −1.04 | (1.17) | 1.36 | (1.41) | −0.01 | (0.03) | 0.53 | (0.73) | −0.44 | (0.23) | 0.01 | (0.04) |
Migration background | ||||||||||||
None | -ref- | -ref- | -ref- | -ref- | -ref- | -ref- | ||||||
2nd generation | 2.46 | (0.88)** | 1.13 | (1.28) | 0.01 | (0.03) | −0.06 | (0.70) | −0.43 | (0.20)* | 0.09 | (0.04)* |
1st generation | 1.89 | (0.92)* | 1.45 | (1.00) | 0.03 | (0.03) | 0.24 | (0.63) | −0.12 | (0.14) | 0.06 | (0.03) |
Diligence (self-reported) | −0.46 | (0.46) | −0.16 | (0.72) | −0.01 | (0.02) | −0.67 | (0.34)* | −0.03 | (0.10) | 0.01 | (0.02) |
Interview month t0 | −0.27 | (0.27) | −0.67 | (0.45) | 0.00 | (0.01) | 0.27 | (0.25) | −0.10 | (0.05) | −0.01 | (0.01) |
Participation motivation | 2.12 | (1.24) | −0.09 | (1.36) | −0.05 | (0.04) | 0.49 | (0.75) | −0.24 | (0.18) | 0.03 | (0.04) |
Had a job at t1 | 0.39 | (0.75) | 1.55 | (0.93) | −0.06 | (0.02)** | −0.49 | (0.44) | 0.00 | (0.12) | −0.01 | (0.03) |
Not living in Vienna at t1 | 3.12 | (1.25)* | −1.64 | (2.61) | −0.04 | (0.04) | 0.90 | (1.20) | 0.70 | (0.59) | −0.02 | (0.07) |
Interview duration t0 | 0.01 | (0.03) | 0.58 | (0.05)*** | −0.00 | (0.00) | −0.04 | (0.03) | 0.01 | (0.01) | −0.00 | (0.00) |
Ratio don’t know t0 | −9.54 | (3.92)* | 2.75 | (4.69) | 0.34 | (0.13)* | 2.62 | (2.36) | 1.10 | (0.89) | −0.08 | (0.13) |
Inconsistency t0 | 0.54 | (0.34) | 0.37 | (0.44) | −0.01 | (0.01) | −0.07 | (0.28) | 0.08 | (0.08) | −0.00 | (0.01) |
Straightlining t0 | 3.76 | (1.88)* | 2.36 | (2.52) | 0.12 | (0.08) | 2.19 | (1.77) | 0.24 | (0.42) | 0.45 | (0.08)*** |
Constant | −22.20 | (3.97)*** | 11.14 | (4.28)* | 0.15 | (0.13) | −1.72 | (2.37) | 1.84 | (0.66)** | 0.24 | (0.12)* |
N | 192 | 172 | 180 | 172 | 167 | 177 | ||||||
adj. R2 | – | 0.521 | 0.099 | – | 0.102 | 0.192 | ||||||
AIC | 104 | 1084 | −172 | 178 | 419 | −133 |
Standard errors in parentheses; *p<0.05, **p<0.01, ***p<0.001; Regressions were done on the matched dataset using the weights created by the genetic matching procedure and including not only the treatment variable (No interviewer), but also all variables on which the samples were matched, following Ho et al. (2007). For binary dependent variables (drop-out and non response to open question), we fitted logistic regression models, for the other dependent variables OLS regression models.
References
Atkeson, L. R., A. N. Adams and R. M. Alvarez (2014) “Nonresponse and Mode Effects in Self- and Interviewer-Administered Surveys,” Political Analysis, 22(3):304–320.10.1093/pan/mpt049Search in Google Scholar
Barge, S. and H. Gehlbach (2012) “Using the Theory of Satisficing to Evaluate the Quality of Survey Data,” Research in Higher Education, 53(2):182–200.10.1007/s11162-011-9251-2Search in Google Scholar
Callegaro, M., K. L. Manfreda and V. Vehovar (2015) Web Survey Methodology. London: SAGE.10.4135/9781529799651Search in Google Scholar
Cannell, C. F., P. V. Miller and L. Oksenberg (1981) “Research on Interviewing Techniques,” Sociological Methodology, 12:389–437.10.2307/270748Search in Google Scholar
Chang, L. and J. A. Krosnick (2009) “National Surveys Via Rdd Telephone Interviewing Versus the Internet: Comparing Sample Representativeness and Response Quality,” Public Opinion Quarterly, 73(4):641–678.10.1093/poq/nfp075Search in Google Scholar
Chang, L. and J. A. Krosnick (2010) “Comparing Oral Interviewing with Self-Administered Computerized QuestionnairesAn Experiment,” Public Opinion Quarterly, 74(1):154–167.10.1093/poq/nfp090Search in Google Scholar
Couper, M. P. (2011) “The Future of Modes of Data Collection,” Public Opinion Quarterly, 75(5):889–908.10.1093/poq/nfr046Search in Google Scholar
Fang, J., C. Wen and V. Prybutok (2013) “The Equivalence of Internet Versus Paper-Based Surveys in IT/IS Adoption Research in Collectivistic Cultures: The Impact of Satisficing,” Behaviour & Information Technology, 32(5):480–490.10.1080/0144929X.2012.751621Search in Google Scholar
Fang, J., C. Wen and V. Prybutok (2014) “An Assessment of Equivalence Between Paper and Social Media Surveys: The Role of Social Desirability and Satisficing,” Computers in Human Behavior, 30:335–343.10.1016/j.chb.2013.09.019Search in Google Scholar
Fricker, S., M. Galesic, R. Tourangeau and T. Yan (2005) “An Experimental Comparison of Web and Telephone Surveys,” The Public Opinion Quarterly, 69(3):370–392.10.1093/poq/nfi027Search in Google Scholar
Ganassali, S. (2008) “The Influence of the Design of Web Survey Questionnaires on the Quality of Responses,” Survey Research Methods, 2(1). Retrieved from https://ojs.ub.uni-konstanz.de/srm/article/view/598.Search in Google Scholar
Gittelmann, S. H. and E. Trimarchi (2012) “The War Against Unengaged Online Respondents,” Quirk’s Marketing Research Review, December 2012. Retrieved from https://www.quirks.com/articles/the-war-against-unengaged-online-respondents.Search in Google Scholar
Groves, R. M., Jr, F. J. Fowler, M. P. Couper, J. M. Lepkowski, E. Singer and R. Tourangeau (2009) Survey Methodology. New York, NY: John Wiley & Sons.Search in Google Scholar
Heerwegh, D. and G. Loosveldt (2008) “Face-to-Face versus Web Surveying in a High-Internet-Coverage Population: Differences in Response Quality,” Public Opinion Quarterly, 72(5):836–846.10.1093/poq/nfn045Search in Google Scholar
Ho, D., K. Imai, G. King and E. A. Stuart (2007) “Matching as Nonparametric Preprocessing for Reducing Model Dependence in Parametric Causal Inference,” Political Analysis, 15(3):199–236.10.1093/pan/mpl013Search in Google Scholar
Ho, D., K. Imai, G. King and E. A. Stuart (2011) “MatchIt: Nonparametric Preprocessing for Parametric Causal Inference,” Journal of Statistical Software, 42(8):1–28.10.18637/jss.v042.i08Search in Google Scholar
Holbrook, A. L., M. C. Green and J. A. Krosnick (2003) “Telephone Versus Face-to-Face Interviewing of National Probability Samples with Long Questionnaires: Comparisons of Respondent Satisficing and Social Desirability Response Bias,” Public Opinion Quarterly, 67(1):79–125.10.1086/346010Search in Google Scholar
Joinson, A. N., A. Woodley and U.-D. Reips (2007) “Personalization, Authentication and Self-Disclosure in Self-Administered Internet Surveys,” Computers in Human Behavior, 23(1):275–285.10.1016/j.chb.2004.10.012Search in Google Scholar
Keusch, F. and T. Yan (2016) “Web Versus Mobile Web,” Social Science Computer Review, 1–19, 0894439316675566. Retrieved from http://journals.sagepub.com/doi/pdf/10.1177/0894439316675566.10.1177/0894439316675566Search in Google Scholar
Klein, J. D., C. G. Havens and R. K. Thomas (2009) “Comparing Adolescent Response Bias Between Internet and Telephone Surveys,” Journal of Adolescent Health, 44(2):S36.10.1016/j.jadohealth.2008.10.100Search in Google Scholar
Kreuter, F., S. Presser and R. Tourangeau (2008) “Social Desirability Bias in CATI, IVR, and Web Surveys: The Effects of Mode and Question Sensitivity,” Public Opinion Quarterly, 72(5):847–865.10.1093/poq/nfn063Search in Google Scholar
Krosnick, J. A. (1991) “Response Strategies for Coping with the Cognitive Demands of Attitude Measures in Surveys,” Applied Cognitive Psychology, 5(3):213–236.10.1002/acp.2350050305Search in Google Scholar
Lindhjem, H. and S. Navrud (2011) “Are Internet Surveys an Alternative to Face-to-Face Interviews in Contingent Valuation?” Ecological Economics, 70(9):1628–1637.10.1016/j.ecolecon.2011.04.002Search in Google Scholar
Lozar Manfreda, K., M. Bošnjak, J. Berzelak, I. Haas and V. Vehovar (Eds.) (2008) “Web Surveys Versus Other Survey Modes: A Meta-analysis Comparing Response Rates,” International Journal of Market Research: JMRS ; the Journal of the Market Research Society, 50(1):79–104.10.1177/147078530805000107Search in Google Scholar
Malakhoff, L. A. and M. Jans (2013) “Towards Usage of Avatar Interviewers in Web Surveys,” Survey Practice, 4(3). Retrieved from http://www.surveypractice.org/index.php/SurveyPractice/article/view/104.10.29115/SP-2011-0015Search in Google Scholar
Malhotra, N. (2008) “Completion Time and Response Order Effects in Web Surveys,” Public Opinion Quarterly, 72(5):914–934.10.1093/poq/nfn050Search in Google Scholar
Mavletova, A. (2013) “Data Quality in PC and Mobile Web Surveys,” Social Science Computer Review, 31(6):725–743.10.1177/0894439313485201Search in Google Scholar
Postoaca, A. (2006) The Anonymous Elect: Market Research Through Online Access Panels. New York: Springer Science & Business Media.Search in Google Scholar
Sekhon, J. S. (2011) “Multivariate and Propensity Score Matching Software with Automated Balance Optimization: The Matching Package for R,” Journal of Statistical Software, 42(7):1–52.10.18637/jss.v042.i07Search in Google Scholar
Simon, H. A. (1957) Models of Man: Social and Rational. Mathematical Essays on Rational Human Behavior in Society Setting. Hoboken: Wiley.Search in Google Scholar
Sommer, J., B. Diedenhofen and J. Musch (2016) “Not to Be Considered Harmful. Mobile-Device Users Do Not Spoil Data Quality in Web Surveys,” Social Science Computer Review, 35(3): 378–387.10.1177/0894439316633452Search in Google Scholar
Thomas, R. K. and F. M. Barlas (2014) Respondents Playing Fast and Loose? Antecedents and Consequences of Respondent Speed of Completion. Presented at the The American Association for Public Opinion Research (AAPOR) 69th Annual Conference, 2014, Anaheim, California.Search in Google Scholar
Tourangeau, R. and T. W. Smith (1996) “Asking Sensitive Questions. The Impact of Data Collection Mode, Question Format, and Question Context,” Public Opinion Quarterly, 60(2):275–304.10.1086/297751Search in Google Scholar
Tourangeau, R., M. P. Couper and D. M. Steiger (2003) “Humanizing Self-Administered Surveys: Experiments on Social Presence in Web and IVR Surveys,” Computers in Human Behavior, 19(1):1–24.10.1016/S0747-5632(02)00032-8Search in Google Scholar
Tourangeau, R., L. J. Rips and K. A. Rasinski (2009) The Psychology of Survey Response (10. print). Cambridge, MA: Cambridge UnivPress.Search in Google Scholar
Ward, M. K. and S. B. Pond III (2015) “Using Virtual Presence and Survey Instructions to Minimize Careless Responding on Internet-Based Surveys,” Computers in Human Behavior, 48:554–568.10.1016/j.chb.2015.01.070Search in Google Scholar
Zhang, C. and F. Conrad (2014) “Speeding in Web Surveys: The Tendency to Answer Very Fast and Its Association with Straightlining,” Survey Research Methods, 8(2):127–135.Search in Google Scholar
©2017 Walter de Gruyter GmbH, Berlin/Boston