Skip to main content
Log in

Mode comparison study on willingness to buy and willingness to pay for organic foods: paper-and-pencil versus computerized questionnaire

  • Published:
Electronic Commerce Research Aims and scope Submit manuscript

Abstract

This contribution looks into the survey mode effect using a randomized trial comparing data from paper-and-pencil and computerized web questionnaire concerning the willingness to buy (WTB) and the willingness to pay (WTP) for two food products certified as organic. A survey questionnaire was filled in by 110 university students for each mode in thermal comfort lab conditions. The design enables to study measurement variance specifically attributable to the mode of questionnaire completion (i.e. mode effect) and presentation of products (i.e. stimulus effect). While the two questionnaires were as similar as possible, the paper-and-pencil version involved the actual presentation of a tetra pack package of organic orange juice of 750 ml and a paper package of organic spaghetti of 500 g, but the computerized version involved their video projection. As regards the difference in substantive results, the prospective consumers “subjects” seemed to be more willing to buy the organic orange juice when presented live than presented on video embedded in the computerized questionnaire, while only women were willing to pay more for orange juice for the paper-and-pencil mode. No difference was found for the organic spaghetti product. As regards the response quality, in contrast to previous studies respondents wrote fewer words to the open-ended question in the computerized than in the paper-and-pencil version of the questionnaire. In addition, the study shows that using video clips as replacement for physical product presentations when measuring WTB and WTP as important concepts in consumer preference research needs further testing and evaluation as respondents may not react to them in the same way.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Tourangeau, R., Conrad, F., & Couper, M. (Eds.). (2013). The science of web surveys. Oxford: Oxford University Press.

    Google Scholar 

  2. Tourangeau, R., & Bradburn, N. M. (Eds.). (2010). Handbook of survey research (2nd ed.). Howard House: Emerald Group.

    Google Scholar 

  3. Cobanoglu, C., Warde, B., & Moreo, P. J. (2001). A comparison of mail, fax and web-based survey methods. International Journal of Market Research, 43(4), 441–452.

    Article  Google Scholar 

  4. Denniston, M. M., Brener, N. D., Kann, L., Eaton, D. K., McManus, T., Kyle, T. M., et al. (2010). Comparison of paper-and-pencil versus Web administration of the Youth Risk Behavior Survey (YRBS): Participation, data quality, and perceived privacy and anonymity. Computers in Human Behavior, 26(5), 1054–1060. doi:10.1016/j.chb.2010.03.006.

    Article  Google Scholar 

  5. Mohorko, A., & Hlebec, V. (2016). Degree of cognitive interviewer involvement in questionnaire pretesting on trending survey modes. Computers in Human Behavior, 62, 79–89. doi:10.1016/j.chb.2016.03.021.

    Article  Google Scholar 

  6. Callegaro, M., Manfreda, K. L., & Vehovar, V. (2015). Web survey methodology. London: Sage.

    Google Scholar 

  7. Couper, M. P. (2011). The future of modes of data collection. Public Opinion Quarterly, 75(5), 889–908. doi:10.1093/poq/nfr046.

    Article  Google Scholar 

  8. Sudman, S., & Bradburn, N. M. (1974). Response effects in surveys: A review and synthesis. Chicago: Aldine Publishing Company.

    Google Scholar 

  9. Dillman, D. A. (1978). Mail and telephone surveys (Vol. 3). New York: Wiley.

    Google Scholar 

  10. Groves, R. M., & Kahn, R. L. (1979). Surveys by telephone: A national comparison with personal interviews. New York: Academic Press. https://babel.hathitrust.org/cgi/pt?id=mdp.39015071885466;view=2up;seq=6;size=200.

    Google Scholar 

  11. Sykes, W., & Collins, M. (1988). Effects of mode of interview: Experiments in the UK. In R. M. Groves et al. (Eds.), Telephone survey methodology. New York: Wiley.

    Google Scholar 

  12. de Leeuw, E. D. (1992). Data quality in mail, telephone and face to face surveys. Amsterdam: TT-Publications. http://files.eric.ed.gov/fulltext/ED374136.pdf.

    Google Scholar 

  13. De Leeuw, E. D., & Hox, J. (2011). Internet surveys as a part of a mixed-mode design. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the internet. Advances in applied methods and research strategies (pp. 45–76). New York: Taylor and Francis.

    Google Scholar 

  14. Determann, D., Lambooij, M. S., Steyerberg, E. W., de Bekker-Grob, E. W., & de Wit, G. A. (2017). Impact of survey administration mode on the results of a health-related discrete choice experiment: Online and paper comparison. Value in Health, 20(7), 953–960. doi:10.1016/j.jval.2017.02.007.

    Article  Google Scholar 

  15. Lindhjem, H., & Navrud, S. (2011). Are internet surveys an alternative to face-to-face interviews in contingent valuation? Ecological Economics, 70(9), 1628–1637. doi:10.1016/j.ecolecon.2011.04.002.

    Article  Google Scholar 

  16. Lee, C.-K., Kim, T.-K., & Mjelde, J. W. (2016). Comparison of preservation values between Internet and interview survey modes: The case of Dokdo, South Korea. Journal of Environmental Planning and Management, 59(1), 22–43. doi:10.1080/09640568.2014.980900.

    Article  Google Scholar 

  17. Mjelde, J. W., Kim, T.-K., & Lee, C.-K. (2016). Comparison of Internet and interview survey modes when estimating willingness to pay using choice experiments. Applied Economics Letters, 23(1), 74–77. doi:10.1080/13504851.2015.1051648.

    Article  Google Scholar 

  18. Kroh, M., Lüdtke, D., Düzel, S., & Winter, F. (2017). Response error in a web survey and a mailed questionnaire: The role of cognitive functioning (December 2016). SSRN: https://ssrn.com/abstract=2920616 or doi:10.2139/ssrn.2920616.

  19. Biemer, P. P., & Lyberg, L. E. (Eds.). (2003). Introduction to survey quality. New Jersey: Wiley.

    Google Scholar 

  20. Vannieuwenhuyze, J. T. A., & Revilla, M. (2013). Evaluating relative mode effects on data quality in mixed-mode surveys. Survey Research Methods, 7(3), 157–168.

    Google Scholar 

  21. Miller, E. T., Neal, D. J., Roberts, L. J., Baer, J. S., Cressler, S. O., Metrik, J., et al. (2002). Test-retest reliability of alcohol measures: Is there a difference between internet-based assessment and traditional methods? Psychology of Addictive Behaviors, 16(1), 56.

    Article  Google Scholar 

  22. Manfreda, K. L., & Vehovar, V. (2002). Mode effects in web surveys. Section, 2172–77. Paper presented at the American Association for Public Research 2002: Strengthening our community—Section on survey research methods. http://www.websm.org/uploadi/editor/1390155015ModeEffectInWebSurveys.pdf.

  23. Mangunkusumo, R. T., Duisterhout, J. S., De Graaff, N., Maarsingh, E. J., De Koning, H. J., & Raat, H. (2006). Internet versus paper mode of health and health behavior questionnaires in elementary schools: Asthma and fruit as examples. Journal of School Health, 76(2), 80–86. doi:10.1111/j.1746-1561.2006.00072.x.

    Article  Google Scholar 

  24. Wang, C.-C., Liu, K.-S., Cheng, C.-L., & Cheng, Y.-Y. (2013). Comparison of web-based versus paper-and-pencil administration of a humor survey. Computers in Human Behavior, 29(3), 1007–1011. doi:10.1016/j.chb.2012.12.029.

    Article  Google Scholar 

  25. Chang, L., & Krosnick, J. A. (2010). Comparing oral interviewing with self-administered computerized questionnaires an experiment. Public Opinion Quarterly, 74(1), 154–167. doi:10.1093/poq/nfp090.

    Article  Google Scholar 

  26. Denscombe, M. (2006). Web-based questionnaires and the mode effect. Social Science Computer Review, 24(2), 246–254. doi:10.1177/0894439305284522.

    Article  Google Scholar 

  27. Denscombe, M. (2007). The length of responses to open-ended questions. Social Science Computer Review, 26(3), 359–368. doi:10.1177/0894439307309671.

    Article  Google Scholar 

  28. Dolnicar, S., Laesser, C., & Matus, K. (2009). Online versus paper: Format effects in tourism surveys. Journal of Travel Research, 47(3), 295–316.

    Article  Google Scholar 

  29. Ginon, E., Combris, P., Lohéac, Y., Enderli, G., & Issanchou, S. (2014). What do we learn from comparing hedonic scores and willingness-to-pay data? Food Quality and Preference, 33, 54–63. doi:10.1016/j.foodqual.2013.11.003.

    Article  Google Scholar 

  30. Dost, F., & Wilken, R. (2012). Measuring willingness to pay as a range, revisited: When should we care? International Journal of Research in Marketing, 29(2), 148–166. doi:10.1016/j.ijresmar.2011.09.003.

    Article  Google Scholar 

  31. Wertenbroch, K., & Skiera, B. (2002). Measuring consumers’ willingness to pay at the point of purchase. Journal of Marketing Research, 39(2), 228–241. doi:10.1509/jmkr.39.2.228.19086.

    Article  Google Scholar 

  32. Horowitz, J. K., & McConnell, K. E. (2002). A review of WTA/WTP studies. Journal of Environmental Economics and Management, 44(3), 426–447. doi:10.1006/jeem.2001.1215.

    Article  Google Scholar 

  33. Börger, T. (2013). Keeping up appearances: Motivations for socially desirable responding in contingent valuation interviews. Ecological Economics, 87, 155–165. doi:10.1016/j.ecolecon.2012.12.019.

    Article  Google Scholar 

  34. Boyle, K. J., Morrison, M., MacDonald, D. H., Duncan, R., & Rose, J. (2016). Investigating internet and mail implementation of stated-preference surveys while controlling for differences in sample frames. Environmental and Resource Economics, 64(3), 401–419. doi:10.1007/s10640-015-9876-2.

    Article  Google Scholar 

  35. Banzhaf, H. S., Burtraw, D., Evans, D., & Krupnick, A. (2006). Valuation of natural resource improvements in the Adirondacks. Land Economics, 82(3), 445–464.

    Article  Google Scholar 

  36. MacDonald, D. H., Morrison, M., Rose, J. M., & Boyle, K. (2010). Untangling differences in values from internet and mail stated preference studies. Paper presented at the world congress of environmental and resource economists (WCERE), Montreal, Canada.

  37. Olsen, S. B. (2009). Choosing between internet and mail survey modes for choice experiment surveys considering non-market goods. Environmental and Resource Economics, 44(4), 591–610. doi:10.1007/s10640-009-9303-7.

    Article  Google Scholar 

  38. Windle, J., & Rolfe, J. (2011). Comparing responses from internet and paper-based collection methods in more complex stated preference environmental surveys. Economic Analysis and Policy, 41(1), 83–97.

    Article  Google Scholar 

  39. Hudson, D., Seah, L.-H., Hite, D., & Haab, T. (2004). Telephone presurveys, self-selection, and non-response bias to mail and internet surveys in economic research. Applied Economics Letters, 11(4), 237–240. doi:10.1080/13504850410001674876.

    Article  Google Scholar 

  40. Grandjean, B. D., Nelson, N. M., & Taylor, P. A. (2009). Comparing an internet panel survey to mail and phone surveys on willingness to pay for environmental quality: A national mode test. Paper presented at the 64th annual conference of the American Association for Public Opinion Research, 14–17 May 2009.

  41. Groves, R. M. (2004). Survey errors and survey costs. New York: Wiley.

    Google Scholar 

  42. Mitchell, R. C., & Carson, R. T. (2013). Using surveys to value public goods: The contingent valuation method. London: Routledge.

    Google Scholar 

  43. Breidert, C., Hahsler, M., & Reutterer, T. (2006). A review of methods for measuring willingness-to-pay. Innovative Marketing, 2(4), 8–32.

    Google Scholar 

  44. Zeiler, K., & Plott, C. R. (2004). The willingness to pay/willingness to accept gap, the endowment effect, subject misconceptions and experimental procedures for eliciting valuations. American Economic Review, 95(3), 530–545.

    Google Scholar 

  45. Couper, M. P. (2005). Technology trends in survey data collection. Social Science Computer Review, 23(4), 486–501.

    Article  Google Scholar 

  46. Artis, A. Q. (2012). Chapter 4—Marketing and promo videos. In A. Q. Artis (Ed.), The shut up and shoot video guide (pp. 129–177). Boston: Focal Press.

    Chapter  Google Scholar 

  47. Zaidman, N., & Holmes, P. (2009). Business communication as cultural text: Exchange and feedback of promotional video clips. International Journal of Intercultural Relations, 33(6), 535–549. doi:10.1016/j.ijintrel.2009.06.002.

    Article  Google Scholar 

  48. Fuchs, M. (2009). Gender-of-interviewer effects in a video-enhanced web survey: Results from a randomized field experiment. Social Psychology, 40, 37–42. doi:10.1027/1864-9335.40.1.37.

    Article  Google Scholar 

  49. Jeannis, M., Terry, T., Heman-Ackah, R., & Price, M. (2013). Video interviewing: An exploration of the feasibility as a mode of survey application. Survey Practice. http://www.surveypractice.org/index.php/SurveyPractice/article/view/24/html.

  50. Malakhoff, L. A., & Jans, M. (2011). Towards usage of avatar interviewers in web surveys. Survey Practice. http://surveypractice.org/index.php/SurveyPractice/article/view/104/html.

  51. Caro, F. G., Ho, T., McFadden, D., Gottlieb, A. S., Yee, C., Chan, T., et al. (2011). Using the internet to administer more realistic vignette experiments. Social Science Computer Review, 30(2), 184–201. doi:10.1177/0894439310391376.

    Article  Google Scholar 

  52. Shapiro-Luft, D., & Cappella, J. N. (2013). Video content in web surveys: Effects on selection bias and validity. Public Opinion Quarterly, 77(4), 936–961. doi:10.1093/poq/nft043.

    Article  Google Scholar 

  53. Mendelson, J., Gibson, J. L., & Romano-Bergstrom, J. C. (2013). Effects of displaying videos on measurement in a web survey. Paper presented at the American Association for Public Opinion Research (AAPOR) 68th annual conference.

  54. scopus. (2017). Abstract and citation database of peer-reviewed literature (WTB & WTP). https://www.scopus.com/. Accessed August 02, 2017.

  55. Sniderman, P. M., & Grob, D. B. (1996). Innovations in experimental design in attitude surveys. Annual Review of Sociology, 22, 377–399.

    Article  Google Scholar 

  56. Mutz, D. C. (2011). Population-based survey experiments (STU—Student ed.). Princeton: Princeton University Press.

    Google Scholar 

  57. Ozok, A. A., & Wei, J. (2010). An empirical comparison of consumer usability preferences in online shopping using stationary and mobile devices: Results from a college student population. Electronic Commerce Research, 10(2), 111–137. doi:10.1007/s10660-010-9048-y.

    Article  Google Scholar 

  58. Lightner, N. J., Yenisey, M. M., Ozok, A. A., & Salvendy, G. (2002). Shopping behaviour and preferences in e-commerce of Turkish and American university students: Implications from cross-cultural design. Behaviour and Information Technology, 21(6), 373–385. doi:10.1080/0144929021000071316.

    Article  Google Scholar 

  59. Maloshonok, N., & Terentev, E. (2016). The impact of visual design and response formats on data quality in a web survey of MOOC students. Computers in Human Behavior, 62, 506–515. doi:10.1016/j.chb.2016.04.025.

    Article  Google Scholar 

  60. Fuchs, M. (2009). Differences in the visual design language of paper-and-pencil surveys versus web surveys. Social Science Computer Review, 27(2), 213–227. doi:10.1177/0894439308325201.

    Article  Google Scholar 

  61. Hughner, R. S., McDonagh, P., Prothero, A., Shultz, C. J., & Stanton, J. (2007). Who are organic food consumers? A compilation and review of why people purchase organic food. Journal of Consumer Behaviour, 6(2–3), 94–110. doi:10.1002/cb.210.

    Article  Google Scholar 

  62. E.C. (2007). Council Regulation (EC) No 834/2007 of 28 June 2007 on organic production and labelling of organic products and repealing Regulation (EEC) No 2092/91. Official Journal of the Europe 20.7.2007.

  63. ELSTAT. (2012). Hellenic Statistical Authority Family Budget Review 2012. http://www.tovima.gr/files/1/2013/11/29/ereynaelstat.pdf; http://www.tovima.gr/finance/article/?aid=544184.

  64. Deutskens, E., Ruyter, K. D., & Wetzels, M. (2006). An assessment of equivalence between online and mail surveys in service research. Journal of Service Research, 8(4), 346–355. doi:10.1177/1094670506286323.

    Article  Google Scholar 

  65. Kwak, N., & Radler, B. (2002). A comparison between mail and web surveys: Response pattern, respondent profile, and data quality. Journal of Official Statistics, 18(2), 257–273.

    Google Scholar 

  66. Barrios, M., Villarroya, A., Borrego, Á., & Ollé, C. (2011). Response rates and data quality in web and mail surveys administered to PhD holders. Social Science Computer Review, 29(2), 208–220. doi:10.1177/0894439310368031.

    Article  Google Scholar 

  67. Kiernan, N. E., Kiernan, M., Oyler, M. A., & Gilles, C. (2005). Is a web survey as effective as a mail survey? A field experiment among computer users. American Journal of Evaluation, 26(2), 245–252. doi:10.1177/1098214005275826.

    Article  Google Scholar 

  68. Rada, V. D. D., & Domínguez-Álvarez, J. A. (2014). Response quality of self-administered questionnaires: A comparison between paper and web questionnaires. Social Science Computer Review, 32(2), 256–269. doi:10.1177/0894439313508516.

    Article  Google Scholar 

  69. Scaefer, D. R., & Dillman, D. A. (1998). Development of a standard e-mail methodology: Results of an experiment. Public Opinion Quarterly (POQ), 62(3), 378–397.

    Article  Google Scholar 

  70. Batte, M. T., Hooker, N. H., Haab, T. C., & Beaverson, J. (2007). Putting their money where their mouths are: Consumer willingness to pay for multi-ingredient, processed organic food products. Food Policy, 32(2), 145–159. doi:10.1016/j.foodpol.2006.05.003.

    Article  Google Scholar 

  71. Millock, K., Hansen, L. G., Wier, M., & Andersen, L. M. (2002). Willingness to pay for organic foods: A comparison between survey data and panel data from Denmark. Paper presented at the 12th annual EAERE (European Association of Environmental and Resource Economists) conference, Monterey, USA.

  72. Leeuw, E. D., & Berzelak, N. (2016). The SAGE handbook of survey methodology: Survey mode or survey modes?. Thousand Oaks: SAGE.

    Google Scholar 

  73. Dodou, D., & de Winter, J. C. F. (2014). Social desirability is the same in offline, online, and paper surveys: A meta-analysis. Computers in Human Behavior, 36, 487–495. doi:10.1016/j.chb.2014.04.005.

    Article  Google Scholar 

  74. Pearson, D., Henryks, J., & Jones, H. (2011). Organic food: What we know (and do not know) about consumers. Renewable Agriculture and Food Systems, 26(02), 171–177. doi:10.1017/S1742170510000499.

    Article  Google Scholar 

  75. Zámková, M., & Prokop, M. (2013). Consumer behaviour of students when shopping for organic food in the Czech Republic. Acta Universitatis Agriculturae et Silviculturae Mendelianae Brunensis, 61(4), 1191–1201.

    Article  Google Scholar 

  76. Zagata, L. (2012). Consumers’ beliefs and behavioural intentions towards organic food. Evidence from the Czech Republic. Appetite, 59(1), 81–89. doi:10.1016/j.appet.2012.03.023.

    Article  Google Scholar 

  77. Dupont, D. P. (2004). Do children matter? An examination of gender differences in environmental valuation. Ecological Economics, 49(3), 273–286. doi:10.1016/j.ecolecon.2004.01.013.

    Article  Google Scholar 

  78. López-Mosquera, N., García, T., & Barrena, R. (2014). An extension of the theory of planned behavior to predict willingness to pay for the conservation of an urban park. Journal of Environmental Management, 135, 91–99. doi:10.1016/j.jenvman.2014.01.019.

    Article  Google Scholar 

  79. De Magistris, T., & Gracia, A. (2008). The decision to buy organic food products in Southern Italy. British Food Journal, 110(9), 929–947.

    Article  Google Scholar 

  80. Callegaro, M. (Ed.). (2013). Improving surveys with paradata: Analytic uses of process information. Hoboken, NJ: Wiley.

    Google Scholar 

Download references

Acknowledgements

The authors acknowledge the insights from the participation to the WEBDATANET (COST Action IS1004) network (http://webdatanet.cbs.dk/). The authors acknowledge the comments and suggestions by Dr. Nejc Berzelak, University of Ljubljana, Faculty of Social Sciences, and the three anonymous reviewers.

Funding

This work was funded by the Engineers and Public Constructors Pension Fund, which is a periodic funding for independent research by faculty members of Engineering Schools in Greece (#80962, Special Account of Research Funds of Democritus University of Thrace).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Konstantinos P. Tsagarakis.

Ethics declarations

Conflict of interest

Authors declare no conflict of interest.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 560 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Keramitsoglou, K.M., Lozar Manfreda, K., Anastasiou, C. et al. Mode comparison study on willingness to buy and willingness to pay for organic foods: paper-and-pencil versus computerized questionnaire. Electron Commer Res 18, 587–603 (2018). https://doi.org/10.1007/s10660-017-9274-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10660-017-9274-7

Keywords

Navigation