Skip to main content
Log in

Self-Reported Learning Gains: A Theory and Test of College Student Survey Response

  • Published:
Research in Higher Education Aims and scope Submit manuscript

Abstract

Recent studies have asserted that self-reported learning gains (SRLG) are valid measures of learning, because gains in specific content areas vary across academic disciplines as theoretically predicted. In contrast, other studies find no relationship between actual and self-reported gains in learning, calling into question the validity of SRLG. I reconcile these two divergent sets of literature by proposing a theory of college student survey response that relies on the belief-sampling model of attitude formation. This theoretical approach demonstrates how students can easily construct answers to SRLG questions that will result in theoretically consistent differences in gains across academic majors, while at the same time lacking the cognitive ability to accurately report their actual learning gains. Four predictions from the theory are tested, using data from the 2006–2009 Wabash National Study. Contrary to previous research, I find little evidence as to the construct and criterion validity of SRLG questions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. All of these items, except for “contributing to the welfare of your community” and “understanding yourself”, appear on the revised version of the NSSE to be used in 2013. The quantitative item has been revised to “analyzing numerical and statistical information.” Two other SRLG items are also included that do not fit within the Holland framework (Pike 2011; Pike et al. 2011b), and are not discussed here (solving complex problems and developing a personal code of ethics).

  2. In Chapter 6 of their book, they review the evidence in favor of this model of survey response.

  3. Several majors not listed in the Dictionary were coded as follows: Pre-Medicine Studies (51.1102) as Investigative per Smart et al. (2000); Gay/Lesbian Studies (05.0208), German Studies (05.0125), Italian Studies (05.0126) and Japanese Studies (05.0127) as Social, similar to other area/ethnic studies; Polish Language and Literature (16.0407) and English Language and Literature/Letters, Other (23.9999) as Artistic, similar to other language and literature majors; Early Childhood Education and Teaching (13.1210) as Social, similar to elementary education; Cell/Cellular and Molecular Biology (26.0406) as Investigative, similar to biology and biochemistry.

  4. It is not clear why previous researchers do not include these environments in their analyses. Pike (2011) states that “too few seniors were majoring in Realistic and Conventional disciplines to permit stable estimates of learning outcomes” (p. 49), but he does not recall the number of these majors in the NSSE that were available (Pike 2012). At least one Conventional major (Accounting) and two Realistic majors (Materials Engineering and Mechanical Engineering) are collected and coded by the NSSE. Given the large number of students in the NSSE response pool (approximately 416,000 in 2011), there should be enough students in these majors to include them in a statistical analysis.

References

  • American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for Educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • Astin, A. W., & Lee, J. J. (2003). How risky are one-shot cross-sectional assessments of undergraduate students? Research in Higher Education, 44(6), 657–672.

    Article  Google Scholar 

  • Biglan, A. (1973a). The characteristics of subject matter in different academic areas. Journal of Applied Psychology, 57(3), 195–203.

    Article  Google Scholar 

  • Biglan, A. (1973b). Relationships between subject matter characteristics and the structure and output of university departments. Journal of Applied Psychology, 57(3), 204–213.

    Article  Google Scholar 

  • Bloom, H. S., Hill, C. J. J., Black, A. R., & Lipsey, M. (2008). Performance trajectories and performance gaps as achievement effect-size benchmarks for educational interventions. Journal of Research on Educational Effectiveness, 1, 289–328.

    Article  Google Scholar 

  • Bowman, N. A. (2010a). Assessing learning and development among diverse college students. New Directions for Instutional Research, 145, 53–71.

    Article  Google Scholar 

  • Bowman, N. A. (2010b). Can 1st-year students accurately report their learning and development? American Educational Research Journal, 47, 466–496.

    Article  Google Scholar 

  • Bowman, N. A. (2011a). Examining systematic errors in predictors of college student self-reported gains. New Directions for Instutional Research, 150, 7–19.

    Article  Google Scholar 

  • Bowman, N. A. (2011b). Validity of college self-reported gains at diverse institutions. Educational Researcher, 40, 22–24.

    Article  Google Scholar 

  • Bradburn, N., Rips, L. J., & Shevell, S. (1987). Answering autobiographical questions: The impact of memory and inference on surveys. Science, 123, 157–161.

    Google Scholar 

  • Campbell, C. M., & Cabrera, A. F. (2011). How sound is NSSE?: Investigating the psychometric properties of NSSE at a public, research-extensive institution. Review of Higher Education, 35, 77–103.

    Article  Google Scholar 

  • Ewell, P., McClenney, K., & McCormick, A. C. (2011). Measuring engagement. Inside Higher Education. Crawfordsville: Wabash College.

  • Gottfredson, G. D., & Holland, J. L. (1996). Dictionary of Holland occupational codes. Lutz, FL: Psychological Assessment Resources Inc.

    Google Scholar 

  • Groves, R. M. (2006). Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70, 646–675.

    Article  Google Scholar 

  • Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56(4), 475–495.

    Article  Google Scholar 

  • Groves, R. M., & Peytcheva, E. (2008). The impact of nonresponse rates on nonresponse bias: A meta-analysis. Public Opinion Quarterly, 72, 167–189.

    Article  Google Scholar 

  • Higher Education Data Sharing Consortium. (2012a). HEDS senior survey. Downloaded June 10, 2012, from http://www.hedsconsortium.org/storage/HEDS_Senior_Survey_Sample_02-23-2012.pdf.

  • Higher Education Research Institute. (2012b). College senior survey. Downloaded June 10, 2012, from http://www.heri.ucla.edu/researchers/instruments/FUS_CSS/2012CSS.PDF.

  • Holland, J. L. (1973). Making vocational choices: A theory of vocational personalities and work environment. Upper Saddle River, NJ: Prentice-Hall.

    Google Scholar 

  • Huang, Y., & Healy, C. (1997). The relations of Holland-typed majors to students’ freshman and senior work values. Research in Higher Education, 38, 455–477.

    Article  Google Scholar 

  • Keeter, S., Kennedy, C., Dimock, M., Best, J., & Craighill, P. (2006). Gauging the impact of growing nonresponse on estimates from a national RDD telephone survey. Public Opinion Quarterly, 70(5), 759–779.

    Article  Google Scholar 

  • Kuh, G. D. (2001). The national survey of student engagement: Conceptual framework and overview of psychometric properties. Bloomington, IN:Indiana University Center for Postsecondary Research.

    Google Scholar 

  • Kuh, G. D., & Vesper, N. (2001). Do computers enhance or detract from student learning? Research in Higher Education, 42(1), 87–102.

    Article  Google Scholar 

  • Laird, T. F. N., Shoup, R., Kuh, G. D., & Schwarz, M. J. (2008). The effects of discipline on deep approaches to student learning and college outcomes. Research in Higher Education, 49(6), 469–494.

    Article  Google Scholar 

  • Lambert, A. D., Terenzini, P. T., & Lattuca, L. R. (2007). More than meets the eye: Curricular and programmatic effects on student learning. Research in Higher Education, 48(2), 141–168.

    Article  Google Scholar 

  • McCormick, A., Pike, G., Kuh, G., & Chen, P.-S. (2009). Comparing the utility of the 2000 and 2005 carnegie classification systems in research on students’s college experiences and outcomes. Research in Higher Education, 50, 144–167. doi:10.1007/s11162-008-9112-9.

    Article  Google Scholar 

  • McCormick, A. C., & McClenney, K. (2012). Will these trees ever bear fruit? A response to the special issue on student engagement. Review of Higher Education, 35(2), 307–333.

    Article  Google Scholar 

  • National Survey of Student Engagement. (2012). Sample institutional report. Downloaded June 9, 2012, from http://nsse.iub.edu/_/?cid=402.

  • Pace, R. C. (1985). The credibility of student self-reports. Technical report, Center for the Study of Evaluation, University of California, Los Angeles.

  • Pascarella, E. T., Blaich, C., Martin, G. L., & Hanson, J. M. (2011). How robust are the findings of academically adrift? Change, 43(3), 20–24.

  • Pascarella, E. T., & Padgett, R. (2011). Using institution-level NSSE benchmarks to assess engagement in good practices: A cautionary note. Manuscript, University of Iowa.

  • Pascarella, E. T., & Wolniak, G. C. (2004). Change or not to change—Is there a question? A response to pike. Journal of College Student Development, 45(3), 353–355.

    Article  Google Scholar 

  • Pike, G. R. (2000). The influence of fraternity or sorority membership on students’ college experiences and cognitive development. Research in Higher Education, 41(1), 117–139.

    Article  Google Scholar 

  • Pike, G. R. (2006). Students’ personality types, intended majors, and college expectations: Further evidence concerning psychological and sociological interpretations of Holland’s theory. Research in Higher Education, 47(7), 801–822.

    Article  Google Scholar 

  • Pike, G. R. (2011). Using college students’ self-reported learning outcomes in scholarly research. New Directions for Instutional Research, 150, 41–58.

    Article  Google Scholar 

  • Pike, G. R. (2012, June 28). Personal communication.

  • Pike, G. R., & Killian, T. S. (2001). Reported gains in student learning : Do academic disciplines make a difference? Research in Higher Education, 42(4), 429–454.

    Article  Google Scholar 

  • Pike, G. R., Kuh, G., McCormick, A., Ethington, C., & Smart, J. (2011a). If and when money matters: The relationships among educational expenditures, student engagement and students’ learning outcomes. Research in Higher Education, 51(1), 81–106.

    Article  Google Scholar 

  • Pike, G. R., Smart, J. C., & Ethington, C. A. (2011b). Differences in learning outcomes across academic environments: Further evidence concerning the construct validity of students’ self-reports. Paper presented at the annual conference of the Association for the Study of Higher Education, Raleigh, NC.

  • Pike, G. R., Smart, J. C., & Ethington, C. A. (2012). The mediating effects of student engagement on the relationships between academic disciplines and learning outcomes: An extension of Holland’s theory. Research in Higher Education, 53(5), 550–575.

    Article  Google Scholar 

  • Porter, S., & Umbach, P. D. (2006). Student survey response rates across institutions: Why do they vary? Research in Higher Education, 47, 229–247.

    Article  Google Scholar 

  • Porter, S. R. (2011a). Do college student surveys have any validity? Review of Higher Education, 35, 45–76.

    Article  Google Scholar 

  • Porter, S. R. (2011b). Student learning as a measure of quality in higher education. Context for Success Project. Seattle: Gates Foundation.

  • Rothstein, J. (2009). Student sorting and bias in value-added estimation: Selection on observables and unobservables. Education Finance and Policy, 4(4), 537–571.

    Article  Google Scholar 

  • Smart, J. C. (2010). Differential patterns of change and stability in student learning outcomes in Holland’s academic environments: The role of environmental consistency. Research in Higher Education, 51, 468–482.

    Article  Google Scholar 

  • Smart, J. C., Feldman, K. A., & Ethington, C. A. (2000). Academic disciplines: Holland’s theory and the study of college students and faculty. Nashville, TX: Vanderbilt University Press

    Google Scholar 

  • Tourangeau, R., Rips, L. J., & Rasinski, K. (2000). The psychology of survey response. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Zhao, C.-M., & Kuh, G. D. (2004). Adding value: Learning communities and student engagement. Research in Higher Education, 45(2), 115–138.

    Article  Google Scholar 

Download references

Acknowledgments

I would like to thank Charlie Blaich and Ernie Pascarella for generously providing me with access to the Wabash study data; Nick Bowman, Jana Hanson and Teniell Trolian for assistance with using the data; Gary Pike for providing information about his research design; and Claire Porter and Paul Umbach for comments on the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Stephen R. Porter.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Porter, S.R. Self-Reported Learning Gains: A Theory and Test of College Student Survey Response. Res High Educ 54, 201–226 (2013). https://doi.org/10.1007/s11162-012-9277-0

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11162-012-9277-0

Keywords

Navigation