Abstract
Science, Technology, Engineering, and Mathematics (STEM) Project-Based Learning (PBL) integrates assessment methods across different aspects of learning experiences. While STEM PBL shifts the focus of attention from summative to formative assessment, a greater attention is given to the interpersonal domain. Because of the nature of STEM PBL, which is centered on developing real-world projects where students can apply their understandings of various concepts, authentic assessment underlies both formative and summative assessment tasks through technology, such as classroom response systems, and rubrics. Authentic assessment in STEM PBL helps students transition from an authority-imposed regulation to the self-regulation of their learning. Therefore, assessment in STEM PBL is inextricably interwoven with pedagogy through integrated assessment methods that develop the whole person, stimulate creativity, and foster individualized group responsibility.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Andrade, H. G. (2000). Using rubrics to promote thinking and learning. Educational Leadership, 57(5), 13–18.
Andrade, H. G. (n.d.). Understanding rubrics. Retrieved from http://learnweb.harvard.edu/ALPS/thinking/docs/rubricar.htm.
Ashcroft, K., & Palacio, D. (1996). Researching into assessment and evaluation in colleges and universities. London, UK: Kogan Page.
Boaler, J. (1998). Open and closed mathematics approaches: Student experiences and understandings. Journal for Research in Mathematics Education, 29, 41–62.
Brophy, J. (2004). Motivating students to learn (2nd ed.). Mahwah, NJ: Erlbaum.
Cavanaugh, S. (2006, November 15). Technology helps teachers home in on student needs. Education Week, 26(24), p.12.
Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco, CA: Addison Wesley/Pearson.
Falchikov, N. (1995). Peer feedback marking: Developing peer assessment. Innovations in Education and Teaching International, 32, 175–187.
Guskey, T. R. (2002). Does it make a difference?: Evaluating professional development. Educational Leadership, 59(6), 46–51.
Klum, G. (1994). Mathematics assessment. What works in the classroom. San Francisco, CA: Jossey-Bass.
Moursund, D. (n.d.). Part 7: Assessment. Retrieved June 1, 2012, from http://www.uoregon.edu/~moursund/PBL/part_7.htm.
O’Malley, K. J., Moran, B. J., Haidet, P., Seidel, C. L., Schneidr, V., Morgan, R. O., Kelly, P. A., & Richards, B. (2003). Validation of an observation instrument for measuring student engagement in health professions settings. Evaluation & Health Professions, 26(1), 86–103.
Patrick, P. (2009). Professional development that fosters classroom application. Modern Language Journal, 93, 280–287.
Peckham, G., & Sutherland, L. (2000). The role of self-assessment in moderating students’ expectation. South African Journal for Higher Education, 14(1), 75–78.
Sanders, W. L., & Rivers, J. C. (1996). Cumulative and residual effects of teachers on future students’ academic achievement. Knoxville: University of Tennessee, Value-Added Research and Assessment Center.
Secretary’s Commission on Achieving Necessary Skills. (2000). What work requires of schools: A SCANS report for America 2000. Washington DC: U.S. Department of Labor.
Simon, A., & Boyer, E. G. (1969). Mirrors for behavior, An anthology of classroom observation instruments. ERIC document Reproduction No. 031613.
Solomon, G. (2003). Project-based learning: A primer. Technology & Learning, 23(6), 20–30.
Stearns, L. M., Morgan, J., Capraro, M. M., & Capraro, R. M. (2012). The development of a teacher observation instrument for PBL classroom instruction. Journal of STEM Education: Innovations and Research, 13(3), 25–34.
Taylor-Powell, E., & Steele, S. (1996). Colleting evaluation data: Direct observation. Program development and evaluation. University of Wisconsin, Cooperative Extension-Program Development and Evaluation. Retrieved from http://cecommerce.uwex.edu/pdfs/G3658_5.PDF
VanTassel-Baska, J., Feng, A. X., Brown, E., Bracke, B., Stambaugh, T., French, H., & Bai, W. (2008). A study of differentiated instructional change over 3 years. The Gifted Child Quarterly, 52, 297–312.
Wright, R. J. (2008). Educational assessment: Tests and measurement in the age of accountability. Thousand Oaks, CA: Sage.
Zimmaro, D. M. (2004). Developing grading rubrics. Retrieved June 1, 2008, from the University of Texas at Austin, Division of Instructional Innovation and Assessment Web site: http://www.utexas.edu/academic/mec/research/pdf/rubricshandout.pdf
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Sense Publishers
About this chapter
Cite this chapter
Capraro, R.M., Corlu, M.S. (2013). Changing views on Assessment for STEM Project-Based Learning. In: Capraro, R.M., Capraro, M.M., Morgan, J.R. (eds) STEM Project-Based Learning. SensePublishers, Rotterdam. https://doi.org/10.1007/978-94-6209-143-6_12
Download citation
DOI: https://doi.org/10.1007/978-94-6209-143-6_12
Publisher Name: SensePublishers, Rotterdam
Online ISBN: 978-94-6209-143-6
eBook Packages: Humanities, Social Sciences and LawEducation (R0)