Skip to main content

Performance Assessment: Something Old, Something New

  • Chapter
  • First Online:
Handbook of Research on Educational Communications and Technology

Abstract

Education, industry, and the military rely on relevant and effective performance assessment to make key decisions. Qualification, promotion, advancement, hiring, firing, and training decisions are just some of the functions that rely on good performance assessment. This chapter explores issues, tools, and techniques for ensuring that performance assessment meets the goals of decision makers. Important considerations include the following: the task environment in which assessment takes place, the validity of the measures selected, the diversity and scope of the measures, and often the local political environment in which assessments are made. Unfortunately, primarily in education, assessment policy is a matter of intense political debate, and the debate is sustained by misinformation. For example, traditional paper and pencil assessment techniques have come under fire from those who do not feel they have the relevance they once did. Simulation-based training technologies in industry and the military have put more emphasis on performance assessment systems that show real-world work relevance. The chapter examines these topics.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • *American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for educational and psychological testing. Washington, DC: American Educational Research Association.

    Google Scholar 

  • *Andrews, D., Nullmeyer, R., Good, J., & Fitzgerald, P. (2008). Measurement of learning processes in pilot simulation. In E. Baker, J. Dickieson, W. Wulfeck, & H. O’Neil. Assessment of problem solving using simulations (pp. 273–288). New York: Lawrence Erlbaum Associates.

    Google Scholar 

  • Anderson, J. R., Reder, L. M., & Simon, H. A. (2000). Applications and misapplications of cognitive psychology to mathematics education. Texas Educational Review, 6.

    Google Scholar 

  • *Baker, E. L., Dickieson, J. L., Wulfeck, W. H., & O’Neil, H. F. (Eds.). (2008). Assessment of problem solving using simulations. New York: Lawrence Erlbaum Associates.

    Google Scholar 

  • *Bell, H. H., Andrews, D. H., & Wulfeck, W. H., II. (2010). Behavioral task analysis. In K. H. Silber & W. R. Foshay (Eds.), Handbook of improving performance in the workplace, vol. 1: Instructional design and training delivery (pp. 184–226). San Francisco, CA: Pfeiffer/International Society for Performance Improvement.

    Google Scholar 

  • Baird, H. (1997). Performance assessment for science teachers. Salt Lake City, UT: Utah State Office of Education.

    Google Scholar 

  • Bloom, S. B. (1956). Taxonomy of educational objectives. Boston: Allyn and Bacon.

    Google Scholar 

  • Byrne, A. J., & Greaves, J. D. (2001). Assessment instruments used during anesthetic simulation: Review of published studies. British Journal of Anaesthesia, 86(3), 445–450.

    Article  Google Scholar 

  • Department of Education. (2010). Overview information; Race to the top fund assessment program; Notice inviting applications for new awards for fiscal year (FY) 2010. Federal Register/Vol. 75, No. 68/Friday, April 9, 2010/Notices. p. 18171.

    Google Scholar 

  • Diaper, G. (1990). The Hawthorne effect: A fresh examination. Educational Studies, 16, 261–267.

    Article  Google Scholar 

  • Ellis, J. A., & Wulfeck, W. H. (1982). Handbook for testing in Navy schools. Special Report 83-2, San Diego, CA: Navy Personnel Research and Development Center. DTIC Accession Number: ADA122479.

    Google Scholar 

  • Glaser, R. (1963). Instructional technology and the measurement of learning outcomes: Some questions. American Psychologist, 18, 519–521.

    Article  Google Scholar 

  • Glaser, R., & Klaus, D. J. (1962). Proficiency measurement: Assessing human performances. In R. M. Gagné (Ed.), Psychological principles in systems development (pp. 419–474). New York: Holt, Rinehart, & Winston.

    Google Scholar 

  • Hays, R. T., & Singer, M. J. (1989). Simulation fidelity in training system design: Bridging the gap between reality and training. New York: Springer-Verlag.

    Book  Google Scholar 

  • Joint Committee on Standards for Educational Evaluation. (1988). The personnel evaluation standards: How to assess systems for evaluating educators. Newbury Park, CA: Sage Publications.

    Google Scholar 

  • Joint Committee on Standards for Educational Evaluation. (2003). The student evaluation standards: How to improve evaluations of students. Newbury Park, CA: Corwin Press.

    Google Scholar 

  • Lane, N. E. (1986) Issues in performance measurement for military aviation with applications to air combat maneuvering. Technical Report: NTSC TR-86-008. Naval Training Systems Center. DTIC Accession Number: ADA172986.

    Google Scholar 

  • Lesgold, A. (2008). Assessment to steer the course of learning. In E. Baker, J. Dickieson, W. Wulfeck, & H. O’Neil (Eds.), Assessment of problem solving using simulations (pp. 19–36). New York: Lawrence Erlbaum Associates.

    Google Scholar 

  • Madaus, G. F., & O’Dwyer, L. M. (1999). Short history of performance assessment: Lessons learned. Phi Delta Kappan, 80(9), 688–695.

    Google Scholar 

  • Meister, D. (1999). Measurement in aviation systems. In D. J. Garland, J. A. Wise, & V. D. Hopkins (Eds.), Handbook of aviation human factors (pp. 34–49). Mahwah, NJ: Lawerence Erlbaum Associates.

    Google Scholar 

  • Merrill, M. D. (1994). Instructional design theory. Englewood Cliffs, NJ: Educational Technology Publications.

    Google Scholar 

  • *Pellegrino, J. W., Chudowsky, N., & Glaser, R. (2001). Knowing what students know: The science and design of educational assessment. Washington, DC: National Academy Press.

    Google Scholar 

  • Shivers, C. H. (1998). Halos, horns and Hawthorne: Potential flaws in the evaluation process. Professional Safety, 43(3), 38–41.

    Google Scholar 

  • Smith, P. C., & Kendall, L. M. (1963). Retranslation of expectations: An approach to the construction of unambiguous anchors to rating scales. Journal of Applied Psychology, 47, 149–155.

    Article  Google Scholar 

  • Staal, M. A. (2004). Stress, cognition, and human performance: A literature review and conceptual framework. NASA Tech. Rep. NASA/TM-2004-212824. Moffat Field, CA: Ames Research Center, Retrieved from http://human-factors.arc.nasa.gov/flightcognition/Publications/IH_054_Staal.pdf

  • Stevens, A., & Collins, A. (1977). The goal structure of a Socratic tutor. Proceedings of Association for Computing Machinery National Conference. Seattle, Washington.

    Google Scholar 

  • Swezey, R. W., & Andrews, D. H. (2001). Readings in training and simulation: A 30-year perspective. Santa Monica, CA: Human Factors and Ergonomics Society.

    Google Scholar 

  • U.S. Congress, Office of Technology Assessment. (1992). Testing in american schools: Asking the right questions, OTA-SET-519. Washington, DC: U.S. Government Printing Office.

    Google Scholar 

  • Wallace, S. R. (1965). Criteria for what? American Psychologist, 20, 411–417.

    Article  Google Scholar 

  • Webb, N. M., Shavelson, R. J., & Haertel, E. H. (2006). Reliability coefficients and generalizability theory. In C. R. Rao (Ed.), Handbook of statistics (Volume on Psychometrics, Vol. 26, pp. 81–124). Amsterdam, The Netherlands: Elsevier.

    Google Scholar 

  • Wiggins, G. P. (1993). Assessing student performance. San Francisco, CA: Jossey-Bass Publishers.

    Google Scholar 

  • Wulfeck, W. H., & Wetzel-Smith, S. K. (2008). Use of visualization to improve high-stakes problem solving. In E. L. Baker, J. Dickieson, W. H. Wulfeck, & H. F. O’Neal (Eds.), Assessment of problem solving using simulations (pp. 223–238). New York: Lawrence Erlbaum Associates.

    Google Scholar 

  • Wulfeck, W. H., & Wetzel-Smith, S. K. (2010). Training incredibly complex tasks. In P. E. O’Connor & J. V. Cohn (Eds.), Human performance enhancement in high risk environments. Santa Barbara, CA: ABC-CLIO.

    Google Scholar 

  • Wulfeck, W. H., Wetzel-Smith, S. K., & Dickieson, J. L. (2004). Interactive multisensory analysis training. In NATO RTO Human Factors and Medicine Panel (Eds.), Symposium on advanced technologies for military training. Neuilly-sur-Sein Cedex, France: North Atlantic Treaty Organization Research and Technology Agency.

    Google Scholar 

  • Yarbrough, D. B., Shulha, L. M., Hopson, R. K., & Caruthers, F. A. (2011). The program evaluation standards (3rd ed.). Thousand Oaks, CA: Sage.

    Google Scholar 

Download references

Acknowledgments

Portions of this work were supported by the US Navy. The views and opinions expressed herein are those of the authors, and are not to be construed as official or as representing the Department of Defense, or the Department of the Navy. In addition, we thank two anonymous reviewers for their helpful comments that improved this manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dee H. Andrews .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Andrews, D.H., Wulfeck, W.H. (2014). Performance Assessment: Something Old, Something New. In: Spector, J., Merrill, M., Elen, J., Bishop, M. (eds) Handbook of Research on Educational Communications and Technology. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-3185-5_24

Download citation

Publish with us

Policies and ethics