Skip to main content
Log in

Evaluating generalist education programs

A conceptual framework

  • Published:
Journal of General Internal Medicine Aims and scope Submit manuscript

Abstract

This paper provides and applies a conceptual framework and a list of guiding principles for evaluation of generalist education programs. Programs are systematic efforts to achieve specified objectives. Evaluations gather data in order to improve or appraise programs and have a continuum of purposes and methods. Descriptive evaluations characterize the structures, processes, and outcomes of programs; research evaluations definitively assess the effectiveness of a program in terms of outcomes. Intermediate outcomes are changes in knowledge, attitudes, and skills of program participants; conclusive outcomes reflect the quality of performance of graduates in actual clinical situations. Outcomes are affected by inputs—the qualities of students entering the program. Guiding principles of program evaluation ensure that data gathered are useful. The authors illustrate the guiding principles with an actual pilot study that determined that expert pediatricians, general internists, and family practitioners could agree on key generalist competencies and that explores evaluation design based on these competencies. Finally, they consider the implications of undertaking generalist education evaluation.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Lundberg GD, Lamm RD. Solving our primary care crisis by retraining specialists to gain specific primary care competencies (editorial). JAMA. 1993;270:380–1.

    Article  PubMed  CAS  Google Scholar 

  2. Petersdorf RG. The doctor is in. Acad Med. 1993;68:113–7.

    Article  PubMed  CAS  Google Scholar 

  3. Politzer RM, Harris DL, Gaston MH, Mullan F. Primary care physician supply and the underserved. JAMA. 1991;266:104–9.

    Article  PubMed  CAS  Google Scholar 

  4. Council on Graduate Medical Education, Health Resources and Services Administration, Public Health Service, US Dept of Health and Human Services. Improving Access to Health Care through Physician Workforce Reform: Directions for the 21st Century. Rockville, MD: US Dept of Health and Human Services; October 1992.

    Google Scholar 

  5. Bunker JP, Barnes BA, Mosteller F (eds). Costs, Risks and Benefits to Surgery. New York: Oxford University Press, 1977.

    Google Scholar 

  6. Adams R. Internal-mammary-artery ligation for coronary insufficiency. An evaluation. N Engl J Med. 1958;258:113.

    Article  PubMed  CAS  Google Scholar 

  7. Fink A. Evaluation Fundamentals: Guiding Health Programs, Research, and Policy. Newbury Park, CA: Sage, 1993.

    Google Scholar 

  8. Campbell DT, Stanley JC. Experimental and Quasi-Experimental Designs for Research. Chicago: Rand McNally, 1969.

    Google Scholar 

  9. Laffel G, Blumenthal D. The case for using industrial quality management science in health care organizations. JAMA. 1989;262:2869–73.

    Article  PubMed  CAS  Google Scholar 

  10. Donabedian A. The quality of medical care. Science. 1978;200:856–64.

    Article  PubMed  CAS  Google Scholar 

  11. Donabedian A. Explorations in Quality Assessment and Monitoring: The Definition of Quality and Approaches to Its Assessment. Ann Arbor, MI: Health Administration Press, 1980.

    Google Scholar 

  12. Health Professions Education for the Future: Schools in Service to the Nation. Report of the Pew Health Professions Commission, San Francisco, 1993.

  13. The Pew Health Professions Commission. Healthy America: Practitioners for 2005, Durham, NC, 1991.

  14. Smith LJ, Price DA, Houston IB. Objective structured clinical examination compared with other forms of students assessment. Arch Dis Child. 1984;59:1173–6.

    Article  PubMed  CAS  Google Scholar 

  15. Cass OW, Freeman ML, Peine CJ, et al. Objective evaluation of endoscopy skills during training. Ann Intern Med. 1993;118:40–4.

    PubMed  CAS  Google Scholar 

  16. Fink A, Kosecoff J, Brook RH. Setting standards of performance for program evaluations: the case of the teaching hospital general medicine group practice program. Eval Program Planning. 1986;9:143–50.

    Article  CAS  Google Scholar 

  17. US Department of Health and Human Services, Public Health Service, Agency of Health Care Policy and Research, Executive Office Center, Suite 501,2101 East Jefferson Street, Rockville, MD 20852.

  18. US Preventive Services Task Force. Guide to Clinical Preventive Services. Baltimore, MD: Williams & Wilkins, 1989.

    Google Scholar 

  19. Bailar JC, Mosteller F. Guidelines for statistical reporting in articles for medical journals. Ann Intern Med. 1988;108:266–73.

    PubMed  Google Scholar 

  20. Goodman LJ, Brueschke RC, Bone WH, et al. An experiment in medical education. JAMA. 1991;265:2373–6.

    Article  PubMed  CAS  Google Scholar 

  21. Linn LS, DiMatteo MR, Cope DW, et al. Measuring physicians’ humanistic attitudes, values, and behaviors. Med Care. 1987;25:504–15.

    Article  PubMed  CAS  Google Scholar 

  22. Braitman L. Confidence intervals assess both clinical and statistical significance. Ann Intern Med. 1991;114:515–7.

    PubMed  CAS  Google Scholar 

  23. Mathieu OR, Alpert JJ, Pelton SI. Am J Dis Child. 1989;143:575.

    Google Scholar 

  24. Alpert JJ. The future for pediatric education. Pediatrics. 1990;86:653.

    PubMed  CAS  Google Scholar 

  25. Flexner A. Medical Education in the United States and Canada, A Report to the Carnegie Foundation for the Advancement of Teaching. Carnegie Foundation Bulletin No. 4, New York, 1910.

Download references

Author information

Authors and Affiliations

Authors

Additional information

With the Generalist Program Evaluation Working Group:WILLIAM BITHONEY, MD. LINDA BLANK, EVANCHARNEY, MD. JACKENDE. MD, DONA L. HARRIS. PHD. PAUL MCCARTHY, MD. STEVENP. SHELOV, MD. DAVIDSWEE, MD

Working group members are from the Department of Medicine in Boston’s Children’s Hospital, Boston, Massachusetts (WB); the American Board of Internal Medicine, Philadelphia, Pennsylvania (LB); the Department of Pediatrics, University of Massachusetts Medical School, Worcester, Massachusetts (EC); the Division of General Internal Medicine, University of Pennsylvania School of Medicine, Philadelphia, Pennsylvania (JE); Michigan State University Kalamazoo Center for Medical Studies, Family Medicine, Kalamazoo, Michigan (DLH); the Department of Pediatrics, Yale University School of Medicine, New Haven, Connecticut (PMcC); the Department of Pediatrics, Albert Einstein College of Medicine, Bronx, New York (SPS); and the Department of Family Medicine, UMDNJ-Robert Wood Johnson Medical School, New Brunswick, New Jersey (DS).

Rights and permissions

Reprints and permissions

About this article

Cite this article

Rubenstein, L.V., Fink, A., Gelberg, L. et al. Evaluating generalist education programs. J Gen Intern Med 9 (Suppl 1), S64–S72 (1994). https://doi.org/10.1007/BF02598120

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02598120

Key words

Navigation