Skip to main navigation menu Skip to main content Skip to site footer

Review article: Biomedical intelligence

Vol. 143 No. 4950 (2013)

The new licencing examination for human medicine: from concept to implementation

  • Sissel Guttormsen
  • Christine Beyeler
  • Raphael Bonvin
  • Sabine Feller
  • Christian Schirlo
  • Kai Schnabel
  • Tina Schurter
  • Christoph Berendonk
DOI
https://doi.org/10.4414/smw.2013.13897
Cite this as:
Swiss Med Wkly. 2013;143:w13897
Published
01.12.2013

Summary

A new Swiss federal licencing examination for human medicine (FLE) was developed and released in 2011. This paper describes the process from concept design to the first results obtained on implementation of the new examination. The development process was based on the Federal Act on University Medical Professions and involved all national stakeholders in this venture. During this process questions relating to the assessment aims, the assessment formats, the assessment dimensions, the examination content and necessary trade-offs were clarified. The aims were to create a feasible, fair, valid and psychometrically sound examination in accordance with international standards, thereby indicating the expected knowledge and skills level at the end of undergraduate medical education. Finally, a centrally managed and locally administered examination comprising a written multiple-choice element and a practical “clinical skills” test in the objective structured clinical examination (OSCE) format was developed. The first two administrations of the new FLE show that the examination concept could be implemented as intended. The anticipated psychometric indices were achieved and the results support the validity of the examination. Possible changes to the format or content in the future are discussed.

References

  1. MedBG. Bundesgesetz über die universitäre Medizinalberufe; 2006 [cited 2013 Mar 16]. Available from: http://www.admin.ch/ch/d/as/2007/4031.pdf.
  2. SCLO. Swiss Catalogue of Learning Objectives for Undergraduate Medical Training; 2008 [cited 2013 Mar 16]. Available from: http://sclo.smifk.ch.
  3. Van Der Vleuten CPM. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ. 1996;1(1):41–67.
  4. Wilkinson TJ, Frampton CM. Comprehensive undergraduate medical assessments improve prediction of clinical performance. Med Educ. 2004;38(10):1111–6.
  5. Hamdy H, Prasad K, Anderson MB, Scherpbier A, Williams R, Zwierstra R, et al. BEME systematic review: Predictive values of measurements obtained in medical schools and future performance in medical practice. Med Teach. 2006:28(2):103–16.
  6. Wenghofer E, Klass D, Abrahamowicz M, Dauphinee D, Jacques A, Smee S, et al, Doctor scores on national qualifying examinations predict quality of care in future practice. Med Educ. 2009;43:1166–73.
  7. Case S, Swanson DB. Constructing Written Test Questions for the Basic and Clinical Sciences. Philadelphia: National Board of Medical Examiners; 1998.
  8. Traub RE. On the Equivalence of the Traits Assessed by Multiple-Choice and Constructed-Response Tests. In: Bennett RE, Ward WC, editors. Construction Versus Choice In Assessment. Hillsdale (New Jersey): Lawrence Erlbaum Associates; 1993. p. 1–27.
  9. Wesman AG. Writing the Test Item. In: Thorndike RL, editor. Educational Measurement. Washington: American Council on Education; 1971. p. 81–129.
  10. Bennett RE. On the Meanings of Constructed Response, in Construction Versus Choice In Assessment. In: Bennett RE, Ward WC, editors. Hillsdale (NJ): Lawrence Erlbaum Associates; 1993. p. 1–27.
  11. Schuwirth LWT, Van der Vleuten CPM. ABC of learning and teaching in medicine: Written assessment. Br Med J. 2003;326(7390):643–5.
  12. Krebs R. Anleitung zur Herstellung von MC-Fragen und MC-Prüfungen für die ärztliche Ausbildung. 2004. Available from: http://www.iml.unibe.ch/dienstleistung/assessment_pruefungen/.
  13. Lubarsky S, Dory V, Duggan P, Gagnon R, Charlin B. Script concordance testing: From theory to practice: AMEE Guide No. 75. Med Teach. 2013;35(3):184–93.
  14. Charlin B, Roy L, Brailovsky C, Goulet F, van der Vleuten C. The Script Concordance test: a tool to assess the reflective clinician. Teach Learn Med. 2000;12(4):189–95.
  15. Norcini JJ. The death of the long case? Br Med J. 2002;324(7334):408–9.
  16. Muzzin LJ, Hart L. Oral examinations. In: Neufeld VR, Norman GR, editors. Assessing Clinical Competence. New York: Springer Publishing Co; 1985. p. 71–93.
  17. Swanson DB. A measurement framework for performance based tests. In: Hart IR, Harden RM, Editors. Further developments in assessing clinical competence. Montreal: Can-Heal; 1987. p. 13–45.
  18. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet. 2001;357(9260):945–9.
  19. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–51.
  20. Williams RG. Have standardized patient examinations stood the test of time and experience? Teach Learn Med. 2004;16(2):215–22.
  21. Boulet JR, en-David MF, Ziv A, Burdick WP, Curtis M, Peitzman S, et al. Using standardized patients to assess the interpersonal skills of physicians. Acad Med. 1998;73(10Suppl):S94–6.
  22. Margolis MJ, Clauser BE, Swanson DB, Boulet JR. Analysis of the relationship between score components on a standardized patient clinical skills examination. Acad Med. 2003;78(10Suppl):S68–71.
  23. Ziv A, Ben-David MF, Sutnick AI, Gary NE. Lessons learned from six years of international administrations of the ECFMG's SP-based clinical skills assessment. Acad Med. 1998;73(1):84–91.
  24. Brailovsky CA, Grand'Maison P, Lescop J. A large-scale multicenter objective structured clinical examination for licensure. Acad Med. 1992;67(10 Suppl):S37–9.
  25. Tamblyn R, Abrahamowicz M, Brailovsky C, Grand'Maison P, Lescop J, Norcini J, et al. Association between licensing examination scores and resource use and quality of care in primary care practice. JAMA. 1998;280(11):989–96.
  26. Tamblyn R, Abrahamowicz M, Dauphinee WD, Hanley JA, Norcini J, Girard N, et al. Association between licensure examination scores and practice in primary care. JAMA. 2002;288(23):3019–26.
  27. Tamblyn R, Abrahamowicz M, Dauphinee D, Wenghofer E, Jacques A, Klass D, et al. Physician scores on a national clinical skills examination as predictors of complaints to medical regulatory authorities. JAMA. 2007;298(9):993–1001.
  28. Grand'Maison P, Lescop J, Rainsberry P, Brailovsky CA. Large-scale use of an objective, structured clinical examination for licensing family physicians. CMAJ. 1992;146(10):1735–40.
  29. Reznick R, Smee S, Rothman A, Chalmers A, Swanson D, Dufresne L, et al. An objective structured clinical examination for the licentiate: report of the pilot project of the Medical Council of Canada. Acad Med. 1992;67(8):487–94.
  30. USMLE, U.S.M.L.E.C.-. USMLE United States Medical Licensing Examination. [cited 2013 Mar 26]. Available from: http://www.usmle.org/.
  31. De Champlain, A, Swygert K, Swanson DB, Boulet JR. Assessing the underlying structure of the United States Medical Licensing Examination Step 2 test of clinical skills using confirmatory factor analysis. Acad Med. 2006;81(10 Suppl):S17–20.
  32. Vu N, Baroffio A, Huber P, Layat C, Gerbase M, Nendaz M. Assessing clinical competence: a pilot project to evaluate the feasibility of a standardized patient – based practical examination as a component of the Swiss certification process. Swiss Med Wkly. 2006;136(25–26):392–9.
  33. Barrows HS, Abrahamson S. The Programmed Patient: A Technique for Appraising Student Performance in Clinical Neurology. Med Educ. 1964;39(8):802–5.
  34. Barrows HS. Simulated patients in medical teaching. Can Med Assoc J. 1968;98(14):674–6.
  35. Cleland JA, Abe K, Rethans JJ. The use of simulated patients in medical education: AMEE Guide No 42. Med Teach. 2009;31(6):477–86.
  36. Schnabel KP. Simulation aus Fleisch und Blut: Schauspielpatienten. In: St. Pierre M, Breuer G, Editors. Simulation in der Medizin. Berlin, Heidelberg: Springer-Verlag; 2013. p. 115–9.
  37. MCCQE. Medical Council of Canada Qualifying Examination (MCCQE). [cited 2013 Apr 26.03]; Available from: http://www.mcc.ca/en/.
  38. Collins JP, Harden RM. The Use of Real Patients, Simulated Patients and Simulators in Clinical Examinations. AMEE Medical Education Guide No 13. Med Teach. 1998;20(6):508–21.
  39. Hubbard JP, Levit EJ, Schumacher CF, Schnabel TG Jr. An objective evaluation of clinical competence. N Engl J Med. 1965;272(25):1321–8.
  40. Frank J, editor. The CanMEDS 2005 physician competency framework. Better standards. Better physicians. Better care. Ottawa: The Royal College of Physicians and Surgeons of Canada. 2005.
  41. Swanson DB, Holtzman KZ, Allbee K, Clauser BE. Psychometric characteristics and response times for content-parallel extended-matching and one-best-answer items in relation to number of options. Acad Med. 2006;81(10 Suppl):S52–5.
  42. Swanson DB, Holtzman KZ, Allbee K. Measurement characteristics of content-parallel single-best-answer and extended-matching questions in relation to number and source of options. Acad Med. 2008;83(10 Suppl):S21–4.
  43. Rogausch A, Hofer R, Krebs R. Rarely selected distractors in high stakes medical multiple-choice examinations and their recognition by item authors: a simulation and survey. BMC Med Educ. 2010;10:85.
  44. Rodriguez MC. Three options are optimal for multiple-choice items: a meta-analysis of 80 years of research. J Educ Meas: Issues and Practice 2005;3–13.
  45. Krebs R. The Swiss Way to Score Multiple True-False Items: Theoretical and Empirical Evidence. In: Schrepbier AJJA, et al., editors. Advances in Medical Education. Dordrecht: Kluwer Academic Publishers; 1997. p. 158–61.
  46. Schuwirth LW, Verheggen MM, van der Vleuten CP, Boshuizen HP, Dinant GJ. Do short cases elicit different thinking processes than factual knowledge questions do? Med Educ. 2001;35(4):348–56.
  47. "Institut für Aus-, W.u.F.n.I.f.M.L. Kompetent prüfen: Handbuch zur Planung, Durchführung und Auswertung von Facharztprüfungen. 1999. [cited 2013 Apr 26]; Available from: http://www.iml.unibe.ch/dienstleistung/assessment_pruefungen/.
  48. BAG. Eidgenössisches Departement des Innern EDI. Bundesamt für Gesundheit BAG, Gesundheitsberufe 2013. [cited 2013 Mar 26]; Available from: http://www.bag.admin.ch/themen/berufe.
  49. Self Assessment IML. 2013; Available from: http://self-assessment.iml.unibe.ch/.
  50. Hodges B, McIlroy JH. Analytic global OSCE ratings are sensitive to level of training. Med Educ. 2003;37(11):p.012–6.
  51. Scheffer S. Validierung des “Berliner Global Rating” (BGR) - ein Instrument zur Prüfung kommunikativer Kompetenzen Medizinstudierender im Rahmen klinisch-praktischer Prüfungen (OSCE). 2009, Dissertation, Medizinische Fakultät Charité, Berlin, Deutschland.
  52. Cronbach LJ. Coefficient alpha and the internal structure of tests. Psychometrika. 1951;(16):p.297–334.
  53. Angoff WH. Scales, norms and equivalent scores, in Educational Measurement. Thorndike RL, Editor. Washington: American Council on Education; 1971. p. 508–600.
  54. Hofstee KWB. The case for compromise in educational selection and grading, In: Anderson SB, Helmick JS, editors. On Educational Testing. San Francisco: Jossey-Bass; 1983. p. 109–27.
  55. Kramer A, Muijtjens A, Jansen K, Düsman H, Tan L, van der Vleuten C. Comparison of a rational and an empirical standard setting procedure for an OSCE. Objective structured clinical examinations. Med Educ. 2003;37(2):132–9.
  56. Harik P, CB, Grabovsky I, Margolis MJ, Dillon GF, Boulet JR. Relationships among subcomponents of the USMLE Step 2 Clinical Skills Examination, the Step 1, and the Step 2 Clinical Knowledge Examinations. Acad Med. 2006;81(10):S1–4.

Most read articles by the same author(s)