An Essential Guide to Developing, Implementing, and Evaluating Objective Structured Clinical Examination (OSCE)
eBook - ePub

An Essential Guide to Developing, Implementing, and Evaluating Objective Structured Clinical Examination (OSCE)

  1. 224 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

An Essential Guide to Developing, Implementing, and Evaluating Objective Structured Clinical Examination (OSCE)

Book details
Book preview
Table of contents
Citations

About This Book

The aim of this book is to provide a comprehensive and practical guide for developing and implementing an Objective Structured Clinical Examination (OSCE) for the medical educators/health sciences educators/tutors/faculty/clinicians/OSCE planners, who are involved in clinical teaching and assessment of students, trainees and residents. The book starts with the essential theoretical foundation before progressing to the practical implementation steps. It contains a good balance of medical education research and practical tips to provide readers an easy to digest, yet comprehensive, guide for the implementation of OSCE as an appropriate assessment tool.

Contents:

  • The Birth and Propagation of the OSCE
  • The OSCE in the Context of a Holistic Assessment
  • Value of the OSCE as an Assessment Tool
  • Selecting the Skills to be Tested in an OSCE through Blueprinting
  • Utilizing Different Formats of OSCE for Greater Efficiency
  • Writing OSCE Stations
  • Creating a Scoring Template for Assigning Marks
  • Preparing Patients for the OSCE
  • Preparing Simulators for the OSCE
  • Preparing the Groundwork for Conducting an OSCE
  • Determining Passes and Fails in an OSCE
  • Post-Assessment Quality Assurance
  • Feedback, Moderation, and Banking
  • Helping Poorly Performing Students in an OSCE
  • OSCE as a Tool for the Selection of Applicants
  • Frequently Asked Questions about the OSCE


Readership: Medical Educators, Health Science Educators, Clinicians, Tutors, Examiners, OSCE Planners, Trainees to prepare themselves to take OSCE.
Key Features:

  • The book is written not only to guide educators to develop and conduct OSCEs, but also for post–exam evaluation and analysis of such activity to assure quality assessment
  • The book is also unique in the presentation of specific topics, such as OSCE for selection of candidates for a specific educational program, quality assurance, helping students with poor performance, where little is written in a book or a guide format. Also, for those who want to have a quick answer to their queries, a compilation of 'Frequently Asked Questions' is presented at the end of the book
  • The three authors of the book from three different countries and with affiliations to reputed educational institutes world over (not only in their own countries), have substantial practical experience in developing, conducting and evaluating OSCEs. In addition they have wide international experience in working as resources persons for workshops and staff development programs on OSCE. All three are holding degrees in Medical Education as well as practicing physicians, with extensive experience in clinical teaching and assessment

Frequently asked questions

Simply head over to the account section in settings and click on ā€œCancel Subscriptionā€ - itā€™s as simple as that. After you cancel, your membership will stay active for the remainder of the time youā€™ve paid for. Learn more here.
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
Both plans give you full access to the library and all of Perlegoā€™s features. The only differences are the price and subscription period: With the annual plan youā€™ll save around 30% compared to 12 months on the monthly plan.
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, weā€™ve got you covered! Learn more here.
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Yes, you can access An Essential Guide to Developing, Implementing, and Evaluating Objective Structured Clinical Examination (OSCE) by Hamza M Abdulghani, Gominda Ponnamperuma, Zubair Amin in PDF and/or ePUB format, as well as other popular books in Medicine & Internal Medicine & Diagnosis. We have over one million books available in our catalogue for you to explore.

Information

Publisher
WSPC
Year
2014
ISBN
9789814623544
1
THE BIRTH AND PROPAGATION OF THE OSCE
ā€œThose who do not remember the past are condemned to repeat it.ā€
George Santayana
In this opening chapter, our aim is to present a broad overview of the historical context of the birth of the OSCE, especially the factors that contributed to the development and propagation of the OSCE. We firmly believe that understanding the historical background is important for various reasons:
ā€¢ It allows us to analyze the preceding events leading to the development of the OSCE.
ā€¢ We develop a deeper understanding of the context of the OSCE in contemporary medical education and assessment.
ā€¢ We learn from previous successes and mistakes, so that we can meaningfully implement the OSCE in our own settings.
ā€¢ It prompts us to appreciate the enormous contributions of early doyens of medical education.
At the end of the chapter, we should be able to:
(i) Recognize how the shortcomings of traditional clinical examinations led to the birth of the OSCE.
(ii) Evaluate some of the seminal publications and events related to the development of the OSCE.
(iii) Compare and contrast early OSCE formats with contemporary OSCE formats.
The Birth
The development of the OSCE is rightly and widely credited to Professor Ronald McGregor Harden. In a seminal paper, published in the British Medical Journal in 1975, Harden et al. described a new approach to the assessment of clinical competence. However, even before the publication of this paper, there were considerable discussions in the medical literature highlighting the many shortfalls of traditional clinical examinations (Fleming et al., 1974; Harden et al., 1969; Wilson et al., 1969). In traditional clinical examinations, there are typically one or two long cases where the candidate takes a history, performs a physical examination, reviews investigations, generates a differential diagnosis, and discusses the findings, interpretation, and general management options with the examiners. The long cases are supplemented with a few short cases, where the candidateā€™s task is more defined with a focused history or physical examination followed by a brief discussion with the examiners.
The traditional clinical examinations have several drawbacks. Firstly, it is difficult to have certains types of real patients such as patients with shortness of breath or acute chest pain during the clinical examination. Secondly, the decision about a candidateā€™s competency in clinical settings is based on a very limited number of encounters. Thirdly, the lack of clear instructions to the candidates and to the examiners renders the marking of candidates vulnerable to the examinersā€™ moods, personal preferences, and prejudices. Fourthly, the long cases, which typically use a patient from the ward or a previously known patient who has been called back from home, do not really represent the variety of patients a graduating doctor is expected to manage in real life. They are also heavily skewed towards in-patients. Finally, the logistical and practical difficulties of using real patients for the entire duration of the examination can be a daunting task and is ethically questionable.
Harden et al. (1975) summed up the above observations that traditional clinical examinations, with two examiners assessing the candidateā€™s skills on only a few patients, are often ā€œluck of the draw.ā€ The lack of clear instructions and prior discussions about the patient among the examiners also creates confusions regarding what should be tested and the expected level of competency required to pass the examination. In short, the crucial pass/fail decision generated during traditional clinical examinations is often arbitrary and depends on multiple confounding factors, including variability among the examiners, varying levels of complexity of patients, and the nature of illnesses. Simply speaking, such examinations are educationally, morally, and legally indefensible.
Harden was the Head of Division of Clinical Medical Education at the University of Dundee, UK (interview with Harden; Vimeo), when he and his colleagues wrote the groundbreaking paper on the OSCE in 1975. To counter some of the ills of the traditional clinical examination, Harden et al. (1975) proposed an ā€œobjective structured examination.ā€ Interestingly, the term ā€œOSCEā€ did not appear in the original paper. The format of the clinical examination first proposed by Harden et al. was different from what we might be familiar with now. It had 16 stations (the number 16 was chosen because it was ā€œconvenientā€). The duration at each station was 5 minutes. However, the stations included a mixture of clinical skills stations, observations/inspections of clinical materials, such as colored photographs, and written questions in the form of true/false multiple choice questions (MCQs). Thus, a typical configuration would look like this (Fig. 1.1).
Fig. 1.1. Basic configuration of the early OSCE.
Each of these stations had clear instructions to the candidate, such as ā€œAuscultate the precordium for evidence of valvular disease,ā€ after which the candidate would move onto another station, where they would answer questions relating to their findings from the previous station. The questions could be open-ended or multiple choice type, although, for convenience, true/false multiple choice-type questions were often used (Harden et al., 1975). The examinersā€™ checklist was rather simple in early examinations, with only ā€œyesā€ and ā€œnoā€ options for each of the tasks performed. The checklist was later modified to allow a ā€œqualified yes.ā€ The new format of objective structured examination was piloted with a limited number of volunteer students and examiners. It received an enthusiastic response from the students and somewhat lukewarm response from the examiners. Students viewed the format as fairer and less dependent on luck (interview with Harden; Vimeo) and supported its implementation.
We should also review another early development that took place in North America that preceded the publication of the landmark paper by Harden et al. in 1975. This had a profound effect on the OSCE format as we practice now. Howard Barrows was an Assistant Professor of Neurology and Stephen Abrahamson was the Director of Research in Medical Education at the University of Southern California, School of Medicine, Los Angeles, CA, USA. They experimented with ā€œprogrammed patientsā€ as a way of appraising studentsā€™ performance in clinical neurology (Barrows and Abrahamson, 1964). They argued, among others, that patients are inherently prone to presenting their findings variably to the students and that an ideal patient that suits the needs of the examination is often hard to come by.
Therefore, they suggested training ā€œprogrammed patients,ā€ which involved ā€œsimulation of disease by a normal person who is trained to assume and present, on examination, the history and neurological findings of an actual patient in the manner of an actual patientā€ (Barrows and Abrahamson, 1964). Programmed patients, as we now know, have been expanded to include both ā€œstandardized patientsā€ and ā€œsimulated patients.ā€ Barrows and Abrahamson used programmed patients to ā€œobtain appraisal of the studentā€™s clinical performanceā€ ā€” a feat that might seem revolutionary to some, even four decades later. Further down the road, simulation and mannequins would become an integral part of the repository of clinical materials that can be tested during a clinical examination.
Programmed patients brought greater standardization to the clinical presentation of patients and reduced the variability that is inherent with the use of real patients. Harden et al. merged two emerging assessment techniques of that age: Programmed patients for standardization and multi-station examination by multiple examiners working independently in order to assess several domains at the same time (Hodges, 2003). Although these two ideas, taken separately, were neither novel nor revolutionary, marrying them together in a systematic manner to test clinical competency was unique (Hodges, 2003).
Good to Know
ā€¢ Standardized patients: real patients or actors portraying findings in a standardized manner
ā€¢ Simulated patients: trained actors acting as patients
ā€¢ Simulation: enacting a clinical scenario
ā€¢ Simulator: mechanical devises, mannequins with or without intelligent features
OSCE as a Global Phenomenon
Acceptance of the OSCE was further bolstered by continuous research and refinements, not only regarding the OSCE, but also regarding the nature of clinical competencies, appreciation of holistic roles of physicians, and psychometrics of assessment in medical education (Khan et al., 2013). Some of the crucial evidence related to the OSCE, as a method of student assessment, will be discussed in the subsequent chapters.
In the last two decades, anecdotal observations and professional experience support the notion that the pace of adoption of the OSCE as a preferred method of student assessment has accelerated considerably across the globe, from undergraduate to postgraduate settings. Many professional bodies and national licensing examinations have endorsed and adopted the OSCE or its variations as a standard clinical assessment across the spectrum of physician training. For example, after years of reliance on knowledge assessment through MCQs, most medical schools in the USA now use the OSCE or similar examinations for clinical assessment (Turner and Dankoski, 2008). National licensing authorities, including the National Board of Medical Examiners, USA (USMLE 2 CS), the Medical Council of Canada (Medical Council of Canada Qualifying Examination Part II), the General Medical Council, UK (PLAB Part 2), among others, are now regularly using the OSCE or its variants in high-stakes summative decision-making.
Key milestones
ā€¢ 1979: Publication of ā€œAssessment of Clinical Competence Using an Objective Structured Clinical Examination (OSCE)ā€ as an Association for the Study of Medical Education (ASME) Guide by Harden and Gleeson
ā€¢ 1985: 1st Ottawa Conference on Assessment of Clinical Compe tency brought the idea of the OSCE to North America
ā€¢ 1998: Australian Medical Council conducted an OSCE for the first time for licensing foreign medical graduates to practice in Australia
ā€¢ 1998: Education Commission for Foreign Medical Graduates, USA (ECFMG) conducted large-scale OSCEs in the USA for the first time
ā€¢ 2000: Publication of ACGME Toolbox of Assessment Methods (2000) endorsed the OSCE as the one of the preferred methods of assessment of clinical competency
Summary
The OSCE did not originate in a vacuum. Much impetus towards the development of the OSCE came from the recognition of many serious flaws associated with unstructured, unstandardized traditional clinical examinations, where the decision often depended on a sole examiner (or a few examiners) examining one or few aspects of clinical competency. The OSCE reduced bias, provided clearer instruction to the examiners and candidates, and changed the perception that ā€œclinical examination is often a game of luck.ā€
References
Barrows HS, Abrahamson S. (1964) The programmed patient: A technique for appraising studentsā€™ performance in clinical neurology. J Med Educ 39: 802ā€“804.
Fleming PR, Manderson WG, Matthews MB, Sanderson PH, Stokes JF. (1974) Evolution of an examination: M.R.C.P. (U.K.). Br Med J 13; 2(5910): 99ā€“102.
Harden RM, Gleeson FA. (1979) Assessment of clinical competence using an objective structured clinical examination (OSCE). Medical Education Booklet No. 8. Med Educ 13(1): 41ā€“54.
H...

Table of contents

  1. Cover
  2. Half Title
  3. Title Page
  4. Copyright
  5. Foreword
  6. Acknowledgements
  7. Preface
  8. Contents
  9. 1. THE BIRTH AND PROPAGATION OF THE OSCE
  10. 2. THE OSCE IN THE CONTEXT OF A HOLISTIC ASSESSMENT
  11. 3. VALUE OF THE OSCE AS AN ASSESSMENT TOOL
  12. 4. SELECTING THE SKILLS TO BE TESTED IN AN OSCE THROUGH BLUEPRINTING
  13. 5. UTILIZING DIFFERENT FORMATS OF OSCE FOR GREATER EFFICIENCY
  14. 6. WRITING OSCE STATIONS
  15. 7. CREATING A SCORING TEMPLATE FOR ASSIGNING MARKS
  16. 8. PREPARING PATIENTS FOR THE OSCE
  17. 9. PREPARING SIMULATORS FOR THE OSCE
  18. 10. PREPARING THE GROUNDWORK FOR CONDUCTING AN OSCE
  19. 11. DETERMINING PASSES AND FAILS IN AN OSCE
  20. 12. POST-ASSESSMENT QUALITY ASSURANCE
  21. 13. FEEDBACK, MODERATION, AND BANKING
  22. 14. HELPING POORLY PERFORMING STUDENTS IN AN OSCE
  23. 15. OSCE AS A TOOL FOR THE SELECTION OF APPLICANTS
  24. APPENDIX 15.1: AN EXAMPLE OF A SELECTION OSCE STATION
  25. 16. FREQUENTLY ASKED QUESTIONS ABOUT THE OSCE
  26. Index
  27. Authorsā€™ Biographies