Print version ISSN 0043-3144
West Indian med. j. vol.54 no.2 Mona Mar. 2005
Autoevaluación del estudiante en un examen clínico objetivo estructurado de pediatría
RB PierreI; AWierengII; M BartonI; K ThameI; JM BrandayII; CDC ChristieI
IDepartment of Obstetrics, Gynaecology and Child Health, Faculty of Medical Sciences, The University of the West Indies, Kingston 7, Jamaica, West Indies
IICurricular Affairs Department, Faculty Office for Undergraduate Affairs Deans Office, Faculty of Medical Sciences, The University of the West Indies, Kingston 7, Jamaica, West Indies
OBJECTIVE: The objective structured clinical examination (OSCE) has been recognized not only as auseful assessment tool but also as a valuable method of promoting student learning. Student selfassessment is also seen as a means of helping students recognize their strengths and weaknesses, understand the relevance of core learning objectives and to take more responsibility for each stage of their work. The authors sought to evaluate the accuracy of medical student self-assessment of their performance in the paediatric clerkship OSCE and thus obtain preliminary data for use in programme strengthening.
DESIGN AND METHODS: A self-administered questionnaire was completed by successive groups of studentsimmediately after the OSCE at the end of each clerkship rotation. Students assessed their performanceat each station, using a performance rating scale. Performance data were summarized using descriptive and non-parametric tests. Basic statistical analysis of the Likert items was conducted by calculating frequencies, means and standard deviations. Regression analysis was used to correlate selfreported rating and actual performance in each station. A p value of < 0.05 was considered significant. Eighty-one students (92%) completed the questionnaire.
RESULTS: Fifty-eight (72%) of the students achieved greater than minimum competence in their overall scores. Significant positive correlation (p < 0.05) between student self-rating and actual score was noted among the following stations: technical skills, cardiovascular examination, assessment of dysmorphism, dermatology, communication and photographic interpretation stations. Students overestimated their performance in the gastrointestinal examination, radiological and arterial blood gas interpretation. Students underestimated their performance in the following: respiratory system, examination of the head, developmental and nutritional assessment.
CONCLUSIONS: The findings highlight the perceived strengths and weaknesses in clinical competence and self-assessment skills and provide direction for programme training needs.
OBJETIVO: El examen clínico objetivo estructurado (ECOE) ha sido reconocido no sólo como una herramienta de evaluación útil, sino también como un valioso método para promover el aprendizaje del estudiante. La auto-evaluación estudiantil es vista también como un medio de ayudar a que los estudiantes reconozcan sus fortalezas y debilidades, entiendan la relevancia de los objetivos de aprendizaje comunes, y tomen más responsabilidad en cada etapa de su trabajo. Los autores buscaron evaluar la exactitud de la auto-evaluación del rendimiento del estudiante de medicina en la estación pediátrica del ECOE, obteniendo de eso modo datos preliminares a ser usados en el fortalecimiento del programa.
DISEÑO Y MÉTODOS: Una autoencuesta fue completada por grupos sucesivos de estudiantes inmediatamente después del ECOE al final de cada rotación de las estaciones. Los estudiantes evaluaron su rendimiento en cada estación, usando una escala de evaluación del rendimiento. Los datos del rendimiento fueron resumidos utilizando tests descriptivos y no aramétricos. El análisis estadístico básico de los ítems tipo Likert se llevó a cabo calculando las frecuencias, medias y desviaciones estándar. El análisis de regresión fue usado para correlacionar la calificación autoreportada con el desempeño real en cada estación. Un valor de p < 0.05 fue considerado significativo. Ochenta y un estudiantes (92%) respondieron la encuesta.
RESULTADOS: Cincuenta y ocho (72%) de los estudiantes lograron un rendimiento por encima del mínimo en sus resultados (puntuaciones) generales. Se observó una correlación positiva significativa (p < 0.05) entre la autocalificación del estudiante y el resultado real en las estaciones siguientes: habilidades técnicas, examen cardiovascular, evaluación del dismorfismo, dermatología, y las estaciones de comunicación e interpretación fotográficas. Los estudiantes encima de-estimaron su actuación en el examen gastrointestinal, la interpretación de gas de sangre radiológica y arterial. Los estudiantes subvaloraron su competencia en las siguientes estaciones: sistema respiratorio, examen de la cabeza, evaluación del desarrollo y la nutrición.
CONCLUSIONES: Los hallazgos resaltan las fortalezas y las debilidades percibidas en la competencia clínica y la autoevaluación de las habilidades, a la par que proveen dirección a las necesidades de entrenamiento en los programas.
The assessment of students' clinical competence is of paramount importance, there being several means of evaluating student performance in medical examinations (1, 2). The objective structured clinical examination (OSCE) is anapproach to student assessment in which aspects of clinical competence are evaluated in a comprehensive, consistent and structured manner, with close attention to the objectivity of the process (3). The OSCE was introduced by Harden in 1975 (4) and first described as an assessment format in paediatrics by Waterston and colleagues (5). Since its inception, the OSCE has been increasingly used to provide formative and summative assessment in various medical disciplines worldwide (6), including non-clinical disciplines (7).
The OSCE has been recognized not only as a useful evaluative tool but also as a valuable method of enhancing student learning (8). The use of feedback with participants has been viewed as critical to enhancing the learning experience (9) and studies suggest that where feedback is an integral part of the OSCE process it can significantly improve competence in the performance of criterion-based tasks (10).
Student self-assessment is also seen as a means of helping students recognize their deficiencies in learning outcomes and to take more responsibility for each stage of their work (11, 12). It gives support for learning and can enhance student performance. It also encourages the development of skills that are important for lifelong learning. However, accurate self-assessment requires the recognition of overestimation or underestimation of cognitive/performance tasks. Overestimation may lead to misdiagnosis, inadequate performance and premature closure. Underestimation usually leads to overuse of diagnostic tests, excessive uncertainty and unnecessary referrals.
The Faculty of Medical Sciences (FMS), The University of the West Indies (UWI), initiated the OSCE as a formal and integrated method of assessment for the final Medicine and Therapeutics examination in Child Health, Community Health and Psychiatry and Adult Medicine during the November 2000 examinations. Students and faculty were exposed for the first time to a relatively new assessment instrument in which aspects of competence (communication, history-taking and technical skills) were assessed in a structured, formal manner.
The Section of Child Health, Mona, Jamaica, implemented the OSCE examination as an end-of clerkship assessment for students in their fifth year, during the 19992000 academic year. It was felt timely in order to (a) direct and drive student learning in areas not previously assessed in the 'traditional' curriculum, (b) verify students' competence in fundamental paediatric clinical skills, and (c) provide a forum for feedback to students on their strengths and weaknesses in clinical skills. It was thought that it would enhance faculty and student acceptance of this new assessment tool and promote faculty training for the newly introduced final OSCE examination. Student self-assessment is currently an unexplored and underutilized evaluation tool at the FMS, UWI, Jamaica. We decided to explore studentself-assessment skills to determine their overestimation or underestimation of their performance in the OSCE, as part of an evaluation of the paediatric clerkship OSCE.
We recognized that this self-assessment activity conducted at the end of the clerkship would have limited immediate impact on student performance. However, we believed that it could promote students' understanding of the teaching objectives, develop their confidence and competence to think critically and provide them with a useful skill for lifelong learning. We also believed that the exercise could help identify weaknesses in the teaching programme and hence establish programme-training needs for future students.
Each OSCE comprised a circuit of 17 stations including four rest stations and involved the execution of a number of tasks such as examination of a system, eliciting a focussed history, counselling or communicating a problem, performing a procedure and problem-solving oriented around patient and laboratory data, and photographic material (Table 1). All clerkship groups were uniformly assessed in six stations which included the technical, history-taking, radiological, communication, patient data and slide/picture interpretation stations. The expected tasks for the clinical (examination) stations varied in each clerkship group and included the assessment of dysmorphic features, developmental and nutritional status, and examination of the cardiovascular, respiratory, gastrointestinal and neurological systems. This assessment format allowed the controlled exposure of students to a wide variety of paediatric clinical skills within a relatively short time period. Time at each station was seven minutes, with the exception of the 14-minute history-taking station. One minute was given between stations to facilitate change and the reading of instructions. With the inclusion of strategically placed rest stations, to reduce student and patient fatigue, all students completed the circuit over a two hour period.
A standardized technique of marking was used and student performance was assessed by criterion reference for each station. Criterion-based scoring was used, with each checklist item scored as zero (omitted, incorrect or inadequate), one to two (correct or adequate).
Face and content validity of each checklist was established by review and consensus by a core group of senior paediatricians. Stations were first selected to represent the curricular goals and objectives and to reflect authentic clinical situations. Checklists were designed to include the features thought to be most important by the development committee. Through discussions, consensus was achieved on the checklist items and structure. The minimum competence score was determined by the modified Angoff method (13).
The study was conducted during the period July 2001 to December 2002. Five groups of students participated inthe process, during their respective clerkship rotations. Student groups had at least two briefing sessions before the OSCE, which covered orientation about the OSCE (both endof-clerkship and final MBBS), sensitization about the departmental and faculty need for evaluation of the examination and review of commonly assessed competences. At the completion of the OSCE, students assessed their performance at each station, using a performance rating scale (Table 2), a Likert scale ranging from one to seven (poor to excellent). This was part of a 32-item self-administered questionnaire (14) that was being used to evaluate the content, structure, and organization of the OSCE.
Participation in the questionnaire was on a voluntary basis and students were assured that non-responders would not be penalized. The Curricular Affairs Section of the Faculty of Medical Sciences handled the administration and analysis of the questionnaires. Ethical approval was received from the University Hospital of the West Indies/TheUniversity of the West Indies FMS Ethical Committee.
An OSCE review session was conducted with the students at the end of the clerkship (and after completion of the questionnaire) for feedback and teaching purposes. General comments were addressed to the whole group regarding their overall performance as perceived by the examiners, during this thirty-minute session. Students subsequently had individual appointments with faculty members and were then given the opportunity to review their individual performances at the respective stations and discuss their strengths and any areas of concern.
Data were collated and descriptive and non-parametric tests applied using Stata (15). Basic statistical analysis of the Likert items was conducted by calculating frequencies, means and standard deviations. Table 2 illustrates how selfassessment scores were equated with actual scores in examiners' ratings. Regression analysis (Pearson productmoment) was used to correlate self-reported rating and actual performance in each station. A p-value of < 0.05 was considered significant for the correlation coefficients. Eighty-one students responded to the questionnaire, representing 92% (81/88) of those who completed the senior paediatric clerkship.
Seventy-two per cent (58/81) of students achieved greater than minimum competence in their overall scores. The weakest performances (mean score less than 50%) were seen in the assessment of dysmorphism, radiological and arterial blood gas (ABG) interpretation stations. Students gave borderline performances (mean score 5059%) in the gastrointestinal, communication, slide/picture and patient data interpretation stations.
Significant positive correlation between actual performance and self-rating was noted among the following stations: technical/procedural, cardiovascular, assessment of dysmorphism, dermatology, communication and slide interpretation (Table 3). Students overestimated their performance in the following: gastrointestinal, radiological and ABG assessment. Students underestimated their performance in the following: respiratory, developmental, nutritional assessment and examination of the head.
Surveying medical student self-assessment skills in the OSCE has highlighted students' strengths and weaknesses, and some challenges in curriculum implementation.
Students demonstrated appropriate self-evaluation skills in the technical stations. These included performing a lumbar puncture, suprapubic aspiration, blood culture sampling, ebulization with salbutamol, use of peak expiratory flow meter and anthropometry. This is not surprising because students are usually quite motivated to participate in procedural skills and self-assessment usually correlates with frequency of performance (16) (although this was not explored in the study). One of the strengths of the undergraduate medical programme at the FMS, UWI, has been the early 'hands on' approach for acquisition of clinical skills.
Students also correctly assessed their eak/borderline performance in the astrointestinal, assessment of dysmorphism, radiological, communication, slide/picture and patient data interpretation stations. With the exception of gastrointestinal and assessment of dysmorphism stations, these represented competences that were until recently formerly assessed in the clerkship rotation. Non-uniform teaching and exposure in the specific areas may have contributed to their lack of confidence and limited competence.
Students underestimated their performance in competences that are assessed infrequently. The respiratory system in paediatrics is not usually assessed in the final MBBS examination OSCE, because of the unsuitability of paediatric cases with acute respiratory related conditions. Also the exposure to developmental assessment, head examination and nutritional assessment has tended to be opportunistic, 'teacher-specific' and dependent on case availability. Students have expressed anecdotally that they are not comfortable with these areas.
Alternatively, they overestimated their performance in the gastrointestinal system, radiological and arterial blood gas interpretation. The gastrointestinal system is frequently reinforced and usually part of summative assessment at the end of year five. Students may have considered this system to be 'easy' and probably devoted less time to review prior to the OSCE.
The limited research into medical students' self assessment has yielded little improvement of this skill during medical school (17). However, this may reflect limited opportunity to develop the skill or variations in self-directed learning behaviour. Yet, results suggest that self-evaluation has educational merit as a measure of non-cognitive abilities associated with clinical performance and as a stimulus to further learning and professional development (18). At FMS, UWI, students are trained for independent practice by the end of the pre-registration period. It is imperative that self-evaluation skills and attitude are inculcated during the undergraduate period.
The value of this exercise was limited by the timing of the assessment at the end of a clerkship. Although students' strengths and areas of concern were discussed, the opportunity for remediation could only occur after the completion of the clerkship rotation. This was particularly so in the case of the twenty-eight per cent of students who achieved an overall OSCE score below minimum competence level. Future group performances and self-assessment skills could be improved by conducting mid-clerkship formative assessments in order to identify and address weaknesses in a timely fashion.
This exploratory study has highlighted the perceived strengths, weaknesses and challenges in student selfassessment skills and the paediatric undergraduate curriculum and teaching. We believe that this study has contributed toward the goals of helping students assess their strengths and weaknesses and foster self-directed learning and has helped to clarify issues that determine competence/confidence in achieving clinical competence. Strategies have already been initiated to restructure the curriculum in order to ensure consistent and uniform exposure to core competences. Innovative teaching methods are needed to improve student performance in pertinent competences especially communication skills and laboratory data interpretation.
We wish to thank Dr Jerome De Lisle of the Centre for Medical Education, Faculty of Medical Sciences, Trinidad and Tobago, for professional advice and permission to use the questionnaire in this study. We also express our gratitude to the participating students, lecturers and consultants in Child Health who contributed to the implementation of the OSCE in the department.
1. Harden RM. How to assess clinical competence an overview. Med Teach 1979; 1: 28996.
2. Fowell SL, Bligh JG. Recent developments in assessing medical students. Postgrad Med J 1998; 74: 1824.
3. Harden RM. What is an OSCE? Med Teach 1988; 10: 1922.
4. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examinations. Brit Med J 1975; 1: 44751.
5. Waterston T, Cater JI, Mitchell RG. An objective undergraduate clinical examination in child health. Arch Dis Child 1980; 55: 91722.
6. Carraccio C, Englander R. The objective structured clinical examination: a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med 2000; 154: 73641.
7. Harden RM, Caincross RG. The assessment of practical skills: the objective structured practical examination (OSPE). Stud High Educ 1980; 5: 18796.
8. Kowlowitz V, Hoole AJ, Sloane PD. Implementation of the objective structured clinical examination in a traditional medical school. Acad Med 1991; 66: 3457.
9. Black NMI, Harden RM. Providing feedback on clinical skills by using the objective structured clinical examination. Med Educ1986; 20: 4852.
10. Hodder RV, Rivington RN, Calcutt LE, Hart LR. The effectiveness of immediate feedback during the objective structured clinical examination. Med Educ1989; 23: 1848.
11. Fallows S, Chandramohan B. Multiple approaches to assessment: reflections on use of tutor, peer and self-assessment. Teach Learn Med 2001; 6: 22946.
12. Taras M. Using assessment for learning and learning from assessment. Assess Eval High Educ 2002; 27: 50110.
13. Thorndike RL, ed. Educational Measurement. 2nd ed, American Council on Education, Washington DC: Oryx Press; 1971.
14. De Lisle, J. 2001 Phase 2, OSCE student evaluation form. The Centre for Medical Science Education, Faculty of Medical Sciences, Mount Hope, Eric Williams Medical Sciences Complex, Trinidad; 2001.
15. StataCorp. 2001. Stata Statistical Software: Release 7.0. College Station, TX: StataCorp LP.
16. Fincher RM, Lewis LA. Learning, experience, and self-assessment of competence of third-year medical students in performing bedside procedures. Acad Med 1994; 69: 2915.
17. Fitzgerald JT, Gruppen LD, White CB. The influence of task formats on the accuracy of medical students' self-assessments. Acad Med 2000; 75: 73741.
18. Arnold L, Willoughby TL, Calkins EV. Self-evaluation in undergraduate medical education: a longitudinal perspective. J Med Educ1985; 60: 218.
Dr RB Pierre, Department of Obstetrics, Gynaecology and Child Health, Section of Child Health, The University of the West Indies, Kingston 7, Jamaica, West Indies. Fax: (876) 927-1446, e-mail: firstname.lastname@example.org.