Kathmandu, Nepal
Two years back, I delivered a talk on "Ensuring quality of student assessment" at KIST Medical College (KISTMC), Lalitpur, Nepal. Basic Science coordinator at KISTMC, Prof. Dr. P Ravi Shankar, specifically asked me to focus on the construction, administration and quality assurance of the Short Answer Questions (SAQs). I am sharing the talk that I delivered there in writing here.
SAQs are not new to health professions educators, particularly in South Asia, as they have been used independently and extensively to assess the knowledge in depth and breadth of various basic sciences disciplines as well as community health sciences/community medicine. It is logical as these disciplines are delivered independently and the concerned discipline faculty or department want to test the knowledge imparted mostly through the large groups lecture sessions. These types of SAQs are mostly unstructured (i.e. model answers are neither made nor discussed) apart from not being integrated with other disciplines and of little relevance to clinical sciences. Consequently, SAQ becomes a mere tools for assessing "memorization" of knowledge over the time, which can easily and reliably done with the MCQs. Similarly, very difficult SAQs also become abundant in the student assessment once the faculty exhaust with the "factual recall of knowledge" type of questions. These two situations are problematic and non-defensible as SAQ pass-mark does not changes between exams because of the fixed-criterion of 50% cut-score (pass mark) in Nepal and South Asia. Logically, SAQ containing difficult and easy questions should have lower and higher pass-mars respectively. Fortunately, there is a remedy called "standard setting" for these problem.
In addition, SAQs are constructed/collated mostly by a single senior faculty for high-stake exams in Nepal and South Asia. Although it recognizes the integrity, honesty and experience in teaching and research(?) of the single respected/reputed faculty, it may not fulfill validity and reliability criteria. Furthermore, when these SAQs are constructed in isolation and without using examination blueprint, they undermine the validity of the not only the SAQs but the whole test as well. Thus, it is recommended to construct/collate the SAQ items for any test only after having discussion with the concerned faculty individually and in group in terms of the content and construct. The SAQs compiled through this process should be "standard set" using suitable criterion-referenced method to produce the valid and defensible cut-off score for formative and if possible for summative exams as well. SAQs must contain concise and precise model answers with clear marking schemes to avoid the inter-rater reliability bias, which is also known as structured SAQs.
It is well understood that the medical students need to appreciate the integration of the various basic sciences disciplines so that they can see the "links" with each other. These links are essential for medical relevance in terms of prevention, diagnosis and management modalities including Behavior Commination and Change (BCC) at larger scale. Yet, this integration should be started right from the beginning rather that waiting for the students to learn them in the clinical sciences phase where there are so much to learn in terms of skills and attitude. This provides a strong rationale for using integrated SAQ in the early phase of the undergraduate medical education (MBBS) course.
Ideally, undergraduate medical education should have an integrated basic sciences curriculum with introductory clinical medicine (early clinical exposure) and public health (community health/medicine) throughout the two years period. It then enables introducing appropriate and innovative teaching/learning methods to exploit the opportunities to instill the life-long and self-directed learning habit among the students, which are crucial attributes for a good physician. Thus, SAQ should assess not only what was imparted but also what was learnt through self-directed learning, which in turn advocates for the introduction of teaching/learning and assessment methods that promotes self-directed learning among the students. Thus, a hybrid Problem-Based Learning (PBL) method becomes a rationale choice as the main teaching/learning method in the integrated basic sciences phase of curriculum as it enables the students to appreciate the teaching/learning of various disciplines in terms of patient and community care perspectives. Students also gain knowledge from large group lectures that are delivered to cover important and difficult concepts and competencies not covered through PBL cases. On the other hand, carefully constructed and administered self-assessment tests also promotes the self-directed learning among the students.
Therefore, it is recommended to use an "integrated" curriculum and appropriate teaching/learning methods to construct structured integrated SAQs from the beginning of the undergraduate medical education (MBBS) course as integrated SAQs are extremely important tool for assessing the short term and long term retention of knowledge. It also enables testing Higher Order Thinking Skills (HOTS), which are usually difficult to assess using Multiple Choice Questions (MCQs).
Since Bloom's Taxonomy classifies the knowledge into six varying degree of difficulties (see figure on LOTS & HOTS), use of MCQs and PBQs together gives a holistic assessment of knowledge imparted and/or acquired through the PBQ tutorials and large groups lecture sessions. It is the PBL cases and sessions that gives lieu to assess the higher level of knowledge as students participate and identify the learning issues related to each PBL case and later independently prepare and present their findings in the small group tutorials facilitated by a trained faculty.
Meta-analysis (Walker and Larey, 2009) and meta-synthesis of meta-analysis (Strobel and van Barneveld, 2009) comparing PBL and conventional classroom have shown that long term retention of knowledge is better among PBL students than the students exposed mostly to the large group lecture sessions. Learning pyramid also shows that retention of knowledge is much more higher when one prepare for discussion and/or teaching (see figure on learning pyramid) and students experience them both in the PBL tutorials.
Nonetheless, there will a dilemma on using SAQ initially for assessing knowledge in the hybrid PBL curriculum due to fear of repeated assessment of knowledge through MCQs and then SAQs. Thus, a logical and viable solution should be developed to ensure the quality of the SAQs being used for the learners, teachers and program. Such method can be developed and I call it Structured Integrated Short Answer Questions (SISAQ) or Problem Based Question (PBQ).
In this method, a half-day or full-day PBQ writing workshop is organized and all the faculty involved in the teaching and facilitating in the particular organ-system based blocks will be formally invited to participate in the process. Once they gather in an isolated area, curricular contents form each disciplines will be listed in the white board/spreadsheet to produce the examination blueprint for PBQs. This should be done by the concerned block director as far as possible. Then relevant clinical cases are listed to cover various disciplines and contents laid down in the blueprint. Only few (4-6) cases are chosen in consensus for constructing required PBQs for the formative and summative exams as laid down by the examination policies of the institute. It should be known that PBQs demand nearly two minutes per one marks and more that 6 PBQs will be very demanding in terms of cognitive ability of the students in the stipulated time. Once the PBQ case themes are selected through consensus then workshop faculty are divided into 2-6 groups where each group will have at least one faculty from each of the basic, community and clinical sciences disciplines to cover their contents/competencies in the PBQs. They will now work in groups to construct and discuss the PBQs.
Before breaking to the groups, faculty must be informed/aware on there rules:
1) Factual recall questions are NOT allowed in the PBQs, questions should be of application level and beyond
2) Direct questions are NOT allowed in the PBQs, questions should be constructed from the PBQ vignette as it is desirable to test knowledge deepening and knowledge creation rather than knowledge acquisition
3) There will at most 10 questions in each PBQ and each question will have minimum of 0.5 marks
4) Total marks of the PBQ should be between 15 - 25 and total time be 1.5 - 2.0 times the total marks
5) Each item should have a concise as well as precise model answer/s with clear marking schemes so that any faculty can mark it but remains reliable and valid
6) Each PBQ questions should be labeled properly with the concerned discipline to evaluate the disaggregated performance so that proper feedback can be given to the specific student/s
7) Each PBQ should be standard set using as suitable criterion-referenced method so that pass-mark changes from exam-to-exam reflecting the construct (easy, moderate, difficult) rather than content
8) Each PBQ should be handed over to the block director and must be destroyed/deleted permanently from the flip-chart and/or computer used by the group
Once the PBQs are constructed, the concerned block director should review them for face and content validity with the basic sciences coordinator/chair. These PBQs should be submitted to the examination section immediately along with the comments and suggestions. The exam section should send them to the internal and external reviewers first and then to the moderation committee with all the comments/queries/suggestions gathered so far. The moderation committee should finally decide on each of the items of each PBQ and hand it back to the examination section for further administrative process.
If done correctly, SISAQ or PBQ will be able to assess the higher level of knowledge, which in turn will provide important information on the success of the knowledge transfer, especially long term knowledge retention and its application, in the students of traditional and/or hybrid PBL curriculum.
I will cover the cut-score (pass mark) determination and item analysis of the PBQs in the next blog.
Until then, happy PBQing ...
References:
Walker A, & Leary, H. (2009). A Problem Based Learning Meta-Analysis: Differences Across Problem Types, Implementation Types, Disciplines, and Assessment Levels. Interdisciplinary Journal of Problem-based Learning, 3(1).
Strobel, J. & van Barneveld, A. (2009). When is PBL more effective? A Meta-synthesis of Meta-analysis Comparing PBL to Conventional Classrooms. Interdisciplinary Journal of Problem-based Learning, 3(1).