Artigo Acesso aberto Revisado por pares

Training Medical Students to Create and Collaboratively Review Multiple-Choice Questions: A Comprehensive Workshop

2020; Association of American Medical Colleges; Linguagem: Inglês

10.15766/mep_2374-8265.10986

ISSN

2374-8265

Autores

Joshua Kurtz, Beth Holman, Seetha U. Monrad,

Tópico(s)

Innovative Teaching and Learning Methods

Resumo

OPEN ACCESSOctober 6, 2020Training Medical Students to Create and Collaboratively Review Multiple-Choice Questions: A Comprehensive Workshop Josh Kurtz, Beth Holman, DrPH, Seetha U. Monrad, MD Josh Kurtz https://orcid.org/0000-0001-7528-1722 Fourth-year medical student, University of Michigan Medical School Google Scholar More articles by this author , Beth Holman, DrPH Associate Director of Assessment and Evaluation, University of Michigan Medical School Google Scholar More articles by this author , Seetha U. Monrad, MD https://orcid.org/0000-0002-3374-2989 Assistant Dean for Assessment, Evaluation, and Quality Improvement; Associate Professor of Internal Medicine and Learning Health Sciences, University of Michigan Medical School E-mail Address: [email protected] Google Scholar More articles by this author https://doi.org/10.15766/mep_2374-8265.10986 SectionsAboutAbstractPDF ToolsDownload Citations ShareFacebookTwitterEmail Abstract Introduction: Multiple-choice question (MCQ) creation is an infrequently used active-learning strategy. Previous studies demonstrated that medical students find value in the process, but have minimal training, which may limit potential learning benefits. We therefore developed a process for question-creation that required students to complete in-depth training, in addition to collaborative question-writing and editing. Methods: We created a question-writing workshop consisting of three components: (1) training in MCQ writing utilizing NBME online modules, a practice MCQ-writing session, and a training session, (2) writing MCQs independently after choosing topics from an institutionally generated blueprint, and (3) reviewing and editing MCQs via an in-person session. To understand students' perceptions, we held two four-student focus groups and recorded/transcribed the data. We iteratively reviewed the transcripts to generate a codebook and corresponding themes. We used the focus group data to generate a survey with Likert-scale questions, which we sent to the remaining 10 students and analyzed using Microsoft Excel. Results: Eighteen second-year medical students participated in this workshop. Students perceived that question-writing training (3.7/5.0±0.5) and question writing (3.9/5.0±0.3) benefitted their learning. Students perceived that MCQ writing required concept integration (4.1/5.0±0.6). Students described how question writing allowed them to recognize subtle distinctions between therapies and diagnoses. Each MCQ required about 1.5 hours to write and collaboratively edit. Discussion: Our results demonstrated that students perceived question writing to benefit their learning. More importantly, students felt that question writing actively engaged them to integrate content and compare concepts; students' engagement suggests that they learned from this question-writing activity. Educational Objectives By the end of this activity, learners will be able to: 1.Recognize the key components of a multiple-choice question (MCQ) set in a clinical context which avoids common question-writing flaws.2.Construct clinical vignette-based MCQs using medical school-specific content.3.Assess peers' clinical vignette-based MCQs to recommend changes in format and content. Introduction In preparing for assessments during medical school, students often report using passive study strategies such as reading textbooks or notes.1 In order to improve students' engagement with and understanding of learned content, there has been increasing focus on incorporating active learning in medical school curricula.2–4 Engaging students in active learning is associated with numerous benefits including increased retention of learned content, deeper understanding of material, and increased motivation to learn.5,6 Previous reports suggested that many learners have received curricular changes involving active learning positively and with enthusiasm.2–4,7 For example, medical students preferred the use of a flipped-classroom curriculum focused on endometrial hyperplasia and cervical dysplasia compared to traditional lecture-based learning.8 Other active-learning modules focused on obesity management and quality improvement have also been well received by participants.9,10 Multiple-choice questions (MCQs) are one of the primary methods used to assess medical students' knowledge,11,12 as they minimize the logistical challenges and time needed to assess multiple content areas while maintaining high levels of validity and reliability.12,13 Some medical schools have involved their students in the MCQ-writing process as a way to engage them in active learning, introduce them to a novel study strategy, and increase their appreciation for question writing.14–17 Health professions' student involvement in question writing has been associated with improved examination performance,18,19 and students who wrote MCQs reported experiencing improved confidence and engagement with the material as a result of the exercise.20–23 The training that students have received prior to question-writing initiatives has been variable; few have included nationally developed training guidelines15 or in-person locally developed training sessions.14 The NBME has published detailed guidelines for creating MCQs for the basic and clinical sciences.24 In a previous report, we demonstrated that students who participated in question writing felt that assessment generation was beneficial to their learning.17 Recognizing the variability of training in previously reported question-writing processes, we developed a question-writing workshop that included both self-directed and collaborative in-person training to teach medical students to write standardized MCQs. The purpose of this workshop was to train medical students how to write clinical vignette-based MCQs, use the skills they acquired to write their own MCQs, and collaboratively edit MCQs with their peers. Medical educators interested in incorporating a novel approach to engage medical students in collaborative active learning and introduce them to a unique study strategy will find this workshop particularly beneficial. We recommend that educators utilize this workshop in the education of preclinical and clinical medical students who have completed or are in the process of completing coursework relevant to the content area(s) chosen for the MCQs. Methods Context One medical student and two members of our institution's evaluation and assessment (E&A) team collaborated to create the MCQ-writing workshop. The E&A team members consisted of one individual with a doctorate in public health who served as the associate director for E&A, and one individual with an educational specialist degree who served as the assistant director of E&A. Both E&A team members had previous formal training and experience with MCQ assessment creation. A medical student co-created and facilitated this workshop for two reasons: (1) this project was developed in response to a student-identified need, and (2) incorporating the learner perspective and input at all stages was crucial for successful workshop development and implementation, especially with noncontent-expert learners. The medical student completed the same online training modules as the workshop participants. Recruitment We used a convenience sampling strategy of all second-year medical students at the University of Michigan Medical School. We sent an email to all 167 medical students at the end of their first year of medical school, and framed participation as an opportunity to learn about MCQ creation and review relevant course material. We recruited students during the summer between their first and second year of medical school, during which time many students had competing commitments. Therefore, we offered participants up to $100 as compensation to complete all aspects of the program, including the workshop evaluation described below. Workshop Overview This workshop consists of three primary components (totaling 6 hours), as summarized in the Figure: 1.MCQ training (3 hours): To teach students how to write a well-constructed MCQ set in a clinical context, students completed online and in-person trainings.2.MCQ writing (2 hours): To engage students in active learning via question creation, students independently wrote two MCQs from a predefined blueprint and submitted them via Google Docs.3.MCQ review (1 hour): To collaboratively learn from one another and improve the quality of their MCQs, students attended an in-person question-editing session wherein they discussed and edited their MCQs. MCQ Training Prior to meeting for an in-person training session, learners completed virtual MCQ-related prework. At our institution, students independently completed two 30-minute online NBME University modules on MCQ-writing best practices, which have subsequently been modified and reduced to one comparable online module. This 1-hour online module is free to use and can be found as the “Online Interactive Item Writing Tutorial” on the NBME item writing services portion of the main NBME website.25 These modules are derived from the NBME's Constructing Written Test Questions for the Basic and Clinical Sciences.24 Alternatively, students who preferred to learn by reading could review the chapters from Constructing Written Test Questions for the Basic and Clinical Sciences corresponding to the online module content such as the chapters on technical item flaws (p. 11–19),26 basics for writing one-best-answer items (p. 29–33),27 and structuring items to fit task competencies (p. 40–45).28 Next, students independently wrote one practice MCQ. These MCQs were not incorporated into the final MCQ bank, but gave students the opportunity to practice question writing as part of their training. Students suggested edits to their peers' practice MCQs via Google Docs. To facilitate this, we divided students into five groups, including three four-member groups and two three-member groups. We provided students with a few key resources to help them write their practice questions: (1) a question formatting sheet to assist with writing MCQs with a consistent structure (Appendix A: Writing Resource 1), (2) examples of MCQ lead-ins for various types of MCQs (Appendix A: Writing Resource 2), and (3) an MCQ-writing checklist to ensure the style, content, and format of their question was appropriate (Appendix A: Writing Resource 3).17 Subsequently, students attended a 1-hour in-person training session. The student lead reviewed the appropriate question format, common question-writing flaws to avoid in MCQ writing, and the resources available to assist students with MCQ writing (Appendix B). One of the members of the E&A team was available during the session to answer any questions or concerns outside of the realm of the student's expertise. After reviewing this content, students then spent 30 minutes with their group members to practice providing feedback on their group members' practice MCQs. Specifically, one student first explained the edits she had recommended for her group members' questions via Google Docs. All group members then discussed their thoughts about the proposed edits, challenging each other's opinions as they deemed necessary, until they reached a consensus. MCQ-Writing Process We created a two-dimensional exam blueprint to guide question creation (see Appendix C: Logistical Resource 3 for blueprint template).29 Our primary framework was the subject matter—in our case the cardiovascular system—with topics within cardiovascular medicine listed underneath the primary framework. Our secondary framework was question type. The four question types we included for our own blueprint were: (1) diagnosis, (2) pathophysiology or mechanism, (3) treatment, and (4) basic science or physiology. The question types and topics used to generate the blueprint should be tailored to suit both the learners' needs and the characteristics of the desired question bank. Example MCQs and lead-ins for each of these question types is included in Appendix A: Writing Resource 2. Students were randomly assigned a topic and question type from the exam blueprint (see Appendix C: Logistical Resource 1 for an example of the topic and question type combination assigned to each student). In order to minimize overlap between question content, one of the workshop leaders generated a list of suggested subtopics for each topic and students chose their subtopic from this list (see Appendix C: Logistical Resource 2 for example topics and subtopics related to the cardiovascular system). Each student remotely wrote two MCQs and suggested edits via Google Docs to their group members' questions before meeting for an in-person review session (see the MCQ review process below). MCQ Review Process After all MCQs within a group had been submitted, students reviewed their group members' questions for structure and content and proposed changes using Google Docs on their personal computers. Subsequently, they met for a 1-hour in-person editing session. We gave students three primary aims for the MCQ review sessions: (1) discuss each group members' proposed edits by explaining the editor's rationale and the question-writer's response, (2) use this discussion to reach a consensus on the best changes to incorporate into the MCQs, and (3) submit their final edited MCQs by the end of the session. A workshop leader attended the in-person editing sessions to address student concerns about the logistics of question formatting and submission. We provided students with a summary sheet with step-by-step instructions for question writing and question editing with references to the relevant appendices (Appendix A: Writing Resource 4). Workshop Iterations Depending on the desired number of MCQs as defined by the blueprint, the writing and editing portions of this workshop may be repeated iteratively (Figure). For our implementation of this activity, students repeated the process of writing and editing three times in accordance with our blueprint. Each student wrote two MCQs per cycle for a total of six MCQs each. Figure. Process for training medical students to write and collaboratively review multiple-choice questions (MCQs), broken down by activity and the corresponding resources required. Materials The following materials were required to complete this workshop: •AV equipment to project the PowerPoint presentation during the training session.•Laptops for each student to use during the in-person editing session. Workshop Evaluation We measured students' perceptions of the training, writing, and review processes by survey and focus group.17 Students were randomly selected in gender-balanced pairs to participate in two focus groups, each with four participants. We selected students in pairs from the same MCQ-writing groups to allow them to address questions pertaining to collaborative elements of the writing process; groups were gender-balanced to represent the study sample. We utilized a semi-structured interview guide (Appendix D: Evaluation Tool 1) to lead the interviews and subsequently audio-recorded and transcribed the focus group data. Two authors reviewed the transcripts iteratively to generate a codebook, challenging each other's categorizations until they reached consensus. The authors subsequently used the codebook and corresponding qualitative data to generate themes. We utilized the themes from the focus groups to create a survey which asked students to evaluate various aspects of the question-writing workshop, including 5-point Likert-scale questions (1 = strongly disagree, 5 = strongly agree). Specifically, we utilized the qualitative information pertaining to MCQ training, question writing, and question editing to inform the Likert-scale questions. Additionally, we included the questions we asked during the focus group verbatim as open-ended questions. An individual with expertise in survey design reviewed the survey and made suggestions to improve clarity. Two students participated in concurrent cognitive interviewing and we modified the survey to better correlate with our intended meaning. We sent the survey (Appendix D: Evaluation Tool 2) via Qualtrics to the 10 participants who had not taken part in the focus groups. Two authors utilized the codebook to categorize the qualitative data from the open-ended survey responses; they iteratively reviewed and drew relationships between the focus group and open-ended survey data to inform the final themes. We analyzed the rating-scale questions using Microsoft Excel 2016 (Microsoft Redmond, WA USA) and expressed the Likert-scale data by mean and standard deviation. For responses to the Likert-like questions, the qualitative descriptive interpretations (i.e., agree, strongly agree) reported below were derived from rounding up students' average quantitative response to the nearest whole number value if the average was within one standard deviation of the nearest whole number. If the nearest lower numerical anchor was within one standard deviation of the average, then the qualitative responses were rounded down to the lower anchor. For example, a response of 3.8±0.5 would be reported that students agreed, whereas a response of 3.8±0.8 would be reported as students neither agreed nor disagreed. We represented descriptive data as a percentage. The Institutional Review Board at the University of Michigan reviewed and approved the protocol associated with this submission (HUM 00129824). Results Learner Characteristics Of the 20 second-year medical students originally enrolled, 18 (90%) completed the study. The two participants who unenrolled completed the online training modules, but did not attend the in-person training session, write MCQs, or attend the MCQ editing sessions. The participants identified their gender as 50% male and 50% female. On average, the participants were 26 years old (range 22–39 years, n = 18). Quantitative Results MCQ training Students' perceptions of completing all components of MCQ training (i.e., completing online NBME modules, writing one practice MCQ, and attending in-person training sessions) were summarized in the Table. Students agreed that completing question-writing training was beneficial to their learning (3.7±0.5; mean ± standard deviation). Students neither agreed nor disagreed that participating in question-writing training increased their confidence in taking NBME multiple-choice examinations (3.2±0.9). Students similarly neither agreed nor disagreed that completing question-writing training changed their approach for future NBME examinations (3.7±0.9). Table. Survey Data of Second-Year Medical Students' (N = 10) Perceptions of Various Aspects of Completing Multiple-Choice Question Writing TrainingTable. Survey Data of Second-Year Medical Students' (N = 10) Perceptions of Various Aspects of Completing Multiple-Choice Question Writing Training Question writing Students reported spending 50.5 minutes (range 10–90 minutes, n = 10) writing each MCQ. For our implementation of this activity, each student wrote six total questions. They repeated the process of writing and editing two MCQs three times (Figure), in accordance with our blueprint. Students also spent less time on question writing for the first session (69.5 minutes, n = 10), compared to the third session (39.5 minutes, n = 10). Students reported that the 2 hours allotted to write their two MCQs and review their group members' questions was “about the right amount of time” (2.0/3.0±0.5, n = 10) on a scale of 1 (not enough time) to 3 (too much time). The relative time (expressed as a percentage, totaling 100% across categories) that students spent reviewing various resources in order to write their MCQs were as follows: lecture materials (44%), textbooks (19.5%), online search engines (13%), first aid (12%), Pathoma (7%), DynaMed or UpToDate (4%), and other (1%). The student who selected other reported using the medical literature as an additional resource. Students' perceptions of writing MCQs were summarized in the Table. Students agreed that question writing was beneficial to their learning (3.9± 0.3) and required integration of multiple concepts (4.1±0.6), but not integration of multiple resources (3.5±1.0). Students neither agreed nor disagreed that the act of writing MCQs made them critically analyze distinctions between therapeutic options (3.9±0.9). Students reported their preferred study strategies as rereading lecture slides, rewriting notes, and reviewing flashcards. As previously reported,17 students perceived that MCQ writing required much more time (4.9±0.3), problem solving (4.3±0.9), integration of content (4.1±0.7), and differentiation between diagnoses (4.0±0.8) compared to their preferred study strategies. Additionally, we asked students to rank in order the components of the MCQs that were most valuable to their learning when writing. The majority of students (60%, n = 10) reported that writing the answer choices was most beneficial. Question editing The findings related to students' perceptions of question editing are shown in the Table. Students agreed that editing their group members' MCQs online and in-person was beneficial to their learning (3.8 ± 0.6) and that it made them critically assess differences between diagnoses in group members' questions (4.0 ± 0.5). Students also agreed that question editing improved the quality of their own questions (4.0 ±0.7). Students neither agreed nor disagreed that question editing improved their ability to deliver feedback (3.7±0.8). Students reported that they spent approximately 9 minutes on average editing each of their group members' questions prior to the in-person question editing sessions. Between the question writing (50.5 minutes per question), remote question editing (9 minutes per question), and in-person question editing (30 minutes per question), each additional MCQ required approximately 1.5 hours to collaboratively create. Qualitative Results We have reported the qualitative results in a previous manuscript.17 In summary, the qualitative data showed that students gained a deeper appreciation for the difficulty of question writing through completing MCQ training; they felt that question writing required a significant integration of concepts and content and that creating plausible distractors required subtle differentiation between diagnoses and treatment options. Students recognized that question writing is a time-intensive process that required a significant knowledge base, which may limit its efficiency as a learning tool. Discussion We created a comprehensive workshop that rigorously trained medical students on how to write and collaboratively edit clinical vignette-based MCQs. Through participation in this workshop, students learned how to apply information from their medical-school specific lectures and other commonly utilized information sources to create questions. Students perceived that participating in all aspects of this workshop (including question writing training, writing, and editing) was beneficial to their learning. More importantly, students perceived that participation in this workshop required significant problem solving, content integration, and critical analysis of multiple concepts, which suggested students were actively engaged in the learning process. Through designing, executing, and analyzing this collaborative question-writing workshop, we have learned a number of important lessons for those interested in incorporating MCQ writing as an active learning strategy into medical school curricula. First, students perceived that incorporating robust question-writing training prior to question writing and question editing was beneficial to their learning. However, it is important to note that students were the least enthusiastic in their responses regarding the MCQ training component of the workshop. Students may have been less engaged in this portion of the workshop due to its technical nature (i.e., discussing many different rules around MCQ writing), the comparatively passive learning (online modules) compared with the remainder of the workshop, and the amount of time required to complete training. Educators should weigh students' relative lack of enthusiasm for this portion of the training and the time required with the benefits of creating consistently formatted MCQs for use by future students. Strategies to provide rigorous training in MCQ writing using more active pedagogies (e.g., interactive online modules, gamification techniques, etc.) may be considered. Second, in light of multiple demands on medical students' time, it was essential to weigh the ideal number and type of questions students were asked to create with the estimated 1.5-hour additional time requirement for each new MCQ. Students perceived that the amount of time allotted to write each MCQ was appropriate, and this in part may have been due to students becoming more efficient in question writing as they wrote additional questions. Students also reported, however, that question writing took significantly more time than their preferred study strategies. The significant time burden required to write each question may have been one of the most important drivers of students' relative lack of enthusiasm for the training portion of this workshop. For our implementation of this activity, each student wrote six total questions. Tailoring the number of iterations of the question writing and question editing components of this workshop (Figure) to the institution's specific schedule and needs is crucial for optimizing students' receptivity. Additionally, it was important to clearly communicate time expectations up front, and ideally dedicate protected time for students to engage with this activity. Third, while the primary purpose of this workshop was to actively engage students in question writing, it had the additional benefit of creating a question bank for potential use by future students. If one of the educator's goals is to use student-generated MCQs for purposes other than promoting learning in the question writers (e.g., practice questions for other students, incorporation into high-stakes assessment, etc.), validity evidence would be needed.30 Faculty or other content-experts (including potential senior learners) should review questions for accuracy and authenticity. Faculty-vetted student-generated questions have been shown to be of similar difficulty and quality compared to questions written by faculty alone; the majority of faculty reviewers felt the review process was worth their time and effort.16,31 Additionally, administering a student-generated question bank to other students would generate important psychometric data (i.e., discrimination indices, student performance, etc.) and allow for association with other variables (i.e., performance on other assessments). We recognized limitations in our approach. It was possible that our results represented the views of students involved in our institution's 1-year preclinical curriculum, given that we were unable to involve multiple institutions in our study design. We find this unlikely, however, given the congruence between our findings about the critical thinking and extensive time required to write questions and previous studies.20,22 Additionally, students who participated in this question-writing workshop voluntarily participated, which may have contributed to their positive perceptions of the activity. Despite their voluntary participation, we were reassured of students' honesty by their comfort with expressing neutral sentiments for numerous questions on the administered survey. Given the logistical and time requirements to run this workshop, it was important to consider how many participants were appropriate to include. In its current form, the workshop may be best suited for relatively small groups of students (15–30 total), which may limit broader incorporation into medical schools' curricula. Focusing students' time on the portion of the activity from which they felt they learned the most (i.e., writing plausible distractors) may help to reduce time and administrative burden while improving student receptivity. Doing so may allow educators to scale this workshop and implement it more broadly. Additionally, focusing students' attention on creating the distractors may improve the efficiency of question writing as a study strategy. The generalizability of our results was limited by our small sample size and the survey data representing slightly over half of the participants. The survey questions were constructed from focus group discussions with the eight students who did not complete the survey, however, adding additional validity to our findings. Further study to assess a larger pool of students' perspectives of this question-writing workshop will be important to strengthen the quantitative findings, which are significantly underpowered due to a small sample size. Additionally, we did not directly measure learning in the evaluation of this workshop; rather, we assessed student engagement, which is associated with learning.32–33 To further assess students' learning, it may be worthwhile to incorporate a pre- and posttest in future iterations of this workshop. Educators may also consider analyzing question-writers' performance on subsequent MCQ examinations for the topics they chose to write about compared to their non question-writing peers; student involvement in question creation has previously been shown to be associated with improved performance on subsequent MCQ examinations.19 Lastly, although we used cognitive interviewing and expert review to develop and refine the survey, additional validity evidence could be gathered.34 Our proposed question-writing process engaged students in an active learning activity that they perceived to be beneficial to their learning and required significant conceptual integration. Further study to directly assess how student involvement in this workshop leads to improved performance on subsequent MCQ examinations will be beneficial. Additionally, exploring strategies to involve faculty in the question review and editing process may be beneficial for student learning and question quality. References1. Hilliard RI. How do medical students learn: medical student learning styles and factors that affect these learning styles. Teach Learn Med. 2009;7(4):201–210. https://doi.org/10.1080/10401339509539745Google Scholar2. Moore GT, Block SD, Style CB, Mitchell R. The influence of the new pathway curriculum on Harvard medical students. Acad Med. 1994;69(12):983–989. https://doi.org/10.1097/00001888-199412000-00017Medline, Google Scholar3. Schwartz PL, Loten EG. Effects of a revised pre‐clinical curriculum on students' perceptions of their cognitive behaviours, attitudes to social issues in medicine, and the learning environment. Teach Learn Med2003;15(2):76–83. https://doi.org/10.1207/S15328015TLM1502_01Medline, Google Scholar4. Graffam B. Active learning in medical education: strategies for beginning implementation. Med Teach. 2007;29(1):38–42. https://doi.org/10.1080/01421590601176398Medline, Google Scholar5. Prince M. Does active learning work? A review of the research. J Eng Educ. 2004;93(3):223–231. https://doi.org/10.1002/j.2168-9830.2004.tb00809.xGoogle Scholar6. McKeachie WJ, Hofer BK. McKeachie's Teaching Tips: Strategies, Research, and Theory for College and University Teachers, 11th ed. Houghton Mifflin; 2002.Google Scholar7. White C, Bradley E, Martindale J, et al. Why are medical students ‘checking out’ of active learning in a new curriculum? Med Educ. 2014;48(3):315–324. https://doi.org/10.1111/medu.12356Medline, Google Scholar8. Morgan H, Hammoud M, McLean K, Yousuf A, Chapman C. Endometrial hyperplasia and cervical dysplasia: a flipped classroom curriculum. MedEdPORTAL. 2015;11:10142. https://doi.org/10.15766/mep_2374-8265.10142Google Scholar9. Pasarica M, Harris DM, Simms-Cendan J, Gorman AL. Collaborative learning activity utilizing evidence-based medicine to improve medical student learning of the lifestyle management of obesity. MedEdPORTAL. 2016;12:10426. https://doi.org/10.15766/mep_2374-8265.10426Medline, Google Scholar10. Dumenco L, Monteiro K, George P, McNicoll L, Warrier S, Dollase R. An interactive quality improvement and patient safety workshop for first-year medical students. MedEdPORTAL. 2018;14:10734. https://doi.org/10.15766/mep_2374-8265.10734Medline, Google Scholar11. Mavis BE, Cole BL, Hoppe RB. A survey of student assessment in US medical schools: the balance of breadth versus fidelity. Teach Learn Med. 2001;13(2):37–41. https://doi.org/10.1207/S15328015TLM1302_1Google Scholar12. Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–396. https://doi.org/10.1056/NEJMra054784Medline, Google Scholar13. Hays R. Assessment in medical education: roles for clinical teachers. Clin Teach. 2008;5:23–27. https://doi.org/10.1111/j.1743-498X.2007.00165.xGoogle Scholar14. Gooi ACC, Sommerfeld CS. Medical school 2.0: how we developed a student-generated question bank using small group learning. Med Teach. 2015;37(10):892–896. https://doi.org/10.3109/0142159X.2014.970624Medline, Google Scholar15. Harris BHL, Walsh JL, Tayyaba S, Harris DA, Wilson DJ, Smith PE. A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input. Teach Learn Med. 2015;27(2):182–188. https://doi.org/10.1080/10401334.2015.1011651Medline, Google Scholar16. Grainger R, Dai W, Osborne E, Kenwright D. 2018. Medical students create multiple-choice questions for learning in pathology education: a pilot study. BMC Med Educ. 2018;18:201–208. https://doi.org/10.1186/s12909-018-1312-1Medline, Google Scholar17. Kurtz JB, Lourie MA, Holman B, Grob K, Monrad SU. Creating assessments as an active learning strategy: what are students' perceptions? A mixed methods study. Med Educ Online. 2019;24: 1630239. https://doi.org/10.1080/10872981.2019.1630239Medline, Google Scholar18. Rajendiren S, Dhiman P, Rajendiren S, et al. Making concepts of medical biochemistry by formulating distractors of multiple choice questions: growing mighty oaks from small acorns. J Contemp Med Educ. 2014;2(2):123–127. https://doi.org/10.5455/jcme.20140625074836Google Scholar19. Walsh J, Harris B, Tayyaba S, Harris D, Smith P. Student-written single-best answer questions predict performance in finals. Clin Teach. 2016;13:352–356.Medline, Google Scholar20. Pittenger AL, Lounsbery JL. Student-generated questions to assess learning in an online orientation to pharmacy course. Am J Pharm Educ. 2011;75(5):1–9. https://doi.org/10.5688/ajpe75594Medline, Google Scholar21. Musbahi O, Nawab F, Dwan NI, Hoffer AJ, Ung JF, Suleman MT. Near-peer question writing and teaching programme. Clin Teach. 2018;15(5):387–392. https://doi.org/10.1111/tct.12704Medline, Google Scholar22. Gonzalez-Cabezas C, Anderson O, Wright M. Association between dental student-developed exam questions and learning at higher cognitive levels. J Dent Educ. 2015;79(11):1295–1304. https://doi.org/10.1002/j.0022-0337.2015.79.11.tb06025.xMedline, Google Scholar23. Baerheim A, Meland E. Medical students proposing questions for their own written final examination: evaluation of an educational project. Med Educ. 2003;37(8):734–738. https://doi.org/10.1046/j.1365-2923.2003.01578.xMedline, Google Scholar24. Paniagua MA, Swygert KA, eds. Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners; 2016.Google Scholar25. National Board of Medical Examiners. NBME item writing services–online interactive item writing tutorial. Accessed October 30, 2019. https://www.nbme.org/IWTutorial/eIWW_4/index.htmlGoogle Scholar26. Technical item flaws. In: Paniagua MA, Swygert KA, eds. Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners; 2016:11–19.Google Scholar27. Basic rules for writing one-best-answer items. In: Paniagua MA, Swygert KA, eds. Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners; 2016:29–33.Google Scholar28. Testing application of foundational (basic) and clinical knowledge: structuring items to fit task competencies. In: Paniagua MA, Swygert KA, eds. Constructing Written Test Questions for the Basic and Clinical Sciences. National Board of Medical Examiners; 2016:40–45.Google Scholar29. Coderre S, Woloschuk W, McLaughlin K. Twelve tips for blueprinting. Med Teach. 2009;31(4):322–324. https://doi.org/10.1080/01421590802225770Medline, Google Scholar30. Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ. 2003;37(9):830–837. https://doi.org/10.1046/j.1365-2923.2003.01594.xMedline, Google Scholar31. Schullo-Feulner A, Janke KK, Chapman SA, et al. Student-generated, faculty-vetted multiple-choice questions: value, participant satisfaction, and workload. Curr Pharm Teach Learn. 2014;6(1):15–21. https://doi.org/10.1016/j.cptl.2013.09.019Google Scholar32. Casuso-Holgado MJ, Cuesta-Vargas A, Moreno-Morales N, Labajos-Manzanares MT, Baron-Lopez F, Vega-Cuesta M. The association between academic engagement and achievement in health sciences students. BMC Med Educ. 2013;13:33–40. https://doi.org/10.1186/1472-6920-13-33Medline, Google Scholar33. Fink LD. Beyond small groups: harnessing the extraordinary power of learning teams. In: Michaelsen LK, Knight AB, Fink LD eds. Team-Based Learning: A Transformative Use of Small Groups in College Teaching. Praeger; 2002.Google Scholar34. Artino AR, La Rochelle JS, Dezee KJ, Gehlbach H. Developing questionnaires for educational research: AMEE Guide No. 87. Med Teach. 2014;36:463–474. https://doi.org/10.3109/0142159X.2014.889814Medline, Google Scholar Sign up for the latest publications from MedEdPORTAL Add your email below APPENDICESReferencesRelatedDetailsAppendices Question-Writing Resources for Students.docx Multiple-Choice Question Writing Training Session.pptx Logistical Resources for Workshop Leaders.xlsx Workshop Evaluation Tools.docx All appendices are peer reviewed as integral parts of the Original Publication. Download Copyright & Permissions© 2020 Kurtz et al. This is an open-access publication distributed under the terms of the Creative Commons Attribution-NonCommercial license.KeywordsActive LearningStudent Peer-ReviewQuestion BankAssessmentCardiovascular MedicineSingle Best AnswerFlipped ClassroomQuantitative ResearchSelf-Regulated LearningMultiple-Choice QuestionsCurriculum DevelopmentStudent-Generated QuestionsAcknowledgmentsThe authors would like to acknowledge Drs. Caren Stalburg and Paula Ross for their excellent review of this workshop; Dr. Peter Batra for his assistance with survey design; and Ms. Karri Grob for her assistance with the in-person question-writing training. Disclosures None to report. Funding/Support This study was funded by the University of Michigan Whitaker Fund Grant and the University of Michigan Medical School Summer Biological Research Program Grant. Ethical Approval The Institutional Review Board at the University of Michigan reviewed and approved the protocol associated with this submission (HUM 00129824) on June 26th, 2017. Disclaimer Some of the materials used or referenced in this workshop are associated with the NBME. The NBME was not involved in the creation of this question-writing workshop and does not endorse use of this workshop for educational purposes. The NBME has granted permission to use all related materials. Loading ...

Referência(s)