Artigo Revisado por pares

Simulation Devices in Interventional Radiology: Validation Pending

2009; Elsevier BV; Volume: 20; Issue: 7 Linguagem: Inglês

10.1016/j.jvir.2009.04.014

ISSN

1535-7732

Autores

Derek Gould, Jim A. Reekers, David Kessel, N Chalmers, Marc Sapoval, Aalpen A. Patel, Gary J. Becker, Mick J. Lee, L Stockx,

Tópico(s)

Radiology practices and education

Resumo

MEDICAL simulation offers a tantalizing breadth and depth of potential for training and assessment in interventional radiology. It promises to provide solutions to many of the shortcomings of our traditional “apprentice-ship” training. Mandatory restrictions on in-hospital work hours of resident trainees limit the time available for training and the breadth of case material to which the individual trainee is exposed. At the same time, advances in noninvasive imaging have reduced trainee exposure to invasive procedures. Thus today's resident/trainee has limited opportunity to acquire the basic gateway skills (eg, selective diagnostic catheter angiography), upon which more advanced interventional skills are based (1Bridges M. Diamond D.L. The financial impact of training surgical residents in the operating room.Am J Surg. 1999; 177: 28-32Abstract Full Text Full Text PDF PubMed Scopus (640) Google Scholar, 2Crofts T.J. Griffiths J.M. Sharma S. et al.Surgical training: an objective assessment of recent changes for a single health board.BMJ. 1997; 314: 891-895Crossref PubMed Scopus (62) Google Scholar, 3European Working time directive website.http://www.incomesdata.co.uk/information/worktimedirective.htmGoogle Scholar). Medical simulators engineered for interventional radiology training have the potential to address these gaps.Simulators introduce a novel capability not only to train but also to establish objective evidence of technical competence during and after training (4Bakker N.H. Tanase D. Reekers J.A. Grim-bergen C.A. Evaluation of vascular and interventional procedures with time-action analysis: a pilot study.J Vasc Intervent Radiol. 2002; 13: 483-488Abstract Full Text Full Text PDF PubMed Scopus (39) Google Scholar, 5European Association of Endoscopic SurgeonsTraining and assessment of competence.Surg Endoscopy. 1994; 8: 721-722Crossref PubMed Scopus (2) Google Scholar, 6Dankelman J. Wentink M. Grimbergen C.A. et al.Does virtual reality training make sense in interventional radiology? Training skill-, rule- and knowledge-based behavior.Cardiovasc Intervent Radiol. 2004; 27: 417-421Crossref PubMed Scopus (21) Google Scholar). Although there is growing evidence for their effectiveness, few medical procedural simulations have demonstrated predictive validity. In other words, in very few instances has proficiency with a medical procedural simulator been proved to transfer to the clinical situation. This transfer of trained skills to patients has now been shown for simulations of laparoscopic surgery, colonoscopy, and anesthesia (7Sedlack R. Kolars J. Computer simulator training enhances the competency of gastroenterology fellows at colonos-copy: results of a pilot study.Am J Gastroenterol. 2004; 99: 33-37Crossref PubMed Scopus (225) Google Scholar, 8Rowe R. Cohen R. An evaluation of a virtual reality airway simulator.Anesth Analg. 2002; 95: 62-66Crossref PubMed Scopus (142) Google Scholar, 9Seymour N.E. Gallagher A.G. Roman S.A. et al.Virtual reality training improves operating room performance: results of a randomized, double-blinded study Yale University & Queen's University, Belfast.Ann Surg. 2002; 236 (discussion 463-464): 458-463Crossref PubMed Scopus (2059) Google Scholar), but at the time of writing, similar evidence is still being sought for endovascular simulators.In the future, it is likely that simulations will be incorporated into certification examinations for interventional radiology (6Dankelman J. Wentink M. Grimbergen C.A. et al.Does virtual reality training make sense in interventional radiology? Training skill-, rule- and knowledge-based behavior.Cardiovasc Intervent Radiol. 2004; 27: 417-421Crossref PubMed Scopus (21) Google Scholar). Although it may be intuitive that skills learned on simulators should effectively transfer to the clinical interventional radiology environment, intuition is not evidence. A great deal of work on the development and validation of interventional radiology procedural simulations must be completed before the inclusion of simulations on board, and other statutory, certification examinations can be endorsed. Ideally, the development and validation of the critical measures of performance (metrics) and test items to be used in simulators should be accomplished through a joint effort of professional societies and the certifying bodies. Only in this way will we ensure that the test instrument is compatible with the educational curriculum and that the desired competencies are being assessed. Input will be required from psychologists and experts in the subject matter, who will analyze knowledge and task performance, breaking them down into their key components (10Johnson S.J. Healey A.E. Evans J.C. et al.Physical and cognitive task analysis in interventional radiology.Clin Radiol. 2006; 61: 97-103Abstract Full Text Full Text PDF PubMed Scopus (34) Google Scholar, 11Grunwald T. Clark D. Fisher S.S. et al.Medicine Meets Virtual Reality 12.in: Westwood J.D. Using cognitive task analysis to facilitate collaboration in development of simulators to accelerate surgical training. IOS Press, City2004: 114-120Google Scholar). Metrics must be identified and used specifically for assessment of the learner. By design, this can be made to occur automatically (by a simulator) within the context of a simulation. The subject experts involved in test development must be appointed with complete transparency by the certifying authorities. They must faithfully represent a robust interventional radiology curriculum, with all facets of content, skill, and even geographic diversity.Ideally a single, comprehensive set of procedures, skill sets, and metrics should be defined and provided as open source for incorporation into academic and commercial simulator models. Each set of metrics based on the test items must in turn be validated for its stated purpose. The diversity of training environments represented within and across radiology societies provides an excellent opportunity for careful test validation. The use of trainees in transfer-of-training studies will show whether skills acquired through simulation indeed translate into performance in patients. Subsequent review would demonstrate whether that performance is maintained. Funding and support for this work should derive jointly from industry, the various specialty societies, and government agencies in the form of public-private partnerships.The Cardiovascular and Interventional Radiological Society of Europe (CIRSE), the Society of Interventional Radiology (SIR), and the Radiological Society of North America (RSNA) have established individual medical simulation task forces and a joint task force with the objective of advising on the use of simulators for training in interventional radiology, including standards for test development, certification, and validation of simulator models. Their joint recommendations, which are also supported by the British Society of Interventional Radiologists (BSIR), are as follows:1Current-generation simulator models may be suitable for gaining certain aspects of procedural experience, such as learning the correct sequence of procedural steps and selection of appropriate tools. Such learning may well be beneficial prior to performing procedures on patients. Although there is growing evidence for their effectiveness in some areas, the utility of simulators for other aspects of training is currently unproven. In particular, there is no existing evidence that catheter manipulation skills acquired on the simulator are transferable to actual clinical practice. Therefore, experience on a simulator cannot yet be regarded as equivalent to training involving performance of actual endovascular procedures in patients. Moreover, it should be self-evident that even a valid simulation that predicts transfer of a specific skill to the procedural setting has limits. It cannot supplant the experience, judgment, and wisdom (12Skills for the new millennium: report of the societal needs working group.CanMEDS 2000 Project. September 1996Google Scholar) gained by managing real patients with serious conditions through their diagnoses, treatments, and longitudinal follow-up. Therefore, we should remember that as training hours shorten, diagnostic work-ups become increasingly noninvasive, and trainees are exposed to dwindling numbers of actual clinical cases, we still must resist the temptation to consider procedural simulations and clinical experiences interchangeable. They are not. Simulation training may become a prerequisite for certification or credentialing, but it can never be a sufficient condition for either.2Training and assessment methods that use simulation should be developed and validated in close association with the statutory authorities responsible for certification: a. Procedural tasks that require simulation should be carefully analyzed by psychologists working with acknowledged subject matter experts to define metrics and critical performance indicators. The statutory bodies will ensure that these are relevant to their curricula and practice. These data should be made freely available. b. Test validation should include content, construct, concurrent, and predictive validation with the objective of demonstrating transfer of trained skills to procedures in patients. c. Because the advancement in technology has the potential to outpace the validation effort, validation may have to be performed in a staggered parallel fashion.There is the potential for simulation to provide robust, high-quality training and objective assessment of competence. Patients will be reassured that interventional radiologists have demonstrated a defined level of experience and that they will be spared the early learning curve of novices. The use of simulation as a component of objective certification of skills by statutory bodies is a laudable and achievable objective. It will require collaboration between the statutory organizations and the simulation industry. We plan for the above-named task forces to continue stimulating informed discussion regarding the role of simulators in interventional radiology training and to catalyze exciting developments in this field.Membership of the Task ForcesThe CIRSE Task Force members are Derek A. Gould, MD (UK), Gary J. Becker, MD (USA), Nick C. Chalmers, MD (UK), David O. Kessel, MD (UK), Mick J Lee, MD (Ireland), Aalpen A. Patel, MD (USA), Jim A. Reekers, MD (Netherlands), Marc Sapoval, MD, PhD (France), and Luc Stockx, MD (Belgium).The SIR Task Force members are Aalpen A. Patel, MD, John F. Cardella, MD, Buddy Connors, MD, Steve Dawson, MD, Craig Glaiberman, MD, Derek Gould, MD, Robert Hurst, MD, Barry Katzen, MD, John Kaufman, MD, Jeanne LaBerge, MD, Albert Nemcek, Jr, MD, Jim Reekers, MD, John Rundback, MD, David Sacks, MD, Mark Scerbo, PhD, Stephen Solomon, MD, Richard Towbin, MD, and William Rilling, MDThe SIR/RSNA Joint Workgroup members are Gary J. Becker, MD, Steven L. Dawson, MD, Robert R. Hattery, MD, Barry T. Katzen, MD, Jeanne M. LaBerge, MD, John A. Kaufman, MD, Theresa C. McLoud, MD, Aalpen A. Patel, MD, and Stephen B. Solomon, MD MEDICAL simulation offers a tantalizing breadth and depth of potential for training and assessment in interventional radiology. It promises to provide solutions to many of the shortcomings of our traditional “apprentice-ship” training. Mandatory restrictions on in-hospital work hours of resident trainees limit the time available for training and the breadth of case material to which the individual trainee is exposed. At the same time, advances in noninvasive imaging have reduced trainee exposure to invasive procedures. Thus today's resident/trainee has limited opportunity to acquire the basic gateway skills (eg, selective diagnostic catheter angiography), upon which more advanced interventional skills are based (1Bridges M. Diamond D.L. The financial impact of training surgical residents in the operating room.Am J Surg. 1999; 177: 28-32Abstract Full Text Full Text PDF PubMed Scopus (640) Google Scholar, 2Crofts T.J. Griffiths J.M. Sharma S. et al.Surgical training: an objective assessment of recent changes for a single health board.BMJ. 1997; 314: 891-895Crossref PubMed Scopus (62) Google Scholar, 3European Working time directive website.http://www.incomesdata.co.uk/information/worktimedirective.htmGoogle Scholar). Medical simulators engineered for interventional radiology training have the potential to address these gaps. Simulators introduce a novel capability not only to train but also to establish objective evidence of technical competence during and after training (4Bakker N.H. Tanase D. Reekers J.A. Grim-bergen C.A. Evaluation of vascular and interventional procedures with time-action analysis: a pilot study.J Vasc Intervent Radiol. 2002; 13: 483-488Abstract Full Text Full Text PDF PubMed Scopus (39) Google Scholar, 5European Association of Endoscopic SurgeonsTraining and assessment of competence.Surg Endoscopy. 1994; 8: 721-722Crossref PubMed Scopus (2) Google Scholar, 6Dankelman J. Wentink M. Grimbergen C.A. et al.Does virtual reality training make sense in interventional radiology? Training skill-, rule- and knowledge-based behavior.Cardiovasc Intervent Radiol. 2004; 27: 417-421Crossref PubMed Scopus (21) Google Scholar). Although there is growing evidence for their effectiveness, few medical procedural simulations have demonstrated predictive validity. In other words, in very few instances has proficiency with a medical procedural simulator been proved to transfer to the clinical situation. This transfer of trained skills to patients has now been shown for simulations of laparoscopic surgery, colonoscopy, and anesthesia (7Sedlack R. Kolars J. Computer simulator training enhances the competency of gastroenterology fellows at colonos-copy: results of a pilot study.Am J Gastroenterol. 2004; 99: 33-37Crossref PubMed Scopus (225) Google Scholar, 8Rowe R. Cohen R. An evaluation of a virtual reality airway simulator.Anesth Analg. 2002; 95: 62-66Crossref PubMed Scopus (142) Google Scholar, 9Seymour N.E. Gallagher A.G. Roman S.A. et al.Virtual reality training improves operating room performance: results of a randomized, double-blinded study Yale University & Queen's University, Belfast.Ann Surg. 2002; 236 (discussion 463-464): 458-463Crossref PubMed Scopus (2059) Google Scholar), but at the time of writing, similar evidence is still being sought for endovascular simulators. In the future, it is likely that simulations will be incorporated into certification examinations for interventional radiology (6Dankelman J. Wentink M. Grimbergen C.A. et al.Does virtual reality training make sense in interventional radiology? Training skill-, rule- and knowledge-based behavior.Cardiovasc Intervent Radiol. 2004; 27: 417-421Crossref PubMed Scopus (21) Google Scholar). Although it may be intuitive that skills learned on simulators should effectively transfer to the clinical interventional radiology environment, intuition is not evidence. A great deal of work on the development and validation of interventional radiology procedural simulations must be completed before the inclusion of simulations on board, and other statutory, certification examinations can be endorsed. Ideally, the development and validation of the critical measures of performance (metrics) and test items to be used in simulators should be accomplished through a joint effort of professional societies and the certifying bodies. Only in this way will we ensure that the test instrument is compatible with the educational curriculum and that the desired competencies are being assessed. Input will be required from psychologists and experts in the subject matter, who will analyze knowledge and task performance, breaking them down into their key components (10Johnson S.J. Healey A.E. Evans J.C. et al.Physical and cognitive task analysis in interventional radiology.Clin Radiol. 2006; 61: 97-103Abstract Full Text Full Text PDF PubMed Scopus (34) Google Scholar, 11Grunwald T. Clark D. Fisher S.S. et al.Medicine Meets Virtual Reality 12.in: Westwood J.D. Using cognitive task analysis to facilitate collaboration in development of simulators to accelerate surgical training. IOS Press, City2004: 114-120Google Scholar). Metrics must be identified and used specifically for assessment of the learner. By design, this can be made to occur automatically (by a simulator) within the context of a simulation. The subject experts involved in test development must be appointed with complete transparency by the certifying authorities. They must faithfully represent a robust interventional radiology curriculum, with all facets of content, skill, and even geographic diversity. Ideally a single, comprehensive set of procedures, skill sets, and metrics should be defined and provided as open source for incorporation into academic and commercial simulator models. Each set of metrics based on the test items must in turn be validated for its stated purpose. The diversity of training environments represented within and across radiology societies provides an excellent opportunity for careful test validation. The use of trainees in transfer-of-training studies will show whether skills acquired through simulation indeed translate into performance in patients. Subsequent review would demonstrate whether that performance is maintained. Funding and support for this work should derive jointly from industry, the various specialty societies, and government agencies in the form of public-private partnerships. The Cardiovascular and Interventional Radiological Society of Europe (CIRSE), the Society of Interventional Radiology (SIR), and the Radiological Society of North America (RSNA) have established individual medical simulation task forces and a joint task force with the objective of advising on the use of simulators for training in interventional radiology, including standards for test development, certification, and validation of simulator models. Their joint recommendations, which are also supported by the British Society of Interventional Radiologists (BSIR), are as follows:1Current-generation simulator models may be suitable for gaining certain aspects of procedural experience, such as learning the correct sequence of procedural steps and selection of appropriate tools. Such learning may well be beneficial prior to performing procedures on patients. Although there is growing evidence for their effectiveness in some areas, the utility of simulators for other aspects of training is currently unproven. In particular, there is no existing evidence that catheter manipulation skills acquired on the simulator are transferable to actual clinical practice. Therefore, experience on a simulator cannot yet be regarded as equivalent to training involving performance of actual endovascular procedures in patients. Moreover, it should be self-evident that even a valid simulation that predicts transfer of a specific skill to the procedural setting has limits. It cannot supplant the experience, judgment, and wisdom (12Skills for the new millennium: report of the societal needs working group.CanMEDS 2000 Project. September 1996Google Scholar) gained by managing real patients with serious conditions through their diagnoses, treatments, and longitudinal follow-up. Therefore, we should remember that as training hours shorten, diagnostic work-ups become increasingly noninvasive, and trainees are exposed to dwindling numbers of actual clinical cases, we still must resist the temptation to consider procedural simulations and clinical experiences interchangeable. They are not. Simulation training may become a prerequisite for certification or credentialing, but it can never be a sufficient condition for either.2Training and assessment methods that use simulation should be developed and validated in close association with the statutory authorities responsible for certification: a. Procedural tasks that require simulation should be carefully analyzed by psychologists working with acknowledged subject matter experts to define metrics and critical performance indicators. The statutory bodies will ensure that these are relevant to their curricula and practice. These data should be made freely available. b. Test validation should include content, construct, concurrent, and predictive validation with the objective of demonstrating transfer of trained skills to procedures in patients. c. Because the advancement in technology has the potential to outpace the validation effort, validation may have to be performed in a staggered parallel fashion. There is the potential for simulation to provide robust, high-quality training and objective assessment of competence. Patients will be reassured that interventional radiologists have demonstrated a defined level of experience and that they will be spared the early learning curve of novices. The use of simulation as a component of objective certification of skills by statutory bodies is a laudable and achievable objective. It will require collaboration between the statutory organizations and the simulation industry. We plan for the above-named task forces to continue stimulating informed discussion regarding the role of simulators in interventional radiology training and to catalyze exciting developments in this field. Membership of the Task ForcesThe CIRSE Task Force members are Derek A. Gould, MD (UK), Gary J. Becker, MD (USA), Nick C. Chalmers, MD (UK), David O. Kessel, MD (UK), Mick J Lee, MD (Ireland), Aalpen A. Patel, MD (USA), Jim A. Reekers, MD (Netherlands), Marc Sapoval, MD, PhD (France), and Luc Stockx, MD (Belgium).The SIR Task Force members are Aalpen A. Patel, MD, John F. Cardella, MD, Buddy Connors, MD, Steve Dawson, MD, Craig Glaiberman, MD, Derek Gould, MD, Robert Hurst, MD, Barry Katzen, MD, John Kaufman, MD, Jeanne LaBerge, MD, Albert Nemcek, Jr, MD, Jim Reekers, MD, John Rundback, MD, David Sacks, MD, Mark Scerbo, PhD, Stephen Solomon, MD, Richard Towbin, MD, and William Rilling, MDThe SIR/RSNA Joint Workgroup members are Gary J. Becker, MD, Steven L. Dawson, MD, Robert R. Hattery, MD, Barry T. Katzen, MD, Jeanne M. LaBerge, MD, John A. Kaufman, MD, Theresa C. McLoud, MD, Aalpen A. Patel, MD, and Stephen B. Solomon, MD The CIRSE Task Force members are Derek A. Gould, MD (UK), Gary J. Becker, MD (USA), Nick C. Chalmers, MD (UK), David O. Kessel, MD (UK), Mick J Lee, MD (Ireland), Aalpen A. Patel, MD (USA), Jim A. Reekers, MD (Netherlands), Marc Sapoval, MD, PhD (France), and Luc Stockx, MD (Belgium). The SIR Task Force members are Aalpen A. Patel, MD, John F. Cardella, MD, Buddy Connors, MD, Steve Dawson, MD, Craig Glaiberman, MD, Derek Gould, MD, Robert Hurst, MD, Barry Katzen, MD, John Kaufman, MD, Jeanne LaBerge, MD, Albert Nemcek, Jr, MD, Jim Reekers, MD, John Rundback, MD, David Sacks, MD, Mark Scerbo, PhD, Stephen Solomon, MD, Richard Towbin, MD, and William Rilling, MD The SIR/RSNA Joint Workgroup members are Gary J. Becker, MD, Steven L. Dawson, MD, Robert R. Hattery, MD, Barry T. Katzen, MD, Jeanne M. LaBerge, MD, John A. Kaufman, MD, Theresa C. McLoud, MD, Aalpen A. Patel, MD, and Stephen B. Solomon, MD

Referência(s)