Artigo Acesso aberto Revisado por pares

PROTOCOL: Evidence‐informed practice versus evidence‐based practice educational interventions for improving knowledge, attitudes, understanding, and behavior toward the application of evidence into practice: A comprehensive systematic review of undergraduate students

2019; The Campbell Collaboration; Volume: 15; Issue: 1-2 Linguagem: Inglês

10.1002/cl2.1015

ISSN

1891-1803

Autores

Elizabeth Adjoa Kumah, Robert McSherry, Josette Bettany‐Saltikov, Sharon Hamilton, Julie Hogg, Vicki Whittaker, Paul van Schaik,

Tópico(s)

Health Policy Implementation Science

Resumo

Evidence-informed practice versus evidence-based practice educational interventions for improving knowledge, attitudes, understanding, and behavior toward the application of evidence into practice: A comprehensive systematic review of undergraduate health and social care students. Over the past three decades, there has been increasing attention on improving healthcare quality, reliability, and ultimately, patient outcomes, through the provision of healthcare that that is influenced by the best available evidence, and devoid of rituals and tradition (Andre, Aune, & Brænd, 2016; Melnyk, Gallagher-Ford, Long, & Fineout-Overholt, 2014; Sackett, Rosenberg, Gray, Haynes, & Richardson, 1996). There is an expectation by professional regulators such as the Nursing and Midwifery Council, United Kingdom (NMC, 2015) and the Health and Care Professions Council (HCPC, 2012) that the professional, as part of their accountability applies the best available evidence to inform their clinical decision-making, roles and responsibilities. This is imperative for several reasons. Firstly, it enhances the delivery of healthcare and improves efficiency. Secondly, it produces better intervention outcomes and promotes transparency. Thirdly, it enhances co-operation and knowledge sharing among professionals and service users, and ultimately, it improves patient outcomes and enhances job satisfaction. Indeed, the need to guide healthcare practice with evidence has been emphasized by several authors, including Kelly, Heath, Howick, & Greenhalgh, 2015; Nevo & Slonim-Nevo, 2011; Scott & McSherry, 2009; Shlonsky & Stern, 2007; Smith & Rennie, 2014; Straus, Glasziou, Richardson, & Haynes, 2011; Tickle-Degnen & Bedell, 2003; and Sackett et al., 1996. According to these authors, the effective and consistent application of evidence into healthcare practice helps practitioners to deliver the best care for their patients and patient relatives. Nevertheless, there is often an ineffective and inconsistent application of evidence into healthcare practice (McSherry, 2007; Melnyk, 2017; Nevo & Slonim-Nevo, 2011). The two main concepts that have been associated with the application of evidence into healthcare practice are “evidence-based practice” and “evidence-informed practice”. Whilst Evidence-based practice has been considered the gold standard for effective healthcare delivery, a large majority of healthcare practitioners continue to encounter multiple difficulties, which inhibit the rapid application of evidence into practice (Epstein, 2009; Glasziou, 2005; Greenhalgh, Howick, & Maskrey, 2014; McSherry, 2007; McSherry, Simmons, & Pearce, 2002; Melnyk, 2017; Nevo & Slonim-Nevo, 2011). Nevo & Slonim-Nevo, 2011 believe the application of evidence into practice should, in principle be “informed by” evidence and not necessarily “based on” evidence. This suggests that decision-making in healthcare practice “might be enriched by prior research but not limited to it” (Epstein, 2009, p. 9). Similarly, McSherry, 2007 considers the application of evidence into practice (evidence-informed practice) to be a systems-based approach (i.e. made up of an input, throughput and an output), which contains, as part of its elements the principles of evidence-based practice. McSherry, 2007 believes evidence-based practice is the awareness, as well as the implementation of the relevant “research evidence” into practice. Hence, the author argues that the principles of evidence-based practice are contained in the “research awareness” element of the “evidence-informed practice model” (see Figure 1 for McSherry 2007 evidence-informed practice model). The evidence-informed practice model Currently, there is an on-going debate in the literature as to which of these two concepts best facilitate the effective and consistent application of evidence into practice. Researchers, such as Melnyk 2017, Melnyk & Newhouse, 2014, and Gambrill, 2010 believe that knowledge and skills in evidence-based practice help the healthcare professional to effectively apply evidence into practice. Conversely, Epstein, 2009; Nevo & Slonim-Nevo, 2011; and McSherry, 2007 have argued the need to equip healthcare professionals with the necessary knowledge and skills of evidence-informed practice in order to facilitate the effective and consistent application of evidence into practice. Moreover, whilst some authors, including Cardoso et al., 2017 and Glasziou, 2005 have used the two terms interchangeably, other researchers (such as Epstein, 2007; McSherry, 2007; Nevo & Slonim-Nevo, 2011; and McSherry et al., 2002) have identified significant differences between the two concepts. These differences are described in the ensuing section. It is imperative that healthcare training institutions produce graduates who are equipped with the knowledge and skills necessary for the effective and consistent application of evidence into practice (Dawes et al., 2005; Frenk et al., 2010; Melnyk, 2017). Hence, it is necessary for healthcare training institutions to include the principles involved in the application of evidence into practice, in undergraduate health and social care curricula. However, the question that arises is: which of the two concepts best facilitates the application of evidence into practice? While Melnyk, Fineout-Overholt, Stillwell, & Williamson, 2010 have suggested a seven-step approach to the application of evidence into practice (termed the “evidence-based practice model”), McSherry, 2007 has argued that the principles involved in the application of evidence into practice is a systems-based approach, with an input, throughput and an output (termed the “evidence-informed practice model”). The main purpose of this systematic review is to determine the differences and similarities, if any, between evidence-informed practice and evidence-based practice educational interventions; as well as the role each concept plays in the application of evidence into practice. In addition, the present systematic review aims to determine whether the two concepts act together, or individually to facilitate the effective application of evidence into practice. These aims will be achieved by exploring the effectiveness of evidence-informed practice educational interventions versus evidence-based practice educational interventions in improving the knowledge, attitudes, understanding, and behavior required for the effective application of evidence into practice among undergraduate pre-registered health and social care students. The gap between evidence and healthcare practice is well acknowledged (Lau et al., 2014; Melnyk 2017; Straus, Tetroe, & Graham, 2009). Difficulties in using evidence to make decisions in healthcare practice are evident across all groups of decision-makers, including health care providers, policy makers, managers, informal caregivers, patients, and patient relatives (Straus et al., 2009). Consequently, several interventions have been developed to improve the implementation of evidence into healthcare practice and policy. Specifically, evidence-based practice educational interventions are widely used and have been greatly evaluated (for example, Callister, Matsumura, Lookinland, Mangum, & Loucks, 2005; Dawley, Bloch, Suplee, McKeever, & Scherzer, 2011; Heye & Stevens 2009; Schoonees, Rohwer, & Young, 2017; and Goodfellow, 2004). Evidence-informed practice educational interventions have also been used as well (for example, Almost et al., 2013), although to a much smaller extent. Conducting a systematic review of currently available research offers a rigorous process for evaluating the comparative effectiveness of both evidence-informed practice and evidence-based practice educational interventions. Dawes et al., 2005 and Tilson et al., 2011 have each reported on Sicily statements, which have been made about the need for developing educational interventions on evidence-based practice in healthcare. The statements were made separately in the “Evidence-Based Healthcare Teachers and Developers” conference held in 2003 (Dawes et al., 2005) and 2009 (Tilson et al., 2011). The statements provide suggestions for evidence-based practice competencies, curricula and evaluation tools for educational interventions. All health and social care students and professionals are required to understand the principles of evidence-based practice, to have a desirable attitude towards evidence-based practice and to effectively implement evidence-based practice (Dawes et al., 2005). In order to incorporate a culture of evidence-based practice among health and social care students, Melnyk, 2017 believes undergraduate health and social care research modules need to be based on the seven-step model of evidence-based practice that was developed by Melnyk et al., 2010. In addition, the curricula should include learning across the four components of evidence-based practice, namely, knowledge, attitudes, behavior, and practice (Haggman-Laitila, Mattila, & Melender, 2016). Tilson et al., 2011 identified major principles for the design of evidence-based practice evaluation tools for learners. Among the identified categories for evaluating evidence-based practice educational interventions include the learner's knowledge of, and attitudes towards evidence-based practice, the learner's reaction to the educational experience, behavior congruent with evidence-based practice as part of patient care, and skills in implementing evidence-based practice. The frameworks used in assessing the effectiveness of evidence-based practice interventions need to reflect the aims of the research module. The aims also need to correspond to the needs and characteristics of learners. For example, students may be expected to perform the seven-steps of evidence-based practice, whiles health practitioners may be required to acquire skills in applying evidence into practice. In addition, the setting where learning, teaching and the application of evidence-based practice occur must be considered Tilson et al., 2011. Evidence-informed practice, on the other hand, extends beyond the initial definitions of evidence-based practice (LoBiondo-Wood, Haber, Cameron, & Singh, 2013), and is more inclusive than evidence-based practice (Epstein, 2009). This is due to the following reasons. Firstly, evidence-informed practice recognises practitioners as critical thinkers and encourages them to be knowledgeable about findings from all types of research (including, systematic reviews, randomised controlled trials, qualitative research, quantitative research, and mixed methods), and to utilize it in an integrative manner. Secondly, evidence-informed practice considers the best available research evidence, practitioner knowledge and experience, client preferences and values, and the clinical state and circumstances (Nevo & Slonim-Nevo, 2011). However, Melnyk & Newhouse, 2014 (p. 347) disagreed with this assertion as a difference between the two concepts. According to the authors, like evidence-informed practice, evidence-based practice has broadened to "integrate the best evidence for well-designed studies and evidence-based theories (i.e., external evidence) with a clinician's expertise, which includes internal evidence gathered from a thorough patient assessment and patient data, and a patient's preferences and values". Although this statement may be true, the existing evidence-based practice models (for example, DiCenso, Ciliska, & Cullum, 2005; Dufault 2004; Greenhalgh, Robert, & Bate, 2005; Melnyk et al 2010; Titler, Kleiber, & Steelman, 2001) place too much emphasis on “scientific evidence”, when making clinical decisions, and focus little or no attention to other forms of evidence such as the clinical context, patient values and preferences, and practitioner's knowledge and experiences (McTavish 2017; Miles & Loughlin, 2011). Inasmuch as scientific evidence plays a major role in clinical decision-making, the decision-making process must be productive and adaptable enough to meet the on-going changing condition and needs of the patient, as well as the knowledge and experiences of the health practitioner (LoBiondo-Wood et al 2013; Nevo & Slonim-Nevo, 2011). To this, researchers, including Nevo & Slonim-Nevo, 2011 and McSherry 2007, have advocated for a creative and flexible model of applying evidence into practice, where healthcare practitioners are not limited to following a series of steps (as advocated in evidence-based practice) in order to apply evidence into practice. Thirdly, unlike evidence-informed practice, evidence-based practice uses a formal hierarchy of evidence, which ranks certain forms of evidence (for example, systematic reviews and randomised controlled trials) higher than others (such as qualitative research and observational studies). Instead of the hierarchy of evidence, proponents of evidence-informed practice support an integrative model of practice that considers all forms of studies and prefers the evidence that provides the best answer to the clinical question (Epstein, 2009). In place of the hierarchy of evidence, Epstein, 2011 suggested a “wheel of evidence,” where “all forms of research, information gathering, and interpretations would be critically assessed but equally valued” (p. 225). This is to ensure that all forms of evidence are considered during decision-making in healthcare practice. Evidence-informed practice does not follow a stepwise approach to applying evidence into practice. Evidence-informed practice is a systems-based approach to applying evidence into practice, which comprises of input, throughput and an output (McSherry, 2007). McSherry, 2007 believes the actual process of applying evidence into practice occurs in a cyclical manner (termed the evidence-informed cycle) and not stepwise. Evidence-informed practice is adaptable and considers the complexities of health and healthcare delivery. Healthcare professionals live and work in a complex system. In fact, the clinical environment as well as health care delivery in itself is a complex system, made up of many interdependent parts (Sturmberg & Lanham, 2014). Hence, as previously stated, evidence-informed practice considers several factors including, the culture and context of patient care, experiences of the healthcare professional, patient preferences and values, as well as factors that influence research utilization (such as workload, lack of organizational support, and time) in clinical decision-making (LoBiondo-Wood et al., 2013; McSherry, 2007; Nevo & Slonim-Nevo, 2011). Thus, an evidence-informed practice educational intervention needs to recognise the learner as a critical thinker who is expected to consider various types of evidence in clinical decision-making (Almost et al., 2013; McSherry et al., 2002). One is not expected to be a researcher in order to effectively implement evidence-informed practice. According to McSherry et al., 2002, the healthcare professional must be aware of the various types of evidence (such as the context of care, patient preferences, and experience, as well as clinician's skills and expertise), not just research evidence, in order to deliver person-centred care. Table 1 presents a summary of the differences and similarities between evidence-informed practice and evidence-based practice. Table 1: For the purposes of this systematic review, the following operational definitions will apply: Evidence-informed practice educational interventions refer to any formal educational program that facilitates the application of the principles of the evidence-informed practice model developed by McSherry, 2007. The evidence-informed practice model (Figure 1), as developed by McSherry, 2007 is a systems-based model comprising of an input (for example, roles and responsibilities of the health practitioner) throughput (i.e. research awareness, application of knowledge, informed decision-making, evaluation) and an output, which is an empowered professional who is a critical thinker and doer (McSherry, 2007). Evidence-based practice educational interventions refer to any formal educational program that enhances the application of the principles of the evidence-based practice model developed by Melnyk et al., 2010. The evidence-based practice model developed by Melnyk et al., 2010 comprises of a seven-step approach to the application of evidence into practice. These are (1) to cultivate a spirit of inquiry (2) ask a clinical question (3) search for the best evidence to answer the question (4) critically appraise the evidence (5) integrate the appraised evidence with own clinical expertise and patient preferences and values (6) evaluate the outcomes of the practice decisions or changes based on evidence and (7) disseminate evidence-based practice results (Melnyk et al., 2010). In this systematic review, it is not a requirement for eligible studies to mention specifically Melnyk et al., 2010's model of evidence-based practice or McSherry, 2007's model of evidence-informed practice as the basis for the development of their educational program. However, the content of the educational program in each of the studies to be included must include some, if not all, of the elements and/or principles of the aforementioned models. In addition, definitions for “knowledge”, “attitudes”, “understanding” and “behavior” will be based on the Classification Rubric for Evidence-based practice Assessment Tools in Education (CREATE) created by Tilson et al., 2011 as follows: Knowledge: Knowledge refers to learners’ retention of facts and concepts about evidence-informed practice and evidence-based practice. Hence, assessments of evidence-informed practice and evidence-based practice knowledge might assess a learner's ability to define evidence-based practice and evidence-informed practice concepts, list their basic principles or describe levels of evidence. Attitudes: attitudes refer to the values ascribed by the learner to the importance and usefulness of evidence-informed practice and evidence-based practice to inform clinical decision-making. Understanding: understanding refers to learners’ comprehension of facts and concepts about evidence-based practice and evidence-informed practice. Behavior: Behavior refers to what learners actually do in practice. It is inclusive of all the processes that a learner would use in the implementation of evidence-informed practice and evidence-based practice, such as assessing patient circumstances, values, preferences, and goals along with identifying the learners’ own competence relative to the patient's needs in order to determine the focus of an answerable question. The mode of delivery of the educational program could be in the form of workshops, seminars, conferences, journal clubs and lectures (both face-to-face and online). The content, manner of delivery and length of the educational program may differ in each of the studies to be included as there is no standard evidence-informed practice/evidence-based practice educational program. In this systematic review, evidence-informed practice and evidence-based practice educational interventions that are targeted towards health and social care postgraduate students or registered health and social care practitioners will be excluded. Comparison interventions will include educational interventions that do not advance the teaching of the principles and processes of evidence-informed practice and/or evidence-based practice in healthcare or no intervention. Most efforts to apply evidence into healthcare practice have either been unsuccessful or partially successful (Christie, Hamill, & Powers, 2012; Eccles, Grimshaw, Walker, Johnston, & Pitts, 2005; Grimshaw, Eccles, & Tetroe, 2004; Lechasseur, Lazure, & Guilbert, 2011; McTavish 2017). The resultant effects include ineffective patient outcomes, reduced patient safety, reduced job satisfaction, and increased staff turnover rate (Adams, 2009; Fielding & Briss 2006; Huston 2010; Knops, Vermeulen, Legemate, & Ubbink, 2009; Melnyk & Fineout-Overholt, 2005; Schmidt & Brown, 2007). Hence, a lot of emphasis has been placed on teaching evidence-based practice skills (Masters, 2009; Melnyk, 2017; Scherer & Smith, 2002; Straus, Ball, Balcombe, Sheldon, & McAlister, 2005) and/or evidence-informed practice (Epstein, 2009; McSherry, 2007; McSherry et al., 2002; Nevo & Slonim-Nevo, 2011) in undergraduate health and social care curricula. However, it remains unclear the exact components of an evidence-based practice/evidence-informed practice educational intervention. Consequently, healthcare instructors continue to encounter challenges when it comes to finding the most efficient approach to preparing health and social care students towards the application of evidence into practice (Almost et al., 2013; Flores-Mateo & Argimon, 2007; Oh et al., 2010; Straus et al., 2005). This has resulted in an increase in the rate and number of research investigating educational interventions for enhancing knowledge, attitudes and skills towards, especially, evidence-based practice (Phillips et al., 2013). There is also, empirical evidence (primary studies) to support a direct link between evidence-based practice/evidence-informed practice educational interventions and knowledge, attitudes, understanding and behavior, which in turn may affect the application of evidence into practice. However, participants in most of the studies reviewed were nursing students. Ashtorab, Pashaeypoor, Rassouli, & Majd, 2014 developed an evidence-based practice educational intervention for nursing students and assessed its effectiveness, based on Rogers’ diffusion of innovation model (Rogers, 2003). The authors concluded that evidence-based practice education grounded on Roger's model leads to improved attitudes, knowledge and adoption of evidence-based practice. According to the authors, Rogers’ diffusion of innovation model contains all the important steps that need to be applied in the teaching of evidence-based practice. Heye & Stevens, 2009 developed an evidence-based practice educational intervention and assessed its effectiveness on seventy-four (74) undergraduate nursing students using the ACE star model of knowledge transformation (Stevens, 2004). According to the authors, the Star model describes how evidence is progressively applied into healthcare practice by transforming the evidence through various stages (translation, integration, evaluation, discovery and summary). It was concluded that the students who participated in the educational program gained research appraisal skills and knowledge in the use of evidence in designing improvements in healthcare practice. In addition, the authors reported that undergraduate nursing students who were included in the study acquired evidence-based practice competencies and skills that are required for the work environment. Several other studies have reported on the effectiveness of evidence-based practice educational interventions and their underpinning theoretical foundations: the Self-directed learning strategies (Fernandez, Tran, & Ramjan, 2014; Kruszewski, Brough, & Killeen, 2009; Zhang, Zeng, Chen, & Li, 2012), the Constructivist Model of learning (Fernandez et al., 2014), Bandura's self-efficacy theories (Kim, Brown, Fields, & Stichler, 2009) as well as the Iowa model of evidence-based practice (Kruszewski et al., 2009). However, research in the area of evidence-informed practice educational interventions has been limited. Almost et al., 2013 developed an educational intervention aimed at supporting nurses in the application of evidence-informed practice. Prior to developing the intervention, the authors conducted interviews to examine the scope of practice, contextual setting and learning needs of participants. A Delphi survey was then conducted to rank learning needs, which were identified by the interview participants, in order to select the key priorities for the intervention. The authors then conducted a pre and post survey, before the intervention and six months after the intervention, respectively, to assess the impact of the intervention. Thus, the development of the intervention was learner-directed, which reaffirms McSherry, 2007's description of the evidence-informed practitioner as a critical thinker and doer. Unlike evidence-based practice, practice knowledge and intervention decisions regarding evidence-informed practice are enriched by prior research but not limited to it. In this way, evidence-informed practice is more inclusive than evidence-based practice (Epstein, 2009 p. 9). Nevo & Slonim-Nevo, 2011 argue that rather than focusing educational interventions on the research-evidence dominated steps of evidence-based practice, research findings should be included in the intervention process, however, the process itself must be creative and flexible enough to meet the continually changing needs, conditions, experiences, and preferences of patients and health professionals. A logic model has been presented in Figure 2 below to indicate the connection between evidence-based practice/evidence-informed practice educational intervention and outcomes. Logic model [Color figure can be viewed at wileyonlinelibrary.com] Despite the seeming confusion surrounding the terms “evidence-informed practice” and “evidence-based practice” together with the on-going debate in the literature as to which concept leads to better patient outcomes, no study, to the best of the researchers’ knowledge, has compared through a systematic review, the effects of the two concepts on the effective implementation of evidence into practice. A review of the literature reveals several systematic reviews conducted on evidence-based practice educational interventions and the effects of such interventions. Examples of such systematic reviews are described below. Young, Rohwer, Volmink, & Clarke, 2014 conducted an overview of systematic reviews that evaluated interventions for teaching evidence-based practice to healthcare professionals (undergraduate students, interns, residents and practicing healthcare professionals). Comparison interventions in the study were no intervention or different strategies. The authors included 15 published and 1 unpublished systematic reviews. The outcome criteria included evidence-based practice knowledge, critical appraisal skills, attitudes, practices, and health outcomes. In many of the included studies, however, the focus was on critical appraisal skills. The systematic reviews that were reviewed used a number of different educational interventions of varying formats (for example, lectures, online teaching and journal clubs), content and duration to teach the various component of evidence-based practice in a range of settings. The results of the study indicated that multifaceted, clinically integrated interventions (for example, lectures, online teaching and journal clubs), with assessment, led to improved attitudes, knowledge, and skills towards evidence-based practice. The majority of the included systematic reviews poorly reported poorly the findings from the source studies, without reference to significant tests or effect sizes. Besides, the outcome criteria (for example, knowledge, skills, attitudes, practices and health outcomes) were described narratively as improved or not, with the use of vote counting. Coomarasamy & Khan, 2004 conducted a systematic review to evaluate the effects of standalone versus clinically integrated teaching in evidence-based medicine on postgraduate healthcare students’ knowledge, critical appraisal skills, attitudes, and behavior. The results indicated that standalone teaching improved knowledge, but not skills, attitudes or behavior. Clinically integrated teaching, however, improved knowledge, skills, attitudes, and behavior. A similar systematic review by Flores-Mateo & Argimon, 2007 identified a small significant improvement in postgraduate healthcare students’ skills, knowledge, behavior, and attitudes after participating in evidence-based practice interventions. Furthermore, a systematic review of the literature has been conducted to identify the effectiveness of evidence-based practice training programs and their components for allied health professionals (Dizon, Grimmer-Somers, & Kumar, 2012). The researchers reported that irrespective of the allied health discipline, there was consistent evidence of significant changes in knowledge and skills among health practitioners, after participating in an evidence-based practice educational program. In addition, recently, a systematic review has been conducted by Rohwer, Motaze, Rehfuess, & Young, 2017 to assess the effectiveness of the e-learning of evidence-based practice on increasing evidence-based practice competencies in healthcare professionals (i.e. medical doctors, nurses, physiotherapists, physician assistants, and athletic trainers). The results indicated that pure e-learning compared to no learning improved the knowledge of as well as the attitudes towards evidence-based practice among the various professional groups. Yet, according to a comprehensive literature review, no specific systematic review has been conducted on evidence-informed practice educational interventions and the effects of such interventions on the knowledge, attitudes, understanding, and behavior of undergraduate health and social care students. Two reviews (namely, McCormack, Rycroft-Malone, DeCorby, & Hutchinson, 2013; and Yost et al., 2015) conducted on evidence-informed practice interventions focused on “change agency” and “knowledge translation” as interventions in improving evidence-informed practice. For example, McCormack et al., 2013 conducted a realist review of strategies and interventions to promote evidence-informed practice, but the authors focused only on “change agency” as an intervention aimed at improving the efficiency of the application of evidence. Also, a sys

Referência(s)