WHAT IS EVIDENCE BASED PRACTICE?
The most common definition of EBP is taken from Dr. David Sackett, a pioneer in evidence-based practice. EBP is "the conscientious, explicit and judicious use of current best evidence in making decisions about the care of the individual patient. It means integrating individual clinical expertise with the best available external clinical evidence from systematicresearch." (Sackett D, 1996)
EBP is the integration of clinical expertise, patient values, and the best research evidence into the decision making process for patient care. Clinical expertise refers to the clinician's cumulated experience, education and clinical skills. The patient brings to the encounter his or her own personal and unique concerns, expectations, and values. The best evidence is usually found in clinically relevant research that has been conducted using sound methodology. (Sackett D, 2002)
The COVID-19 pandemic represents the largest public health crisis of the past century. Faced with a global threat, public health officials, professional societies, clinicians and patients have appropriately sought strategies to prevent SARS-CoV-2 transmission, reduce progression to severe and critical illness, and mitigate short-term and long-term sequelae. Efforts have extended to drug therapies, non-pharmacological interventions (vaccination, ventilation strategies in critically ill) and system-level policies (masking, vaccination, quarantining, isolation, physical distancing, and remote work and study).
First reports of cases of pneumonia originating from Wuhan, China, emerged on 31 December 2019. A PubMed search conducted from 1 January 2020 to date using keywords related to coronavirus, COVID-19, SARS-CoV-2 and novel coronavirus-2 yields over 439 000 hits, representing a daily publication rate of over 260 articles. A living systematic review of registered clinical trials for COVID-19 identified 15 624 registrations up to 2023; now, over five years into the pandemic, researchers continue to...
Evidence-based practice improves healthcare and patient outcomes, by providing a framework for integrating research into clinical practice. Evidence-based practice is considered a core competency in medical education.
Formulate a research question
Find best available research
Critically appraise research findings
Evaluate strength/certainty of evidence
Although these competencies are part of curricula for medical and health education programmes in Sweden, there is no consensus on which methods best support learning of EBM.
To assess the effects of digital patient decision-support tools for atrial fibrillation (AF) treatment decisions in adults with AF.
Systematic review and meta-analysis.
Eligible randomised controlled trials (RCTs) evaluated digital patient decision-support tools for AF treatment decisions in adults with AF.
We searched MEDLINE, EMBASE and Scopus from 2005 to 2023.
Risk-of-bias (RoB) assessment: We assessed RoB using the Cochrane Risk of Bias Tool 2 for RCTs and cluster RCT and the ROBINS-I tool for quasi-experimental studies.
We used random effects meta-analysis to synthesise decisional conflict and patient knowledge outcomes reported in RCTs. We performed narrative synthesis for all outcomes. The main outcomes of interest were decisional conflict and patient knowledge.
13 articles, reporting on 11 studies (4 RCTs, 1 cluster RCT and 6 quasi-experimental) met the inclusion criteria. There were 2714 participants across all studies (2372 in RCTs), of which 26% were women and the mean age was 71 years. Socioeconomically disadvantaged groups were poorly represented in the included studies. Seven studies (n=2508) focused on non-valvular AF and the mean CHAD2DS2-VASc across studies was 3.2 and for HAS-BLED 1.9. All tools focused on decisions regarding thromboembolic stroke prevention and most enabled calculation of individualised stroke risk. Tools were heterogeneous in features and functions; four tools were patient decision aids. The readability of content was reported in one study. Meta-analyses showed a reduction in decisional conflict (4 RCTs (n=2167); standardised mean difference –0.19; 95% CI –0.30 to –0.08; p=0.001; I2=26.5%; moderate certainty evidence) corresponding to a decrease in 12.4 units on a scale of 0 to 100 (95% CI –19.5 to –5.2) and improvement in patient knowledge (2 RCTs (n=1057); risk difference 0.72, 95% CI 0.68, 0.76, p<0.001; I2=0%; low certainty evidence) favouring digital patient decision-support tools compared with usual care. Four of the 11 tools were publicly available and 3 had been implemented in healthcare delivery.
In the context of stroke prevention in AF, digital patient decision-support tools likely reduce decisional conflict and may result in little to no change in patient knowledge, compared with usual care. Future studies should leverage digital capabilities for increased personalisation and interactivity of the tools, with better consideration of health literacy and equity aspects. Additional robust trials and implementation studies are warranted.
CRD42020218025