Real-world evidence is growing in importance as a source of information that can help support clinical decision-making when evaluated properly.
It is accepted that randomly assigned controlled trials (RCTs) are the gold standard in establishing the efficacy of a clinical intervention. However, because such studies often have strict inclusion criteria and stringent requirements for patient follow-up, they often have greater patient adherence than that seen in the clinic and can exclude some of the types of patients routinely seen in day-to-day practice.1,2
In addition, RCTs can be extremely expensive to conduct, and study enrollment can be relatively slow compared with nonrandomized studies.3 This is where data collected from routine clinical practice comes in. Such data enable the generation of real-world evidence (RWE), which can complement information from RCTs by providing insights regarding the safety and effectiveness of an intervention in broader patient populations under routine care conditions.1
This evidence can be derived from real-world data from various sources, including electronic health or medical records, patient registries, medical claims databases, retrospective or prospective observational studies, or clinical audits (Table 1).1,4-10 RWE is increasingly being used to support drug approvals11 and in health technology assessments to support reimbursement.12
In that respect, it is already guiding clinical practice. With a better understanding as to how to assess the quality of real-world studies, clinicians can use this evidence appropriately to inform their own clinical decision-making at both practice and patient level.
In ophthalmology, there is an increasing global evidence base that describes the use of intravitreal anti-VEGF in routine clinical practice.5,6,9,10 Such studies have shown that, in routine practice compared with RCTs, there is both a lack of adherence to and undertreatment with anti-VEGF therapy, including nonpersistence, whereby patients do not continue therapy in the medium-to-long term.2
This remains a significant barrier to optimizing real-world outcomes for patients with chronic, progressive retinal conditions, such as neovascular age-related macular degeneration (nAMD). Certainly, more RWE for effective strategies that can be employed at a clinic/practice level to improve adherence and persistence to anti-VEGF agents would be welcomed.
In addition, there are inherent limitations to data collection in real-world clinical practice; as a result, RWE may vary in quality.13 This is partly because of heterogeneity in data gained from different sources and variability in both the application of RWE methodologies and the interpretation of the resulting evidence. Furthermore, depending on the RWE source, data may be inconsistently collected, misclassified, or missed. This is known as information bias.
A recent systematic review analyzed 64 real-world ophthalmology data sources from 16 countries for completeness of data relating to different outcomes, and only 10 scored highly.14 Most of these sources provided information on baseline status, clinical outcomes, and treatment, but few collected data on economic and patient-reported burden.
Another issue within ophthalmology is how the treatment regimens are described, particularly for anti-VEGF agents. Intravitreal aflibercept, brolucizumab, pegaptanib, and ranibizumab have indications for retinal diseases. However, in some European countries, bevacizumab is used and reimbursed off-label, despite jurisdictional ambiguity and, often, a legal risk for prescribing physicians.15
Sometimes there is a lack of clarity as to whether patients were treated with fixed dosing, pro re nata therapy, or treat-and-extend dosing, as well as the degree of adherence to those schedules.14 This makes understanding treatment effectiveness and comparing studies particularly challenging.
Although information bias may be one of the easiest types to identify and quantify, RWE is also prone to other biases. Therapies may be differently prescribed depending on patient and disease characteristics, leading to selection and channelling biases (for example, older patients or those with more advanced disease tending to receive one therapy over another).
In addition, patients or caregivers may be more likely to report only the most recent or impactful events, leading to recall bias. In some cases, events are more likely to be captured in one treatment group than another, resulting in detection biases.16 It is important that such limitations are recognized and RWE is interpreted in this context.
Introducing a novel RWE quality assessment tool
Throughout medical school and early in their medical careers, clinicians are taught how to assess the robustness of clinical studies, learning about randomization, methods of blinding, and different types of controls.17 It is similarly important to understand the quality of RWE.
RWE is becoming an increasingly important component of the overall evidence-based treatment of retinal diseases. However, clear guidance on how to assess the rigor of real-world studies and the conclusions and recommendations they generate in the field of ophthalmology is lacking.
As shown in Figure 1, an RWE steering committee (a coalition of leading retinal specialists and methodological experts) recently developed a user-friendly framework assessing the quality of available RWE for retinal diseases, including nAMD, diabetic macular oedema, and retinal vascular occlusion.13 The goal was to assist ophthalmologists in independently drawing relevant and reliable conclusions from RWE and understanding its applicability to their practice.
Building on a validated framework
The Good Research for Comparative Effectiveness (GRACE) checklist was selected as the basis of the RWE quality assessment tool. This is an 11-item screening tool that evaluates methodology and reporting to identify high-quality observational comparative effectiveness research.18 The checklist has been extensively validated, has demonstrated strong sensitivity and specificity, and can be successfully applied by a wide variety of users with different training backgrounds.
Although the checklist was developed specifically for comparative effectiveness research, many of the items were also considered applicable to noncomparative studies. It was adapted for the retinal diseases field by omitting items that were not considered relevant to a practical, clinically focused ophthalmology audience, and adapting or combining some items for easier application.
Considerations when assessing quality of RWE
The adaptation of the GRACE checklist to be more specific to and relevant for ophthalmologists has resulted in the development of the retinal diseases RWE quality assessment tool.
This addresses treatment details, how outcomes were assessed/quantified, descriptors of the study population, and possible sources of bias. More details and examples of what to consider can be found in the full publication.13
The purpose of the tool is to specifically assess the quality of RWE generated by a specific study in a population of patients with retinal disease. Caution should be exercised when comparing RWE across different retinal disease studies because of the heterogeneity mentioned previously regarding the different methodologies employed, as well as heterogeneity in patient populations, disease characteristics, and treatment regimens.
RWE is growing in importance as a source of information that can help support clinical decision-making when evaluated properly. Regulators also increasingly accept that RWE may be needed in appraising postmarketing value. The retinal diseases RWE quality assessment tool can help clinicians understand which findings from real-world studies are most robust and applicable to their practice.
Robert P. Finger, MD, PhD
Finger is a member of the Department of Ophthalmology at the University of Bonn in Germany. He was assisted by Vincent Daien, MD, PHD, FEBO; James S. Talks; Taiji Sakamoto, MD, PhD, FARVO; Bora M. Eldem, MD, FEBO; Monica Lövestam-Adrian, MD; and Jean-François Korobelnik, MD, in the development of this report. The authors have no commercial interests in relation to the article. Medical writing and editorial support for preparation of the article, under the guidance of the authors, was provided by ApotheCom, which was funded by Bayer Consumer Health.
1. Talks J, Daien V, Finger RP, et al. The use of real-world evidence for evaluating anti-vascular endothelial growth factor treatment of neovascular age-related macular degeneration. Surv Ophthalmol. 2019;64(5):707-719. doi:10.1016/j.survophthal.2019.02.008
2. Okada M, Mitchell P, Finger RP, et al. Nonadherence or nonpersistence to intravitreal injection therapy for neovascular age-related macular degeneration: a mixed-methods systematic review. Ophthalmology. 2021;128(2):234-247. doi:10.1016/j.ophtha.2020.07.060
3. Lauer MS, D’Agostino RB Sr. The randomized registry trial--the next disruptive technology in clinical research?. N Engl J Med. 2013;369(17):1579-1581. doi:10.1056/NEJMp1310102
4. Real-world data (RWD) and real-world evidence (RWE) are playing an increasing role in health care decisions. United States Food and Drug Administration. September 30, 2021. Accessed November 1, 2021. https://www.fda.gov/science-research/science-and-research-special-topics/real-world-evidence
5. Egan C, Zhu H, Lee A, et al. The United Kingdom Diabetic Retinopathy Electronic Medical Record Users Group, Report 1: baseline characteristics and visual acuity outcomes in eyes treated with intravitreal injections of ranibizumab for diabetic macular oedema. Br J Ophthalmol. 2017;101(1):75-80. doi:10.1136/bjophthamol-2016-309313
6. Bhandari S, Nguyen V, Fraser-Bell S, et al. Ranibizumab or aflibercept for diabetic macular edema: comparison of 1-year outcomes from the Fight Retinal Blindness! registry. Ophthalmology. 2020;127(5):608-615. doi:10.1016/j.ophtha.2019.11.018
7. Rayess N, Vail D, Mruthyunjaya P. Rates of reoperation in 10 114 patients with epiretinal membranes treated by vitrectomy with or without inner limiting membrane peeling. Ophthalmol Retina. 2021;5(7):664-669. doi:10.1016/j.oret.2020.10.013
8. Faramawi MF, Delhey LM, Chancellor JR, Sallam AB. The influence of diabetes status on the rate of cataract surgery following pars plana vitrectomy. Ophthalmol Retina. 2020;4(5):486-493. doi:10.1016/j.oret.2019.09.011
9. Calster JV, JacobJ, Wirix M, et al. A Belgian retrospective study in patients treated for macular edema secondary to CRVO: 1-year real-world intravitreal aflibercept data. Presented at: EURETINA Congress 2018; September 20-23, 2018; Vienna, Austria.
10. Korobelnik JF, Daien V, Faure C, et al. Real-world outcomes following 12 months of intravitreal aflibercept monotherapy in patients with diabetic macular edema in France: results from the APOLLON study. Graefes Arch Clin Exp Ophthalmol. 2020;258(3):521-528. doi:10.1007/s00417-019-04592-9
11. Bolislis WR, Fay M, Kühler TC. Use of real-world data for new drug applications and line extensions. Clin Ther. 2020;42(5):926-938. doi:10.1016/j.clinthera.2020.03.006
12. Leahy TP, Ramagopalan S, Sammon C. The use of UK primary care databases in health technology assessments carried out by the National Institute for health and care excellence (NICE). BMC Health Serv Res. 2020;20(1):675. doi:10.1186/s12913-020-05529-3
13. Finger RP, Daien V, Talks JS, et al. A novel tool to assess the quality of RWE to guide the management of retinal disease. Acta Ophthalmol. 2021;99(6):604-610. doi:10.1111/aos.14698
14. Daien V, Eldem BM, Talks JS, et al. Real-world data in retinal diseases treated with anti-vascular endothelial growth factor (anti-VEGF) therapy - a systematic approach to identify and characterize data sources. BMC Ophthalmol. 2019;19(1):206. doi:10.1186/s12886-019-1208-9
15. Bro T, Derebecka M, Jørstad ØK, Grzybowski A. Off-label use of bevacizumab for wet age-related macular degeneration in Europe. Graefes Arch Clin Exp Ophthalmol. 2020;258(3):503-511. doi:10.1007/s00417-019-04569-8
16. Blonde L, Khunti K, Harris SB, Meizinger C, Skolnik NS. Interpretation and impact of real-world clinical data for the practicing clinician. Adv Ther. 2018;35(11):1763-1774. doi:10.1007/s12325-018-0805-y
17. Moher D, Jadad AR, Nichol G, Penman M, Tugwell P, Walsh S. Assessing the quality of randomized controlled trials: an annotated bibliography of scales and checklists. Control Clin Trials. 1995;16(1):62-73. doi:10.1016/0197-2456(94)00031-w
18. Dreyer NA, Bryant A, Velentgas P. The GRACE checklist: a validated assessment tool for high quality observational studies of comparative effectiveness. J Manag Care Spec Pharm. 2016;22(10):1107-1113. doi:10.18553/jmcp.2016.22.10.1107