Assessing the quality of real-world evidence in retinal diseases

, , , , , ,
Ophthalmology Times Europe Journal, Ophthalmology Times Europe November 2021, Volume 17, Issue 09

A recently developed tool can help ophthalmologists to assess the quality of findings from real-world studies in retinal diseases. This will help them decide which results are most robust and applicable to their practice.

By Prof. Robert P. Finger, Prof. Vincent Daien, Mr James S. Talks, Prof. Taiji Sakamoto, Prof. Bora M. Eldem, Dr Monica Lövestam-Adrian and Prof. Jean-François Korobelnik

It is well accepted that randomised controlled trials (RCTs) are the gold standard in establishing the efficacy of a clinical intervention. However, since such studies often have strict inclusion criteria and stringent requirements for patient follow-up, they often have greater patient adherence than that seen in the clinic and can exclude some of the types of patients routinely seen in day-to-day practice.1,2

In addition, RCTs can be extremely expensive to conduct and study enrolment can be relatively slow compared with non-randomised studies.3 This is where data collected from routine clinical practice comes in. Such data enable the generation of real-world evidence (RWE), which can complement information from RCTs by providing insights regarding the safety and effectiveness of an intervention in broader patient populations under routine care conditions.1 

This evidence can be derived from real-world data from various sources, including electronic health or medical records (also referred to as EHR or EMR), patient registries, medical claims databases, retrospective or prospective observational studies or clinical audits (Table 1).1,4-11 RWE is increasingly being used to support drug approvals12 and in health technology assessments to support reimbursement.13

In that respect, it is already guiding clinical practice. With a better understanding as to how to assess the quality of real-world studies, clinicians can use this evidence appropriately to inform their own clinical decision-making at both practice and patient level.

In ophthalmology, there is an increasing global evidence base that describes the use of intravitreal anti-vascular endothelial growth factors (anti-VEGF) in routine clinical practice.5,6,9,10 Such studies have shown that, in routine practice compared with RCTs, there is both a lack of adherence to, and undertreatment with, anti-VEGF therapy, including non-persistence, whereby patients do not continue therapy in the medium-to-long term.2

This remains a significant barrier to optimising real-world outcomes for patients with chronic, progressive retinal conditions such as neovascular age-related macular degeneration (nAMD). Certainly, more RWE for effective strategies that can be employed at a clinic/ practice level to improve adherence and persistence to anti-VEGF agents would be welcomed.

In addition, there are inherent limitations to data collection in real-world clinical practice; as a result, RWE may vary in quality.14 This is partly because of heterogeneity in data gained from different sources and variability in both the application of RWE methodologies and the interpretation of the resulting evidence. Furthermore, depending on the RWE source, data may be inconsistently collected, misclassified or missed — this is known as information bias.

A recent systematic review analysed 64 real-world ophthalmology data sources from 16 countries for completeness of data relating to different outcomes and only ten scored highly.15 Most of these sources provided information on baseline status, clinical outcomes and treatment, but few collected data on economic and patient-reported burden.

Another issue within ophthalmology is how the treatment regimens, particularly for anti-VEGF agents, are described. Intravitreal aflibercept, brolucizumab, pegaptanib and ranibizumab have indications for retinal diseases, but in some European countries, bevacizumab is used and reimbursed off-label, despite jurisdictional ambiguity and, often, a legal risk for prescribing physicians.16

Sometimes, there is a lack of clarity as to whether patients were treated with fixed dosing, pro re nata therapy or treat-and-extend dosing and the degree of adherence to those schedules.15 This makes understanding treatment effectiveness and comparing studies particularly challenging.

Although information bias may be one of the easiest types to identify and quantify, RWE is also prone to other biases. Therapies may be differently prescribed depending on patient and disease characteristics, leading to selection and channelling biases (for example, older patients or those with more advanced disease tending to receive one therapy over another).

In addition, patients or caregivers may be more likely to report only the most recent or impactful events, leading to recall bias; and, in some cases, events are more likely to be captured in one treatment group than another, resulting in detection biases.17 It is important that such limitations are recognised and RWE is interpreted in this context.

Introducing a novel RWE quality assessment tool

Throughout medical school and early in their medical careers, clinicians are taught how to assess the robustness of clinical studies, learning about randomisation, methods of blinding and different types of controls.18 It is similarly important to understand the quality of RWE.

RWE is becoming an increasingly important component of the overall evidence-based treatment of retinal diseases. However, clear guidance on how to assess the rigor of real-world studies and the conclusions and recommendations they generate in the field of ophthalmology is lacking.

As shown in Figure 1, an RWE steering committee—a coalition of leading retinal specialists and methodological experts—recently developed a user-friendly framework assessing the quality of available RWE for retinal diseases, including nAMD, diabetic macular oedema and retinal vascular occlusion.14 The goal was to assist ophthalmologists in independently drawing relevant and reliable conclusions from RWE and understanding its applicability to their practice.

Building on a validated framework

The Good Research for Comparative Effectiveness (GRACE) checklist was selected as the basis of the RWE quality assessment tool. This is an 11-item screening tool that evaluates methodology and reporting to identify high-quality observational comparative effectiveness research.19 The checklist has been extensively validated, has demonstrated strong sensitivity and specificity and can be successfully applied by a wide variety of users with different training backgrounds.

Although the checklist was developed specifically for comparative effectiveness research, many of the items were also considered applicable to non-comparative studies. It was adapted for the retinal diseases field by omitting items that were not considered relevant to a practical, clinically focused ophthalmology audience, and adapting or combining some items for easier application.

Considerations when assessing quality of RWE

The adaptation of the GRACE checklist to be more specific to, and relevant for, ophthalmologists has resulted in the development of the retinal diseases RWE quality assessment tool. Table 2 summarises the retinal disease-specific aspects of the checklist.

This addresses treatment details, how outcomes were assessed/quantified, descriptors of the study population and possible sources of bias. More details and examples of what to consider can be found in the full publication.14

The purpose of the tool is to specifically assess the quality of RWE generated by a specific study in a population of patients with retinal disease. Caution should be exercised when comparing RWE across different retinal disease studies because of the heterogeneity mentioned previously regarding the different methodologies employed, as well as heterogeneity in patient populations, disease characteristics and treatment regimens.

Summary

RWE is a growing in importance as a source of information that, when evaluated appropriately, can help to support clinical decision-making. Regulators also increasingly accept that RWE may be needed in appraising post-marketing value. The retinal diseases RWE quality assessment tool can help clinicians to understand which findings from real-world studies are most robust and applicable to their practice.

--

Robert P. Finger, Vincent Daien, James S. Talks,Taiji Sakamoto,Bora M. Eldem,Monica Lövestam-Adrian and Jean-François Korobelnik
E: Robert.Finger@ukbonn.de
Prof. Finger is based at the Department of Ophthalmology, University of Bonn, Bonn, Germany; Prof. Daien at the Department of Ophthalmology, Gui De Chauliac Hospital, Montpellier, France and The Save Sight Institute, Sydney Medical School, The University of Sydney, Sydney, Australia; Mr Talks at the Department of Ophthalmology, Royal Victoria Infirmary, Newcastle upon Tyne, UK; Prof. Sakamoto at the Department of Ophthalmology, Kagoshima University Graduate School of Medical and Dental Sciences, Kagoshima and J-CREST, Japan; Prof. Eldem at the Faculty of Medicine, Ophthalmology Department, Hacettepe University, Ankara, Turkey; Dr Lövestam-Adrian at the Department of Ophthalmology, Lund University Hospital, SUS Lund, Sweden; and Prof. Korobelnik at CHU Bordeaux, Service d’Ophtalmologie, Bordeaux, France and the University of Bordeaux, INSERM, BPH, U1219, Bordeaux, France.
The authors have no commercial interests in relation to the article. Medical writing and editorial support for preparation of the article, under the guidance of the authors, was provided by ApotheCom, which was funded by Bayer Consumer Health.

--

References
1. Talks J, Daien V, Finger RP, et al. The use of real-world evidence for evaluating anti-vascular endothelial growth factor treatment of neovascular age-related macular degeneration. Surv Ophthalmol. 2019;64:707-719.
2. Okada M, Mitchell P, Finger PM, et al. Non-adherence or non-persistence to intravitreal injection therapy for neovascular age-related macular degeneration: a mixed-methods systematic review. Ophthalmology. 2021;128:234-247.
3. Lauer MS and D’Agostino RB Sr. The randomized registry trial--the next disruptive technology in clinical research? N Engl J Med. 2013;369:1579-1581.
4. United States Food and Drug Administration. Real-world data (RWD) and real-world evidence (RWE) are playing an increasing role in health care decisions. 2020; Available from: https://www.fda.gov/science-research/science-and-research-special-topics/real-world-evidence.
5. Egan C, Zhu H, Lee A, et al. The United Kingdom Diabetic Retinopathy Electronic Medical Record Users Group, Report 1: baseline characteristics and visual acuity outcomes in eyes treated with intravitreal injections of ranibizumab for diabetic macular oedema. Br J Ophthalmol. 2017;101:75-80.
6. Bhandari S, Nguyen V, Fraser-Bell S, et al. Ranibizumab or aflibercept for diabetic macular edema: comparison of 1-year outcomes from the Fight Retinal Blindness! registry. Ophthalmology. 2020;127:608-615.
7. Rayess N, Vail D and Mruthyunjaya P. Rates of reoperation in 10,114 patients with epiretinal membranes treated by vitrectomy with or without inner limiting membrane peeling. Ophthalmol Retina. 2021;5:664-669.
8. Faramawi MF, Delhey LM, Chancellor JR and Sallam AB. The influence of diabetes status on the rate of cataract surgery following pars plana vitrectomy. Ophthalmol Retina. 2020;4:486-493.
9. Calster JV, JacobJ, Wirix M, et al. A Belgian retrospective study in patients treated for macular edema secondary to CRVO: 1-year real-world intravitreal aflibercept data. Presented at EURETINA 2018.
10. Korobelnik JF, Daien V, Faure C, et al. Real-world outcomes following 12 months of intravitreal aflibercept monotherapy in patients with diabetic macular edema in France: results from the APOLLON study. Graefes Arch Clin Exp Ophthalmol. 2020;258:521-528.
11. Fajgenbaum MAP, Neffendof, JE, Wong RS, et al. Intraoperative and postoperative complications in phacovitrectomy for epiretinal membrane and macular hole: a clinical audit of 1,000 consecutive eyes. Retina. 2018;38:1865-1872.
12. Bolislis WR, Fay M and Kühler TC. Use of real-world data for new drug applications and line extensions. Clin Ther. 2020;42:926-938.
13. Leahy TP, Ramagopalan S and Sammon C. The use of UK primary care databases in health technology assessments carried out by the National Institute for health and care excellence (NICE). BMC Health Serv Res. 2020;20:675.
14. Finger RP, Daien V, Talks, JS, et al. A novel tool to assess the quality of RWE to guide the management of retinal disease. Acta Ophthalmol. 2021;99:604-610.
15. Daien V, Eldem BM, Talks JS, et al. Real-world data in retinal diseases treated with anti-vascular endothelial growth factor (anti-VEGF) therapy–a systematic approach to identify and characterize data sources. BMC Ophthalmol. 2019;19:206.
16. Bro T, Derebacka M, Jørstad ØK, Grzybowski A. Off-label use of bevacizumab for wet age-related macular degeneration in Europe. Graefes Arch Clin Exp Ophthalmol. 2020;258:503-511.
17. Blonde L, Khunti K, Harris, SB, et al. Interpretation and impact of real-world clinical data for the practicing clinician. Adv Ther. 2018;35:1763-1774.
18. Moher D, Jaded AR, Nichol G, et al. Assessing the quality of randomized controlled trials: an annotated bibliography of scales and checklists. Control Clin Trials. 1995;16:62-73.
19. Dreyer NA, Bryant A andVelentgas P. The GRACE Checklist: a validated assessment tool for high quality observational studies of comparative effectiveness. J Manag Care Spec Pharm. 2016;22:1107-1113.