5 takeaways:
➀ Real-world data can be used to replicate results from randomized controlled trials. While randomized controlled trials — RCTs — have long been considered the gold standard of medical evidence, researchers are now working to see if other pools of data can be used to answer the same questions. Dr. Sebastian Schneeweiss of the Harvard Medical School has or will conduct 30 studies in the “RCT DUPLICATE” initiative, funded by the U.S. Food and Drug Administration (first results here). The science behind the studies can get complicated, but at the core they use patient-level claims data from insurers to reach the same causal conclusions as an RCT. In an RCT, Patient Group A will get the drug under study and Patient Group B will get either a different drug or a placebo. Both groups are then observed, and researchers will eventually know with a high level of statistical satisfaction whether the new drug performed better than the alternative. In Schneeweiss’ studies [see video], insurance claims for existing patients show which ones are on which drugs. That allows researchers to design a cohort study by building a Patient Group A and a Patient Group B. These studies work for drugs that are already on the market but that researchers want to study for secondary reasons — say, a specific gender, or a certain age group. Schneeweiss has finished 20 of his 30 DUPLICATE studies, with generally positive results. “It is very encouraging news,” he said. “… Real-world evidence may offer causal insights when RCT data are either not available or cannot be quickly and feasibly generated.”
➁ The randomized controlled trial offers advantages that are hard to replicate. Dr. Steven Nissen, a prominent cardiologist and drug-safety expert from the Cleveland Clinic, cautioned against the move away from randomized controlled trials, argues that real-world evidence isn’t ready for prime time. “Real-world studies should not be viewed as conclusive evidence,” he said. A clinical trial is a controlled experiment designed to isolate the new treatment by minimizing variability and bias. The problem, Nissen said, is that clinical trials are expensive. He sees the push for real-world studies as a cost-saving gambit but no substitute for RCTs. Hormone replacement therapy was taken by millions of women based on the strength of real-world studies, he noted, but later RCTs conducted in the early 2000s undercut the supposed benefits of the therapy. “Everybody believed based upon the real-world evidence that it would go the other way,” Nissen said. “And the evidence from observational studies was wrong.” He concluded that while real-world studies can be helpful in limited situations, “the randomized controlled trial is the gold standard for a very good reason.”
➂ Real-world evidence can be used to supplement randomized controlled trials. Dr. Michele Jonsson Funk of the University of North Carolina at Chapel Hill detailed, nine shortcomings of RCTs, including that they are too specific to answer broader questions, too small to study rare outcomes and too short to study long-term effects. Real-world data are helpful in addressing some of those flaws, such as the inability to study rare outcomes or subpopulations. Dr. Eberechukwu Onukwugha of the University of Maryland School of Pharmacy noted significant differences among white, Black and Latinx users of health services; those differences often emerge in diagnoses and treatments as well. The use of real-world data such as cancer registries and Medicare claims can be used to study such populations and improve outcomes.
➃ A caution for reporters: Real-world data can be easily abused. Reporters who cover health issues are often familiar with the side-effects data collected by the FDA. The data are easily accessible and detail adverse events in patients who have used a drug, vaccine or medical device. But the data are notoriously flawed, given that they are self-reported by patients or inconsistently reported by doctors. Moreover, adverse effects fail the causality test: Just because somebody got sick the same week they took a new drug doesn’t mean the new drug caused the illness. The misuse of vaccine safety data has mushroomed into a story in 2021, with some TV pundits claiming the COVID-19 vaccine caused deaths that likely happened purely by chance around the time of vaccination. Dr. Nandita Mitra, a biostatistics professor from the University of Pennsylvania, said the side-effects data can be used to spot potential problems. But reviewing the raw data is the beginning, not the end, of the interpretation process.
➄ The FDA is reviewing the use of real-world evidence in drug and device approvals. Two laws — the Prescription Drug User Fee Act and the 21st Century Cures Act — have mandated that the FDA enhance the use of real-world evidence for approving and monitoring drugs, devices and other medical products. Draft guidance for the industry on how to do so is scheduled to be issued by December 2021. Dr. John Concato, the FDA’s associate director for real-world evidence analytics, noted that “the standard for ‘substantial evidence’ remains unchanged” — meaning real-world data still must clear a high bar. And in recent years, it has. Concato shared several examples from 2010 or later in which the FDA used real-world evidence in approvals. In July, for example, the FDA approved a drug used in liver transplant patients for lung transplant recipients. The data came from a registry of all transplant recipients in the U.S. The FDA determined the real-world study met its “adequate and well-controlled” standard. Journalists should expect more such approvals to follow.
Speakers:
Dr. Amy Abernethy, President, Clinical Research Platforms, Verily, a subsidiary of Alphabet Inc.
Dr. John Concato, Associate Director for Real-World Evidence Analytics, Office of Medical Policy, Center for Drug Evaluation and Research, U.S. Food and Drug Administration
Dr. Michele Jonsson Funk, Associate Professor, Department of Epidemiology; Director, Center for Pharmacoepidemiology, UNC Gillings School of Global Public Health
Dr. Nandita Mitra, Professor, Department of Biostatistics, Epidemiology and Informatics; Co-Director, Center for Causal Inference; University of Pennsylvania
Dr. Steven Nissen, Chief Academic Officer, Heart and Vascular Institute; Chair in Cardiovascular Medicine and Professor of Medicine; Cleveland Clinic Lerner School of Medicine
Dr. Eberechukwu Onukwugha, Associate Professor of Pharmaceutical Health Services Research, University of Maryland School of Pharmacy
Dr. Sebastian Schneeweiss, Professor of Medicine, Harvard Medical School; Professor in Epidemiology, Harvard T.H. Chan School of Public Health
This program is sponsored by ISPOR, the Professional Society for Health Economics and Outcomes Research, with support from the BMS-Pfizer Alliance. NPF is solely responsible for the content, which does not necessarily reflect the views of BMS or Pfizer.









