
Talks
Presenting Author Academic/Professional Position
Faculty
Academic/Professional Position (Other)
School of Mathematical and Statistical Sciences
Academic Level (Author 1)
Faculty
Discipline/Specialty (Author 1)
Population Health and Biostatistics
Presentation Type
Oral Presentation
Discipline Track
Community/Public Health
Abstract Type
Research/Clinical
Abstract
Background: When borrowing information from external data to augment a current trial, many available methods discount the sample size but retain the effect size from previous studies. Discounting the sample size is just one way to discount the prior information. It may not be appropriate if the underlying assumption of unbiased treatment effect does not hold, for example, when the treatment effect in the historical study is likely higher than the one expected in the current trial.
Methods: To tackle this potential issue, we study some methods to shrink the effect size from previous studies assuming that the prior effect size is higher than the true effect size. These methods include weighted mean, Bayesian hierarchical method for an individual study/subgroup, multiplicity adjusted mean, and dynamic/conditional shrinkage when borrowing information from external data for a patient population of interest. We evaluate the performance of these methods for normal and binomial endpoints through Monte Carlo simulation studies and compare them with some available methods of borrowing external data.
Results: Numerical results demonstrate that the proposed multiplicity adjustment method has good performance in terms of bias, type I error and power control. The Bayesian hierarchical modeling method has comparable performance as the multiplicity method when the hyper variance is well chosen or estimated. Some potential application scenarios are discussed and illustrated via a few hypothetical case studies.
Conclusions: This research will add to the toolbox of utilizing external data to generate clinical evidence in clinical studies.
Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.
Recommended Citation
Ma, Zhuanzhuan; Ahn, Chul; Wang, Bin; and Li, Xuefeng, "Discounting Effect Size When Borrowing External Data in Clinical Studies" (2025). Research Symposium. 18.
https://scholarworks.utrgv.edu/somrs/2025/talks/18
Included in
Applied Statistics Commons, Biostatistics Commons, Clinical Trials Commons, Statistical Methodology Commons, Statistical Models Commons
Discounting Effect Size When Borrowing External Data in Clinical Studies
Background: When borrowing information from external data to augment a current trial, many available methods discount the sample size but retain the effect size from previous studies. Discounting the sample size is just one way to discount the prior information. It may not be appropriate if the underlying assumption of unbiased treatment effect does not hold, for example, when the treatment effect in the historical study is likely higher than the one expected in the current trial.
Methods: To tackle this potential issue, we study some methods to shrink the effect size from previous studies assuming that the prior effect size is higher than the true effect size. These methods include weighted mean, Bayesian hierarchical method for an individual study/subgroup, multiplicity adjusted mean, and dynamic/conditional shrinkage when borrowing information from external data for a patient population of interest. We evaluate the performance of these methods for normal and binomial endpoints through Monte Carlo simulation studies and compare them with some available methods of borrowing external data.
Results: Numerical results demonstrate that the proposed multiplicity adjustment method has good performance in terms of bias, type I error and power control. The Bayesian hierarchical modeling method has comparable performance as the multiplicity method when the hyper variance is well chosen or estimated. Some potential application scenarios are discussed and illustrated via a few hypothetical case studies.
Conclusions: This research will add to the toolbox of utilizing external data to generate clinical evidence in clinical studies.