Improving methods for discrete choice experiments to measure patient preferences

In settings with numerous random parameters, RPL models should use more Halton draws or perhaps a different type of draws to produce valid findings that can support good health and health policy decisions. Sensitivity analyses can increase confidence in estimating a given number of random parameters...

Full description

Bibliographic Details
Main Author: Ellis, Alan R.
Corporate Author: Patient-Centered Outcomes Research Institute (U.S.)
Format: eBook
Language:English
Published: Washington, DC Patient-Centered Outcomes Research Institute 2021, [2021]
Series:Final research report
Online Access:
Collection: National Center for Biotechnology Information - Collection details see MPG.ReNa
Description
Summary:In settings with numerous random parameters, RPL models should use more Halton draws or perhaps a different type of draws to produce valid findings that can support good health and health policy decisions. Sensitivity analyses can increase confidence in estimating a given number of random parameters with a particular number of draws. Although bootstrapping is known to correct variance estimates in settings with correlated errors, bootstrapping should not be used as an alternative to traditional variance estimation for multinomial logit models in the presence of preference heterogeneity. LIMITATIONS: Our conclusions largely depend on our specific source data sets and simulation parameters and should therefore be replicated under a wide range of conditions. For example, our simulations involving DCE design features and model assumptions were complex and included 1 study setting (study 1) that was not typical of recent health-related DCEs.
Finally, using simulated data that reflect preference heterogeneity, we ran multinomial logit models with both bootstrapped and conventional variance estimation. RESULTS: (1) Our review found growing use of health-related DCEs (an average of 60 per year from 20132017) with increasingly sophisticated methods (design, software, and econometric models), as well as inadequate reporting of methodologic details (eg, incorporation of interactions into the study design, use of blocking, method used to create choice sets, distributional assumptions, number of draws, use of internal validity testing). (2) In our simulations involving DCE design features and model assumptions, problems in the pilot phase tended to have little effect on the main DCE results; however, in 1 of the 2 study settings, problems in the pilot and main DCEs had widespread effects on bias and variance estimation, possibly related to correlations among attributes.
We then conducted computer simulations to (2) improve understanding of the effects of selected DCE design features and statistical model assumptions on DCE results, (3) demonstrate problems in RPL estimation with numerous random parameters and inadequate numbers of Halton draws, and (4) explore the use of bootstrap methods to improve variance estimation when using fixed-effect multinomial logit models in lieu of RPL models. METHODS: We systematically reviewed health-related DCEs published in 2013 to 2017. We implemented 864 simulation scenarios based on data from 2 actual DCEs to examine the effects of DCE design features and model assumptions on bias, variance, and overall error. We analyzed real and simulated data to demonstrate problems with RPL estimation using numerous random parameters and different numbers of Halton draws.
(3) In analyses of real DCE data, the stability of RPL results depended on the number of Halton draws used for a given number of random parameters; some parameters, especially deviation parameters, failed to stabilize even with 20 000 draws. (4) Compared with traditional variance estimation, bootstrapping in simple DCE data sets with 3 random parameters did not yield confidence intervals with actual coverage closer to the nominal (95%) coverage. CONCLUSIONS: Reporting DCE design and analysis methods in greater detail would strengthen health-related DCEs. Reporting guidelines may be a means to that end. Small problems in a pilot study may not have drastic effects on the main DCE results, but certain DCE designs (eg, those with correlated attributes) may require special care, such as added sensitivity analyses.
BACKGROUND: Discrete choice experiments (DCEs) measure preferences by presenting choice tasks in which respondents choose a preferred alternative. We identified 4 knowledge gaps related to DCE design and analysis: (1) Current trends (since 2012) in design, analysis, and reporting are unknown. (2) DCE design decisions can affect findings by influencing respondent behavior, and task complexity and respondent fatigue may play important roles. These and other problems, such as sampling error, selection bias, and unmeasured interactions, can reduce the usefulness of DCE findings, but their combined effects are unknown. (3) Random parameter logit (RPL) models, commonly used in DCEs, are sensitive to the number of Halton draws (numeric sequences) used for simulation to estimate parameters, but little detail is available about the effects of number of draws on results with different numbers of random parameters.
Also, our simulated random parameters followed the normal distribution, which does not apply to all DCE parameters. Finally, our findings on Halton draws in RPL estimation enables us to make only general suggestions, not to specify precise numeric thresholds related to the number of random parameters to estimate or the number of draws to use
(4) The multinomial logit model, also commonly used in DCEs, does not account for repeated observations from the same individual and therefore underestimates standard errors. It is not known whether bootstrapping at the level of the individual can be used to generate correct confidence intervals for multinomial logit models in the presence of preference heterogeneity. OBJECTIVES: This report describes a project to explore issues with DCE design, analysis, and reporting. (1) We collaborated on a systematic review to capture the state of the science of health-related DCEs.
Physical Description:1 PDF file (142 pages) illustrations