Swipe om te navigeren naar een ander artikel
The online version of this article (doi:10.1007/s11136-015-1115-3) contains supplementary material, which is available to authorized users.
On behalf of the MAUCa Consortium.
An erratum to this article is available at http://dx.doi.org/10.1007/s11136-017-1546-0.
An erratum to this article is available at http://dx.doi.org/10.1007/s11136-016-1289-3.
To assess the feasibility of using a discrete choice experiment (DCE) to value health states within the QLU-C10D, a utility instrument derived from the QLQ-C30, and to assess clarity, difficulty, and respondent preference between two presentation formats.
We ran a DCE valuation task in an online panel (N = 430). Respondents answered 16 choice pairs; in half of these, differences between dimensions were highlighted, and in the remainder, common dimensions were described in text and differing attributes were tabulated. To simplify the cognitive task, only four of the QLU-C10D’s ten dimensions differed per choice set. We assessed difficulty and clarity of the valuation task with Likert-type scales, and respondents were asked which format they preferred. We analysed the DCE data by format with a conditional logit model and used Chi-squared tests to compare other responses by format. Semi-structured telephone interviews (N = 8) explored respondents’ cognitive approaches to the valuation task.
Four hundred and forty-nine individuals were recruited, 430 completed at least one choice set, and 422/449 (94 %) completed all 16 choice sets. Interviews revealed that respondents found ten domains difficult but manageable, many adopting simplifying heuristics. Results for clarity and difficulty were identical between formats, but the “highlight” format was preferred by 68 % of respondents. Conditional logit parameter estimates were monotonic within domains, suggesting respondents were able to complete the DCE sensibly, yielding valid results.
A DCE valuation task in which only four of the QLU-C10D’s ten dimensions differed in any choice set is feasible for deriving utility weights for the QLU-C10D.
Log in om toegang te krijgen
Met onderstaand(e) abonnement(en) heeft u direct toegang:
Louviere, J., Carson, R. T., Burgess, L., Street, D., & Marley, A. A. (2013). Sequential preference question factors influencing completion rates and response times using an online panel. The Journal of Choice Modelling, 8, 19–31. CrossRef
King, M. T., Costa, D. S. J., Aaronson, N. K., Brazier, J. E., Cella, D. F., Fayers, P. M., et al. (submitted). QLU-C10D: A health state classification system for a multi-attribute utility measure based on the EORTC QLQ-C30. Quality of Life Research (currently under review).
Aaronson, N. K., Ahmedzai, S., Bergman, B., Bullinger, M., Cull, A., Duez, N. J., et al. (1993). The European Organisation for Research and Treatment of Cancer QLQ-C30: A quality-of-life instrument for use in international clinical trials in oncology. Journal of the National Cancer Institute, 85(5), 365–376. CrossRefPubMed
Colbourn, C. J., & Dinitz, J. H. (2006). Handbook of Combinatorial designs. Boca Raton, FL: Taylor and Francis. CrossRef
Street, D. J., & Burgess, L. (2007). The construction of optimal stated choice experiments: Theory and methods. Hoboken, NJ: Wiley. CrossRef
Demirkale, F., Donovan, D., & Street, D. J. (2013). Constructing D-optimal symmetric stated preference discrete choice experiments. Journal of Statistical Planning and Inference, 143, 1380–1391. CrossRef
Bleichrodt, N., Wakker, P., & Johannesson, M. (1997). Characterizing QALYs by risk neutrality. Journal of Risk and Uncertainty, 15(2), 107–114. CrossRef
Ritchie, J., & Spencer, L. (1994). Qualitative data analysis for applied policy research. In A. Bryman & R. Burgess (Eds.), Analyzing qualitative data (pp. 173–194). London: Routledge. CrossRef
Chrzan, K. (2010). Using partial profile choice experiments to handle large numbers of attributes. International Journal of Marketing Research, 52(6), 827–840.
Vass, C., Rigby, D., Campbell, S., Tate, K., Stewart, A., & Payne, K. (2014). PS2-33 investigating the framing of risk attributes in a discrete choice experiment: An application of eye-tracking and think aloud. In Paper presented at the 36th meeting of the Society for Medical Decision Making, Miami, FL.
Krucien, N., Ryan, M., & Hermens, F. (2014). Using eye- tracking methods to inform decision making processes in discrete choice experiments, Health Economists’ Study Group (HESG). Glasgow Caledonian University.
Whitty, J. A., Ratcliffe, J., Chen, G., & Scuffham, P. A. (2014). Australian public preferences for the funding of new health technologies: A comparison of discrete choice and profile case best–worst scaling methods. Medical Decision Making, 34(5), 638–654.
Mulhern, B., Bansback, N., Brazier, J., Buckingham, K., Cairns, J., Devlin, N., et al. (2014). Preparatory study for the revaluation of the EQ-5D tariff: Methodology report. Health Technology Assessment, 18(12), vii–xxvi, 1–191.
- Using a discrete choice experiment to value the QLU-C10D: feasibility and sensitivity to presentation format
N. K. Aaronson
J. E. Brazier
D. S. J. Costa
P. M. Fayers
A. S. Pickard
D. J. Street
T. A. Young
M. T. King
- Springer International Publishing