Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2016 Jul 19;6(7):e011985.
doi: 10.1136/bmjopen-2016-011985.

Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments

Affiliations

Simulation study to determine the impact of different design features on design efficiency in discrete choice experiments

Thuva Vanniyasingam et al. BMJ Open. .

Abstract

Objectives: Discrete choice experiments (DCEs) are routinely used to elicit patient preferences to improve health outcomes and healthcare services. While many fractional factorial designs can be created, some are more statistically optimal than others. The objective of this simulation study was to investigate how varying the number of (1) attributes, (2) levels within attributes, (3) alternatives and (4) choice tasks per survey will improve or compromise the statistical efficiency of an experimental design.

Design and methods: A total of 3204 DCE designs were created to assess how relative design efficiency (d-efficiency) is influenced by varying the number of choice tasks (2-20), alternatives (2-5), attributes (2-20) and attribute levels (2-5) of a design. Choice tasks were created by randomly allocating attribute and attribute level combinations into alternatives.

Outcome: Relative d-efficiency was used to measure the optimality of each DCE design.

Results: DCE design complexity influenced statistical efficiency. Across all designs, relative d-efficiency decreased as the number of attributes and attribute levels increased. It increased for designs with more alternatives. Lastly, relative d-efficiency converges as the number of choice tasks increases, where convergence may not be at 100% statistical optimality.

Conclusions: Achieving 100% d-efficiency is heavily dependent on the number of attributes, attribute levels, choice tasks and alternatives. Further exploration of overlaps and block sizes are needed. This study's results are widely applicable for researchers interested in creating optimal DCE designs to elicit individual preferences on health services, programmes, policies and products.

Keywords: conjoint analysis; design efficiency; discrete choice experiment; patient preferences.

PubMed Disclaimer

Figures

Figure 1
Figure 1
(A) Relative d-efficiencies (%) of designs with two alternatives across 2–20 attributes, 2–5 attribute levels and 20 choice sets each. (B) Relative d-efficiencies (%) of designs with three alternatives across 2–20 attributes, 2–5 attribute levels and 20 choice sets each. (C) Relative d-efficiencies (%) of designs with four alternatives across 2–20 attributes, 2–5 attribute levels and 20 choice sets each. (D) Relative d-efficiencies (%) of designs with five alternatives across 2–20 attributes, 2–5 attribute levels and 20 choice sets each.
Figure 2
Figure 2
(A) The effect of 2–5 attributes on relative d-efficiency (%) across different choice tasks for designs with two alternatives and two-level attributes. (B) The effect of 6–10 attributes on relative d-efficiency (%) across different choice tasks for designs with two alternatives and two-level attributes. (C) The effect of 11–15 attributes on relative d-efficiency (%) across different choice tasks for designs with two alternatives and two-level attributes. (D) The effect of 16–20 attributes on relative d-efficiency (%) across different choice tasks for designs with two alternatives and two-level attributes.
Figure 3
Figure 3
The effect of 6–10 attributes on relative d-efficiency (%) across different choice tasks for designs with two alternatives and three-level attributes.

References

    1. Marshall D, Bridges JF, Hauber B et al. . Conjoint Analysis Applications in Health—How are Studies being Designed and Reported? An Update on Current Practice in the Published Literature between 2005 and 2008. Patient 2010;3:249–56. 10.2165/11539650-000000000-00000 - DOI - PubMed
    1. Mandeville KL, Lagarde M, Hanson K. The use of discrete choice experiments to inform health workforce policy: a systematic review. BMC Health Serv Res 2014;14:367 10.1186/1472-6963-14-367 - DOI - PMC - PubMed
    1. de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ 2012;21:145–72. 10.1002/hec.1697 - DOI - PubMed
    1. Ryan M, Scott DA, Reeves C et al. . Eliciting public preferences for healthcare: a systematic review of techniques. Health Technol Assess 2001;5:1–186. 10.3310/hta5050 - DOI - PubMed
    1. Spinks J, Chaboyer W, Bucknall T et al. . Patient and nurse preferences for nurse handover-using preferences to inform policy: a discrete choice experiment protocol. BMJ Open 2015;5:e008941 10.1136/bmjopen-2015-008941 - DOI - PMC - PubMed

Publication types

LinkOut - more resources