Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2013 Jun;57(3-4):53-67.
doi: 10.1016/j.jmp.2013.05.005.

A Tutorial on Adaptive Design Optimization

Affiliations

A Tutorial on Adaptive Design Optimization

Jay I Myung et al. J Math Psychol. 2013 Jun.

Abstract

Experimentation is ubiquitous in the field of psychology and fundamental to the advancement of its science, and one of the biggest challenges for researchers is designing experiments that can conclusively discriminate the theoretical hypotheses or models under investigation. The recognition of this challenge has led to the development of sophisticated statistical methods that aid in the design of experiments and that are within the reach of everyday experimental scientists. This tutorial paper introduces the reader to an implementable experimentation methodology, dubbed Adaptive Design Optimization, that can help scientists to conduct "smart" experiments that are maximally informative and highly efficient, which in turn should accelerate scientific discovery in psychology and beyond.

PubMed Disclaimer

Figures

Figure 1
Figure 1
Sample power (POW) and exponential (EXP) functions, generated from a narrow range of model parameters (see text). The time intervals between 1 and 5 seconds, where the models are the most discriminable, are indicated by the blue circles. In contrast, the green elliptic circles indicate the time intervals (i.e., 15 – 20 seconds) that offer the least discriminability.
Figure 2
Figure 2
Schematic illustration of the traditional experimentation versus adaptive experimentation paradigm. (a) The vertical arrow on the left represents optimization of the values of design variables before data collection. The vertical arrow on the right represents the analysis and modeling of the data collected, using model selection or parameter estimation methods, for example. (b) In the adaptive experimentation paradigm, the three parts of experimentation (design optimization, experiment, and data modeling) are closely integrated to form a cycle of inference steps in which the output from one part is fed as an input to the next part.
Figure 3
Figure 3
Schematic illustration of the steps involved in adaptive design optimization (ADO).
Figure 4
Figure 4
The grid search algorithm.
Figure 5
Figure 5
Illustration of sequential Monte Carlo search design optimization with simulated annealing.
Figure 6
Figure 6
The SMC search algorithm.
Figure 7
Figure 7
Heat map of the data-generating model in the ADO simulation. Darker colors indicate regions of higher probability.
Figure 8
Figure 8
Heat maps of the Power model (left) and Exponential model (right) representing ADO’s prior estimates of each model. Darker colors indicate regions of higher probability. The lag time of t = 7 (blue rectangle) is chosen for testing in the first stage because it is the place where the two models differ the most, based on the priors.
Figure 9
Figure 9
Heat maps of the Power model (left) and Exponential model (right) representing ADO’s estimates of each model after the first stage of testing (prior to the second stage). Estimates have converged around the observed data point (white dot in each heat map). ADO selects t = 1 (blue rectangle) for testing in Stage 2 because it is the place where the two models differ the most, based on these updated estimates.
Figure 10
Figure 10
Heat maps of the Power model (left) and Exponential model (right) representing ADO’s estimates of each model after the ten stages of testing. Both models try to fit the observed data points (white dots) as well as possible, but the exponential model cannot do so as well as the power model. The difference is so extreme that the power model is over 1000 times more likely to generate this pattern of data than the exponential model.
Figure 11
Figure 11
Posterior model probability curve from a sample run of the ADO simulation experiment. The data were generated from the power model with parameters a = 0.80 and b = 0.40. See the text for additional details of the simulation.

References

    1. Akaike H. Information theory and an extension of the maximum likelihood principle. In: Petrov BN, Caski F, editors. Proceedings of the Second International Symposium on Information Theory. Budapest: Akademiai Kiado; 1973. pp. 267–281.
    1. Amzal B, Bois F, Parent E, Robert C. Bayesian-optimal design via interacting particle systems. Journal of the American Statistical Association. 2006;101(474):773–785.
    1. Andrieu C, DeFreitas N, Doucet A, Jornan MJ. An introduction to MCMC for machine learning. Machine Learning. 2003;50:5–43.
    1. Atkinson A, Donev A. Optimum Experimental Designs. Oxford University Press; 1992.
    1. Atkinson A, Federov V. Optimal design: Experiments for discriminating between several models. Biometrika. 1975;62(2):289.

LinkOut - more resources