Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
Review
. 2022 Mar 23;67(7):10.1088/1361-6560/ac3842.
doi: 10.1088/1361-6560/ac3842.

Convex optimization algorithms in medical image reconstruction-in the age of AI

Affiliations
Review

Convex optimization algorithms in medical image reconstruction-in the age of AI

Jingyan Xu et al. Phys Med Biol. .

Abstract

The past decade has seen the rapid growth of model based image reconstruction (MBIR) algorithms, which are often applications or adaptations of convex optimization algorithms from the optimization community. We review some state-of-the-art algorithms that have enjoyed wide popularity in medical image reconstruction, emphasize known connections between different algorithms, and discuss practical issues such as computation and memory cost. More recently, deep learning (DL) has forayed into medical imaging, where the latest development tries to exploit the synergy between DL and MBIR to elevate the MBIR's performance. We present existing approaches and emerging trends in DL-enhanced MBIR methods, with particular attention to the underlying role of convexity and convex algorithms on network architecture. We also discuss how convexity can be employed to improve the generalizability and representation power of DL networks in general.

Keywords: artificial intelligence; convex optimization; deep learning (DL); first order methods; inverse problems; machine learning (ML); model based image reconstruction.

PubMed Disclaimer

Figures

Figure 5.
Figure 5.
(a) When τ1 and x˜τ, the objective ϕ continuously increases as a function of x. There is a global minimizer at x=0. (b) When τ1 and x˜>τ there is a unique intersection point (the filled red marker) between the two gradient lines f(x) and q(x).
Figure 6.
Figure 6.
(a) If τ>1 and x˜>τ, there is a unique intersection between f(x) (blue curve) and q(x) (green line), indicated by the filled red marker. (b) If τ>1 and x˜<x˜t, there is no intersection between the f(x) and q(x). The solution to (8.19) is x=0. Here xt=τ1,x˜t=2τ1.
Figure 7.
Figure 7.
Two cases when x˜t<x˜τ. The intersections between the blue curve f(x) and the green line q(x) are marked by the open and the filled red markers. The former indicates a local maximum, the latter indicates a local minimum. There is another local minimum at x=0. (a) When area A> area B, the global minimizer of (8.19) is at x=0. (b) When area A< area B, the global minimizer is at x=x, the second (larger) intersection point. The critical point x˜=x˜c separating the two cases is when area A= area B.
Figure 8.
Figure 8.
The thresholding solution given by (8.20). Here we append by symmetry the solution for x˜<0 as well. (a) If τ1, the objective (8.19) is convex, the solution x is a continuous function of x˜. (b) If τ>1, the objective (8.19) is nonconvex, the solution x has a jump at x˜=x˜c, given by (8.22).
Figure 1.
Figure 1.
(a) An iterative algorithm where the data consistency (DC) term and the regularizer (Reg) connects in serial. The loop sign (green) indicates the recurrent nature of the iterations. (b) Variational network (VN) unrolls an iterative algorithm and replaces the regularizers by CNNs. The multiple CNNs can share weights θk=θ, for all k) or have different weights, although the former adheres more to the recurrent nature of an iterative algorithm. The serial connection in (a) can model algorithms such as proximal gradient or alternating update schemes (Liang et al 2019). Parallel connection is also possible, e.g., as in gradient descent, which gives rise to different VN architectures (Liang et al 2019).
Figure 2.
Figure 2.
Using the CNN to parametrize the unknown image x as proposed in (Gong et al 2018a). The output of the CNN, which is pretrained to perform image denoising, is the reconstructed image. Image reconstruction is formulated to minimize the loss function with respect to z or θ.
Figure 3.
Figure 3.
The hyperparameter learning framework proposed in (Xu and Noo 2021). The CNN, parametrized by θ, generates patientspecific and spatially variant hyperparameter η needed for optimization-based image reconstruction. End-to-end learning requires backpropagating the gradient from the loss to the CNN parameter θ. During testing/inference, the image reconstruction module can run outside of a DL library.
Figure 4.
Figure 4.
A convex optimization layer (COL) outputs the solution of a convex optimization problem f(x;θ), where θ lumps both the input y and nuisance parameters η. A COL can be embedded as a component in a larger network. End-to-end training of such networks requires differentiation through argmin.

Similar articles

Cited by

References

    1. Abdalah M, Mitra D, Boutchko R and Gullberg GT 2013. Optimization of regularization parameter in a reconstruction algorithm 2013 IEEE Nuclear Science Symposium and Medical Imaging Conference (Seoul, Korea South, 27 October–2 November 2013) (Picastaway, NJ: IEEE; ) pp 1–4
    1. Adler J and Öktem O 2018. Learned primal-dual reconstruction IEEE Trans. Med. Imaging 37 1322–32 - PubMed
    1. Agrawal A, Amos B, Barratt S, Boyd S, Diamond S and Kolter Z 2019a. Differentiable convex optimization layers Proceedings of 2019 Advances in Neural Information Processing Systems 32 pp 9562–74 arXiv:1910.12430
    1. Agrawal A, Barratt S, Boyd S, Busseti E and Moursi WM 2019b. Differentiating through a cone program Journal of Applied and Numerical Optimization 1 107–15 (http://jano.biemdas.com/archives/931)
    1. Aggarwal HK and Jacob M 2020. J-MoDL: joint model-based deep learning for optimized sampling and reconstruction, IEEE Journal of Selected Topics in Signal Processing 14 1151–62 - PMC - PubMed

Publication types