An Efficient Preconditioner for Stochastic Gradient Descent Optimization of Image Registration

Yuchuan Qiao, Boudewijn P.F. Lelieveldt, Marius Staring*

*Corresponding author for this work

Research output: Contribution to journalArticleScientificpeer-review

7 Citations (Scopus)


Stochastic gradient descent (SGD) is commonly used to solve (parametric) image registration problems. In the case of badly scaled problems, SGD, however, only exhibits sublinear convergence properties. In this paper, we propose an efficient preconditioner estimation method to improve the convergence rate of SGD. Based on the observed distribution of voxel displacements in the registration, we estimate the diagonal entries of a preconditioning matrix, thus rescaling the optimization cost function. The preconditioner is efficient to compute and employ and can be used for mono-modal as well as multi-modal cost functions, in combination with different transformation models, such as the rigid, the affine, and the B-spline model. Experiments on different clinical datasets show that the proposed method, indeed, improves the convergence rate compared with SGD with speedups around 25 in all tested settings while retaining the same level of registration accuracy.

Original languageEnglish
Article number8638803
Pages (from-to)2314-2325
Number of pages12
JournalIEEE Transactions on Medical Imaging
Issue number10
Publication statusPublished - 2019


  • image registration
  • Optimization
  • preconditioning
  • stochastic gradient descent


Dive into the research topics of 'An Efficient Preconditioner for Stochastic Gradient Descent Optimization of Image Registration'. Together they form a unique fingerprint.

Cite this