MIRTorch

GitHub release (latest by date including pre-releases) Read the Docs

A PyTorch-based differentiable Image Reconstruction Toolbox, developed at the University of Michigan.

The work is inspired by MIRT, a well-acclaimed toolbox for medical imaging reconstruction.

The overarching goal is to provide fast iterative and data-driven image reconstruction across CPUs and GPUs. Researchers can rapidly develop new model-based and learning-based methods (i.e., unrolled neural networks) with convenient abstraction layers. With the full support of auto-differentiation, one may optimize imaging protocols and image reconstruction parameters with gradient methods.

Documentation: https://mirtorch.readthedocs.io/en/latest/


Installation

We recommend to pre-install PyTorch first. To install the MIRTorch package, after cloning the repo, please try python setup.py install.

requirements.txt details the package dependencies. We recommend installing pytorch_wavelets directly from the source code instead of pip.


Features

Linear maps

The LinearMap class overloads common matrix operations, such as +, - , *.

Instances include basic linear operations (like convolution), classical imaging processing, and MRI system matrix (Cartesian and Non-Cartesian, sensitivity- and B0-informed system models). We are planning to add support for PET/CT this year.

Since the Jacobian matrix of a linear operator is itself, the toolbox can actively calculate such Jacobians during backpropagation, avoiding the large cache cost required by auto-differentiation.

When defining linear operators, please make sure that all torch tensors are on the same device and compatible. For example, torch.cfloat are compatible with torch.float but not torch.double. When the data is image, there are 2 empirical formats: [num_batch, num_channel, nx, ny, (nz)] and [nx, ny, (nz)]. For some LinearMaps, there is a boolean batchmode to control it.

Proximal operators

The toolbox contains common proximal operators such as soft thresholding. These operators also support the regularizers that involve multiplication with diagonal or unitary matrices, such as orthogonal wavelets.

Iterative reconstruction (MBIR) algorithms

Currently, the package includes the conjugate gradient (CG), fast iterative thresholding (FISTA), optimized gradient method (POGM), forward-backward primal-dual (FBPD) algorithms for image reconstruction.

Dictionary learning

For dictionary learning-based reconstruction, we implemented an efficient dictionary learning algorithm (SOUP-DIL) and orthogonal matching pursuit (OMP). Due to PyTorch’s limited support of sparse matrices, we use SciPy as the backend.

Multi-GPU support

Currently, MIRTorch uses torch.DataParallel to support multiple GPUs. One may re-package the LinearMap, Prox or Alg inside a torch.nn.Module to enable data parallel. See this tutorial for detail.


Usage and examples

Generally, MIRTorch solves the image reconstruction problems that have the cost function $\textit{argmin}_{x} |Ax-y|_2^2 + \lambda \textit{R}(x)$. $A$ stands for the system matrix. When it is linear, one may use LinearMap to efficiently compute it. y usually denotes measurements. $\textit{R}(\cdot)$ denotes regularizers, which determines which Alg to be used. One may refer to 1, 2 and 3 for more tutorials on optimization.

Here we provide several notebook tutorials focused on MRI, where $A$ is FFT or NUFFT.

  • /example/demo_mnist.ipynb shows the LASSO on MNIST with FISTA and POGM.

  • /example/demo_mri.ipynb contains the SENSE (CG-SENSE) and B0-informed reconstruction with penalized weighted least squares (PWLS).

  • /example/demo_cs.ipynb shows the compressed sensing reconstruction of under-determined MRI signals.

  • /example/demo_dl.ipynb exhibits the dictionary learning results.

Since MIRTorch is differentiable, one may use AD to update many parameters. For example, updating the reconstruction neural network’s weights. More importantly, one may update the imaging system itself via gradient-based and data-driven methods. As a user case, Bjork repo contains MRI sampling pattern optimization examples. One may use the reconstruction loss as the objective function to jointly optimize reconstruction algorithms and the sampling pattern. See this video on how to jointly optimize reconstruction and acquisition.


Acknowledgments

This work is inspired by (but not limited to):

  • SigPy: https://github.com/mikgroup/sigpy

  • MIRT: https://github.com/JeffFessler/mirt

  • MIRT.jl: https://github.com/JeffFessler/MIRT.jl

  • PyLops: https://github.com/PyLops/pylops

If the code is useful to your research, please cite:

@article{wang:22:bjork,
  author={Wang, Guanhua and Luo, Tianrui and Nielsen, Jon-Fredrik and Noll, Douglas C. and Fessler, Jeffrey A.},
  journal={IEEE Transactions on Medical Imaging}, 
  title={B-spline Parameterized Joint Optimization of Reconstruction and K-space Trajectories (BJORK) for Accelerated 2D MRI}, 
  year={2022},
  pages={1-1},
  doi={10.1109/TMI.2022.3161875}}

License

This package uses the BSD3 license.