mriqc.qc package

Module contents

This module contains the actual computation of IQMs included within MRIQC.

Note

Most of the IQMs in this module are adapted, derived or reproduced from the QAP project [QAP].

Submodules

mriqc.qc.anatomical module

Computation of the quality assessment measures on structural MRI

mriqc.qc.anatomical.artifacts(img, seg, calculate_qi2=False, bglabel=0)[source]

Detect artifacts in the image using the method described in [Mortamet2009]. Calculates QI1, the fraction of total voxels that within artifacts.

Optionally, it also calculates QI2, the distance between the distribution of noise voxel (non-artifact background voxels) intensities, and a Rician distribution.

Parameters:
  • img (numpy.ndarray) – input data
  • seg (numpy.ndarray) – input segmentation
mriqc.qc.anatomical.cnr(img, seg, lbl=None)[source]

Calculate the CNR

\[\text{CNR} = \frac{|\mu_\text{GM} - \mu_\text{WM} |}{\sigma_B}\]
Parameters:
  • img (numpy.ndarray) – input data
  • seg (numpy.ndarray) – input segmentation
Returns:

the computed CNR

mriqc.qc.anatomical.efc(img)[source]

Calculate the EFC [Atkinson1997]

The original equation is normalized by the maximum entropy, so that the EFC can be compared across images with different dimensions.

Parameters:img (numpy.ndarray) – input data
mriqc.qc.anatomical.fber(img, seg, fglabel=None, bglabel=0)[source]

Calculate the FBER

\[\text{FBER} = \frac{E[|F|^2]}{E[|B|^2]}\]
Parameters:
  • img (numpy.ndarray) – input data
  • seg (numpy.ndarray) – input segmentation
mriqc.qc.anatomical.rpve(pvms, seg)[source]

Computes the rPVe of each tissue class.

mriqc.qc.anatomical.snr(img, seg, fglabel, bglabel='bg')[source]

Calculate the SNR

\[\text{SNR} = \frac{\mu_F}{\sigma_B}\]

where \(\mu_F\) is the mean intensity of the foreground and \(\sigma_B\) is the standard deviation of the background, where the noise is computed.

Parameters:
  • img (numpy.ndarray) – input data
  • seg (numpy.ndarray) – input segmentation
  • fglabel (str) – foreground label in the segmentation data.
  • bglabel (str) – background label in the segmentation data.
Returns:

the computed SNR for the foreground segmentation

mriqc.qc.anatomical.summary_stats(img, pvms)[source]

Estimates the mean, the standard deviation, the 95% and the 5% percentiles of each tissue distribution.

mriqc.qc.anatomical.volume_fraction(pvms)[source]

Computes the ICV fractions corresponding to the (partial volume maps).

Parameters:pvms (list) – list of numpy.ndarray of partial volume maps.

mriqc.qc.functional module

Computation of the quality assessment measures on functional MRI

mriqc.qc.functional.dvars(func, mask, output_all=False, out_file=None)[source]

Compute the mean DVARS [Power2012].

Particularly, the standardized DVARS [Nichols2013] are computed.

Note

Implementation details

Uses the implementation of the Yule-Walker equations from nitime for the AR filtering of the fMRI signal.

Parameters:
  • func (numpy.ndarray) – functional data, after head-motion-correction.
  • mask (numpy.ndarray) – a 3D mask of the brain
  • output_all (bool) – write out all dvars
  • out_file (str) – a path to which the standardized dvars should be saved.
Returns:

the standardized DVARS

mriqc.qc.functional.fd_jenkinson(in_file, rmax=80.0, out_file=None)[source]

Compute the FD [Jenkinson2002] on a 4D dataset, after 3dvolreg has been executed (generally a file named *.affmat12.1D).

Parameters:
  • in_file (str) – path to epi file
  • rmax (float) – the default radius (as in FSL) of a sphere represents the brain in which the angular displacements are projected.
  • out_file (str) – a path for the output file with the FD
Returns:

the output file with the FD, and the average FD along the time series

Return type:

tuple(str, float)

Note

infile should have one 3dvolreg affine matrix in one row - NOT the motion parameters

mriqc.qc.functional.gcor(func, mask)[source]

Compute the GCOR.

Parameters:
  • func (numpy.ndarray) – input fMRI dataset, after motion correction
  • mask (numpy.ndarray) – 3D brain mask
Returns:

the computed GCOR value

mriqc.qc.functional.gsr(epi_data, mask, direction='y', ref_file=None, out_file=None)[source]

Computes the GSR [Giannelli2010]. The procedure is as follows:

  1. Create a Nyquist ghost mask by circle-shifting the original mask by \(N/2\).
  2. Rotate by \(N/2\)
  3. Remove the intersection with the original mask
  4. Generate a non-ghost background
  5. Calculate the GSR

Warning

This should be used with EPI images for which the phase encoding direction is known.

Parameters:
  • epi_file (str) – path to epi file
  • mask_file (str) – path to brain mask
  • direction (str) – the direction of phase encoding (x, y, all)
Returns:

the computed gsr

mriqc.qc.functional.zero_variance(func, mask)[source]

Mask out voxels with zero variance across t-axis

Parameters:
  • func (numpy.ndarray) – input fMRI dataset, after motion correction
  • mask (numpy.ndarray) – 3D brain mask
Returns:

the 3D mask of voxels with nonzero variance across \(t\).

Return type:

numpy.ndarray

References

[Atkinson1997]Atkinson et al., Automatic correction of motion artifacts in magnetic resonance images using an entropy focus criterion, IEEE Trans Med Imag 16(6):903-910, 1997. doi:10.1109/42.650886.
[Giannelli2010]Giannelli et al., Characterization of Nyquist ghost in EPI-fMRI acquisition sequences implemented on two clinical 1.5 T MR scanner systems: effect of readout bandwidth and echo spacing. J App Clin Med Phy, 11(4). 2010. doi:10.1120/jacmp.v11i4.3237.
[Jenkinson2002]Jenkinson et al., Improved Optimisation for the Robust and Accurate Linear Registration and Motion Correction of Brain Images. NeuroImage, 17(2), 825-841, 2002. doi:10.1006/nimg.2002.1132.
[Mortamet2009]Mortamet B et al., Automatic quality assessment in structural brain magnetic resonance imaging, Mag Res Med 62(2):365-372, 2009. doi:10.1002/mrm.21992.
[Nichols2013]Nichols, Notes on Creating a Standardized Version of DVARS, 2013.
[Power2012]Poweret al., Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion, NeuroImage 59(3):2142-2154, 2012, doi:10.1016/j.neuroimage.2011.10.018.
[QAP]The QAP project.