-
Notifications
You must be signed in to change notification settings - Fork 33
COMMIT2
This tutorial illustrates the basics to begin using COMMIT2.
This tutorial uses the same dataset as the Getting Started tutorial:
Then, download and extract into the same folder the files contained in the following ZIP archive:
-
IASF.nii.gz
: the intra-axonal signal fraction map estimated with NODDI (other models could be used instead); -
IASF.scheme
: the corresponding scheme to use such map in COMMIT; -
GM_1x1x1.nii.gz
: parcellation of the gray matter for deifining the bundles of interest.
After downloading all the data, the content of the folder should look like this:
NB: in this tutorial we will make use of the MRtrix toolbox for several key operations to prepare the data for COMMIT. However, one could also use his/her preferred software to achieve the same functionalities.
Open the Python interpreter and go to the folder containing all the files and import the following required libraries:
import numpy as np
import os
import commit
from commit import trk2dictionary
First, we extract the connecting streamlines, i.e. those that do connect two valid gray-matter ROIs as defined in the file GM_1x1x1.nii.gz
and, then, we re-organize them as contiguous groups in the resulting tractogram:
os.system( 'tck2connectome -force -assignment_radial_search 2 -out_assignments fibers_assignment.txt demo01_fibers.tck GM_1x1x1.nii.gz connectome.csv' )
if not os.path.isdir( 'bundles' ) :
os.mkdir( 'bundles' )
os.system( 'connectome2tck -force -exclusive -files per_edge -keep_self demo01_fibers.tck fibers_assignment.txt bundles/bundle_' )
C = np.loadtxt( 'connectome.csv', delimiter=',' )
CMD = 'tckedit -force'
for i in range(C.shape[0]):
for j in range(i,C.shape[0]):
if C[i,j] > 0 :
CMD += ' bundles/bundle_%d-%d.tck' %(i+1,j+1)
os.system( CMD + ' demo01_fibers_connecting.tck' )
Import the tractogram just created into COMMIT:
commit.core.setup()
trk2dictionary.run(
filename_tractogram = 'demo01_fibers_connecting.tck',
filename_mask = 'WM.nii.gz',
fiber_shift = 0.5
)
COMMIT2 requires an initial guess for the relative importance of each bundle, in order to consider each bundle equally regardless of its cardinality (see Eq. 6 in the correspoding pubblication here). This is required to not penalize small bundles just because they are small or favour big ones because they are big. We can get such an initial guess by running the standard COMMIT:
# load the data
mit = commit.Evaluation( '.', '.' )
mit.set_config( 'doNormalizeSignal', False )
mit.set_config( 'doNormalizeKernels', True )
mit.load_data( 'IASF.nii.gz', 'IASF.scheme' )
# set the simplistic model that fits the streamlines to the IASF map
mit.set_model( 'VolumeFractions' )
mit.model.set( hasISO=False )
mit.generate_kernels( regenerate=True )
mit.load_kernels()
# create the sparse data structures to handle the matrix A
mit.load_dictionary( 'COMMIT' )
mit.set_threads()
mit.build_operator()
# perform the fit
mit.fit( tol_fun=1e-3, max_iter=1000 )
mit.save_results( path_suffix="_COMMIT1" )
We save the streamline contributions estimated by COMMIT for later use in COMMIT2:
x_nnls = mit.x.copy()
First, we retrieve the number of streamlines present in each bundle from the connectome previously calculated, i.e. connectome.csv
:
C = np.loadtxt( 'connectome.csv', delimiter=',' )
group_size = C[C>0].astype(np.int32)
With this information, we create an array with the indices of the streamlines composing each bundle:
tmp = np.insert( np.cumsum(group_size), 0 , 0)
group_idx = np.array( [np.arange(tmp[i],tmp[i+1]) for i in range(len(tmp)-1)] )
Now, we compute the individual weight for each bundle in order to penalizes all groups of streamlines in the same manner regardless of their cardinality, i.e. Eq. 6:
group_w = np.empty_like( group_size, dtype=np.float64 )
for k in range(group_size.size) :
group_w[k] = np.sqrt(group_size[k]) / ( np.linalg.norm(x_nnls[group_idx[k]]) + 1e-12 )
Finally, we create the regularization term implementing the anatomical prior that axons are organized in bundles. Please recall that this requires a regularization parameter lambda to be defined, i.e. reg_lambda
:
reg_lambda = 1e-5 # change to suit your needs
regularisation = commit.solvers.init_regularisation(
mit,
regnorms = [commit.solvers.group_sparsity, commit.solvers.non_negative, commit.solvers.non_negative],
structureIC = group_idx,
weightsIC = group_w,
group_norm = 2,
lambdas = [reg_lambda, 0.0, 0.0]
)
We can now fit the streamlines to the IASF map using COMMIT2 (i.e. COMMIT with bundle regularization):
mit.fit( tol_fun=1e-4, max_iter=1000, regularisation=regularisation )
mit.save_results( path_suffix="_COMMIT2" )
The following figure shows the (binary) connectomes corresponding to the raw input tractogram (left) as well as after the filtering with COMMIT (center) and COMMIT2 (right):
As we can see, the connectomes are sparser with COMMIT and COMMIT2, as indeed the network density (computed with the brain connectivity toolbox) is 0.66, 0.48 and 0.33, respectively.
However, despite sparser, the streamlines that were kept by COMMIT/COMMIT2 explain equally well the reference IASF map, as show in the following figure:
Diffusion Imaging and Connectivity Estimation (DICE) lab