Functional connectivity and resting state#

Functional connectivity and resting-state data can be studied in many different way. Nilearn provides tools to construct “connectomes” that capture functional interactions between regions or to extract regions and networks, via resting-state networks or parcellations. For a much more detailed guide, go to Nilearn’s Connectivity section, here we want to show you just a few basics.

Speaking of which, we will be covering the following sections:

  1. Extracting times series to build a functional connectome

  2. Single subject maps of seed-to-voxel correlation

  3. Single subject maps of seed-to-seed correlation

  4. Group analysis of resting-state fMRI with ICA (CanICA)

Setup#

Before we start with anything, let’s set up the important plotting functions:

from nilearn import plotting
%matplotlib inline
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
from nilearn import image, datasets
dataset_adhd = datasets.fetch_adhd(n_subjects=40, data_dir='/Users/peerherholz/Desktop/')
Downloading data from https://www.nitrc.org/frs/download.php/7812/adhd40_4046678.tgz ...
Downloaded 16130048 of 21375806 bytes (75.5%,    1.0s remaining) ...done. (4 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_4046678.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7813/adhd40_4134561.tgz ...
Downloaded 69173248 of 69914913 bytes (98.9%,    0.1s remaining) ...done. (13 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_4134561.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7814/adhd40_4164316.tgz ...
Downloaded 40837120 of 45506732 bytes (89.7%,    0.8s remaining) ...done. (8 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_4164316.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7815/adhd40_4275075.tgz ...
Downloaded 30121984 of 32363673 bytes (93.1%,    0.4s remaining) ...done. (6 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_4275075.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7816/adhd40_6115230.tgz ...
Downloaded 68526080 of 73484949 bytes (93.3%,    0.9s remaining) ...done. (14 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_6115230.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7817/adhd40_7774305.tgz ...
Downloaded 39190528 of 42188959 bytes (92.9%,    0.6s remaining) ...done. (8 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_7774305.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7818/adhd40_8409791.tgz ...
Downloaded 66658304 of 70396354 bytes (94.7%,    0.7s remaining) ...done. (14 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_8409791.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7819/adhd40_8697774.tgz ...
Downloaded 40656896 of 45075978 bytes (90.2%,    0.8s remaining) ...done. (8 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_8697774.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7820/adhd40_9744150.tgz ...
Downloaded 62537728 of 63380505 bytes (98.7%,    0.2s remaining) ...done. (14 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_9744150.tgz..... done.
Downloading data from https://www.nitrc.org/frs/download.php/7821/adhd40_9750701.tgz ...
Downloaded 41795584 of 46607053 bytes (89.7%,    1.0s remaining) ...done. (9 seconds, 0 min)
Extracting data from /Users/peerherholz/Desktop/adhd/49718167d83b032094ad6d17e64b4e49/adhd40_9750701.tgz..... done.

Also, let’s specify which subjects we want to use for this notebook. So, who do we have?

!nib-ls /Users/peerherholz/Desktop/adhd/*/*.nii.gz
/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/0010064/0010064_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/0010128/0010128_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/0021019/0021019_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/0027018/0027018_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/0027034/0027034_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/0027037/0027037_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/1517058/1517058_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/1562298/1562298_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/2497695/2497695_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/2950754/2950754_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/3007585/3007585_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/3520880/3520880_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/3994098/3994098_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/4134561/4134561_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/6115230/6115230_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/8409791/8409791_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x1.96
/Users/peerherholz/Desktop/adhd/8697774/8697774_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/9744150/9744150_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00
/Users/peerherholz/Desktop/adhd/9750701/9750701_rest_tshift_RPI_voreg_mni.nii.gz int16 [ 61,  73,  61,  90] 3.00x3.00x3.00x2.00

For each subject we also have a regressor file, containing important signal confounds such motion parameters, compcor components and mean signal in CSF, GM, WM, Overal. Let’s take a look at one of those regressor files:

import pandas as pd
pd.read_table('/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv').head()
csf constant linearTrend wm global motion-pitch motion-roll motion-yaw motion-x motion-y motion-z gm compcor1 compcor2 compcor3 compcor4 compcor5
0 12140.708282 1.0 0.0 9322.722489 9955.469315 -0.0637 0.1032 -0.1516 -0.0376 -0.0112 0.0840 10617.938409 -0.035058 -0.006713 -0.071532 0.009847 -0.027601
1 12123.146913 1.0 1.0 9314.257684 9947.987176 -0.0708 0.0953 -0.1562 -0.0198 0.0021 0.0722 10611.036827 -0.026949 -0.091152 -0.030126 0.020055 -0.099798
2 12085.963127 1.0 2.0 9319.610045 9945.132852 -0.0795 0.0971 -0.1453 -0.0439 -0.0241 0.0972 10591.877177 0.002552 0.069165 0.090166 -0.016608 -0.071980
3 12109.299348 1.0 3.0 9299.841075 9943.648622 -0.0607 0.0918 -0.1601 -0.0418 -0.0133 0.0877 10592.008336 0.079391 0.029959 -0.098036 0.062493 0.024105
4 12072.330305 1.0 4.0 9297.870869 9925.640852 -0.0706 0.0873 -0.1482 -0.0313 -0.0118 0.0712 10570.445905 0.075471 -0.030123 0.084739 0.088217 0.012996

So let’s create two lists, containing the path to the resting-state images and confounds of all subjects:

# Which subjects to consider
sub_idx = ['0010042', '0010064', '0010128', '0021019', '0027018',
           '0027034', '0027037', '1517058', '1562298', '2497695',
           '2950754', '3007585', '3520880', '3994098', '4134561',
           '6115230', '8409791', '8697774', '9744150', '9750701']
# Path to resting state files
rest_files = ['/Users/peerherholz/Desktop/adhd/%s/%s_rest_tshift_RPI_voreg_mni.nii.gz' % (sub, sub) for sub in sub_idx]
# Path to counfound files
confound_files = ['/Users/peerherholz/Desktop/adhd/%s/%s_regressors.csv' % (sub, sub) for sub in sub_idx]

Perfect, now we’re good to go!

1. Extracting times series to build a functional connectome#

So let’s start with something simple: Extracting activation time series from regions defined by a parcellation atlas.

Brain parcellation#

As a first step, let’s define the regions we want to extract the signal from. For this we can use nilearn’s great “dataset” function. How about the Harvard-Oxford Atlas? At first, we download the atlas and corresponding data via the fetch_atlas_harvard_oxford. Please note that we will use the default version but one can also specify the probabilities and resolution.

from nilearn import datasets
atlas_ho = datasets.fetch_atlas_harvard_oxford('cort-maxprob-thr25-2mm')

Now we can access the atlas from the downloaded dataset:

# Location of HarvardOxford parcellation atlas
atlas_file = atlas_ho.maps

# Visualize parcellation atlas
plotting.plot_roi(atlas_file, draw_cross=False, annotate=False);
../../../_images/f18bd0463e2bc64b421a408432059530f68e82704c5079f3329e9095c161fbf0.png

As mentioned above, the corresponding labels are also part of the dataset and can be accessed via:

# Load labels for each atlas region
labels = atlas_ho.labels[1:]
labels[:10]
['Frontal Pole',
 'Insular Cortex',
 'Superior Frontal Gyrus',
 'Middle Frontal Gyrus',
 'Inferior Frontal Gyrus, pars triangularis',
 'Inferior Frontal Gyrus, pars opercularis',
 'Precentral Gyrus',
 'Temporal Pole',
 'Superior Temporal Gyrus, anterior division',
 'Superior Temporal Gyrus, posterior division']

Extracting signals on a parcellation#

To extract signal on the parcellation, the easiest option is to use NiftiLabelsMasker. As any “maskers” in nilearn, it is a processing object that is created by specifying all the important parameters, but not the data:

from nilearn.input_data import NiftiLabelsMasker
masker = NiftiLabelsMasker(labels_img=atlas_file, standardize=True, verbose=1,
                           memory="nilearn_cache", memory_level=2)

The Nifti data can then be turned to time-series by calling the NiftiLabelsMasker fit_transform method, that takes either filenames or NiftiImage objects.

Let’s do this now for the first subject:

time_series = masker.fit_transform(rest_files[0], confounds=confound_files[0])
[NiftiLabelsMasker.wrapped] loading data from Nifti1Image(
shape=(91, 109, 91),
affine=array([[   2.,    0.,    0.,  -90.],
       [   0.,    2.,    0., -126.],
       [   0.,    0.,    2.,  -72.],
       [   0.,    0.,    0.,    1.]])
)
Resampling labels
________________________________________________________________________________
[Memory] Calling nilearn.image.resampling.resample_img...
resample_img(<nibabel.nifti1.Nifti1Image object at 0x10e34f640>, interpolation='nearest', target_shape=(61, 73, 61), target_affine=array([[  -3.,    0.,    0.,   90.],
       [   0.,    3.,    0., -126.],
       [   0.,    0.,    3.,  -72.],
       [   0.,    0.,    0.,    1.]]))
_____________________________________________________resample_img - 0.0s, 0.0min
________________________________________________________________________________
[Memory] Calling nilearn.maskers.base_masker._filter_and_extract...
_filter_and_extract('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nilearn.maskers.nifti_labels_masker._ExtractionFunctor object at 0x1335d23b0>, { 'background_label': 0,
  'clean_kwargs': {},
  'detrend': False,
  'dtype': None,
  'high_pass': None,
  'high_variance_confounds': False,
  'keep_masked_labels': True,
  'labels': None,
  'labels_img': <nibabel.nifti1.Nifti1Image object at 0x10e34f640>,
  'low_pass': None,
  'mask_img': None,
  'reports': True,
  'smoothing_fwhm': None,
  'standardize': True,
  'standardize_confounds': True,
  'strategy': 'mean',
  't_r': None,
  'target_affine': None,
  'target_shape': None}, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, dtype=None, memory=Memory(location=nilearn_cache/joblib), memory_level=2, verbose=1)
[NiftiLabelsMasker.transform_single_imgs] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
[NiftiLabelsMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_labels_masker.nifti_labels_masker_extractor...
nifti_labels_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x1335d3df0>)
____________________________________nifti_labels_masker_extractor - 0.7s, 0.0min
[NiftiLabelsMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[8161.796524, ..., 8092.3425  ],
       ...,
       [8180.454026, ..., 8224.133125]]), detrend=False, standardize=True, standardize_confounds=True, t_r=None, low_pass=None, high_pass=None, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, runs=None)
____________________________________________________________clean - 0.0s, 0.0min
_______________________________________________filter_and_extract - 1.2s, 0.0min

Compute and display the correlation matrix#

Now we’re read to compute the functional connectome with ConnectivityMeasure.

from nilearn.connectome import ConnectivityMeasure
correlation_measure = ConnectivityMeasure(kind='correlation')
correlation_matrix = correlation_measure.fit_transform([time_series])[0]

And finally we can visualize this correlation matrix:

# Mask the main diagonal for visualization:
np.fill_diagonal(correlation_matrix, 0)

# Plot correlation matrix - note: matrix is ordered for block-like representation
plotting.plot_matrix(correlation_matrix, figure=(10, 8), labels=labels,
                     vmax=0.8, vmin=-0.8, reorder=True);
../../../_images/6ddbf0421784fe3d8c4b748f5a99d52383fe8f8a9ea4c425b3a1e2596693e22d.png

Same thing without confounds, to stress the importance of including confounds#

Let’s do the same thing as before, but this time without using the confounds.

# Extract the signal from the regions
time_series_bad = masker.fit_transform(rest_files[0]) # Note that we haven't specify confounds here

# Compute the correlation matrix
correlation_matrix_bad = correlation_measure.fit_transform([time_series_bad])[0]

# Mask the main diagonal for visualization
np.fill_diagonal(correlation_matrix_bad, 0)

# Plot the correlation matrix
plotting.plot_matrix(correlation_matrix_bad, figure=(10, 8), labels=labels,
                     vmax=0.8, vmin=-0.8, title='No confounds', reorder=True)
[NiftiLabelsMasker.wrapped] loading data from Nifti1Image(
shape=(91, 109, 91),
affine=array([[   2.,    0.,    0.,  -90.],
       [   0.,    2.,    0., -126.],
       [   0.,    0.,    2.,  -72.],
       [   0.,    0.,    0.,    1.]])
)
________________________________________________________________________________
[Memory] Calling nilearn.maskers.base_masker._filter_and_extract...
_filter_and_extract('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nilearn.maskers.nifti_labels_masker._ExtractionFunctor object at 0x1335ddcf0>, { 'background_label': 0,
  'clean_kwargs': {},
  'detrend': False,
  'dtype': None,
  'high_pass': None,
  'high_variance_confounds': False,
  'keep_masked_labels': True,
  'labels': None,
  'labels_img': <nibabel.nifti1.Nifti1Image object at 0x10e34f640>,
  'low_pass': None,
  'mask_img': None,
  'reports': True,
  'smoothing_fwhm': None,
  'standardize': True,
  'standardize_confounds': True,
  'strategy': 'mean',
  't_r': None,
  'target_affine': None,
  'target_shape': None}, confounds=None, sample_mask=None, dtype=None, memory=Memory(location=nilearn_cache/joblib), memory_level=2, verbose=1)
[NiftiLabelsMasker.transform_single_imgs] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
/Users/peerherholz/anaconda3/envs/nowaschool/lib/python3.10/site-packages/joblib/memory.py:655: JobLibCollisionWarning: Cannot detect name collisions for function 'nifti_labels_masker_extractor'
  return self._cached_call(args, kwargs)[0]
[MemorizedFunc(func=<nilearn.maskers.nifti_labels_masker._ExtractionFunctor object at 0x1335ddcf0>, location=nilearn_cache/joblib)]: Clearing function cache identified by nilearn/maskers/nifti_labels_masker/nifti_labels_masker_extractor
[NiftiLabelsMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_labels_masker.nifti_labels_masker_extractor...
nifti_labels_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x133b881f0>)
____________________________________nifti_labels_masker_extractor - 0.7s, 0.0min
[NiftiLabelsMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[8161.796524, ..., 8092.3425  ],
       ...,
       [8180.454026, ..., 8224.133125]]), detrend=False, standardize=True, standardize_confounds=True, t_r=None, low_pass=None, high_pass=None, confounds=None, sample_mask=None, runs=None)
____________________________________________________________clean - 0.0s, 0.0min
_______________________________________________filter_and_extract - 1.2s, 0.0min
<matplotlib.image.AxesImage at 0x133086800>
../../../_images/2d6bd4768c2047f8a45785c5ea581059e7518ca21f54c4696bcb25aba4ee81c8.png

As you can see, without any confounds all regions are connected to each other! One reference that discusses the importance of confounds is Varoquaux & Craddock 2013.

Probabilistic atlas#

Above we used a parcellation atlas. Now, with nilearn, you can do the same thing also with a probabilistic atlas. Let’s use for example the MSDL atlas.

# Path to MSDL atlas
msdl_atlas = datasets.fetch_atlas_msdl()

# Extract only default mode network nodes
dmn_nodes = image.index_img(msdl_atlas.maps, [3, 4, 5, 6])

# Plot MSDL probability atlas
plotting.plot_prob_atlas(dmn_nodes, cut_coords=(0, -60, 29), draw_cross=False,
                         annotate=False, title="DMN nodes in MSDL atlas")
<nilearn.plotting.displays._slicers.OrthoSlicer at 0x133a2b730>
../../../_images/bac97403c4d3196ed26dd1f3bdbca5c4b106dc1cfbc8d3078c27b3bd974452ac.png

The only difference to before is that we now need to use the NiftiMapsMasker function to create the masker that extracts the time series:

from nilearn.input_data import NiftiMapsMasker
masker = NiftiMapsMasker(maps_img=msdl_atlas.maps, standardize=True, verbose=1,
                         memory="nilearn_cache", memory_level=2)

Now, as before

# Extract the signal from the regions
time_series = masker.fit_transform(rest_files[0], confounds=confound_files[0])

# Compute the correlation matrix
correlation_matrix= correlation_measure.fit_transform([time_series])[0]

# Mask the main diagonal for visualization
np.fill_diagonal(correlation_matrix, 0)
[NiftiMapsMasker.wrapped] loading regions from None
Resampling maps
________________________________________________________________________________
[Memory] Calling nilearn.image.resampling.resample_img...
resample_img(<nibabel.nifti1.Nifti1Image object at 0x133bab3d0>, interpolation='continuous', target_shape=(61, 73, 61), target_affine=array([[  -3.,    0.,    0.,   90.],
       [   0.,    3.,    0., -126.],
       [   0.,    0.,    3.,  -72.],
       [   0.,    0.,    0.,    1.]]))
_____________________________________________________resample_img - 2.3s, 0.0min
________________________________________________________________________________
[Memory] Calling nilearn.maskers.base_masker._filter_and_extract...
_filter_and_extract('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nilearn.maskers.nifti_maps_masker._ExtractionFunctor object at 0x13358b4f0>, { 'allow_overlap': True,
  'clean_kwargs': {},
  'detrend': False,
  'dtype': None,
  'high_pass': None,
  'high_variance_confounds': False,
  'keep_masked_maps': True,
  'low_pass': None,
  'maps_img': '/Users/peerherholz/nilearn_data/msdl_atlas/MSDL_rois/msdl_rois.nii',
  'mask_img': None,
  'reports': True,
  'smoothing_fwhm': None,
  'standardize': True,
  'standardize_confounds': True,
  't_r': None,
  'target_affine': None,
  'target_shape': None}, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, dtype=None, memory=Memory(location=nilearn_cache/joblib), memory_level=2, verbose=1)
[NiftiMapsMasker.transform_single_imgs] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
[NiftiMapsMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_maps_masker.nifti_maps_masker_extractor...
nifti_maps_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x133c79f30>)
______________________________________nifti_maps_masker_extractor - 2.1s, 0.0min
[NiftiMapsMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[26299.723984, ..., 24082.704925],
       ...,
       [26439.039557, ..., 24011.2596  ]]), detrend=False, standardize=True, standardize_confounds=True, t_r=None, low_pass=None, high_pass=None, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, runs=None)
____________________________________________________________clean - 0.0s, 0.0min
_______________________________________________filter_and_extract - 2.9s, 0.0min

Before we plot the new correlation matrix, we also need to define the labels of the MSDL atlas. At the same time we will also get the coordinates:

# CSV containing label and coordinate of MSDL atlas
msdl_labels = msdl_atlas.labels
msdl_coords = msdl_atlas.region_coords

Perfect! Now as before, we can plot the correlation matrix as follows:

# Plot the correlation matrix
plotting.plot_matrix(correlation_matrix, figure=(10, 8), labels=msdl_labels,
                     vmax=0.8, vmin=-0.8, reorder=True)
<matplotlib.image.AxesImage at 0x1335d2170>
../../../_images/800e73259b266fe2e1ffda45db34f865b96343ba32a6b81278cfa25881cb9b43.png

Display corresponding graph on glass brain#

A square matrix, such as a correlation matrix, can also be seen as a “graph”: a set of “nodes”, connected by “edges”. When these nodes are brain regions, and the edges capture interactions between them, this graph is a “functional connectome”.

As the MSDL atlas comes with (x, y, z) MNI coordinates for the different regions, we can visualize the matrix as a graph of interaction in a brain. To avoid having too dense a graph, we represent only the 20% edges with the highest values. For another atlas this information can be computed for each region with the nilearn.plotting.find_xyz_cut_coords function:

plotting.plot_connectome(correlation_matrix, msdl_coords, edge_threshold="80%",
                         colorbar=True)
<nilearn.plotting.displays._projectors.OrthoProjector at 0x135ea8c10>
../../../_images/ed7b14c32f7ada178221a1bb28c3efb95f01d5aba33bfdcea57ed1534ab362a0.png

As you can see, the correlation matrix gives a very “full” graph: every node is connected to every other one. This is because it also captures indirect connections.

From version 0.5.0 on, nilearn also provides an interactive plot for connectoms:

plotting.view_connectome(correlation_matrix, msdl_coords, edge_threshold="80%", edge_cmap='bwr',
                         symmetric_cmap=True, linewidth=6.0, node_size=3.0)

2. Single subject maps of seed-to-voxel correlation#

Above we computed the correlation between different regions. But what if we want to compute a seed-to-voxel correlation map for a single subject? The procedure is very similar to the one from before.

Time series extraction#

First, we need to extract the time series from the seed region. For this example, let’s specify a sphere of radius 8 (in mm) located in the Posterior Cingulate Cortex. This sphere is considered to be part of the Default Mode Network.

# Sphere radius in mm
sphere_radius = 8

# Sphere center in MNI-coordinate
sphere_coords = [(0, -52, 18)]

In this case, we will use We use the NiftiSpheresMasker function to extract the time series within a given sphere. Before signal extraction, we can also directly detrend, standardize, and bandpass filter the data.

from nilearn.input_data import NiftiSpheresMasker
seed_masker = NiftiSpheresMasker(sphere_coords, radius=sphere_radius, detrend=True,
                                 standardize=True, low_pass=0.1, high_pass=0.01,
                                 t_r=2.0, verbose=1, memory="nilearn_cache", memory_level=2)

Now we’re read to extract the mean time series within the seed region, while also regressing out the confounds.

seed_time_series = seed_masker.fit_transform(rest_files[0], confounds=confound_files[0])
________________________________________________________________________________
[Memory] Calling nilearn.maskers.base_masker._filter_and_extract...
_filter_and_extract('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nilearn.maskers.nifti_spheres_masker._ExtractionFunctor object at 0x136aa9450>, { 'allow_overlap': False,
  'clean_kwargs': {},
  'detrend': True,
  'dtype': None,
  'high_pass': 0.01,
  'high_variance_confounds': False,
  'low_pass': 0.1,
  'mask_img': None,
  'radius': 8,
  'reports': True,
  'seeds': [(0, -52, 18)],
  'smoothing_fwhm': None,
  'standardize': True,
  'standardize_confounds': True,
  't_r': 2.0}, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, dtype=None, memory=Memory(location=nilearn_cache/joblib), memory_level=2, verbose=1)
[NiftiSpheresMasker.transform_single_imgs] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
[NiftiSpheresMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_spheres_masker.nifti_spheres_masker_extractor...
nifti_spheres_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x136af99c0>)
___________________________________nifti_spheres_masker_extractor - 2.6s, 0.0min
[NiftiSpheresMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[10273],
       ...,
       [10254]], dtype=int16), detrend=True, standardize=True, standardize_confounds=True, t_r=2.0, low_pass=0.1, high_pass=0.01, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, runs=None)
____________________________________________________________clean - 0.0s, 0.0min
_______________________________________________filter_and_extract - 3.2s, 0.1min

Next, we need to do a similar procedure for each voxel in the brain as well. For this, we can use the NiftiMasker, which the same arguments as before, plus additionally smoothing the signal with a smoothing kernel of 6mm.

from nilearn.input_data import NiftiMasker
brain_masker = NiftiMasker(smoothing_fwhm=6, detrend=True, standardize=True,
                           low_pass=0.1, high_pass=0.01, t_r=2., verbose=1,
                           memory="nilearn_cache", memory_level=2)

Now we can extract the time series for every voxel while regressing out the confounds:

brain_time_series = brain_masker.fit_transform(rest_files[0], confounds=confound_files[0])
[NiftiMasker.fit] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
[NiftiMasker.fit] Computing the mask
________________________________________________________________________________
[Memory] Calling nilearn.masking.compute_background_mask...
compute_background_mask('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', verbose=0)
__________________________________________compute_background_mask - 0.3s, 0.0min
[NiftiMasker.fit] Resampling mask
________________________________________________________________________________
[Memory] Calling nilearn.image.resampling.resample_img...
resample_img(<nibabel.nifti1.Nifti1Image object at 0x135e77fd0>, target_affine=None, target_shape=None, copy=False, interpolation='nearest')
_____________________________________________________resample_img - 0.0s, 0.0min
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_masker._filter_and_mask...
_filter_and_mask('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nibabel.nifti1.Nifti1Image object at 0x135e77fd0>, { 'clean_kwargs': {},
  'detrend': True,
  'dtype': None,
  'high_pass': 0.01,
  'high_variance_confounds': False,
  'low_pass': 0.1,
  'reports': True,
  'runs': None,
  'smoothing_fwhm': 6,
  'standardize': True,
  'standardize_confounds': True,
  't_r': 2.0,
  'target_affine': None,
  'target_shape': None}, memory_level=2, memory=Memory(location=nilearn_cache/joblib), verbose=1, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, copy=True, dtype=None)
[NiftiMasker.transform_single_imgs] Loading data from Nifti1Image('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz')
[NiftiMasker.transform_single_imgs] Smoothing images
________________________________________________________________________________
[Memory] Calling nilearn.image.image.smooth_img...
smooth_img(<nibabel.nifti1.Nifti1Image object at 0x136af9510>, 6)
/Users/peerherholz/anaconda3/envs/nowaschool/lib/python3.10/site-packages/joblib/memory.py:655: JobLibCollisionWarning: Cannot detect name collisions for function 'nifti_masker_extractor'
  return self._cached_call(args, kwargs)[0]
[MemorizedFunc(func=<nilearn.maskers.nifti_masker._ExtractionFunctor object at 0x1368764a0>, location=nilearn_cache/joblib)]: Clearing function cache identified by nilearn/maskers/nifti_masker/nifti_masker_extractor
_______________________________________________________smooth_img - 1.7s, 0.0min
[NiftiMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_masker.nifti_masker_extractor...
nifti_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x136a3c550>)
___________________________________________nifti_masker_extractor - 0.4s, 0.0min
[NiftiMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[ 887.6601, ..., 1161.3188],
       ...,
       [ 907.5894, ..., 1176.775 ]], dtype=float32), detrend=True, standardize=True, standardize_confounds=True, t_r=2.0, low_pass=0.1, high_pass=0.01, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, runs=None)
____________________________________________________________clean - 7.1s, 0.1min
_________________________________________________filter_and_mask - 10.6s, 0.2min

Performing the seed-based correlation analysis#

Now that we have two arrays (mean signal in seed region, signal for each voxel), we can correlate the two to each other. This can be done with the dot product between the two matrices.

Note, that the signals have been variance-standardized during extraction. To have them standardized to norm unit, we further have to divide the result by the length of the time series.

seed_based_correlations = np.dot(brain_time_series.T, seed_time_series)
seed_based_correlations /= seed_time_series.shape[0]

Plotting the seed-based correlation map#

Finally, we can tranform the correlation array back to a Nifti image.

seed_based_correlation_img = brain_masker.inverse_transform(seed_based_correlations.T)
________________________________________________________________________________
[Memory] Calling nilearn.masking.unmask...
unmask(array([[ 0.101436, ..., -0.243823]], dtype=float32), <nibabel.nifti1.Nifti1Image object at 0x135e77fd0>)
___________________________________________________________unmask - 0.2s, 0.0min

And this we can of course plot again to better investigate the correlation outcome:

display = plotting.plot_stat_map(seed_based_correlation_img, threshold=0.333,
                                 cut_coords=sphere_coords[0])
display.add_markers(marker_coords=sphere_coords, marker_color='black',
                    marker_size=200)
../../../_images/42bb238bb2155f061b55707a98d9038f816f67079ee2428b9e7a1c579af7a3a2.png

The map above depicts the temporal correlation of a seed region with the rest of the brain. For a similar example but on the cortical surface, see this example.

3. Single subject maps of seed-to-seed correlation#

The next question is of course, how can compute the correlation between different seed regions? It’s actually very easy, even simpler than above.

Time series extraction#

First, we need to extract the time series from the seed regions. So as before, we need to define a sphere radius and centers of the seed regions:

# Sphere radius in mm
sphere_radius = 8

# Sphere center in MNI-coordinate
sphere_center = [(  0, -52, 18),
                 (-46, -68, 32),
                 ( 46, -68, 32),
                 (  1,  50, -5)]

Now we can extract the time series from those spheres:

# Create masker object to extract average signal within spheres
from nilearn.input_data import NiftiSpheresMasker
masker = NiftiSpheresMasker(sphere_center, radius=sphere_radius, detrend=True,
                            standardize=True, low_pass=0.1, high_pass=0.01,
                            t_r=2.0, verbose=1, memory="nilearn_cache", memory_level=2)
# Extract average signal in spheres with masker object
time_series = masker.fit_transform(rest_files[0], confounds=confound_files[0])
________________________________________________________________________________
[Memory] Calling nilearn.maskers.base_masker._filter_and_extract...
_filter_and_extract('/Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz', 
<nilearn.maskers.nifti_spheres_masker._ExtractionFunctor object at 0x139106500>, { 'allow_overlap': False,
  'clean_kwargs': {},
  'detrend': True,
  'dtype': None,
  'high_pass': 0.01,
  'high_variance_confounds': False,
  'low_pass': 0.1,
  'mask_img': None,
  'radius': 8,
  'reports': True,
  'seeds': [(0, -52, 18), (-46, -68, 32), (46, -68, 32), (1, 50, -5)],
  'smoothing_fwhm': None,
  'standardize': True,
  'standardize_confounds': True,
  't_r': 2.0}, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, dtype=None, memory=Memory(location=nilearn_cache/joblib), memory_level=2, verbose=1)
[NiftiSpheresMasker.transform_single_imgs] Loading data from /Users/peerherholz/Desktop/adhd/0010042/0010042_rest_tshift_RPI_voreg_mni.nii.gz
/Users/peerherholz/anaconda3/envs/nowaschool/lib/python3.10/site-packages/joblib/memory.py:655: JobLibCollisionWarning: Cannot detect name collisions for function 'nifti_spheres_masker_extractor'
  return self._cached_call(args, kwargs)[0]
[MemorizedFunc(func=<nilearn.maskers.nifti_spheres_masker._ExtractionFunctor object at 0x139106500>, location=nilearn_cache/joblib)]: Clearing function cache identified by nilearn/maskers/nifti_spheres_masker/nifti_spheres_masker_extractor
[NiftiSpheresMasker.transform_single_imgs] Extracting region signals
________________________________________________________________________________
[Memory] Calling nilearn.maskers.nifti_spheres_masker.nifti_spheres_masker_extractor...
nifti_spheres_masker_extractor(<nibabel.nifti1.Nifti1Image object at 0x1391067d0>)
___________________________________nifti_spheres_masker_extractor - 3.1s, 0.1min
[NiftiSpheresMasker.transform_single_imgs] Cleaning extracted signals
________________________________________________________________________________
[Memory] Calling nilearn.signal.clean...
clean(array([[10273, ..., 12400],
       ...,
       [10254, ..., 12444]], dtype=int16), detrend=True, standardize=True, standardize_confounds=True, t_r=2.0, low_pass=0.1, high_pass=0.01, confounds=['/Users/peerherholz/Desktop/adhd/0010042/0010042_regressors.csv'], sample_mask=None, runs=None)
____________________________________________________________clean - 0.0s, 0.0min
_______________________________________________filter_and_extract - 3.7s, 0.1min

Display mean signal per sphere#

If we want, we can even plot the average signal per sphere:

fig = plt.figure(figsize=(8, 4))
plt.plot(time_series)
plt.xlabel('Scan number')
plt.ylabel('Normalized signal')
plt.tight_layout();
../../../_images/1832e124eb80b3ea8e88d3a4eb81f2e23e0e9ee575edc6ea62d231d64c0d2f80.png

Compute partial correlation matrix#

Now that we have the average signal per sphere we can compute the partial correlation matrix, using the ConnectivityMeasure function.

from nilearn.connectome import ConnectivityMeasure
connectivity_measure = ConnectivityMeasure(kind='partial correlation')
partial_correlation_matrix = connectivity_measure.fit_transform([time_series])[0]
# Plotting the partical correlation matrix
fig, ax = plt.subplots()
plt.imshow(partial_correlation_matrix, cmap='Spectral')
for (j,i),label in np.ndenumerate(partial_correlation_matrix):
    ax.text(i, j, round(label, 2), ha='center', va='center', color='w')
    ax.text(i, j, round(label, 2), ha='center', va='center', color='w')
../../../_images/17c111956b803a4d18ddd441aa94ce66ecdb98fa62f65d44c05baf500c39ece7.png

Display connectome#

Now that we have the correlation matrix, we can also plot it again on the glass brain with plot_connectome:

from nilearn.plotting import plot_connectome
plot_connectome(partial_correlation_matrix, sphere_center,
                display_mode='ortho', colorbar=True,  node_size=150,
                title="Default Mode Network Connectivity")
<nilearn.plotting.displays._projectors.OrthoProjector at 0x13906f040>
../../../_images/678e3f1b98fb8303d15a916bf8bb5d341b4006526be181cd05349ab0c23ab547.png

And again with nilearn’s interactive connectome viewer function:

plotting.view_connectome(partial_correlation_matrix, sphere_center, edge_cmap='bwr',
                         symmetric_cmap=True, linewidth=6.0, node_size=10.0)