Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nulls for parcellated data don't show right number of regions #161

Open
1 task done
lshoheisel opened this issue May 8, 2024 · 3 comments
Open
1 task done

Nulls for parcellated data don't show right number of regions #161

lshoheisel opened this issue May 8, 2024 · 3 comments

Comments

@lshoheisel
Copy link

Description of issue

Hi,

I'm trying to produce some null distributions of parcellated data and have run into an issue. The atlas file I have has 70 region and is in MNI space. When I parcellate data, this works fine:

import nibabel as nb
from neuromaps import parcellate, datasets, nulls,transforms,resampling,images

atlas_file = '070-DesikanKilliany-1mm.nii.gz'
atlas = nb.load(atlas_file)

parcellation = images.relabel_gifti(transforms.mni152_to_fsaverage(atlas,fsavg_density = '10k',method = 'nearest'))
dk = parcellate.Parcellater(parcellation, 'fsaverage').fit()

abagen = datasets.fetch_annotation(source='abagen')
abagen = dk.transform(abagen, 'fsaverage')

This gets me the expected 70-region vector, but when I try to produce nulls, I get an error message:

rotated = nulls.alexander_bloch(abagen, 'fsaverage', '10k',
                                n_perm=100, seed=1234, parcellation=parcellation)
  File C:\anaconda\envs\TVB\Lib\site-packages\neuromaps\nulls\nulls.py:139 in alexander_bloch
    return load_data(data)[spins]

IndexError: index 70 is out of bounds for axis 0 with size 70

I tried producing generic nulls by using data=None, but the resulting matrix features 72 regions, so 2 more than should be in the atlas.

Any idea what went wrong? Am I missing a mistake in the code, or could this be an issue with the atlas file?

Thanks,
Linnea

Code of Conduct

  • I agree to follow the neuromaps Code of Conduct
@Picardian14
Copy link

Hello,

I've been having a similar issue using the AAL90 atlas which is also in MNI space.
If you observe the output of the transform function it will give you a left and right hemisphere surface. By taking the unique values in each one you will probably see that some labels from the left hemisphere might have been labeled as right and viceversa. Or even more, as in my case, some regions might not even be present at all.

If you don't have a need to use rotated maps you can use the Burt2018, Burt2020 or Moran null models to generate surrogate null maps on an MNI parcellation.

In my case I need to have the rotated maps but the transformations of neuromaps are not working and I cannot get a surface version of the AAL90 parcellation. If someone can give me a help with this I would be very thankful

Best,
Ivan

@sarahkreuzer
Copy link

Hi there,

I have observed the same problem as described by Linnea using a different atlas: Transforming the neuromorphometric atlas from MNI to fsLR space and parcelling some maps according to neuromorphometric in fsLR works fine. However, creating the nulls gives me 2 more regions than it should (setting data=None). Is there a way to get the right nulls?

Thanks,

Sarah

@VinceBaz
Copy link
Member

VinceBaz commented Jun 3, 2024

Hey!

I apologize for the delay. I will try to offer a general explanation that will hopefully explain most of the issues that you guys encountered.

When we call alexander_bloch (or any other spin function) to generate spin permutations, the function looks at the labels of each parcel in the parcellation and drops the parcels with labels included in the PARCIGNORE list, stored as a global variable in the toolbox. This list includes the following labels: unknown, corpuscallosum, Background+FreeSurfer_Defined_Medial_Wall, ???, Unknown, Medial_wall, Medial wall, medial_wall. In your cases, you are transforming a parcellation in volumetric space to a surface space. When this happens, the parcellation is mapped to the surface, but no parcel label is included in the new gifti file storing the parcellation data, which means that when you try to generate a null map using this transformed parcellation file, the function does not drop any parcel. It therefore expects to see as many parcels as there are unique values in the parcellation file (i.e. 72 values in @lshoheisel case).

However, the Parcellater class works differently: it does not look at the labels to decide which parcel to drop from the data. Instead, it assumes that 0 is the background parcel and this specific parcel is dropped. [You can actually call relabel_gifti to set the ids of parcels you want to drop from the data (e.g. those in the PARCIGNORE list) to 0]. Thus, in @lshoheisel 's case, after calling the dk.transform, we end up with a 70-region vector since the parcels with ids 0 in both the right and left hemispheres are removed.

As a result of these discrepancies, we get an IndexError.

Currently, there is no option to manually provide a list of indices to drop when generating spin nulls. Adding a drop parameter to the spin functions would definitely make it easier to ignore some parcels. I'll try to create a PR to fix this issue.

In the meantime, a fix would be to go over the parcels in the transformed .gii images of the parcellation, and add a label from PARCIGNORE to those that you want to ignore. For @isoheisel's example, if you want to ignore parcel 0 (which is the parcel that the Parcellater class ignores), it could look as follows:

for gii_hemi in parcellation:
    parc_ids = np.unique(gii_hemi.agg_data()).astype('int')
    new_labels = [nib.gifti.GiftiLabel(i) for i in parc_ids]
    for label in new_labels:
        label.label = label.key
    new_labels[0].label = 'unknown'
    gii_hemi.labeltable.labels = new_labels

Then, the following:

rotated = nulls.alexander_bloch(abagen, 'fsaverage', '10k', n_perm=100, seed=1234, parcellation=parcellation)

should work as expected.

I hope this helps! Let me know if anything is not clear or if the proposed fix does not work!

Best,
Vince

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants