open-discussion > number of clusters
Showing 1-7 of 7 posts
Display:
Results per page:
Jan 27, 2015  03:01 PM | Shengwei Zhang
number of clusters
Hi all,

I'm using the latest version of the software to parcellate the GM of my own datasets. But when I set the number of clusters to 600 or higher, the output only had 255 clusters with the max. label of 255. The voxel size of my dataset is 3mm isotropic. Does the program restrict the voxel size to be the same as those of the test data, i.e. 4mm isotropic? Any help would be appreciated.

Shengwei
Aug 13, 2015  06:08 PM | Adam Steel - NIH/University of Oxford
RE: number of clusters
Hi Shangwei -

I'm having the same issue at the moment. Was this issue ever resolved?

Thanks

Adam
Aug 13, 2015  09:08 PM | Shengwei Zhang
RE: number of clusters
Unfortunately not, Adam.

Shengwei
Aug 15, 2015  10:08 AM | S Mody - Indian Institute of Science, Bangalore
RE: number of clusters
Originally posted by Adam Steel:
Hi Shangwei -

I'm having the same issue at the moment. Was this issue ever resolved?

Thanks

Adam

It appears that (at least some of the) code may not complete in a practical amount of time for higher resolutions. I used preprocessed fMRI data from the Human Connectome Project which has a voxel resolution of (2 x 2 x 2) mm. I used a cluster count of 1000. The function make_local_connectivity_tcorr() with a grey mask having about 255,000 voxels runs successfully to create the correlation matrix. The correlation matrix has around 1,000,000 non-zero entries. However the next step binfile_parcellate() runs forever - as the eigenvalue decomposition of the Laplacian of the correlation matrix does not converge (after 24+ hours of run time).

There is a comment in the ncut() function of the toolbox about regularizing the matrix for better stability. So it appears that the author himself may have had problems with the eigenvalue decomposition - which is of course not at all unexpected. Trying to find the eigenvectors of a large sparse matrix can be hell. Hopefully the author reads this and comments.

My next step is to resample the data and grey-mask to (4 x 4 x 4) and rerun the code.

Wondering whether anyone else has had a similar experience.

Regards,
Sandeep Mody.
Aug 15, 2015  10:08 AM | S Mody - Indian Institute of Science, Bangalore
RE: number of clusters
@Shenwei, perhaps the reason you get so few clusters is that the correlation matrix is so sparse that only 255 of the eigenvectors are numerically significant. Did you consider this possibility?

Regards,
Sandeep Mody.
Aug 17, 2015  07:08 AM | Daniel Lurie
RE: number of clusters
Hi all,

I'm also experiencing this issue (212 subjects, 4mm voxels, HO 25% GM mask, group average parcellation).

It looks like the actual parcel data (in the .npy files) contains the right number of parcels. You can check this on your own results with the following python code:
import numpy as np

parcel_data = np.load('/path/to/parcellation_k.npy')

np.unique(parcel_data, return_counts=True)

This will return two arrays of equal length. The first array shows every unique value in the data, so if your k=100, it should have *around* 100 entries (as Cameron mentions in the paper, NCUT can sometimes produce clustering with empty clusters, so your actual k might end up somewhat smaller than your desired k). The second array is the number of voxels with each value at the same array index in the first array.

I think the issue is with the scripts that convert the NumPy parcellation data into a NIfTI volume. Working on a fix now, will report back if/when I'm successful.

Dan
Aug 17, 2015  09:08 AM | Daniel Lurie
RE: number of clusters
Hi again,

This script should work until Cameron releases an official fix: https://github.com/danlurie/cluster_roi/blob/master/nifti_gen_fix.py

The problem was essentially an overflow error; the data type from the mask image (to which the parcellation values are written) is uint8 by default, which can't handle values over 255. My fix converts everything to float64, which is overkill, but works.

Let me know if there are any issues.

Dan