Hi,
I am trying to run a GLM analysis using 3dDeconvolve on a very large dataset. During run concatenation, 3dDeconvolve fails with memory allocation error. A solution to this problem would be for 3dDeconvolve to read the data in sections or "tiles" and cobble the results together in a final step.
i.e. 3dDeconvolve -tiles 10 ...
would divide the data set in to 10 chunks, where each chunk would be read in to memory sequentially.
I suspect issues with large data sets will increase in the future as spatial resolution increases, TR decreases, and collection times increase.
thanks,
Brad Buchsbaum