AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
January 25, 2008 03:33PM


Hi,

I am trying to run a GLM analysis using 3dDeconvolve on a very large dataset. During run concatenation, 3dDeconvolve fails with memory allocation error. A solution to this problem would be for 3dDeconvolve to read the data in sections or "tiles" and cobble the results together in a final step.

i.e. 3dDeconvolve -tiles 10 ...

would divide the data set in to 10 chunks, where each chunk would be read in to memory sequentially.

I suspect issues with large data sets will increase in the future as spatial resolution increases, TR decreases, and collection times increase.

thanks,

Brad Buchsbaum
Subject Author Posted

3dDeconvolve feature request for large data sets

Brad Buchsbaum January 25, 2008 03:33PM

Re: 3dDeconvolve feature request for large data sets

Daniel Glen January 25, 2008 04:04PM

Re: 3dDeconvolve feature request for large data sets

bob cox January 28, 2008 09:29AM

Re: 3dDeconvolve feature request for large data sets

Brad Buchsbaum January 29, 2008 05:05PM

Re: 3dDeconvolve feature request for large data sets

Daniel Glen January 29, 2008 05:21PM