AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
October 14, 2008 10:43AM
Hello AFNI experts,

I'm running a very straightforward 3dDeconvolve script that works without problems. However, memory usage is currently 1.1GB for a single session (112x112x30x1548 voxels), while we plan to concatenate data from different sessions which would increase memory usage to >2GB. Since my computer has only 2GB, I expect this will cause an enormous amount of swapping, if the program does not crash.

Since 3dDeconvolve uses only a univariate approach, my question is whether it is possible to reduce memory usage significantly (with possibly a small increase in CPU time or I/O)

I'm using 3dDeconvolve with both the multiple jobs option and a mask. I observed that the mask does not decrease memory usage, the whole dataset is still loaded (CPU time *is* decreased though). This makes sense, I guess, because only loading voxels within the mask requires that the hard drive does a lot of seeking.

Nevertheless, it seems to me that memory usage could be decreased by only loading parts of the dataset at a time (step 1). For example, the data could be loaded and processes slice by slice. I haven't found an option in 3dDeconvolve; am i missing something, and if not, would it be hard to implement this?
Note that I found another post [1] where 3dDeconvolve is run slice by slice using a mask, but since the whole dataset is loaded anyway I don't expect this will reduce memory usage.

Furthermore, am I the only person who deals with this problem? Is there another solution, other than resampling, dealing with a machine that swaps a lot, or begging my supervisor to buy more memory?

thanks,
nick

[1] [afni.nimh.nih.gov]
Subject Author Posted

reducing 3ddeconvolve memory usage?

Nick Oosterhof October 14, 2008 10:43AM

Re: reducing 3ddeconvolve memory usage?

Bob Cox October 14, 2008 10:58AM

Re: reducing 3ddeconvolve memory usage?

Nick Oosterhof October 14, 2008 11:03AM

Re: reducing 3ddeconvolve memory usage?

Nick Oosterhof October 20, 2008 01:15PM

Re: reducing 3ddeconvolve memory usage?

Bob Cox October 20, 2008 01:29PM