History of AFNI updates  

|
Daniel Glen
January 29, 2009 11:53AM
There have been a few threads about how to deal with very large datasets (and avoiding crashes caused by memory allocation errors). Here's a good example with some sample solutions:

[afni.nimh.nih.gov]

Your choices are essentially doing the slice at a time method using 3dZcutup and then 3dZcat to put them back together again, using 3dAutobox and/or 3dZeropad to reduce the spatial extent to an area in which you are interested like brain only or resampling (3dresample) to a coarser resolution. For the 3dZcutup solution, the slices will only be a single line of voxels when viewed across the slice plane (not in-place) until the 3dDeconvolve resulting slices are reunited with 3dZcat. With 3dZcutup, you do not have to split the volume into single slices but into only two parts if you like.

If you have further questions, please feel free to post the entire command and results.
Subject Author Posted

3dDeconvolve killed

Jayna Amting January 29, 2009 10:32AM

Re: 3dDeconvolve killed

Daniel Glen January 29, 2009 11:53AM

Re: 3dDeconvolve killed

Bob Cox January 29, 2009 01:56PM