History of AFNI updates  

|
November 14, 2017 12:35PM
I've got a high resolution data set (six runs of 0.8 mm^3 voxels, 200x84x162), and 3dDeconvolve crashes even when I allocate 64 GB of memory. Is there a solution to this?

If not, one thought is to analyze each run individually and average them together. If I just wanted to take the betas to a group analysis, I would have no problem with that, but this is a functional localizer, so I need the t scores. I have a vague recollection of reading on this board 10+ years ago that it's okay to average z-transformed t values, but I want to confirm that.

Thanks!

Phil
Subject Author Posted

3dDeconvolve running out of memory

Phil Burton November 14, 2017 12:35PM

Re: 3dDeconvolve running out of memory

Colm Connolly November 14, 2017 01:42PM

Re: 3dDeconvolve running out of memory

Phil Burton November 14, 2017 01:54PM

Re: 3dDeconvolve running out of memory

Colm Connolly November 14, 2017 03:03PM

Re: 3dDeconvolve running out of memory

Phil Burton November 14, 2017 03:17PM

Re: 3dDeconvolve running out of memory

rick reynolds November 14, 2017 10:05PM

Re: 3dDeconvolve running out of memory

Phil Burton November 15, 2017 04:55PM