I've got a high resolution data set (six runs of 0.8 mm^3 voxels, 200x84x162), and 3dDeconvolve crashes even when I allocate 64 GB of memory. Is there a solution to this?
If not, one thought is to analyze each run individually and average them together. If I just wanted to take the betas to a group analysis, I would have no problem with that, but this is a functional localizer, so I need the t scores. I have a vague recollection of reading on this board 10+ years ago that it's okay to average z-transformed t values, but I want to confirm that.
Thanks!
Phil