AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
November 14, 2017 12:35PM
I've got a high resolution data set (six runs of 0.8 mm^3 voxels, 200x84x162), and 3dDeconvolve crashes even when I allocate 64 GB of memory. Is there a solution to this?

If not, one thought is to analyze each run individually and average them together. If I just wanted to take the betas to a group analysis, I would have no problem with that, but this is a functional localizer, so I need the t scores. I have a vague recollection of reading on this board 10+ years ago that it's okay to average z-transformed t values, but I want to confirm that.

Thanks!

Phil
Subject Author Posted

3dDeconvolve running out of memory

Phil Burton November 14, 2017 12:35PM

Re: 3dDeconvolve running out of memory

Colm Connolly November 14, 2017 01:42PM

Re: 3dDeconvolve running out of memory

Phil Burton November 14, 2017 01:54PM

Re: 3dDeconvolve running out of memory

Colm Connolly November 14, 2017 03:03PM

Re: 3dDeconvolve running out of memory

Phil Burton November 14, 2017 03:17PM

Re: 3dDeconvolve running out of memory

rick reynolds November 14, 2017 10:05PM

Re: 3dDeconvolve running out of memory

Phil Burton November 15, 2017 04:55PM