Hi all,
I'm trying to run 3dDeconvolve on a large amount of data obtained from a single participant over many sessions (~250 runs of about 500MB each). When I do, 3dDeconvolve crashes with the following error (followed by a long stack trace and memory map):
*** Error in `3dDeconvolve': free(): invalid next size (normal): 0x0000000002bce690 ***
I can run the same command on the first (or second) 50% of the runs without issue (and the full version has considerably more than twice as much memory available to it when it fails). Based on
this thread, my understanding is that there shouldn't be a limit on the amount of data that AFNI can read in, so long as each volume is a reasonable size and there is adequate RAM available.
Anyone know what might be happening here or how I might work around it?
Thank you!