I will have to try something, since it is weird that it works for "small" datasets and fails for "large" datasets.
Please give me the information about your datasets listed below:
- Dimensions (grid points in each direction, including time)
- Dataset "type" (floats, shorts, ???)
- Largest number of time points where 3dDeconvolve seems to work in your experience
- Number of regression columns used (should be in the 3dDeconvolve stderr output)
With this info, I can make up some fake data and try it. Since the datasets are so big, I'll have to do this on the NIH's cluster, using one of the "largemem" nodes. Once I can duplicate the failure, I can try to figure out what is causing it. Clearly it is some issue with memory allocation or misuse, but WHERE in the program it happens is kind of opaque -- somewhere at the startup is all I can see now.