Yes, 'extents' should have been just 'extent' in the
earlier grep command. If you are not familiar with
running 3dinfo on a dataset to see the details, it
would be good to try out.
But there are a few small things that lead to a
9 or 10 times scaling in the size:
1. The voxels will come out at 1.75mm^3 unless
you specify otherwise. That scales the size by a
factor of 2.4 (1.7969*1.7969*4 / (1.75^3)).
2. The "extents" of the dataset are larger, to fill
the box of the template. That probably doubles
the size.
3. The data are eventually processed as 32-bit
floats, not 16-bit short integers. That doubles
the size.
So the data size will be scaled by 9.6 (2.4*2*2),
unless you specify otherwise.
Using voxels of 2mm^3 will scale the size by
2/3 or so (1.75^3/2^3).
Also, use -regress_compute_fitts, which saves
about 40% RAM in the 3dDeconvolve command.
- rick