Hi Cody,
As with most of the programs, including 3dWarpDrive (called by
the @auto_tlrc script), allocating space for a new dataset is
highly dependent on the size of the input datasets. Some
people work with datasets that are more than 2 Gig in size, and
would obviously need 3-5 GB _minimum_ to deal with them.
In the case of your @auto_tlrc script, note that if you do not
have sufficient memory to process data at 1 mm^3 resolutioin,
it might be worth trying out the -dxyz option, and working at
2 mm^3, instead (1/8th the memory needs). Though if you are
working with anatomical datasets, that is probably not desired.
Having said that, we recommend a minimum of 1 GB RAM, in general,
though obviously it is not required for everything. I have only
512 MB at home, but I do not do any serious processing there. If
you do group analyses on functional data, then I would say 2 GB
is minimum and at least 3-4 GB is recommended.
Again, your memory needs depend on your processing needs.
- rick