Rick,
I was initially worried about this, but Ziad suggested that I do it this way (submitting all timepoints in each run). Nevertheless, I just ran the following command:
SurfSmooth \
-met HEAT_07 \
-spec CW_BVspec.spec \
-surf_A CW_anat_inhom_crop_SAG_TRF_TAL_LH_RECOSM.srf \
-input CW_RNPAallruns_lh1test.niml.dset \
-blurmaster CW_RNPAallruns_lh1test.niml.dset \
-detpoly_master 3 \
-output CW_RNPASallruns_lh1test.niml.dset \
-target_fwhm 4 \
-cmask '-a CW_RNPAallruns_lh1test.niml.dset[0] -expr bool(a)' > r1_smooth_LHtest.1D
And, here's what I get:
** nodemax(+1) of 111634(+1) is larger than the forced_mask_length of 110648
-cmask datasets may be inappropriate for surface used
*** glibc detected *** free(): invalid pointer: 0x002e1800 ***
Abort (core dumped)
-Adam