I do ReHo , Roi based analyses (correlation) and ALFF. Then ı do ttest by uber_ttest.py. At the ttest dataset ı use clustering at the AFNI GUI and when look clustering report MOCO has more significant voxels (total number of voxels at the report ) than NOTMOCO.
3dReHo -prefix ReHo11_{$subj} -inset errts.{$subj}.tproject+tlrc -mask mask_group+tlrc
3dmaskdump -noijk -mask mask_group+tlrc ReHo11_{$subj}+tlrc | 1d_tool.py -show_mmms -infile ->tt.txt
grep mean tt.txt | cut -f'2 4' -d ',' | cut -f2 -d ',' | cut -f2 -d '=' > std.txt
set std=`cat std.txt`
grep mean tt.txt | cut -f'2 4' -d ',' | cut -f1 -d ',' | cut -f2 -d '=' > mean.txt
set mean=`cat mean.txt`
echo {$mean}
echo {$std}
3dcalc -a ReHo11_{$subj}+tlrc -b mask_group+tlrc -expr '((a-'{$mean}')/('{$std}'*b))' -prefix ReHo11_Normalized2
for ReHo automating ı take std and mean value by code. I tried manually but the result didn't changed. I use ReHo11_Normalized2 for ttest.
3dUndump -prefix {$roi} -master errts.{$subj}.tproject+tlrc. -srad 6 -xyz {$roi}.txt
3dmaskave -quiet -mask {$roi}+tlrc. errts.{$subj}.tproject+tlrc. > timeCourse.txt
3dfim+ -input errts.{$subj}.tproject+tlrc. -polort 2 -ideal_file timeCourse.txt -out Correlation -bucket {$roi}_Corr
3dcalc -a {$roi}_Corr+tlrc. -expr 'log((1+a)/(1-a))/2' -prefix Corr_{$roi}_m_{$subj}_Z
and this is for Roi based analyses. I use Corr_....._Z for ttest last one.