That script tries to point you in the right direction
when it finishes, suggesting to try:
sort -n $outdir/$LCfile | head -1"
which is probably:
sort -n stim_results/NSD_sums | head -1
Out of the 100 iterations that were run (consider making
that bigger, especially if you want many sets of timing),
that sort command shows the best one (according to the
sums you choose to compute). For the best 20, consider
using 1000 iterations, instead.
That script basically generated a set of timing files at
each iteration, and then evaluates them according to their
relative normalized standard deviations. So the result of
running the script is the actual timing files one might
use (and many that are considered garbage).
Suppose that sort command shows that iteration 0038 made
the best result (as the class example would). Then those
stimulus timing files would be those shown by:
ls -l stim_results/stimes.0038*
The sample 3dDeconvolve command that tested them is in:
stim_results/cmd.3dd.0038
And the evaluation of inter-stimulus intervals is in:
stim_results/out.mrt.0038
- rick