AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
November 21, 2016 02:35PM
Hi,

I am currently receiving the following error and I am unsure of the cause or how to fix it:
** FATAL ERROR: '-stim_times 1' file 'final/stimuli/0_Stay.txt' has 1 auxiliary values per time point [nopt=16]

The timing file of reference (0_Stay.txt) looks like this:
-although this also happens for all of my event files, no matter the participant or the condition (including ones that contain exactly two runs of data)

55.0*2.00257205963 97.5*1.41961884499 110.0*2.65587878227 127.5*2.08984804153 145.0*2.47264695168 155.0*1.80546998978 167.5*2.10321688652 190.0*1.86910700798 195.0*1.88350510597 207.5*1.32351708412 215.0*1.520327489 235.0*1.68914914131 290.0*1.47039198875 305.0*2.05382609367
30.0*1.27251815796 65.0*1.65395784378 77.5*1.62255883217 100.0*2.17324495316 127.5*1.45616602898 137.5*1.29013109207 160.0*1.26953887939 172.5*1.28880596161 177.5*1.40289998055 210.0*1.63748216629 267.5*1.65583705902 285.0*1.520788908
*
*
*
*



The relevant part of the tcsh script that is causing this error is (and particularly in the 3dDeconvolve block):

# execute via :
# tcsh -xef new_glm_draft.sh |& tee output.proc.final

# =========================== auto block: setup ============================
# script setup

# take note of the AFNI version
afni -ver

# check that the current AFNI version is recent enough
afni_history -check_date 28 Oct 2015
if ( $status ) then
echo "** this script requires newer AFNI binaries (than 28 Oct 2015)"
echo " (consider: @update.afni.binaries -defaults)"
exit
endif

# the user may specify a single subject to run with
if ( $#argv > 0 ) then
set subj = $argv[1]
else
set subj = vc$VCnumber
endif

# label of GLM - and directory name
set glm_dir='final'

# assign output directory name
set subj_dir = $subj.results

# copy glm.ALLS file into directory
# cp glm.$glm_dir GLMS/glm.$glm_dir

# verify that the results directory does not yet exist
if ( -d $subj_dir/$glm_dir ) then
echo output dir "$glm_dir" already exists
exit
endif

# set list of runs
set runs = (`count -digits 2 1 6`)

# create results and stimuli directories
mkdir $subj_dir/$glm_dir
mkdir $subj_dir/$glm_dir/stimuli

# copy stim files into stimulus directory
cp \
$default_dir/ParticipantNEWTEST/vc$VCnumber/0_Stay.txt \
$default_dir/ParticipantNEWTEST/vc$VCnumber/1_Switch.txt \
$subj_dir/$glm_dir/stimuli

# copy anatomy to results dir
3dcopy $default_dir/NIfTI/vc$VCnumber/Mpragevc$VCnumber.nii \
$subj_dir/$glm_dir/Mpragevc$VCnumber


# -------------------------------------------------------
# enter the results directory (can begin processing data)
cd $subj_dir

set rundata = "pb04.$subj.r*.scale+tlrc.HEAD"
cat dfile.r*.1D > $glm_dir/dfile_rall.1D # make a single file of registration params


# catenate outlier counts into a single time series
cat outcount.r*.1D > outcount_rall.1D

# create 'full_mask' dataset (union mask)
cp full_mask* $glm_dir

# create an anat_final dataset, aligned with stats
# 3dcopy Mpragevc$VCnumber+orig anat_final.$subj+tlrc.HEAD


# ================================ regress =================================
cd $glm_dir

# compute de-meaned motion parameters (for use in regression)
1d_tool.py -infile dfile_rall.1D -set_nruns 6 \
-demean -write motion_demean.1D

# compute motion parameter derivatives (for use in regression)
1d_tool.py -infile dfile_rall.1D -set_nruns 6 \
-derivative -demean -write motion_deriv.1D

# create censor file motion_${subj}_censor.1D, for censoring motion
1d_tool.py -infile dfile_rall.1D -set_nruns 6 \
-show_censor_count -censor_prev_TR \
-censor_motion 0.3 motion_${subj}

# note TRs that were not censored
set ktrs = `1d_tool.py -infile motion_${subj}_censor.1D \
-show_trs_uncensored encoded`

cd ..


# run the regression analysis
3dDeconvolve -input pb04.$subj.r*.scale+tlrc.HEAD \
-censor $glm_dir/motion_${subj}_censor.1D \
-polort 3 \
-num_stimts 8 \
-stim_times 1 $glm_dir/stimuli/0_Stay.txt 'GAM' \
-stim_label 1 Stay \
-stim_times 2 $glm_dir/stimuli/1_Switch.txt 'GAM' \
-stim_label 2 Switch \
-stim_file 3 $glm_dir/motion_demean.1D'[0]' -stim_base 3 -stim_label 3 roll \
-stim_file 4 $glm_dir/motion_demean.1D'[1]' -stim_base 4 -stim_label 4 pitch \
-stim_file 5 $glm_dir/motion_demean.1D'[2]' -stim_base 5 -stim_label 5 yaw \
-stim_file 6 $glm_dir/motion_demean.1D'[3]' -stim_base 6 -stim_label 6 dS \
-stim_file 7 $glm_dir/motion_demean.1D'[4]' -stim_base 7 -stim_label 7 dL \
-stim_file 8 $glm_dir/motion_demean.1D'[5]' -stim_base 8 -stim_label 8 dP \
-gltsym 'SYM: Stay -Switch' \
-glt_label 1 S-S \
-gltsym 'SYM: 0.5*Stay +0.5*Switch' \
-glt_label 2 mean.SS \
-fout -tout -x1D $glm_dir/X.xmat.1D -xjpeg X.jpg \
-x1D_uncensored $glm_dir/X.nocensor.xmat.1D \
-fitts $glm_dir/fitts.$subj \
-errts $glm_dir/errts.${subj} \
-bucket $glm_dir/stats.$subj


# if 3dDeconvolve fails, terminate the script
if ( $status != 0 ) then
echo '---------------------------------------'
echo '** 3dDeconvolve error, failing...'
echo ' (consider the file 3dDeconvolve.err)'
exit
endif

cd $glm_dir

# display any large pairwise correlations from the X-matrix
1d_tool.py -show_cormat_warnings -infile X.xmat.1D |& tee out.cormat_warn.txt

# create an all_runs dataset to match the fitts, errts, etc.
3dTcat -prefix all_runs.$subj $glm_dir/pb04.$subj.r*.scale+tlrc.HEAD

# --------------------------------------------------
# create a temporal signal to noise ratio dataset
# signal: if 'scale' block, mean should be 100
# noise : compute standard deviation of errts
3dTstat -mean -prefix rm.signal.all all_runs.$subj+orig"[$ktrs]"
3dTstat -stdev -prefix rm.noise.all errts.${subj}+orig"[$ktrs]"
3dcalc -a rm.signal.all+orig \
-b rm.noise.all+orig \
-c full_mask.$subj+orig \
-expr 'c*a/b' -prefix TSNR.$subj

# ---------------------------------------------------
# compute and store GCOR (global correlation average)
# (sum of squares of global mean of unit errts)
3dTnorm -norm2 -prefix rm.errts.unit errts.${subj}+orig
3dmaskave -quiet -mask full_mask.$subj+orig rm.errts.unit+orig \
> gmean.errts.unit.1D
3dTstat -sos -prefix - gmean.errts.unit.1D\' > out.gcor.1D
echo "-- GCOR = `cat out.gcor.1D`"

# ---------------------------------------------------
# compute correlation volume
# (per voxel: average correlation across masked brain)
# (now just dot product with average unit time series)
3dcalc -a rm.errts.unit+orig -b gmean.errts.unit.1D -expr 'a*b' -prefix rm.DP
3dTstat -sum -prefix corr_brain rm.DP+orig

# create ideal files for fixed response stim types
1dcat X.nocensor.xmat.1D'[24]' > ideal_Stay.1D
1dcat X.nocensor.xmat.1D'[25]' > ideal_Switch.1D

# --------------------------------------------------------
# compute sum of non-baseline regressors from the X-matrix
# (use 1d_tool.py to get list of regressor colums)
set reg_cols = `1d_tool.py -infile X.nocensor.xmat.1D -show_indices_interest`
3dTstat -sum -prefix sum_ideal.1D X.nocensor.xmat.1D"[$reg_cols]"

# also, create a stimulus-only X-matrix, for easy review
1dcat X.nocensor.xmat.1D"[$reg_cols]" > X.stim.xmat.1D


It is important to note that this is part 2 of two scripts with part 1 running all of the necessary preprocessing steps and part 2 running the GLMs.
Finally, I have read through some previous forums on this site and have adjusted my timing files accordingly to be married with a colon instead of an asterisk and also changed the asterisks that represented no data for that run to be "-1:0" (as suggested here: [afni.nimh.nih.gov]). Neither of these changes seemed to solve the Fatal Error but it is possible that I implemented these changes wrong.

Thank you in advance for your help!
Subject Author Posted

FATAL ERROR: has 1 auxiliary values per time point [nopt=16]

h.becker November 21, 2016 02:35PM

Re: FATAL ERROR: has 1 auxiliary values per time point [nopt=16]

rick reynolds November 21, 2016 03:10PM

Re: FATAL ERROR: has 1 auxiliary values per time point [nopt=16]

h.becker November 21, 2016 04:00PM