AFNI Soup to Nuts: How to Analyze Data with AFNI from Start to Finish
At the moment, this information is input via a command-line interface, or with an optional question/answer session (
afni_proc.py -ask_me
). Eventually, a GUI will become available (but not yet).
Slide 3
Illustration of Stimulus Conditions
:
Analysis Steps
:
Back to afni_proc.py
:
Line-by-Line Explanation of
s1.afni_proc.block
command:
Line-by-Line Explanation of
s1.afni_proc.block
command (cont…)
PART I Process Data for each Individual Subject
:
Pre-processing is done by the
proc.sb23.blk
script within the directory,
AFNI_data4/sb23.blk.results/
.
STEP 0
(tcat):
Apply 3dTcat to copy datasets into the results directory, while removing the first 3 TRs from each run.
STEP 1
(tshift):
Check for possible “outliers” in each of the 9 time series datasets using 3dToutcount . Then perform temporal alignment using 3dTshift.
Subject sb23’s outlier files:
in
afni
, view run 01, time points 38, 39 and 40
Next, perform temporal alignment using
3dTshift
.
Subject sb23’s newly created time shifted datasets:
STEP 2
: Register the volumes in each 3D+time dataset using AFNI program 3dvolreg.
Subject sb23’s 9 newly created volume registered datasets:
view the registration parameters in the text file,
dfile.rall.1D
STEP 3
:
Apply a Gaussian filter to spatially blur the volumes using program 3dmerge.
results from
3dmerge
:
STEP 3.5
:
creating a union mask
next, take the union of the run masks
so the result is dataset,
full_mask.sb23.blk+orig
STEP 4
:
Scaling the Data - as percent of the mean
Another example:
Scale the Data:
Compare EPI graphs from before and after scaling
compare EPI images from before and after scaling
STEP 5
: Perform a regression analysis on Subject sb23’s data with 3dDeconvolve.
3dDeconvolve command - Part 1
3dDeconvolve command - Part 2
3dDeconvolve command - Part 3
3dDeconvolve command - Part 4
There are a many reasons why
3dDeconvolve
might fail:
After running
3dDeconvolve
, an 'all_runs' dataset is created by concatenating the 9 scaled EPI time series datasets, using program
3dTcat
.
Create ideal response files, in case the user wishes to plot them in the graph window along with the time-series data.
Compute estimates for the amount of blur in the data, as requested via
afni_proc.py
option,
-regress_est_blur_errts
.
Generate a script that the user can run to review the unprocessed EPI data.
Let’s view some data
:
Now plot the
all_runs
dataset along with the
fitts
dataset
For each subject, we need only the 9 beta weights of our stimulus conditions to perform the group analysis. For our class example, the beta-weights are located in the following sub-bricks:
Result: One dataset for each of the 16 subjects, containing only the 9 sub-bricks of regressors of interest. These regressors will be used for the ANOVA.
STEP 7
: Warp the beta datasets for each subject to Talairach space, by applying the transformation in the anatomical datasets with adwarp.
PART 2 Run Group Analysis (ANOVA3)
:
3dANOVA3 Command - Part 1
3dANOVA3 Command - Part 2
3dANOVA3 Command - Part 3
-adiff
: Performs contrasts between levels of factor ‘a’ (or -bdiff for factor ‘b’, -cdiff for factor ‘c’, etc), with
no
collapsing across levels of factor ‘a’.
-aBcontr
: 2nd order contrast. Performs comparison between 2 levels of factor
‘a’
at a
Fixed
level of factor
‘B’
In class -- Let’s run the ANOVA together
:
-fa
: Produces a main effect for factor ‘a’
Brain areas corresponding to “Telephone” (reds) vs. “Face-to-Face” (blues)
Brain areas corresponding to “Positive Telephone” (reds) vs. “Positive Email” (blues)
Many thanks to NIMH LBC for donating the data used in this lecture