14.2.3. Taylor et al. (2024). A Set of FMRI Quality Control Tools in AFNI: Systematic, in-depth …¶
Introduction¶
Here we present commands used in the following paper:
- Taylor PA, Glen DR, Chen G, Cox RW, Hanayik T, Rorden C, Nielson DM, Rajendra JK, Reynolds RC (2024). A Set of FMRI Quality Control Tools in AFNI: Systematic, in-depth and interactive QC with afni_proc.py and more. Imaging Neuroscience 2: 1–39. doi: 10.1162/imag_a_00246
Abstract: Quality control (QC) assessment is a vital part of FMRI processing and analysis, and a typically under discussed aspect of reproducibility. This includes checking datasets at their very earliest stages (acquisition and conversion) through their processing steps (e.g., alignment and motion correction) to regression modeling (correct stimuli, no collinearity, valid fits, enough degrees of freedom, etc.) for each subject. There are a wide variety of features to verify throughout any single subject processing pipeline, both quantitatively and qualitatively. We present several FMRI preprocessing QC features available in the AFNI toolbox, many of which are automatically generated by the pipeline-creation tool, afni_proc.py. These items include: a modular HTML document that covers full single subject processing from the raw data through statistical modeling; several review scripts in the results directory of processed data; and command line tools for identifying subjects with one or more quantitative properties across a group (such as triaging warnings, making exclusion criteria or creating informational tables). The HTML itself contains several buttons that efficiently facilitate interactive investigations into the data, when deeper checks are needed beyond the systematic images. The pages are linkable, so that users can evaluate individual items across group, for increased sensitivity to differences (e.g., in alignment or regression modeling images). Finally, the QC document contains rating buttons for each “QC block”, as well as comment fields for each, to facilitate both saving and sharing the evaluations. This increases the specificity of QC, as well as its shareability, as these files can be shared with others and potentially uploaded into repositories, promoting transparency and open science. We describe the features and applications of these QC tools for FMRI.
Study keywords: FMRI, EPI, MPRAGE, quality control, visualization, quantitative
Main programs:
open_apqc.py
, afni_proc.py
, sswarper2
, @ss_review_basic
,
@ss_review_driver
, gen_ss_review_table.py
, gtykd_check
Download scripts¶
To download, either:
... click the link(s) in the following table (perhaps Rightclick -> “Save Link As…”):
run
sswarper2
for nonlinear alignment to a template, and skullstripping of the subject’s T1w anatomical volumerun
afni_proc.py
for resting state FMRI analysis; this uses nonlinear warps estimated withsswarper2
... or copy+paste into a terminal:
curl -O https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/codex/fmri/media/2024_TaylorEtal/do_13_ssw.tcsh curl -O https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/codex/fmri/media/2024_TaylorEtal/do_20_ap.tcsh
View scripts¶
do_13_ssw.tcsh
¶
1#!/bin/tcsh
2
3# SSW: script to run skullstripping and nonlinear alignment for a subject
4
5# --------------------------- env vars and init ----------------------------
6
7# compress BRIK files
8setenv AFNI_COMPRESSOR GZIP
9
10# initial exit code
11set ecode = 0
12
13# -------------------------- subject and path info -------------------------
14
15# subject ID
16set subj = sub-002
17
18# script label, mainly for logs
19set label = 13_ssw
20
21# supplementary datasets (here, in known directory via environment var)
22set dset_ref = MNI152_2009_template_SSW.nii.gz
23
24# upper (group-level) directories
25set dir_inroot = ${PWD:h} # one dir above scripts/
26set dir_log = ${dir_inroot}/logs # store tee'ed info
27set dir_basic = ${dir_inroot} # here, same dir as inroot
28
29# subject input directories
30set sdir_basic = ${dir_basic}/${subj}
31set sdir_func = ${sdir_basic}/func
32set sdir_anat = ${sdir_basic}/anat
33set sdir_events = ${sdir_basic}/timing
34
35# subject derivative directories
36set sdir_deriv = ${sdir_basic}/derivatives # for all derived outputs
37set sdir_ssw = ${sdir_deriv}/ssw
38set sdir_ap = ${sdir_deriv}/ap
39
40\mkdir -p ${dir_log}
41
42# ----------------------- data and control variables -----------------------
43
44# dataset inputs
45set dset_anat_00 = ${sdir_anat}/${subj}_T1w.nii.gz
46
47# control variables
48
49# check available N_threads and report what is being used
50set nthr_avail = `afni_system_check.py -disp_num_cpu`
51set nthr_using = `afni_check_omp`
52
53echo "++ INFO: Using ${nthr_using} of available ${nthr_avail} threads"
54
55# ----------------------------- main commands ------------------------------
56
57# run the skullstripping+warping
58sswarper2 \
59 -input ${dset_anat_00} \
60 -base ${dset_ref} \
61 -subid ${subj} \
62 -odir ${sdir_ssw} \
63 |& tee ${dir_log}/log_${subj}_${label}.txt
64
65if ( ${status} ) then
66 set ecode = 2
67 goto COPY_AND_EXIT
68endif
69
70echo "++ done proc ok"
71
72# -------------------------- finish and exit -------------------------------
73
74COPY_AND_EXIT:
75
76if ( ${ecode} ) then
77 echo "++ BAD FINISH: ${label} (ecode = ${ecode})"
78else
79 echo "++ GOOD FINISH: ${label}"
80endif
81
82exit ${ecode}
do_20_ap.tcsh
¶
1#!/bin/tcsh
2
3# AP: script to run FMRI pipeline with afni_proc.py
4
5# --------------------------- env vars and init ----------------------------
6
7# compress BRIK files
8setenv AFNI_COMPRESSOR GZIP
9
10# initial exit code
11set ecode = 0
12
13# -------------------------- subject and path info -------------------------
14
15# subject ID
16set subj = sub-002
17
18# script label, mainly for logs
19set label = 20_ap
20
21# supplementary datasets (here, in known directory via environment var)
22set dset_ref = MNI152_2009_template_SSW.nii.gz
23
24# upper (group-level) directories
25set dir_inroot = ${PWD:h} # one dir above scripts/
26set dir_log = ${dir_inroot}/logs # store tee'ed info
27set dir_basic = ${dir_inroot} # here, same dir as inroot
28
29# subject input directories
30set sdir_basic = ${dir_basic}/${subj}
31set sdir_func = ${sdir_basic}/func
32set sdir_anat = ${sdir_basic}/anat
33set sdir_events = ${sdir_basic}/timing
34
35# subject derivative directories
36set sdir_deriv = ${sdir_basic}/derivatives # for all derived outputs
37set sdir_ssw = ${sdir_deriv}/ssw
38set sdir_ap = ${sdir_deriv}/ap
39
40\mkdir -p ${dir_log}
41
42# ----------------------- data and control variables -----------------------
43
44# dataset inputs
45set dsets_epi = ( ${sdir_func}/sub-002_task-avrel_run-*nii.gz )
46
47set dset_anat_cp = ${sdir_ssw}/anatSS.${subj}.nii
48set dset_anat_skull = ${sdir_ssw}/anatU.${subj}.nii
49set dsets_NL_warp = ( ${sdir_ssw}/anatQQ.${subj}.nii \
50 ${sdir_ssw}/anatQQ.${subj}.aff12.1D \
51 ${sdir_ssw}/anatQQ.${subj}_WARP.nii )
52
53set timing_files = ( ${sdir_events}/times.{vis,aud}.txt )
54set stim_classes = ( vis aud )
55
56# could add separate section here for control variables in the AP
57# command, like blur size, censor thresholds, etc.
58
59# check available N_threads and report what is being used
60set nthr_avail = `afni_system_check.py -disp_num_cpu`
61set nthr_using = `afni_check_omp`
62
63echo "++ INFO: Using ${nthr_using} of available ${nthr_avail} threads"
64
65# ----------------------------- main commands ------------------------------
66
67# 'tis convenient to run from this dir, for the supplementary outputs
68\mkdir -p ${sdir_ap}
69cd ${sdir_ap}
70
71# run the FMRI pipeline creation+execution
72afni_proc.py \
73 -subj_id ${subj} \
74 -dsets ${dsets_epi} \
75 -copy_anat ${dset_anat_cp} \
76 -anat_has_skull no \
77 -anat_follower anat_w_skull anat ${dset_anat_skull} \
78 -blocks tshift align tlrc volreg blur mask scale \
79 regress \
80 -radial_correlate_blocks tcat volreg regress \
81 -tcat_remove_first_trs 2 \
82 -align_opts_aea -cost lpc+ZZ \
83 -giant_move \
84 -check_flip \
85 -tlrc_base ${dset_ref} \
86 -tlrc_NL_warp \
87 -tlrc_NL_warped_dsets ${dsets_NL_warp} \
88 -volreg_align_to MIN_OUTLIER \
89 -volreg_align_e2a \
90 -volreg_tlrc_warp \
91 -volreg_compute_tsnr yes \
92 -blur_size 4.0 \
93 -mask_epi_anat yes \
94 -regress_stim_times ${timing_files} \
95 -regress_stim_labels ${stim_classes} \
96 -regress_basis 'BLOCK(20,1)' \
97 -regress_censor_motion 0.3 \
98 -regress_censor_outliers 0.05 \
99 -regress_motion_per_run \
100 -regress_opts_3dD -jobs 2 \
101 -gltsym 'SYM: vis -aud' \
102 -glt_label 1 V-A \
103 -gltsym 'SYM: 0.5*vis +0.5*aud' \
104 -glt_label 2 mean.VA \
105 -regress_compute_fitts \
106 -regress_make_ideal_sum sum_ideal.1D \
107 -regress_est_blur_epits \
108 -regress_est_blur_errts \
109 -regress_run_clustsim no \
110 -html_review_style pythonic \
111 -execute \
112 |& tee ${dir_log}/log_${subj}_${label}.txt
113
114if ( ${status} ) then
115 set ecode = 2
116 goto COPY_AND_EXIT
117endif
118
119echo "++ done proc ok"
120
121# -------------------------- finish and exit -------------------------------
122
123COPY_AND_EXIT:
124
125if ( ${ecode} ) then
126 echo "++ BAD FINISH: ${label} (ecode = ${ecode})"
127else
128 echo "++ GOOD FINISH: ${label}"
129endif
130
131exit ${ecode}