14.2.1. Wardle et al. (2025). Brief encounters with real objects modulate medial parietal but not occipitotemporal cortex

Introduction

Here we present commands used in the following paper:

Abstract: Humans are skilled at recognizing everyday objects from pictures, even if we have never encountered the depicted object in real life. But if we have encountered an object, how does that real-world experience affect the representation of its photographic image in the human brain? We developed a paradigm that involved brief real-world manual exploration of everyday objects prior to the measurement of brain activity with fMRI while viewing pictures of those objects. We found that while object-responsive regions in lateral occipital and ventral temporal cortex were visually driven and contained highly invariant representations of specific objects, those representations were not modulated by this brief real-world exploration. However, there was an effect of visual experience in object-responsive regions in the form of repetition suppression of the BOLD response over repeated presentations of the object images. Real-world experience with an object did, however, produce foci of increased activation in medial parietal and posterior cingulate cortex, regions that have previously been associated with the encoding and retrieval of remembered items in explicit memory paradigms. Our discovery that these regions are engaged during spontaneous recognition of real-world objects from their 2D image demonstrates that modulation of activity in medial regions by familiarity is neither stimulus nor task-specific. Overall, our results support separable coding in the human brain of the visual appearance of an object from the associations gained via real-world experience. The richness of object representations beyond their photographic image has important implications for understanding object recognition in both the human brain and in computational models.

Main programs: afni_proc.py

Download scripts

To download, either:

  • ... click the link(s) in the following table (perhaps Rightclick -> “Save Link As…”):

    s1.afni_proc.tcsh

    afni_proc.py command for multiecho FMRI

  • ... or copy+paste into a terminal:

    curl -O https://afni.nimh.nih.gov/pub/dist/doc/htmldoc/codex/fmri/media/2025_WardleEtal/s1.afni_proc.tcsh
    

View scripts

s1.afni_proc.tcsh

This script contains an afni_proc.py command for ME-FMRI processing, with detailed regression modeling done in a separate follow-up script.

 1#!/bin/tcsh
 2
 3# For this study, all analyses are conducted in each participant's
 4# native brain space (i.e. aligned to in-session anatomical MPRAGE)
 5# and displayed on the common mesh in SUMA for any group analyses
 6
 7# This script contains the afni_proc.py command used to process
 8# task-based multi-echo multiband fMRI data in the below study. A
 9# dummy 'regress' block is included to generate AFNI QC files, however
10# 3dDecolvolve is run separately in later scripts as the betas are
11# analyzed differently depending on the analysis and run type
12# (i.e. block design of localizer runs vs. event related design of
13# experimental runs, single beta estimate per condition for univariate
14# analyses vs. one beta estimate per condition per run for
15# multivariate analyses).  Here the raw data is preprocessed in the
16# order collected for motion correction and alignment to the
17# in-session anatomical.
18
19# Used for processing in: 
20#
21#   Wardle, S. G., Rispoli, B., Roopchansingh, V. & Baker, C. (2024)
22#   Brief encounters with real objects modulate medial parietal but
23#   not occipitotemporal cortex. bioRxiv. 2024.08.05.606667
24#   https://doi.org/10.1101/2024.08.05.606667
25
26# To run for a single participant, type (while providing an actual
27# value for P_ID):
28#
29#   tcsh s1.afni_proc.tcsh P_ID
30
31# =============================================================================
32
33# collect user input
34if ( $#argv > 0 ) then
35    set pname = $argv[1]
36else
37    echo 'WARNING: NO PARTICIPANT IS ENTERED'
38endif
39
40# specify project directory 
41set myroot = <INSERT PROJECT HOME DIRECTORY>
42
43# define directories
44set input_root  = ${myroot}/rawdata/rawMRI
45set subdir      = ${input_root}/${pname}
46set output_root = ${myroot}/preprocessed
47set suboutdir   = ${output_root}/${pname}
48
49# print into terminal
50echo ${subdir}
51echo ${pname}
52echo ${input_root}
53
54# create the output root directory if it doesn't exist
55# nb: don't create the subject directory here because afni_proc.py will
56\mkdir -pv $output_root
57echo $output_root
58
59cd $subdir
60
61# generate the afni_proc.py script for this participant, for multiecho
62# FMRI processing
63
64# Notes
65#
66# + This command includes reverse phase encoding for EPI distortion
67#   correction (`-blip_* ..` options)
68#
69# + AFNI's formulation for the optimal combination (OC; Posse et al.,
70#   1999) of multiple echos is used (`-combine_method OC`)
71#
72# + The 'regress' block is included to enable the QC HTML to be made,
73#   but the actual regression commands are run later/separately, so no
74#   `-regress_* ..` options were used here
75#
76
77afni_proc.py                                                                 \
78    -subj_id                ${pname}                                         \
79    -script                 ${subdir}proc.${pname}                           \
80    -out_dir                ${suboutdir}                                     \
81    -dsets_me_echo          ${pname}*tSeries*e01*orig.HEAD                   \
82    -dsets_me_echo          ${pname}*tSeries*e02*orig.HEAD                   \
83    -dsets_me_echo          ${pname}*tSeries*e03*orig.HEAD                   \
84    -echo_times             12.9 32.228 51.556                               \
85    -reg_echo               2                                                \
86    -copy_anat              ${subdir}/*anat.nii                              \
87    -blocks                 tshift align volreg blur mask combine regress    \
88    -tcat_remove_first_trs  8 8 8 8 8 8 8 8 8 8                              \
89    -blip_forward_dset      ${pname}blip_forward-e02*orig.HEAD               \
90    -blip_reverse_dset      ${pname}blip_reverse-e02*orig.HEAD               \
91    -volreg_align_to        MIN_OUTLIER                                      \
92    -volreg_align_e2a                                                        \
93    -blur_size              4                                                \
94    -combine_method         OC                                               \
95
96# run the afni_proc.py script for this participant
97tcsh -xef ${subdir}proc.${pname} |& tee ${subdir}output.proc.${pname}