AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
December 01, 2003 11:35AM
Hello AFNI minds,

I am interested in transforming 3d+time EPIRT datasets into a percent-signal-change 3d+time dataset, where each voxel at each point in the time series is transformed into percent-different from the mean signal from that voxel across the time series. I do this not only to normalize datasets for between-subject comparisons, but also to apply VOI masks and dump out time course data ready-to-go as percent signalchange.

My event-related paradigm has no "baselines" so I use the session-average signal as a surrogate baseline, even though this artficially blunts signal change values since the baseline includes the occasional event of interest.

I can generate this alternative dataset (in native orig space) as part the following script which calculates a functional dataset:

****
foreach subject ( cr la sw ks jg jb mw mrm mn yt vw tc )
cd ../${subject}*

# Calculate percent signal change from session-wide mean voxel signal
if ( -e normalized+orig.BRIK ) then
rm -rf normalized+orig.*
endif
if ( -e TimeSerAvg+orig.BRIK ) then
rm -rf TimeSerAvg+*
endif

3dTstat -prefix TimeSerAvg EPIRTseries+orig
3drefit -abuc TimeSerAvg+orig

3dcalc -datum float -a EPIRTseries+orig -b TimeSerAvg+orig -expr "((a-b)/b)*100" -prefix normalized

# apply model to the normalized datasets

3dDeconvolve -input normalized+orig -concat runs.1D -mask 3dCL_mask+orig -nfirst 0 -num_stimts 12 \
-polort 2 \
-stim_file 1 'antc.1D' -stim_label 1 'ant' \
-stim_file 2 'rvnantlc.1D' -stim_label 2 'rewvsneu' \

etc....
*********
The "Normalized" variant .BRIK for some reason (additional decimal places?) is exactly twice the original EPIRTseries.BRIK and results in fairly similar individual statistical maps. With the normalized dataset, I can better compare event-related Beta coefficients between different patient groups. Also, just as individual subject *statisitical maps* made from the raw time series can be tlrc-warped and merged in to a group map, so can the individual maps derived from the normalized data as above.

My problem is, that Talairach warping of this normalized dataset ITSELF (using a hi-res MP-RAGE reference scan)

adwarp -apar MPRAGE+tlrc -dpar normalized+orig

for some reason results in unmanageable file sizes and "core dumps." Thus, I cannot warp out the normalized dataset so as to pass the data through a VOI mask placed in common stereotactic coordinates with 3dmaskave etc.

What can I do to warp out the %signalchange normalized dataset? I am using 3.8mm isotropic voxels with a 1mm gap. Is there some re-sampling parameter or command I could include that would not appreciably dilute the data but would make things more manageable? -dyxz 2.0?

Jim B

Subject Author Posted

Tlrc-warping out a %signal change dataset

Jim Bjork December 01, 2003 11:35AM

Re: Tlrc-warping out a %signal change dataset

rick reynolds December 01, 2003 12:02PM

Re: Tlrc-warping out a %signal change dataset

bob cox December 01, 2003 01:29PM