AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
December 31, 2010 03:48AM
Hi, so I'm still trying to work my way through surface-based analysis for the first time, and I'm now running into an error using SurfSmooth:

Notice SUMA_estimate_FWHM_1dif (SUMA_GeomComp.c:6564):
Distribution of data is possibly random noise (p=0.154304)
Expect fwhm to be no different from 0
FWHM values up to 0.56(segments) or 0.48(mm)
are likely meaningless (at p=0.01) on this mesh.

-- Error SUMA_Chung_Smooth_07_toFWHM_dset (SUMA_GeomComp.c:5743):
Failed to get mean fwhm
-- Error SurfSmooth (SUMA_SurfSmooth.c:2116):
Failed to blur master data dset


What I'm trying to do, in words, is to recreate a standard volume-based group analysis, but now on the surface. Since I'm totally new to surfaces, below I list exactly my processing stream, with completely specified commands that I've modeled from examples in SUMA documents, help files, or Message Board postings. I would be very grateful if you could tell me not only where I've gone wrong with the surface smoothing, but any other errors or inefficiencies you might see (eg, do I have to do everything twice, once for each hemisphere, or can I combine them at some point?)


For every subject:

####Use FreeSurfer to create surface (using all default settings)
(not shown)

####Make spec file
@SUMA_Make_Spec_FS -sid MYSUBJ1

####Align the surface to my aligned anatomical
@SUMA_AlignToExperiment -exp_anat MPRAGEanat_Nudged+orig -surf_anat surf/SUMA/MYSUBJ1_SurfVol+orig

####Process (tshifted and volreged) EPI data by hemisphere
For each hemi in lh rh

####Create standard mesh
MapIcosahedron -spec MYSUBJ1_${hemi}.spec -ld 141

####Map the brain mask (made by 3dAutomask on EPI data) to that mesh
3dVol2Surf -spec std.MYSUBJ1_${hemi}.spec -surf_A std.${hemi}.smoothwm.asc \
-surf_B std.${hemi}.pial.asc -sv MYSUBJ1_SurfVol_Alnd_Exp+orig \
-grid_parent mask+orig \
-map_func ave -f_steps 10 -f_index nodes -out_niml v2s_std.${hemi}.mask.niml.dset

####Process EPI data by hemisphere and run
For each run X

####Map the EPI run timeseries to the standard mesh
3dVol2Surf -spec std.MYSUBJ1_${hemi}.spec -surf_A std.${hemi}.smoothwm.asc \
-surf_B std.${hemi}.pial.asc -sv MYSUBJ1_SurfVol_Alnd_Exp+orig \
-grid_parent EPI_RUNX+orig \
-map_func ave -f_steps 10 -f_index nodes -out_niml v2s_std.${hemi}.EPI_RUNX.niml.dset

####Smooth the surface time series
SurfSmooth \
-met HEAT_07 \
-spec std.MYSUBJ1_${hemi}.spec \
-surf_A std.${hemi}.smoothwm.asc \
-surf_B std.${hemi}.pial.asc \
-input v2s_std.${hemi}.EPI_RUNX.niml.dset \
-blurmaster v2s_std.${hemi}.EPI_RUNX.niml.dset \
-detpoly_master 3 \
-output v2s_std.${hemi}.EPI_RUNX.BL.niml.dset \
-target_fwhm 4 \
-bmall \
-cmask "-a v2s_std.${hemi}.EPI_RUNX.niml.dset -expr bool(a)"

####Normalize
3dTstat -prefix v2s_std.${hemi}.EPI_RUNX_MEAN.BL.niml.dset v2s_std.${hemi}.EPI_RUNX.BL.niml.dset
3dcalc -fscale -a v2s_std.EPI_RUNX.BL.niml.dset -b v2s_std.${hemi}.EPI_RUNX_MEAN.BL.niml.dset -expr 'step(b)*min(200,a/b*100)' -prefix v2s_std.${hemi}.Norm_EPI_RUNX.BL.niml.dset`

End

####Concatenate runs
3dTcat -prefix v2s_std.${hemi}.Norm_EPI_allruns.BL.niml.dset v2s_std.${hemi}.Norm_EPI_RUNX.BL.niml.dset v2s_std.${hemi}.Norm_EPI_RUNY.BL.niml.dset v2s_std.${hemi}.Norm_EPI_RUNZ.BL.niml.dset ...

####Run 3dDeconvolve (no polort because detrending took place in smoothing???)
3dDeconvolve -basis_normall 1 -input v2s_std.${hemi}.Norm_EPI_allruns.BL.niml.dset -mask v2s_std.${hemi}.mask.niml.dset -concat startpoints.1D -num_stimts 26 \
[-stims....blah blah blah] \
-tout -bucket deconallruns_stdsurf_${hemi}

End

(Repeat for all subjects)

####This is when I would normally bucket out beta weights, run anovas across subjects, and compute a cluster size threshold, but I haven't gotten there yet for surfaces


Thanks,

Chris
Subject Author Posted

SurfSmooth problem

Chris Ackerman December 31, 2010 03:48AM

Re: SurfSmooth problem

Ziad January 03, 2011 09:24PM

Re: SurfSmooth problem

Chris Ackerman January 04, 2011 10:52AM

Re: SurfSmooth problem

Ziad January 04, 2011 10:32PM

Re: SurfSmooth problem

Ziad January 06, 2011 10:25AM

Re: SurfSmooth problem - Rebuild

bob cox January 06, 2011 10:28AM