> There's already some
> smoothness in your original data, so blurring it
> further will give you a smoothness that is higher
> than the size of your smoothing kernel.
Is that true without exception? I used
-- 3dmerge -1blur_fwhm to blur datasets with FWHM values of 4, 6, 8, 10 and 12 mm (in parallel to compare the effect of the different FWHM values),
-- 3dDeconvolve to reduce the influence of nuisance variables and
-- 3dFourier for temporal filtering, and afterwards
-- 3dFWHMx estimated the resulting FWHM values as 6.2, 7.5, 8.6, 9.2 and 9.2 mm.
Is it because of the order of the preprocessing steps? I tried to follow the recommendations of Weissenbacher et al. (2009; Neuroimage). As I am most interested in the 8mm blurring, I am happy with the increase to 8.6. But what if I wanted to report the results of blurring with 12mm?
Many thanks to the AFNI team for their wonderful programs and their friendly and generous help!
Wolfram