History of AFNI updates  

|
January 25, 2015 08:53AM
Hello,

I am using the following to normalize:

-a data
-b mean
-expr 'a/b * 100'

I have tried 2 processing streams with different results:

1) slice time > volreg > co-registration > mean WM and LV extraction > nuisance regression > vol2surf > surfsmooth > intensity normalization

or

2) slice time > volreg > co-registration > intensity normalization > mean WM and LV extraction > nuisance regression > vol2surf > surfsmooth

For 1) I get very weird results. Using 3dtstat to get the voxel-wise mean I end up with a mean dataset that is full of holes (i.e. many nodes have a value of 0). I find it hard to believe that the mean of so many voxels would be 0. Just to humor it I tried the 3dcalc for normalization and ended up with intensity values on the order of trillions (even for voxels whose mean is not 0).

For 2) I get more reasonable results.

My question is why might this difference happen? I thought that there wouldn't be much difference where I put normalization in my processing stream but the differences are huge! The only difference between the two proc streams is the place of normalization, no commands were changed. This is making me question stream 2) because I'm worried that something might be off.

Thanks very much for all your help.

Dustin



Edited 1 time(s). Last edit at 01/25/2015 09:57AM by dmoracze.
Subject Author Posted

Intensity normalization on the surface

dmoracze January 25, 2015 08:53AM

Re: Intensity normalization on the surface

ziad January 26, 2015 09:21AM