Hi Gang,
Thanks for this tip!
I just want to make sure I am understanding correctly, and that this data shouldn't be ringing any alarm bells, so I have a couple additional questions:
1. As my data is currently processed, I have not used the SCALE block in proc_py. Do you think for a beta series correlation, since I am concatenating betas together, that the data should have been normalized/scaled prior to 3dDeconvolve? Or is it okay to not use normalized betas? I'm thinking the data should be normalized but I'd love some feedback.
2. As a test I have created a beta series (with unscaled betas) of one of my conditions, where there are 18 different individual trials. Therefore the beta series has 18 sub-bricks- one for each stimulus presentation. There is quite a LARGE range of beta values across these subbricks, with one brick having a voxel with a beta as low as -67 and another brick having a beta as high as +70. When I calculate the whole-brain average beta for each stimulus (ie each subbrick) I get a much more normal story, where the betas are all between +1 and -1. and the standard deviations range between 1 and 2.6.
Are these GIANT voxel-specific betas alarming? Or do you think it is not of concern since the average betas are within normal expected range?