Hi Tamara,
There are no particular guidelines for GCOR, though indeed, higher values are usually not good. I do not know that anyone has applied a GCOR threshold for the purposes of dropping subjects, if that is what you are thinking about.
For a subject with high GCOR value, consider using @radial_correlated for further investigation. That can even be added to the proc script using -radial_correlate yes.
Using 0.1 as an outlier threshold means censoring any time points where more than 10 percent of the brain is considered an outlier. You might even consider using 0.05. But any way, that value is applied as a fraction of the masked brain.
The original degrees of freedom available is the number of time points in the data (after any censoring), or "TRs total". The "degrees of freedom used" is basically the number of regressors in the x-matrix. So the "degrees of freedom left" is their difference.
For X.stim.xmat.1D, review it to see whether it seems correct. There are any number of mistakes one could make when generating timing, and the software will only look for issues that we decide to make it look for. A researchers eyeballs are important.
Yes, sum_ideal.1D has the sum of the X.stim.xmat.1D regressors. Again, there is no rule here, particularly because the shape varies wildly across studies. Look for anything strange. Does it hover around one number? Does it go to zero outside the run breaks? Does it shoot up high at just one or two points? Evaluating these takes practice.
There are many checks that can be performed, but computers can only be programmed to do particular ones. If we could itemize all important problems, we would just write software to look for them. But new issues keep coming up. Just practice looking, that is the most important thing. Also, @radial_correlate can help find particular issues within the data, itself.
- rick