Hello,
I want to find the time to minimum (TM) from a Dynamic Susceptibility Contrast imaging data.
I have used:
3dTstat -argmin -prefix min dsc+orig . Then I multiply by the TR in 3dcalc.
This works fine but gives me results with a factor of TR.
For this reason, I have tried to use the Hilbert Delay plugin with a Reference vector being a sharp negative peak., ie only zeros except for one value being -100, let's say. (with Tstim = 0)
It seems to work fine, and in this case it gives interpolated values.
Is it appropriate to use Hilbert Delay in this way ?
Thank you.
Julien.