Hello,
I'm using 3dDeconvolve with the -iresp and -stimlag options to dump out event-averaged HRF responses. This results in the creation of several datasets, so I have created a special subdirectory for these that resides one level below where the main 3d+time EPIRT dataset resides. My script is written such that the 3dDeconvolve command is actually executed in the "subject_HRF" subdirectory, invokes the necessary stim-files and source time series in other directories, and successfully creates a bucket dataset in this subdirectory where 3dDeconvolve was launched.
However, the dozen or so -Iresp datasets keep getting created in the main directory one level above, where the 3d+time dataset resides.
Is there some glitch in the code where these supplemental datasets are dumped out in the same directory where the input dataset resides, and not the directory wherefrom the 3dDeconvolve command is launched?
If it's supposed to be this way, is there a way I can make it write to file where the command was launched?
Jim