AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
October 17, 2017 07:36PM
Hey guys:

I'm using the 3dDeconvolve -nodata option to test some experimental designs. Quick questions.
1) Each stimulus is 12s so I originally made a binary file as input as per the manual Doug Ward wrote in 2002 where I treat each stimulus as a sequence of 6 1s in a row (2s TRs) so each stim_file essentially looked like this - 0 0 0 0 1 1 1 1 1 1 0 0 0 - and there were 3 of those. However I wanted to specify a gamma function so I used the 'make_stim_times.py' tool on these files and essentially got 'onsets' at each place the 1s are (so e.g. stimulus 1 was 8 10 12 14 16 18 64 etc.). Is one way more accurate for testing multicollinearity than the other? As all 3 stimuli are quite long (12 s) and the TRs between them are short, relatively (8s on average) I want to make sure I can resolve each stimulus.
2) I will also be shocking people which I'd like to input as a regressor of no interest so can I use a binary file with the -ortvec option to ensure there will be no shock contamination?

Thanks!
Lauren
Subject Author Posted

3dDeconvolve -nodata option with -stim_files and -ortvec

lhopkins October 17, 2017 07:36PM