Hi Rick,
Thank you for your response. Correcting the last parameter in the -concat option did indeed resolve the 2nd set of warning message I was getting.
Regarding the timing files, this is related to a previous question I posted earlier (https://afni.nimh.nih.gov/afni/community/board/read.php?1,159421,159421#msg-159421), where I provided one of my timing files, which only has duration modulation:
10.15:16.78 90.12:19.79 116.94:22.43 173.97:7.11 219.81:4.58 231.41:1.58 286.55:11.17 304.75:17.96
1.02:11.35 19.4:4.85 42.13:12.0 63.16:5.61 128.22:12.11 147.36:10.15 170.42:2.35 181.8:19.86 222.31:2.82 238.48:3.67 311.72:22.09
1.03:14.16 55.54:4.05 66.63:15.91 117.45:10.59 135.08:5.05 169.68:7.57 194.14:6.99 244.7:2.8 281.33:3.94 311.82:2.92 321.76:30.74
20.38:7.09 34.51:21.36 100.33:2.37 124.36:13.71 145.09:4.81 162.5:1.82 173.35:11.78 218.0:8.48 233.51:1.72 262.39:42.91 314.33:8.67
1.02:28.95 35.0:18.26 80.74:30.0 160.28:26.33 191.64:2.73 221.25:27.31
1.03:7.51
The reason why I'm also interested in amplitude modulation is because my task involves presenting neutral and aversive images, and I worry that habituation to the aversive stimuli over the course of the experiment will attenuate the response amplitude. Assuming this logic is correct, it looks like I would need to adjust my timing files to include the AM parameter(s). In one of your responses, you provided an example of '10.15*1.83:16.78' , and the 3dDeconvolve help file provides an example of '30*5,3:12'. My question is, is there a mathematical formula or rule of thumb for determining the AM parameter(s)? Or would I be better off just focusing on duration modulation instead?
I have one last issue that I forgot to include in the initial post. Since my input dataset is very large (6383 sub-bricks), I'm getting the following memory error:
++ total shared memory needed = 16,784,167,400 bytes (about 17 billion [giga])
++ current memory malloc-ated = 4,223,042 bytes (about 4.2 million [mega])
** FATAL ERROR: Can't create shared mmap() segment
** Unix message: Cannot allocate memory
I'm running this on my university's HPC, where I requested 20gb of virtual memory, but the error still persists.
Thanks again,
Dan