AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
March 21, 2017 12:01PM
Hi Gang,

I am running 3dLME on 24 subjects with repeated measures (2 conditions per subject), thus 48 datasets. The input file for each dataset consists of one z value per node and I am using std.60 standardized mesh (36,002 nodes per hemi). Here is a schematic of my command:

$h equals hemisphere (looping over lh and rh)

3dLME 	-jobs 24							                      \
		-prefix LME.int-scr.$h.isc.niml.dset	                              \
		-model 'con' 							              \
		-ranEff '~1' 							              \
		-SS_type 3 							              \
		-num_glt 1							              \
		-gltLabel 1 'int-scr' -gltCode 1 'con : 1*int -1*scr' 	      \
		-dataTable 							              \
		Subj 	con 	InputFile 					              \
		S101 	int 	S101_int.$h.zVal.niml.dset 	              \
		S101 	scr   S101_scr.$h.zVal.niml.dset 	              \
                S102 	int 	S102_int.$h.zVal.niml.dset 	              \
		S102 	scr   S102_scr.$h.zVal.niml.dset 	              \
                .
                .

My input files are 36,0002 nodes per hemi, but here is the relevant output of 3dinfo on the results from this test:

R-to-L extent:     0.000     -to- 144007.000 [L] -step-     1.000 mm [144008 voxels]
A-to-P extent:     0.000     -to-     0.000     -step-     1.000 mm [  1 voxels]
I-to-S extent:     0.000     -to-     0.000     -step-     1.000 mm [  1 voxels]
Number of values stored at each pixel = 1
  -- At sub-brick #0 '(Intercept)  F' datum type is float
     statcode = fift;  statpar = 1 23

There are 144,008 nodes. Plus, my main effect and post-hoc t test are not included in the bucket. I have used this command on other similarly formatted data and I have check that all input files have the correct dimensions. I am using:

Precompiled binary linux_openmp_64: Dec 31 2016 (Version AFNI_16.3.20)

Any ideas?

Thanks,

Dustin


EDIT: I ran the same code on old data where this analysis previously worked fine. The resulting niml file had the same problem as stated above (incorrect number of nodes and no main effects or post hoc tests. Interestingly the number of nodes was not the same as above, but rather 432024.

144008 and 432024 are both multiples of 36002 (4 and 12, respectively). This must have something to do with the number of effects. The previously analysis was more complicated and had 2 quantitative and 2 factor-level variables whereas the analysis quoted above just had 1 factor-level effect.

This analysis ran fine in November, I'm wondering if something has changed between November and my "current" afni version?



Edited 1 time(s). Last edit at 03/21/2017 12:02PM by dmoracze.
Subject Author Posted

3dLME on surface - error

dmoracze March 21, 2017 12:01PM

Re: 3dLME on surface - error

gang March 21, 2017 12:40PM

Re: 3dLME on surface - error

dmoracze March 21, 2017 02:52PM

Re: 3dLME on surface - error

hangjoonjo July 03, 2017 11:31PM