AFNI Message Board

Dear AFNI users-

We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:

https://discuss.afni.nimh.nih.gov

Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.

The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.

Sincerely, AFNI HQ

History of AFNI updates  

|
July 24, 2015 03:48PM
Hi Gang,

I'm having trouble using 3dLME to examine errors in a cognitive task. The program is hanging at the single voxel model test step, which I assume means I did something dumb. Here's my command:
3dLME -prefix data/group_processed/LME/LME.PASAT_errors \
    -model '(Group*Scan):Time' -qVars t -qVarCenters 0 \
    -ranEff '~1' -SS_type 3 -num_glf 1 \
    -glfLabel 1 Before -glfCode 1 'Time : 1*0_Coef & 1*1_Coef & 1*2_Coef & 1*3_Coef & 1*4_Coef' \
    -dataTable @data/group_processed/LME.errors.dataTable.txt
And here's the beginning of the data table:
$ head data/group_processed/LME.errors.dataTable.txt
Subj	Group	Scan	Run	Time	t	InputFile
020VTV	xx	pre	PASAT1	0_Coef	-16	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#0_Coef]
020VTV	xx	pre	PASAT1	1_Coef	-12	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#1_Coef]
020VTV	xx	pre	PASAT1	2_Coef	-8	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#2_Coef]
020VTV	xx	pre	PASAT1	3_Coef	-4	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#3_Coef]
020VTV	xx	pre	PASAT1	4_Coef	0	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#4_Coef]
020VTV	xx	pre	PASAT1	5_Coef	4	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#5_Coef]
020VTV	xx	pre	PASAT1	6_Coef	8	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#6_Coef]
020VTV	xx	pre	PASAT1	7_Coef	12	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#7_Coef]
020VTV	xx	pre	PASAT1	8_Coef	16	data/processed/020VTV_1/epis/REML.errors_mdtwbs+tlrc.HEAD[PASAT1_errors#8_Coef]

Here I modelled each error in 3dREMLfit with a 'CSPLIN(-16,16,9)' HRF, and my F-test is to examine the time points before each error. Each subject is in one of two groups (xx/yy) and has data for two scans (pre/post) and three runs (PASAT[1-3]) per scan, but those few runs in which a subject made no errors are omitted, which means there's some missing data. I had a `-corStr 't : AR1'` in my command, but took it out in light of [afni.nimh.nih.gov] . It seems to me that the model I want is Group*Scan*(0+Time), which is what I think I'm getting here. Any ideas what I've done wrong?

ijs

P.S.: The output is here:
Read 10591 items
Loading required package: nlme
Package nlme loaded successfully!

Loading required package: phia
Loading required package: car
Loading required package: MASS
Loading required package: nnet
Package phia loaded successfully!


++++++++++++++++++++++++++++++++++++++++++++++++++++
***** Summary information of data structure *****
30 subjects :  020VTV 039MX4 129EW1 154BA4 178AEY 241AHN 261UMC 268DFV 296AMW 306VT5 318LYO 362HR9 387RJJ 399YR1 430EEN 440KZC 465FE2 487ZR9 533PLH 552LCJ 554GB6 584VPT 617KGL 662UHF 669YTH 690HKF 733XF9 744GXX 829XGB 977MEG 
1512 response values
2 levels for factor Group : xx yy 
2 levels for factor Scan : post pre 
3 levels for factor Run : PASAT1 PASAT2 PASAT3 
9 levels for factor Time : 0_Coef 1_Coef 2_Coef 3_Coef 4_Coef 5_Coef 6_Coef 7_Coef 8_Coef 
1512 centered values for numeric variable t : -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 -16 -12 -8 -4 0 4 8 12 16 
0 post hoc tests

Contingency tables of subject distributions among the categorical variables:


Tabulation of subjects against all categorical variables
~~~~~~~~~~~~~~
Subj vs Group:
        
[redacted]

~~~~~~~~~~~~~~
Subj vs Scan:
        
         post pre
  020VTV   27  27
  039MX4   27  27
  129EW1   27  27
  154BA4   27  27
  178AEY   27  27
  241AHN   27  27
  261UMC   27  27
  268DFV   27  27
  296AMW   18  27
  306VT5   27  27
  318LYO   27  27
  362HR9   27  27
  387RJJ   18  27
  399YR1   27  27
  430EEN   27  27
  440KZC   27  27
  465FE2   18  18
  487ZR9    9  18
  533PLH    9  27
  552LCJ   27  27
  554GB6   18  27
  584VPT   27  27
  617KGL   27  27
  662UHF   27  27
  669YTH   27  27
  690HKF   27  27
  733XF9   27  27
  744GXX   27  27
  829XGB   27  27
  977MEG   18  18

~~~~~~~~~~~~~~
Subj vs Run:
        
         PASAT1 PASAT2 PASAT3
  020VTV     18     18     18
  039MX4     18     18     18
  129EW1     18     18     18
  154BA4     18     18     18
  178AEY     18     18     18
  241AHN     18     18     18
  261UMC     18     18     18
  268DFV     18     18     18
  296AMW     18      9     18
  306VT5     18     18     18
  318LYO     18     18     18
  362HR9     18     18     18
  387RJJ     18     18      9
  399YR1     18     18     18
  430EEN     18     18     18
  440KZC     18     18     18
  465FE2      9     18      9
  487ZR9      9      9      9
  533PLH      9     18      9
  552LCJ     18     18     18
  554GB6     18     18      9
  584VPT     18     18     18
  617KGL     18     18     18
  662UHF     18     18     18
  669YTH     18     18     18
  690HKF     18     18     18
  733XF9     18     18     18
  744GXX     18     18     18
  829XGB     18     18     18
  977MEG     18     18      0

~~~~~~~~~~~~~~
Subj vs Time:
        
         0_Coef 1_Coef 2_Coef 3_Coef 4_Coef 5_Coef 6_Coef 7_Coef 8_Coef
  020VTV      6      6      6      6      6      6      6      6      6
  039MX4      6      6      6      6      6      6      6      6      6
  129EW1      6      6      6      6      6      6      6      6      6
  154BA4      6      6      6      6      6      6      6      6      6
  178AEY      6      6      6      6      6      6      6      6      6
  241AHN      6      6      6      6      6      6      6      6      6
  261UMC      6      6      6      6      6      6      6      6      6
  268DFV      6      6      6      6      6      6      6      6      6
  296AMW      5      5      5      5      5      5      5      5      5
  306VT5      6      6      6      6      6      6      6      6      6
  318LYO      6      6      6      6      6      6      6      6      6
  362HR9      6      6      6      6      6      6      6      6      6
  387RJJ      5      5      5      5      5      5      5      5      5
  399YR1      6      6      6      6      6      6      6      6      6
  430EEN      6      6      6      6      6      6      6      6      6
  440KZC      6      6      6      6      6      6      6      6      6
  465FE2      4      4      4      4      4      4      4      4      4
  487ZR9      3      3      3      3      3      3      3      3      3
  533PLH      4      4      4      4      4      4      4      4      4
  552LCJ      6      6      6      6      6      6      6      6      6
  554GB6      5      5      5      5      5      5      5      5      5
  584VPT      6      6      6      6      6      6      6      6      6
  617KGL      6      6      6      6      6      6      6      6      6
  662UHF      6      6      6      6      6      6      6      6      6
  669YTH      6      6      6      6      6      6      6      6      6
  690HKF      6      6      6      6      6      6      6      6      6
  733XF9      6      6      6      6      6      6      6      6      6
  744GXX      6      6      6      6      6      6      6      6      6
  829XGB      6      6      6      6      6      6      6      6      6
  977MEG      4      4      4      4      4      4      4      4      4
***** End of data structure information *****
++++++++++++++++++++++++++++++++++++++++++++++++++++

Reading input files now...

Reading input files: Done!

If the program hangs here for more than, for example, half an hour,
kill the process because the model specification or the GLT coding
is likely inappropriate.



Edited 3 time(s). Last edit at 07/24/2015 04:16PM by Isaac Schwabacher.
Subject Author Posted

3dLME model troubles

Isaac Schwabacher July 24, 2015 03:48PM

Re: 3dLME model troubles

gang July 24, 2015 04:25PM

Re: 3dLME model troubles

Isaac Schwabacher July 24, 2015 04:35PM

Re: 3dLME model troubles

gang July 26, 2015 11:32AM

Re: 3dLME model troubles

Isaac Schwabacher July 28, 2015 11:40AM

Re: 3dLME model troubles

gang July 28, 2015 02:56PM

Re: 3dLME model troubles

Isaac Schwabacher July 28, 2015 06:14PM

Re: 3dLME model troubles

gang July 29, 2015 02:16PM

Re: 3dLME model troubles

Isaac Schwabacher July 29, 2015 04:57PM

Re: 3dLME model troubles

gang July 30, 2015 12:50PM