Show all posts by user
Dear AFNI users-
We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:
https://discuss.afni.nimh.nih.gov
Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.
The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.
Sincerely,
AFNI HQ
History of AFNI updates
Page 1 of 1 Pages: 1
Results 1 - 14 of 14
Hello,
My second level file is failing, and I'm receiving the model test failed, Possible Reasons: ... error message. This leaves things a little vague and I really have no clue as of now what the problem may be. Does any of my model specification look incorrect? For reference, this is a within-subjects analysis of drug/placebo effects on one of our first level contrasts (winVsNoWin)
by
MikeyMalina
-
AFNI Message Board
Hey Gang, thanks for the quick response.
Unfortunately, that did not solve the problem. I get the same error message following update of my bsVar line.
Thanks,
Mikey
by
MikeyMalina
-
AFNI Message Board
Hello,
I've received an error when attempting to run a 3x2 ANOVA through 3dMVM. I have never seen this error before, it seems to be R-related. Any ideas on solutions or better ways of structuring the ANOVA. The file was run with more than enough memory. My runfile code is below, and in the actual .txt file the columns are separated appropriately.
Thank you,
Mikey
3dMVM -prefix ST
by
MikeyMalina
-
AFNI Message Board
Hello,
I have a second level script (the spacing is correct in the actual file):
3dMVM -prefix MJ_MVM1 -jobs 64 \
-bsVars 'Use*Age+WRAT' \
-wsVars 'contrasts' \
-qVars 'Age,WRAT' \
-qVarCenters '25.85,97.85' \
-num_glt 2 \
-gltLabel 1 WvsL_hvy_vs_low -gltCode 1 'Use : 1*Heavy -1*Low contrasts : 1*WvsL' \
-gltLabel 2 RvsS_hvy_vs_low -g
by
MikeyMalina
-
AFNI Message Board
Hello,
I am attempting to execute afni_proc with linear warp, and am getting an error I've never seen before:
*+ WARNING: Problems with the X matrix columns, listed below:
*+ WARNING: !! * Columns 0 and 18 #0] are (nearly?) collinear!
*+ WARNING: !! * Columns 0 and 19 #0] are (nearly?) collinear!
*+ WARNING: !! * Columns 0 and 20 #0] are (nearly?) collinear!
*+ WARNING: !! *
by
MikeyMalina
-
AFNI Message Board
Just checking back in. Were you able to access the relevant files within the box folder?
by
MikeyMalina
-
AFNI Message Board
Sorry for the delayred response, it took a bit to get my afniVer updated. Attached are the images output from the djunct_overlap_check commands. It seems like something is quite wrong...
by
MikeyMalina
-
AFNI Message Board
Here is that code, thanks for the tip. As far as @djunct_overlap_check, my AFNI (19.0) does not seem to recognize it as a viable command. What version are you running it on?
#!/usr/bin/env tcsh
# created by uber_subject.py: version 0.37 (April 14, 2015)
# creation date: Thu Dec 31 14:20:32 2015
# set data directories
set top_dir = /project2/jschnei1/MRI/MJE/MJ_022_10012020
set anat_
by
MikeyMalina
-
AFNI Message Board
Hello, the attached script is crashing, with the following error message:
++ Processing -nwarp ** FATAL ERROR: malloc (out of memory) error for dataset sub-brick #0
This memory error message is odd since I have more than enough memory-per-cpu going for this processing, it is only 20 minutes of task-MRI data, and I have processed more data with less memory in the past. Is anyone able to noti
by
MikeyMalina
-
AFNI Message Board
I tried that out and for some reason it isn't working, though logically everything seems like the right move. It is performing voxelwise addition right? So voxel x1, y1, z1 from each of my 120 sub-briks should be added together? When I look at the results this does not seem to be the case, as my output files have extremely small values at each voxel (0.0000x), despite the input sub-briks bei
by
MikeyMalina
-
AFNI Message Board
Hello,
It is my goal to add all 124 sub-briks from a dataset into one grand-sum 3D file. Is there a more efficient way to do this than 3dcalc -a data[0] -b data[1]... ? Is there some way to specify the n that I want to divide by for 3dmean? As of right now it automatically assumes the divisor to be 124 obviously. Any ideas?? If not, I've never used 3dcalc beyond -z, once it loops back aro
by
MikeyMalina
-
AFNI Message Board
Hello,
I am trying to mean-center a 4D dataset that contains the GLM parameter estimate images (if anyone is familiar, the stage 2 file output from FSL's dual regression function) for a paired t-test pharmacology study. I am just looking for a function that mean-centers a 4D dataset, independent of running any statistical tests. I know that 3dMVM has an option for mean centering of the qu
by
MikeyMalina
-
AFNI Message Board
Hi Rick,
I'm having the same problem as Colm, as far as the errors listed above, and I'm wondering, when you say "For now, uncompress
those WARP datasets and generate a new proc scripts", what do you mean by uncompress those WARP datasets? I'm inputting them as they exist in my directory, as standard nii files that aren't compressed in any way. Is there some way
by
MikeyMalina
-
AFNI Message Board