Show all posts by user
Dear AFNI users-
We are very pleased to announce that the new AFNI Message Board framework is up! Please join us at:
https://discuss.afni.nimh.nih.gov
Existing user accounts have been migrated, so returning users can login by requesting a password reset. New users can create accounts, as well, through a standard account creation process. Please note that these setup emails might initially go to spam folders (esp. for NIH users!), so please check those locations in the beginning.
The current Message Board discussion threads have been migrated to the new framework. The current Message Board will remain visible, but read-only, for a little while.
Sincerely,
AFNI HQ
History of AFNI updates
Page 1 of 1 Pages: 1
Results 1 - 27 of 27
Hi,
Will you share any resources from this bootcamp? Because I live in Turkey and I cant attend this event. But , I extremely want to learn useful tricks of AFNI and want to understand every step of analyzing.
Thanks,
Abdullah
by
trabz
-
AFNI Message Board
Hi,
I performed a two sample t-test on my dataset and when I choose p unc< 0.001 it shows min q<0.25 (differs by analysis ,ALFF,fALFF,ReHo etc). Now I used also CONN for same datasets and in CONN there are thresholds as "FDR-Analysis Level","FDR-Seed Level". First one takes whole brain and calculates FDR and the second one takes only ROI -*and corresponding resul
by
trabz
-
AFNI Message Board
Thanks for this fast response.
Can I ask one more which is not about this topic ? You can see my afni_proc.py it uses regress_bandpass and regress_RSFC both. Will bandpassing cause any harm on fALFF ? I asked this before and I understood that way. Is that the correct way to use ?
Thanks
Abdullah
by
trabz
-
AFNI Message Board
Hi,
Sorry for my disturbance which is keep rising for a while and thanks for your help. I had a question about error time series, fitts and stats. I perform RS-FMRI analysis which is supported by uber scripts. I wanted to ask some part of my afni_proc.py generated script. Which is at following.
# run the regression analysis
3dDeconvolve -input pb04.$subj.r*.blur+tlrc.HEAD
by
trabz
-
AFNI Message Board
Hi,
Thanks for all your help and advice. I will re-consider all the steps again. The whole process is much clear now.
Thanks
Abdullah
by
trabz
-
AFNI Message Board
Hi,
Thanks for reply.
Briefly, I try to compute ReHo,ALFF and fALFF from the resting-state fMRI data. According to your reply, I will not use bandpass filter - 0.01 0.1 Hz- in afni_proc.py. After the processing I will compute via 3dLomb and 3dAmp. By the way, I do not have any good reason to use bandpass filter , I use it because it is in default parameters for all the other A
by
trabz
-
AFNI Message Board
Hi,
Firstly,I try to compute ALFF,fALFF and the others like mALFF. First for 3dRSFC I used your analysis that published at afni_proc.py examples, 11 I guess. At one message board answer to another user, the experts said that you cant use bandpass and fALFF because fALFF needs full spectrum and he said he is working on a new function which can compute it even after bandpass process. Now my
by
trabz
-
AFNI Message Board
Hi;
Thanks for all your help I will not use that subjects than.
Abdullah
by
trabz
-
AFNI Message Board
-Hi,
Sure. You're right align was good there but it has lost information although it has that information at raw dataset. Also, the first image in my last post has that information but not properly aligned.I marked it at the image below. Is it because of the dataset or am I doing something wrong?
Thanks
-Abdullah
by
trabz
-
AFNI Message Board
- Hi,
My afni_proc.py is here again for @Align performed run
afni_proc.py -subj_id $subj \
-script proc.$subj -scr_overwrite \
-blocks despike tshift align tlrc volreg blur mask regress \
-copy_anat $top_dir/mprage+orig \
-dsets $top_dir/rest2N+orig.HEAD
by
trabz
-
AFNI Message Board
Hi,
After using giant_move some of the pre-processing output's fitting errors has gone but some of them have a new error which is data lose from epi set. You can see and understand from the below images.
Here is my afni_proc.py again
afni_proc.py -subj_id female4 -script proc.female4 -scr_overwrite -blocks \
despike tshift align tlrc volreg blur mask regress -copy_anat
by
trabz
-
AFNI Message Board
Of course
I try
#!/usr/bin/env tcsh
# created by uber_subject.py: version 1.2 (April 5, 2018)
# creation date: Tue Jul 24 21:58:36 2018
# set subject and group identifiers
set subj = female
set gname = peking
# set data directories
set top_dir = /mnt/e/SemraHoca/Peking/Gr1/${subj}/1860323/session_1
set anat_dir = $top_dir/anat_1
set epi_dir = $top_dir/rest_1
# run afni_
by
trabz
-
AFNI Message Board
Hi Rick,
I tried to use ADHD200 Peking University dataset which is publicly released but i couldn't perform pre-process properly i think and i used uber_subject.py for afni_proc.py script generating i copied the warnings and error from output.proc to here. For other subjects it didn't give any fatal errors. When ı copied your code for censor motion count it gives 83 for 0.2.
by
trabz
-
AFNI Message Board
Hi ,
I try to use some dataset which is at Nifti format but when i look pre-processing results saw that epi set and anatomic set is not fitting each other. How can it be happen ? How can i fix this ?
Thanks
by
trabz
-
AFNI Message Board
Hi again,
For the fast summary I try to compare two sets as before, when we talked about this topic you said MOCO set (motion corrected by MR) possibly has more blur than NOTMOCO set (Raw data, not motion corrected by MR) at pre-processing steps because of the extra time-interpolation but i have to compare these sets how can I use exactly same blur for each set ? Is there any way for it?
by
trabz
-
AFNI Message Board
Thanks for that summary. I almost understand every step of pre-processing but I have 2 more questions.
1- How can I compare MOCO-NOTMOCO correctly with same parameters ? (When I look files_acf folder at notmoco it says FWHM=10.06 and and for MOCO 10.35--I chose 6)
2- How can I use pre-processed data by SPM ?
Thanks for all your help. I will keep this favor in my mind.
Regards
Abdull
by
trabz
-
AFNI Message Board
Thanks for your attention first :). I just wonder that is there any mistake at my codes? I posted my codes but if you want i can post it in one piece. One more question can i use pre-processed data created by SPM ? How can i turn it into AFNI format? When i tried that, it created separate datasets from every volume. Sorry for my questions but i am new at neuroscience and AFNI and i have nobody to
by
trabz
-
AFNI Message Board
I do ReHo , Roi based analyses (correlation) and ALFF. Then ı do ttest by uber_ttest.py. At the ttest dataset ı use clustering at the AFNI GUI and when look clustering report MOCO has more significant voxels (total number of voxels at the report ) than NOTMOCO.
3dReHo -prefix ReHo11_{$subj} -inset errts.{$subj}.tproject+tlrc -mask mask_group+tlrc
3dmaskdump -noijk -mask mask_group+tlrc Re
by
trabz
-
AFNI Message Board
Sorry ı couldn't explain myself. Also ı couldn't understand why AFNI gives lower activations at NOTMOCO than MOCO. If i use 3 in AFNI this will do not censor any TR both at MOCO and NOTMOCO so why NOTMOCO gives lower activations. We use 6 FWHM , 3mm censor motion , 0.01 Hz-0.1 Hz bandpass filter . Did I enter those parameters correctly ? Do I supress any activations at NOTMOCO wit
by
trabz
-
AFNI Message Board
But we use exactly same value with other packages. Our project is about motion so ı don't want to use motion correction.
by
trabz
-
AFNI Message Board
I made the changes but it still same as before.
afni_proc.py -subj_id $subj \
-script proc.$subj -scr_overwrite \
-blocks despike tshift align tlrc volreg blur mask regress \
-copy_anat $anat_dir/anatZ+orig \
-dsets $epi_dir/rZ+orig.HEAD
by
trabz
-
AFNI Message Board
At first thanks for your attention
TR= 2800 ms, TE=25ms, flip angle = 90°, field of view =192 mm, 36 slices covering the whole brain, slice thickness = 3 mm, in-plane resolution =2×2 mm. Resting state data was collected for 9 min 44 s resulting in 205 volumes of BOLD fMRI data per subject. Resting-State fMRI scans were performed in 1.5 Tesla Siemens MR device.
And for AFNI format ı use the co
by
trabz
-
AFNI Message Board
In our project, we are trying to understand head motion's effect on brain activations. MOCO data are motion corrected NOTMOCO data. Our NOTMOCO data's head motion is at 1-2mm motion range and MOCO (motion corrected NOTMOCO data. this motion correction process is done by MR) So NOTMOCO data is ın other word raw data and MOCO data is the motion corrected data. Because of that, we are
by
trabz
-
AFNI Message Board
afni_proc.py -subj_id $subj \
-script proc.$subj -scr_overwrite \
-blocks despike tshift align tlrc volreg blur mask regress \
-copy_anat $anat_dir/anat+orig \
-dsets $epi_dir/r01+orig.HEAD \
-tcat_remove_first_trs 5
by
trabz
-
AFNI Message Board
I use AFNI and uber_subject.py and when ı look my afni_proc.py script it's look ok and also it performs all the processing without any error. At our project we are trying to compare non-stationary data NOTMOCO and stationary data MOCO . (MOCO has motion at 0.1mm-0.3mm band and NOTMOCO has motion at 1mm-2mm band and MOCO is the motion corrected NOTMOCO data created by MR own software). I u
by
trabz
-
AFNI Message Board
I use AFNI and uber_subject.py and when ı look my afni_proc.py script it's look ok and also it performs all the processing without any error. At our project we are trying to compare non-stationary NOTMOCO and stationary data MOCO (MOCO has motion 0.1mm-0.3mm band and NOTMOCO has motion 1mm-2mm band and MOCO is the motion corrected NOTMOCO data created by MR own software). I use motion ce
by
trabz
-
AFNI Message Board