Hi Paul,
First point:
You are right. This produces an error:
3dBrickStat -slow -min -max C01_2_t1_tl+tlrc
*** failure while reading from brick file ./C01_2_t1_tl+tlrc.BRIK
*** desired 17060042 bytes but only got 1805258
*** Unix error message: Undefined error: 0
THD_load_datablock
Max_func
3dBrickStat main
** Command line was:
3dBrickStat -slow -min -max C01_2_t1_tl+tlrc
Fatal Signal 11 (SIGSEGV) received
Max_func
3dBrickStat main
Bottom of Debug Stack
** Command line was:
3dBrickStat -slow -min -max C01_2_t1_tl+tlrc
** AFNI version = AFNI_20.0.23 Compile date = Mar 27 2020
** [[Precompiled binary macos_10.12_local: Mar 27 2020]]
** Program Death **
** If you report this crash to the AFNI message board,
** please copy the error messages EXACTLY, and give
** the command line you used to run the program, and
** any other information needed to repeat the problem.
** You may later be asked to upload data to help debug.
** Crash log is appended to file /Users/bl/.afni.crashlog
Second point:
I don't have a particular reason for using @auto_tlrc and outside the afni_proc.py for these epi runs, so I will check the @SSwarper. Thank you!
However, I have a separate set of epi runs for representational similarity analysis, so for those datasetsI need to do preprocessing and GLM in native space and then do RSA searchlight for each individual subject. After that. I will need to normalize the searchlight results to standard space. Is it appropriate to use @auto_tlrc in that situation?
Thank you very much!
-Joy