It's still not in the regular release, but here are some instructions for using the beta version.
==================================================================================================
INSTALLATION INSTRUCTIONS
==================================================================================================
1. Install Anaconda using their downloadable scripts for your environment. On linux systems:
wget [
repo.continuum.io]
./Miniconda3-latest-Linux-x86_64.sh
Or use one of the installers here. On a Mac, I've downloaded the Python 2.7 installer - you can still use Python 3.6 later in your Anaconda environment- from the Anaconda download page.
[
www.anaconda.com]
2. The installation program should ask you to put these lines in your .bashrc file. (not sure why there is a space after the first dot here).
You can source .bashrc or start a new bash shell. These lines are different than previous recent versions,
but conda lets you know to put these lines in.
. /home/glend/miniconda3/etc/profile.d/conda.sh
conda activate
3. Start a new bash shell
bash
4. Make a new conda environment for Dask that we call "dask" below. This should work with python2 or python3.
conda create -y -c conda-forge -n dask python=3 dask
conda activate dask
5. Install dask-jobqueue using pip:
pip install dask_jobqueue
6. Install this special version of afni python code (this is a beta version, not integrated into the AFNI code completely yet)
[
afni.nimh.nih.gov]
and then pip install the package.
The local pip install must be done at the directory above the afni_python directory, and contains the setup.py file that specifies afni_python. Traverse down to the "python_scripts" directory. (this step lets python know where to look for other python files to import)
mkdir afni_template_beta
cd afni_template_beta
wget [
afni.nimh.nih.gov]
tar xzvf python_template_beta1.tgz
cd python_scripts
pip install -e .
7. Test out the template script to see the help. This is one level down from the previous directory in the "afni_python" directory.
cd afni_python
./make_template_dask.py
8. Add the full path to the afni_python directory to path (put absolute path in .bashrc file) This allows you to call the python programs from the command line.
export PATH=`pwd`:$PATH
and add this to your .bashrc file with something like this:
echo export PATH = `pwd`:'$PATH' >> ~/.bashrc
9. There are three new python files needed here that are not yet in the standard AFNI python distribution:
make_template_dask.py - the controlling script. Configures Dask and checks options
construct_template_graph.py - the "meat" of the processing. Loops across subjects to create various mean templates and does the image processing.
regwrap.py - configuration of options and simple utility functions
The other python scripts are similar to the standard distribution except python imports for afni are all from the pipinstalled afni_python package.
10. Running this script with data and options (for debugging, precede this with "python -m pdb "… )
make_template_dask.py -ok_to_exist -dsets /data/DSST/template_making/testdata/sub-*_T1w.nii.gz \
-init_base /usr/local/apps/afni/current/linux_openmp_64/MNI152_2009_template.nii.gz \
-bokeh_port 8790 -dask_mode localcluster -anisosmooth \
-final_space SLU_elderly1.0
The localcluster is used here, but on a SLURM cluster, we use "-dask_mode SLURM". Also most importantly, you need to come up with a name for the output space name of the data.
11. Bokeh graphs and monitoring using port number used above.
start browser (firefox) [
localhost]
\mkdir -p $outdir
cd $outdir
# ------------- RUUUUUUuuuuuuuuuuuuuuuuuunnnn!...
export OMP_NUM_THREADS=1
#python -m pdb `which make_template_dask.py` \
#make_template_dask.py \
python -m pdb `which make_template_dask.py` \
-ok_to_exist \
-dsets `cat ${inilist}|tr '\n' ' '` \
-init_base $inibase \
-bokeh_port 8791 \
-no_strip -anisosmooth \
-aniso_iters 1 \
-max_threads 2 \
-aff_vol_rsz ${vol_affx} \
-cluster_walltime "71:59:00" \
-dask_mode localcluster \
-findtypical_final \
-final_space MYSPACENAME