3dGrayplot


Make a grayplot from a 3D+time dataset, sort of like Jonathan Power:
  https://www.ncbi.nlm.nih.gov/pubmed/27510328
  https://www.jonathanpower.net/2017-ni-the-plot.html
Result is saved to a PNG image for your viewing delight.

 * This style of plot is also called a carpet plot,
   but REAL carpets are much more attractive, IMHO.

 * The horizontal axis of the grayplot is time, and the
   vertical axis is all 3 spatial dimensions collapsed into 1.

 * Also see AFNI script @grayplot, as well as the QC output
   generated by afni_proc.py.

Usage:

  3dGrayplot [options] inputdataset

OPTIONS: [lots of them]
--------

 -mask mset      = Name of mask dataset
                    * Voxels that are 0 in mset will not be processed.
                    * Dataset must be byte-valued (8 bits: 0..255);
                      shorts (16 bits) are also acceptable, but only
                      values from 1.255 will be processed.
                    * Each distinct value from 1..255 will be processed
                      separately, and the output image will be ordered
                      with the mask=1 voxels on top, mask=2 voxels next,
                      and so on down the image.
                    * A partition (e.g., mask=3) with fewer than 9 voxels
                      will not be processed at all.
                    * If there is more than one partition, horizontal dashed
                      lines will drawn between them.
                    * If '-mask' is not given, then all voxels will be used,
                      except those at the very edge of a volume. Doing this is
                      usually not a good idea, as the non-brain tissue will
                      take up a lot of useless space in the output grayplot.

 -input dataset  = Alternative way to input the dataset to process.

 -prefix ppp.png = Name for output file.
                    * Default is Grayplot.png (if you don't use this option)
                    * If the filename ends in '.jpg', a JPEG file is output.
                    * If the filename ends in '.pgm', a PGM file is output.
                        [PGM files can be manipulated with the NETPBM package.]
                    * If the filename does not end in '.jpg' OR in '.png'
                      OR in '.pgm', then '.png' will be added at the end.

 -dimen X Y      = Output size of image in pixels.
                    * X = width  = time axis direction
                    * Y = height = voxel/space dimensions
                    * Defaults are X=1024 Y=512 -- suitable for screen display.
                    * For publication, you might want more pixels, as in
                        -dimen 1800 1200
                      which would be 6 inches wide by 4 inches high, at the usual
                      300 dots-per-inch (dpi) of high resolution image printing.
                   ** Note that there are usually many more voxels in the Y direction
                      than there are pixels in the output image. This fact requires
                      coarsening the Y output grid and resampling the data to match.
                      See the next option for a little more information about
                      how this resampling is implemented.

 -oldresam       = The method for resampling the processed dataset to the final
                   grayscale image size was changed/improved in a major way.
                   If you want to use the original method, then give this option.
                    * The only reason for using this option is for
                      comparison with the new method.
                    * The new resampling method uses minimum-sidelobe local averaging
                      when coarsening the grid (vertical direction Y = voxels/space)
                      -- whose purpose is to reduce aliasing artifacts
                    * And uses cubic interpolation when refining the grid
                      (usually horizontal direction = time) -- whose purpose
                      is purely beauty -- compared to the older linear interpolation.
                    * Note that the collapsing of multiple voxels into one pixel in
                      the Y direction will tend to cancel out signals that change sign
                      within neighbors in the voxel ordering method you choose.
                      (See the 'order' options below.)

 -polort p       = Order of polynomials for detrending.
                    * Default value is 2 (mean, slope, quadratic curve).
                    * Use '-1' if data is already detrended and de-meaned.
                      (e.g., is an AFNI errts.* file or other residual dataset)

 -fwhm f         = FWHM of blurring radius to use in the dataset before
                   making the image.
                    * Each partition (i.e., mask=1, mask=2, ...) is blurred
                      independently, as in program 3dBlurInMask.
                    * Default value is 0 mm = no blurring.
                        [In the past, the default value was 6.]
                    * If the dataset was NOT previously blurred, a little
                      spatial blurring here will help bring out larger scale
                      features in the times series, which might otherwise
                      look very noisy.

  ** The following four options control the ordering of **
  ** voxels in the grayplot, in the vertical direction. **

 -pvorder        = Within each mask partition, order the voxels (top to
                   bottom) by how well they match the two leading principal
                   components of that partition. The result is to make the
                   top part of each partition be made up of voxels with
                   similar time series, and the bottom part will be more
                   'random looking'.
                  ++ The presence of a lot of temporal structure in a
                     grayplot of a 'errts' residual dataset indicates
                     that the 'removal' of unwanted time series components
                     did not work well.
                  ++ Using '-pvorder' to put all the structured time series
                     close together will make such problems more visible.
                  ++ IMHO, this is the most useful ordering.

 -LJorder        = Within each mask partition, order the voxels (top to
                   bottom) by their Ljung-Box statistics, which is a measure
                   of temporal correlation.
                  ++ Experimental; probably not useful.

 -peelorder      = Within each mask partition, order the voxels by how
                   many 'peel' steps are needed to get from the partition
                   boundary to a given voxel.
                  ++ This ordering puts voxels in 'similar' geometrical
                     positions sort-of close together in the image.
                     And is usually not very interesting, IMHO.

 -ijkorder       = Set the intra-partition ordering to the default, by
                   dataset 3D index ('ijk').
                  ++ In AFNI's +tlrc ordering, this ordering primarily will
                     be from Inferior to Superior in the brain (from top to
                     bottom in the grayplot image).
                  ++ This is the default ordering method, but not the best.

  ** These options control the scaling from voxel value to gray level **

 -range X        = Set the range of the data to be plotted to be 'X'.
                   Each time series is first normalized by its values to:
                      Z[i] = (t[i] - mean_t)/stdev_t.
                   When this option is used, then:
                    * a value of 0 will be plotted as middle-gray
                    * a value of +X (or above) will be plotted as white
                    * a value of -X (or below) will be plotted as black
                   Thus, this option should be used with data that is centered
                   around zero -- or will be so after '-polort' detrending.
                    * For example, if you are applying this option to an
                      afni_proc.py 'errts' (residuals) dataset, a good value
                      of X to use is 3 or 4, since those values are in percents.
                    * The @grayplot script uses '-range 3.89' since that is the
                      value at which a standard normal N(0,1) deviate has a 1e-4
                      two-sided tail probability. (If nothing else, this sounds cool.)
                   If you do NOT use '-range', then the data will be automatically
                   normalized so each voxel time series has RMS value 1, and then
                   the grayscale plot will be black-to-white being the min-to-max,
                   where the min and max computed over the entire detrended
                   and normalized dataset.
                    * This default automatic normalizing and scaling makes it
                      almost impossible to directly compare grayplots from
                      different datasets. This difficulty is why the '-range'
                      and '-percent' options were added.

 -percent        = Use this option on 'raw' time series datasets, to compute
                   the mean of each voxel timeseries and then use that value
                   to scale the values to percent differences from the mean.
                    * NOT suitable for use with a residual 'errts' dataset!
                    * Should be combined with '-range'.
                    * Detrending will be applied while calculating the mean.
                      By default, that will be quadratic detrending of each
                      voxel time series, but that can be changed with the
                      '-polort' option.

 -raw_with_bounds A B
                 = Use this option on 'raw' time series datasets, map values
                   <= A to black, those >= B to white, and intermediate values
                   to grays.
                    * Can be used with any kind of dataset, but probably makes
                      most sense to use with scaled ones (errts, fitts or
                      all_runs).
                    * Should NOT be combined with '-range' or '-percent'.

** Quick hack for Cesar Caballero-Gaudes, April 2018, by @AFNIman.
   As such, this program may be modified in the future to be more useful,
   or at least more beautifully gorgeous.

** Applied to 'raw' EPI data, the results may not be very informative.
   It seems to be more useful to look at the grayplot calculated from
   pre-processed data (e.g., time series registered, filtered, etc.).

** See also the script @grayplot, which can process the results from
   afni_proc.py and produce an image with the grayplot combined with
   a graph of the motion magnitude, and with the GM, WM, and CSF in
   different partitions.

** afni_proc.py uses this program to create grayplots of the residuals
   from regression analysis, as part of its Quality Control (QC) output.

--------
EXAMPLE:
--------
The following commands first generate a time series dataset,
then create grayplots using each of the ordering methods
(so you can compare them). No mask is given.

 3dcalc -a jRandomDataset:64:64:30:256 -datum float \
        -prefix Qsc.nii -expr 'abs(.3+cos(0.1*i))*sin(0.1*t+0.1*i)+gran(0,3)'
 3dGrayplot -pvorder   -prefix QscPV.png   -input Qsc.nii -fwhm 8
 3dGrayplot -ijkorder  -prefix QscIJK.png  -input Qsc.nii -fwhm 8
 3dGrayplot -peelorder -prefix QscPEEL.png -input Qsc.nii -fwhm 8