AFNI file: README.compression
Compressed Dataset .BRIK Files
==============================
AFNI now supports the use of compressed .BRIK files. The routines
that open and read these files detect the compression mode using
the filename suffix, and will use the correct decompression program
to read them in from disk. The character 'z' is added to the end
of a dataset's listing in the AFNI menus if the .BRIK is compressed;
for example, "elvis [epan]z".
No other files used by AFNI can be compressed and still be readable
by the software. This includes the .HEAD files, timeseries (.1D)
files, etc. Note also that the programs 2swap and 4swap don't
do compression or decompression, so that if you need to do byte
swapping on a compressed .BRIK file, you must manually decompress
it, swap the bytes, and (optionally) recompress the file.
How to Compress
===============
You can compress the .BRIK files manually. The following 3 programs
are supported:
Name Suffix Compression Command Uncompress Command
-------- ------ ------------------- -------------------
compress .Z compress -v *.BRIK uncompress *.BRIK.Z
gzip .gz gzip -1v *.BRIK gzip -d *.BRIK.gz
bzip2 .bz2 bzip2 -1v *.BRIK bzip2 -d *.BRIK.bz2
"compress" is available on almost all Unix systems.
"gzip" is available on many Unix systems, and can also be
ftp-ed from the AFNI distribution site.
"bzip2" is included in the AFNI distribution. It generally
compresses more than the other two programs, but is much
slower at both compression and uncompression. (See the
file README.bzip2 for details about this program.)
For large MR image datasets, "compress" and "gzip" have about the
same compression factor and take about the same CPU time (at least
in the samples I've tried here.)
Do NOT compress the .HEAD files! AFNI will not be able to read them.
Automatic Compression
=====================
If you set the environment variable AFNI_COMPRESSOR to one of
the strings "COMPRESS", "GZIP", or "BZIP2", then most programs
will automatically pass .BRIK data through the appropriate
compression program as it is written to disk. Note that this
will slow down dataset write operations.
Penalties for Using Compression
===============================
Datasets must be uncompressed when they are read into AFNI (or other
programs), which takes time. In AFNI itself, a dataset .BRIK file
is only read into the program when its values are actually needed
-- when an image or graph window is opened. When this happens, or
when you "Switch" to a compressed dataset, there can be a noticeable
delay. For "compress" and "gzip", this may be a few seconds. For
"bzip2", the delays will generally be longer.
The speed penalty means that it is probably best to keep the
datasets you are actively using in uncompressed form. This can
be done by compressing datasets manually, and avoiding the use
of AFNI_COMPRESSOR (which will compress all .BRIKs). Datasets
that you want to keep on disk, but don't think you will use
often, can be compressed. They can still be viewed when the
need arises without manual decompression.
Large .BRIK files are normally directly mapped to memory. This
technique saves system swap space, but isn't useful with compressed
files. Compressed .BRIK files are read into "malloc" allocated
memory, which will take up swap space. This may limit the number
of datasets that can be used at once. AFNI will try to purge unused
datasets from memory if a problem arises, but it may not succeed.
If necessary, the "-purge" option can be used when starting AFNI.
Very large datasets (larger than the amount of RAM on your system)
should not be compressed, since it will be impossible to read such
an object into memory in its entirety. It is better to rely on
the memory mapping facility in such cases.
Effect on Plugins and Other Programs
====================================
If you use the AFNI supplied routines to read in a dataset, then
everything should work well with compressed .BRIK files. You can
tell if a dataset is compressed after you open it by using the
DSET_COMPRESSED(dset) macro -- it returns 1 if "dset" is compressed,
0 otherwise.
How it Works
============
Using Unix pipes. Files are opened with COMRESS_fopen_read or
COMPRESS_fopen_write, and closed with COMPRESS_fclose. The code
is in files thd_compress.[ch], if you want to have fun. If you
have a better compression utility that can operate as a filter,
let me know and I can easily include it in the AFNI package.
=================================
| Robert W. Cox, PhD |
| Biophysics Research Institute |
| Medical College of Wisconsin |
=================================
This page auto-generated on
Tue Dec 17 06:56:52 PM EST 2024