Skip to content
ashs.rst 5.21 KiB
Newer Older
Michael Krause's avatar
Michael Krause committed
ASHS
====

Michael's avatar
Michael committed
There is a specially patched version of the original `ASHS`_ (Automatic
Segmentation of Hippocampal Subfields) tool available on the Tardis.

In most cases you are probably going to run this tool with a pre-installed
Michael's avatar
Michael committed
atlas and a set of a T1 and T2 Nifti images to start the automatic segmentation
pipeline. To do this you have to tell ASHS where its root folder is located by
exporting a variable called ``ASHS_ROOT`` like this:

.. code-block:: bash

Michael's avatar
Michael committed
    export ASHS_ROOT=/opt/ashs

You could also get your own copy (300MB) of ASHS for example from our gitlab server
https://gitlab.mpib-berlin.mpg.de/krause/ashs/tree/mpib and place it in your
home.

.. code-block:: bash

    mkdir ~/src/ && cd ~/src
Michael's avatar
Michael committed
    git clone --depth 1 --branch mpib https://gitlab.mpib-berlin.mpg.de/krause/ashs.git
Michael's avatar
Michael committed
    export ASHS_ROOT=$HOME/src/ashs
Michael's avatar
Michael committed
Segmentation
-------------

Given your files are already in place a typical invocation would then be:

.. code-block:: bash

    $ASHS_ROOT/bin/ashs_main.sh -a /opt/ashs/data/atlas_upennpmc/ -g T1.nii -f T2.nii -w wdir -T -Q

This will run the ASHS pipeline in the foreground submitting jobs and waiting
for them to finish. Results will be put in the specified folder *wdir*. You can
use a different atlas folder by specifying a custom location after the ``-a``
switch. To conserve disk space please always use the tidy ``-T`` option. It
is also possible to run some stages selectively. Use the option ``-h`` to list
all parameters and a help text.

To run the program in the background and catch all the output so you needn't
keep the shell open use the ampersand (*&*) and shell redirection mechanism like this:

.. code-block:: bash

    $ASHS_ROOT/bin/ashs_main.sh [..opts..]    >ashs.log 2>&1 &

For a number of image pairs you can use a simple for loop.

.. code-block:: bash

    for id in 01 03 10 32 ; do
       $ASHS_ROOT/bin/ashs_main.sh -a /opt/ashs/data/atlas_upennpmc/ -g T1_${id}.nii -f T2_${id}.nii -w wdir_${id} -T -Q >ashs_${id}.log 2>&1 &
    done

Be careful with looping over a large number (>30) of image pairs in this way as
some intermediate steps will generate a lot of jobs on its own. Also check ``qstat`` and the log file you specified to track ASHS' progress.


Known Issues
------------

Due to the very large amount of intermediate jobs, script submission may become
unreliable when submitting more than 20 or 30 images at once.

Please consider either manually submitting smaller chunks of jobs or use a
script similar to the one below that will keep track of the current chunk
position.  Subsequent runs of this script will submit chunksize number of new
`ashs_main.sh` calls:


.. code-block:: bash
Michael Krause's avatar
Michael Krause committed

    #!/bin/bash

    # This script will submit ASHS jobs in small chunks. To submit a next chunk
    # just run this script again. The state is inferred from the existence of the
    # working directory only.
Michael Krause's avatar
Michael Krause committed

    # somewhat safe value for concurrent ashs run at the moment
    imagesperchunk=30

    # this will only run the first N images
    submitted=0
    for subject in $( ls -1 *nii) ; do
        if [ $submitted -ge $imagesperchunk ] ; then
            echo "Submitted $imagesperchunk images, done for now."
            exit 0
        fi

Michael Krause's avatar
Michael Krause committed
        name=${subject%%\.img}
        wdir=workdir_${name}_dir
        if [ -d $wdir ] ; then
            echo "Skipping $subject, working directory exists."
Michael Krause's avatar
Michael Krause committed
            continue
        fi
        submitted=$(( submitted + 1 ))
        echo "Starting script for subject: $subject"
Michael Krause's avatar
Michael Krause committed
        ASHS_ROOT=/opt/ashs /opt/ashs/bin/ashs_main.sh \
          -a /opt/ashs/data/atlas_paul/ \
          -g T1/t1_${subject} \
Michael Krause's avatar
Michael Krause committed
          -f ${subject} \
          -w $wdir \
          -T \
          -Q >$subject.log 2>&1 &

        sleep 2s
    done
    echo "No images left, exiting."
    exit 0

Atlas Creation
--------------

We had some success creating an own atlas using manual segmentations following the official `Atlas Creation Guide <https://sites.google.com/site/hipposubfields/building-an-atlas>`_. Basically you need to prepare the following data:

1. 20 T1 and T2 images
2. Segmentations in T2 in Nifti format for both hemispheres
3. A label description file (ITKsnap) mapping the values in the segmentations to a name, color and some options like this::

      0      0    0    0       0  0  0    "Clear Label"
      11   102  205  170       1  1  1    "CA1L"
      12     0    0  128       1  1  1    "CA1R"
      ...

Let's assume your niftis are in a folder called nifits/ and the segmentations in a folder called Segmentations relative to the current directory then:

4. A data manifest file listing all the necessary files for each subject in this format::

     1102 niftis/1102/T1*gz niftis/1102/high*gz Segmentations/1102_*left.nii.gz Segmentations/1102_*right.nii.gz
     1105 niftis/1105/T1*gz niftis/1105/high*gz Segmentations/1105_*left.nii.gz Segmentations/1105_*right.nii.gz
     ...


At the moment you need to either download a copy of ASHS from gitlab as shown above or use the one in ``/opt/ashs-git/``:

.. code-block:: bash

Michael's avatar
Michael committed
   export ASHS_ROOT=/opt/ashs-git

With all the files in place you can run ``ashs_main.sh`` like this:


.. code-block:: bash

   $ASHS_ROOT/bin/ashs_train.sh -D manifest.txt -L labels.txt -w atlas_wdir

.. _ASHS: https://sites.google.com/site/hipposubfields/