Newer
Older
Segmentation of Hippocampal Subfields) tool available on the Tardis.
In most cases you are probably going to run this tool with a pre-installed
atlas and a set of a T1 and T2 Nifti images to start the automatic segmentation
pipeline. To do this you have to tell ASHS where its root folder is located by
exporting a variable called ``ASHS_ROOT`` like this:
.. code-block:: bash
You could also get your own copy (300MB) of ASHS for example from our gitlab server
https://gitlab.mpib-berlin.mpg.de/krause/ashs/tree/mpib and place it in your
home.
.. code-block:: bash
mkdir ~/src/ && cd ~/src
git clone --depth 1 --branch mpib https://gitlab.mpib-berlin.mpg.de/krause/ashs.git
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
-------------
Given your files are already in place a typical invocation would then be:
.. code-block:: bash
$ASHS_ROOT/bin/ashs_main.sh -a /opt/ashs/data/atlas_upennpmc/ -g T1.nii -f T2.nii -w wdir -T -Q
This will run the ASHS pipeline in the foreground submitting jobs and waiting
for them to finish. Results will be put in the specified folder *wdir*. You can
use a different atlas folder by specifying a custom location after the ``-a``
switch. To conserve disk space please always use the tidy ``-T`` option. It
is also possible to run some stages selectively. Use the option ``-h`` to list
all parameters and a help text.
To run the program in the background and catch all the output so you needn't
keep the shell open use the ampersand (*&*) and shell redirection mechanism like this:
.. code-block:: bash
$ASHS_ROOT/bin/ashs_main.sh [..opts..] >ashs.log 2>&1 &
For a number of image pairs you can use a simple for loop.
.. code-block:: bash
for id in 01 03 10 32 ; do
$ASHS_ROOT/bin/ashs_main.sh -a /opt/ashs/data/atlas_upennpmc/ -g T1_${id}.nii -f T2_${id}.nii -w wdir_${id} -T -Q >ashs_${id}.log 2>&1 &
done
Be careful with looping over a large number (>30) of image pairs in this way as
some intermediate steps will generate a lot of jobs on its own. Also check ``qstat`` and the log file you specified to track ASHS' progress.
Known Issues
------------
Due to the very large amount of intermediate jobs, script submission may become
unreliable when submitting more than 20 or 30 images at once.
Please consider either manually submitting smaller chunks of jobs or use a
script similar to the one below that will keep track of the current chunk
position. Subsequent runs of this script will submit chunksize number of new
`ashs_main.sh` calls:
.. code-block:: bash
#!/bin/bash
# This script will submit ASHS jobs in small chunks. To submit a next chunk
# just run this script again. The state is inferred from the existence of the
# working directory only.
# somewhat safe value for concurrent ashs run at the moment
imagesperchunk=30
# this will only run the first N images
submitted=0
for subject in $( ls -1 *nii) ; do
if [ $submitted -ge $imagesperchunk ] ; then
echo "Submitted $imagesperchunk images, done for now."
exit 0
fi
name=${subject%%\.img}
wdir=workdir_${name}_dir
if [ -d $wdir ] ; then
echo "Skipping $subject, working directory exists."
submitted=$(( submitted + 1 ))
echo "Starting script for subject: $subject"
ASHS_ROOT=/opt/ashs /opt/ashs/bin/ashs_main.sh \
-a /opt/ashs/data/atlas_paul/ \
-f ${subject} \
-w $wdir \
-T \
-Q >$subject.log 2>&1 &
sleep 2s
done
echo "No images left, exiting."
exit 0
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
Atlas Creation
--------------
We had some success creating an own atlas using manual segmentations following the official `Atlas Creation Guide <https://sites.google.com/site/hipposubfields/building-an-atlas>`_. Basically you need to prepare the following data:
1. 20 T1 and T2 images
2. Segmentations in T2 in Nifti format for both hemispheres
3. A label description file (ITKsnap) mapping the values in the segmentations to a name, color and some options like this::
0 0 0 0 0 0 0 "Clear Label"
11 102 205 170 1 1 1 "CA1L"
12 0 0 128 1 1 1 "CA1R"
...
Let's assume your niftis are in a folder called nifits/ and the segmentations in a folder called Segmentations relative to the current directory then:
4. A data manifest file listing all the necessary files for each subject in this format::
1102 niftis/1102/T1*gz niftis/1102/high*gz Segmentations/1102_*left.nii.gz Segmentations/1102_*right.nii.gz
1105 niftis/1105/T1*gz niftis/1105/high*gz Segmentations/1105_*left.nii.gz Segmentations/1105_*right.nii.gz
...
At the moment you need to either download a copy of ASHS from gitlab as shown above or use the one in ``/opt/ashs-git/``:
.. code-block:: bash
With all the files in place you can run ``ashs_main.sh`` like this:
.. code-block:: bash
$ASHS_ROOT/bin/ashs_train.sh -D manifest.txt -L labels.txt -w atlas_wdir
.. _ASHS: https://sites.google.com/site/hipposubfields/