Commit 1f3b289f authored by Michael Krause's avatar Michael Krause 🎉
Browse files

software: ashs typos

parent c26bec3e
......@@ -24,7 +24,7 @@ set of patches developed at MPIB to work with Torque.
Atlases
---
-------
The atlases available need to match the ashs version to function properly.
......@@ -44,12 +44,12 @@ download your on. Right now there are:
The data in *legacy* is supposed to work with the 0.1.x-branch of ASHS and the
data in *1.0* have been created with the newer branch.
atlases in *1.0* have been created with the newer branch.
Parallelization
---
---------------
The legacy branch of ASHS necessarily relied internally on qsub. Segmentation,
The legacy branch of ASHS necessarily relied on qsub internally. Segmentation,
even for a single subject, was almost prohibitively slow to be run on a single
core. The patched version should distribute nicely over the cluster when using
the parameter `-Q` (see below for examples).
......@@ -62,19 +62,20 @@ original scripts.
The table below shows a comparison for a test segmentation run using the **1.0** branch:
========== ===========
cores real time
========== ===========
========= =========
cores real time
========= =========
800 (pbs) 20m
20 (par) 15m
8 23m
4 35m
2 55m
1 110m
========= =========
Due to defensive inter-stage delays, the first approach (using cluster-wide
distribution with `-Q`) was even slower than 20-core parallel version on a
distribution with `-Q`) was even slower than the 20-core parallel version on a
single node using `-P`. Even single-threaded performance is at acceptable
speeds now. With recent workstations a batch segmentation for a small number of
subjects could be done overnight - without a cluster.
......@@ -83,19 +84,18 @@ subjects could be done overnight - without a cluster.
Segmentation
-------------
Given your files are already in place a typical invocation for a single subject
Given your files are already in place, a typical invocation for a single subject
would then be **either**:
old approach
Old approach
^^^^^^^^^^^^
(using internal qsub)
.. code-block:: bash
[krause@master ~] module load ashs
[krause@master ~] $ASHS_ROOT/bin/ashs_main.sh \
-a /opt/software/ashs/data/1.0/ashs_atlas_mpib_20180208\
-g T1.nii -f T2.nii -w wdir -T -Q
[krause@master ~] module load ashs
[krause@master ~] $ASHS_ROOT/bin/ashs_main.sh \
-a /opt/software/ashs/data/1.0/ashs_atlas_mpib_20180208\
-g T1.nii -f T2.nii -w wdir -T -Q
......@@ -119,7 +119,9 @@ For a number of image pairs you could use a simple for loop.
.. code-block:: bash
for id in 01 03 10 32 ; do
$ASHS_ROOT/bin/ashs_main.sh -a /opt/ashs/data/atlas_upennpmc/ -g T1_${id}.nii -f T2_${id}.nii -w wdir_${id} -T -Q >ashs_${id}.log 2>&1 &
$ASHS_ROOT/bin/ashs_main.sh -a /opt/software/ashs/data/1.0/ashs_atlas_mpib_20180208/ \
-g T1_${id}.nii -f T2_${id}.nii -w wdir_${id} -T -Q \
>ashs_${id}.log 2>&1 &
done
Be careful with looping over a large number (>30) of image pairs in this way as
......@@ -127,7 +129,7 @@ some intermediate steps will generate a lot of jobs on its own. Also check
``qstat`` and the log file you specified to track ASHS' progress.
new approach
New approach
^^^^^^^^^^^^
Since ASHS version 1.0 this is the preferred way to submit segmentations for a number of subjects.
......@@ -154,12 +156,12 @@ Looping over a number of subject IDs would look something like this.
for id in 01 03 10 32 ; do
export SUBJECT_ID=$id
qsub ashs_job.pbs -l nodes=1,ppn=8,mem=10gb -V
qsub ashs_job.pbs -l nodes=1,ppn=8,mem=10gb -v SUBJECT_ID
done
In this case the option `-V` to qsub instructs qsub to inherit all formerly
exported variables. This way the variable `SUBJECT_ID` is accessible in the job
context.
In this case the option `-v` to qsub instructs qsub to inherit a named,
formerly exported, variable. This way the variable `SUBJECT_ID` becomes
accessible in the job context.
Known Issues
------------
......
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment