Skip to content
matlab.rst 17.8 KiB
Newer Older
Michael Krause's avatar
Michael Krause committed
Matlab
======

Michael Krause's avatar
Michael Krause committed
Matlab is a bit of a problem child on the Tardis. While the `MATLAB Distributed
Computing Server`_ product aims to implement a compatibility layer to a number
of PBS based clusters it just doesn't work reliably for a number of reasons.
Michael Krause's avatar
Michael Krause committed

Michael Krause's avatar
Michael Krause committed
Because there is only a limited number of shared licenses available it's also
not feasible to run an arbitrary number of Matlab sessions in the form of jobs.
Michael Krause's avatar
Michael Krause committed
A workaround is to "compile" a script and create a standalone redistribution
environment, which does not require a license to run.
Michael Krause's avatar
Michael Krause committed

Michael Krause's avatar
Michael Krause committed
Different Matlab versions are available via environment modules.  You can list
them with :program:`module avail matlab` and activate a specific version with
:program:`module load matlab/<version>`.
Michael Krause's avatar
Michael Krause committed

Regular sessions
----------------

**If** there are free licenses available and you just need a quick way to spawn a
single Matlab session, there is nothing wrong with just running Matlab as is.
This might especially be useful if you simply need a node with lot's of memory
or if you want to test your code. In an interactive job you could simply enter
"matlab" and it will warn you about there not being a display available and
start in command line mode.


.. code-block:: bash

   [krause@master ~] qsub -i -l mem=100gb -q testing
   qsub: waiting for job 5395814.master.tardis.mpib-berlin.mpg.de to start
   qsub: job 5395814.master.tardis.mpib-berlin.mpg.de ready

   [krause@ood-9.tardis.mpib-berlin.mpg.de ~] module load matlab/R2012b
   [krause@ood-9.tardis.mpib-berlin.mpg.de ~] matlab


                                       < M A T L A B (R) >
                             Copyright 1984-2012 The MathWorks, Inc.
                              R2012a (7.14.0.739) 64-bit (glnxa64)
                                        February 9, 2012


   To get started, type one of these: helpwin, helpdesk, or demo.
   For product information, visit www.mathworks.com.

Michael Krause's avatar
Michael Krause committed
In a job context you would just run :program:`matlab -r main` with main.m containing your script:
Michael Krause's avatar
Michael Krause committed


.. code-block:: bash

Michael Krause's avatar
Michael Krause committed
   [krause@master ~] echo "module load matlab/R2014b; matlab -r main" | qsub -d.
Michael Krause's avatar
Michael Krause committed

.. important:

    Always check `http://lipsupport.mpib-berlin.mpgde/licstat`_ to see if there is an available license.


Compiling
---------

Once you leave the testing stage and would like to spawn an arbitrary number of
Matlab jobs/processes you have to compile your script with :program:`mcc`.
Michael Krause's avatar
Michael Krause committed
A reliable pattern is to create a main file :file:`project.m` that contains
a function with the same name and expects some arguments you would like to loop
over. A little like this maybe:

.. code-block:: matlab

   function project(subject_id, sigma)
   %
   % my main program implementing foo
   %
   % arguments
   % ---------
   %
   % subject_id: a string encoding the subject id
   % sigma: a string encoding values for sigma
Michael Krause's avatar
Michael Krause committed
   sigma = str2num(sigma);
   repmat(cellstr(subject_id), 1, sigma)



Running :program:`mcc -m project.m` would then "compile" (or rather encrypt and
package) your function and output a system dependent binary named
Michael Krause's avatar
Michael Krause committed
:file:`project` and a wrapper script :file:`run_project.sh`. To run it you
now have to combine the wrapper script, the location of a Matlab Compile
Runtime or the local installation path of the Matlab instance, that was used by
Michael Krause's avatar
Michael Krause committed
mcc, and a sufficient number of arguments for the function project().
Michael Krause's avatar
Michael Krause committed

Example:

.. code-block:: matlab

   [krause@master ~] mcc -m project.m
   [krause@master ~] ./run_project.sh /opt/matlab/interactive 42 5
   ------------------------------------------
   Setting up environment variables
   ---
   LD_LIBRARY_PATH is .:/opt/matlab/interactive/runtime/glnxa64:/opt/matlab/interactive/bin/glnxa64:/opt/matlab/interactive/sys/os/glnxa64:/opt/matlab/interactive/sys/java/jre/glnxa64/jre/lib/amd64/native_threads:/opt/matlab/interactive/sys/java/jre/glnxa64/jre/lib/amd64/server:/opt/matlab/interactive/sys/java/jre/glnxa64/jre/lib/amd64/client:/opt/matlab/interactive/sys/java/jre/glnxa64/jre/lib/amd64
   Warning: No display specified.  You will not be able to display graphics on the screen.
   ans =
Michael Krause's avatar
Michael Krause committed
       '42'    '42'    '42'    '42'    '42'
Michael Krause's avatar
Michael Krause committed
   [krause@master ~]


Michael Krause's avatar
Michael Krause committed
To include toolboxes in your script you have to add them during the compile
step so they get included in your package. Matlab built-in toolboxes such as
signal processing or statistics are detected automatically by scanning the
functions used in your script and don't need to be added explicitly. Compiled
scripts can't use the :program:`addpath()` function at runtime. You can guard
those calls however with the function :program:`isdeployed()`, which will
return 1 when Matlab detects that it runs as a compiled script and 0 otherwise.
Michael Krause's avatar
Michael Krause committed

Example: Suppose you collect your project library in a toolbox called project,
Michael Krause's avatar
Michael Krause committed
which in turn uses the function :program:`normrnd()` from the statistics
package:
Michael Krause's avatar
Michael Krause committed

.. code-block:: bash
Michael Krause's avatar
Michael Krause committed

Michael Krause's avatar
Michael Krause committed
   [krause@master ~] cat matlab/tools/project/myrnd.m
   function X = myrnd(arg)
Michael Krause's avatar
Michael Krause committed
   X = normrnd(0, 1, arg, arg);


You can then either use the "-a" or the "-I" switch of mcc to add your own toolbox.

Michael Krause's avatar
Michael Krause committed
+ **-a** will add the functions or directories listed directly to the compiled package/archive
+ **-I** (uppercase i) will add the location to the mcc search path so it get's included implicitly
Michael Krause's avatar
Michael Krause committed
Both options should work fine. The example below uses mcc from matlab R2014b,
but you can use any version.  The important part is to use the same Matlab
version as MCR upon script invocation with :program:`run_project.sh`.
Michael Krause's avatar
Michael Krause committed

.. code-block:: matlab

   [krause@master ~] module load matlab/R2014b
   [krause@master ~] cat project.m
   function project(arg1)
   myrnd(str2num(arg1))
   [krause@master ~] mcc -m project.m -a matlab/tools/project
   [...]
   [krause@master ~] ./run_project.sh /opt/matlab/R2014b 3
   ------------------------------------------
   Setting up environment variables
   ---
   LD_LIBRARY_PATH is .:/opt/matlab/R2014b/runtime/glnxa64:/opt/matlab/R2014b/bin/glnxa64:/opt/matlab/R2014b/sys/os/glnxa64:/opt/matlab/R2014b/sys/opengl/lib/glnxa64
Michael Krause's avatar
Michael Krause committed
   ans =
Michael Krause's avatar
Michael Krause committed
       0.5377    0.8622   -0.4336
       1.8339    0.3188    0.3426
      -2.2588   -1.3077    3.5784
Michael Krause's avatar
Michael Krause committed
    You only have to compile your project once and can then use it any number
    of times.  Matlab extracts your package to a shared hidden folder called
    `.mcrCache<Version-Number>`.  Those folders sometimes get corrupted by
    Matlab, especially when multiple jobs start at exactly the same time.  The
    only workaround so far is to add a sleep 1s between qsub calls and hope
    there is no collision.  Also, it makes sense to regularly remove those
    directories. But make sure all your jobs have finished before removing
    them with :file:`rm -rf .mcrCache*`.
Michael Krause's avatar
Michael Krause committed

SPM
---

SPM already comes as a pre-compiled version and can, identical to the examples
above, be started with :program:`run_spm8.sh` or :program:`run_spm12.sh`.
Usually users are exporting a number of batch files with the spm gui on their
local machine, change the paths to reflect the names on the tardis and then
call :program:`run_spm12.sh` with the **run** parameter for each batch file.
Example: segmentation for a number of nifti images. The file batch.template
contains the string :`%%IMAGE%%` as a placeholder so we can easily replace it
with the current image path and create a number of new batches from a single
template:
Michael Krause's avatar
Michael Krause committed

.. code-block:: bash

   #!/bin/bash
Michael Krause's avatar
Michael Krause committed
   i=0
   for image in tp2/Old/*.nii ; do
       fullpath=$PWD/$image
       sed "s#%%IMAGE%%#$fullpath#" batch.template > batch_${i}.m
       echo "run_spm12.sh /opt/matlab/interactive run $PWD/batch_${i}.m" | qsub -d.
       i=$((i+1))
Michael Krause's avatar
Michael Krause committed
   done


Sometimes it ***might be*** necessary to recompile the spm toolbox yourself,
for instance if you need a specific version or if you want to add external
toolboxes to SPM (e.g. cat12).

.. code-block:: matlab
Michael Krause's avatar
Michael Krause committed

Michael Krause's avatar
Michael Krause committed
   [krause@master ~] matlab
   Warning: No display specified.  You will not be able to display graphics on the screen.

                                       < M A T L A B (R) >
                             Copyright 1984-2012 The MathWorks, Inc.
                              R2012a (7.14.0.739) 64-bit (glnxa64)
                                        February 9, 2012


Michael Krause's avatar
Michael Krause committed
   To get started, type one of these: helpwin, helpdesk, or demo.
   For product information, visit www.mathworks.com.
   >> %addpath(genpath('/home/mpib/krause/matlab/tools/spm12')) % overkill
   >> addpath('/home/mpib/krause/matlab/tools/spm12')
   >> addpath('/home/mpib/krause/matlab/tools/spm12/config')
Michael Krause's avatar
Michael Krause committed
   >> spm_make_standalone()
   [... lot's of output and warnings ...]
   Processing /opt/matlab/R2012a/toolbox/matlab/mcc.enc
   [... lot's of output and warnings ...]
   Generating file "/home/mpib/krause/matlab/tools/spm_exec/readme.txt".Gen
   Generating file "/home/mpib/krause/matlab/tools/spm12/../spm_exec/run_spm12.sh".
   >>
Michael Krause's avatar
Michael Krause committed


This should create a folder :file:`spm_exec` below the spm toolbox location
containing the fresh :program:`spm12` and :program:`run_spm12.sh` which you can
then use in your jobs just like above.


To properly add the fieldtrip toolbox we have to jump through some more hoops.
For now the only reliable and flexible way is to run :program:`mcc()` from
within a Matlab session and make sure to run :program:`ft_defaults()` first.
Also, some of the provided mex files won't work out of the box, so we have to
recompile them using :program:`ft_compile_mex()`. This however stumbles over
some external C file called :file:`CalcMD5.c` using non-standard comments.
The following Matlab Script has been successfully used to create a compiled
script from a :file:`main.m` file, which relies on internal fieldtrip
functions.

.. code-block:: matlab
   % setup path
   basepath='/home/mpib/krause/matlab/tools/ConMemEEGTools/'
   addpath([basepath, '/fieldtrip-20150930'])
   ft_defaults()

   % re-compile mex functions (this has to be done only once per fieldtrip version)
   % "fix" the CalcMD5.c file
   system(['sed -i ''s#//.*##g'' ', basepath, '/fieldtrip-20150930/external/fileexchange/CalcMD5.c'])
   % and compile
   ft_compile_mex(true)

   % build the runtime environment
   mcc('-m', 'main.m')
Efficient Saving
----------------

We have noticed a couple of times now that Matlab's :program:`save()` function
can lead to undesirable performance issues. This occurs especially when a large
number of jobs try to save big objects at the same time. This is a bit of
a complex issue and I will try to go through them in this section using some
examples. You are free to adapt some of the things highlighted here in your
code, but you don't have to.
There are two aspects to consider when saving larger objects with Matlab:

1. shared Input/Output

   Your home directories are connected via a network file system (NFS). While
   NFS bandwidth, storage size and IO performance is matched to our cluster
   size, its capacity can be quickly saturated. We recommend to carefully watch
   your saving and loading operations in your scripts, especially when there
   are a large number of jobs accessing files at the same time. An indication
   for bad IO performance is a job cpu efficiency value considerably less than
   1.0 (check the `Tardis Website`_ for that value).
   The worst case for a networked file system are a huge number of tiny file
   operations, like reading or writing a few bytes or constantly checking for
   changed file attributes (size, modification time and so on). Try to avoid
   these situations by only writing large chunks of data at a time.

2. Compression

   Sometimes it's much faster to compress large chunks of data **before**
   sending it over the wire (i.e. saving an object). Also, storage size is
   a valuable resource and you should therefore consider compressing data with
   Matlab.

With objects larger than 2GB unfortunately, people usually resort to saving
with the Matlab File format v7.3. The default behaviour of Matlab's
:programe:`save(.., '-v7.3')` is suboptimal and you might want to save your
objects differently. The following examples highlight why this is necessary and
also why it's not trivial to just recommend a better alternative.

Consider two extreme variants of data, a truly random matrix and a matrix full
of zeros. Both in size 5GB:


.. code-block:: bash

   R = randn(128,1024,1024,5);
   Z = zeros(128,1024,1024,5);

On a drained tardis saving these two objects naively takes **forever**

.. code-block:: bash

   tic; save('~/test.mat', ['R'], '-v7.3'); toc;
   Elapsed time is 346.156656 seconds.

   tic; save('~/test.mat', ['Z'], '-v7.3'); toc;
   Elapsed time is 145.100863 seconds

There are two reasons for this. Firstly, the compression algorithm Matlab uses
appears to be quite slow. And even with perfectly compressible data (Z can be
easily expressed with 5 bytes), Matlab needs more than two minutes to save the
object, while **still** creating a considerable amount of IO operations.

There is an option to disable compression (which is generally not advisable
anyway). But even then saving the "R" object takes more than 3 minutes:

.. code-block:: bash

   tic; save('~/test.mat', ['R'], '-v7.3', '-nocompression'); toc
   Elapsed time is 220.320252 seconds.

With version 7.3 Matlab changed the binary format for object serialization to
something based on HDF5. Luckily, the lower level functions for HDF5 file
manipulations are available to you. For this simple case, a matrix of numbers
saving directly to HDF5 could look like this:

.. code-block:: bash

   >> h5create('test.h5', '/R', size(R), 'ChunkSize', [128,1024,1024,1]);
   >> tic; h5write('test.h5', '/R', R); toc;
   Elapsed time is 15.225184 seconds.

In comparison to the naive ::program:`save()`, we are fast by a factor of 22.
But this is not the whole story unfortunately. One downside to this approach is
connected to the internal structure of HDF5 objects. Saving a struct with
multiple, nested objects of different types (think string annotations, integer
arrays and float matrices) is much more tedious. Timothy E. Holy wrote
a wrapper script, that kind of automatically creates the necessary structures
and published the function ::program:`savefast` on `Matlab's filexchange`_. It
has a similar interface to the original save and can be used as a drop-in
replacement in many cases. You need to add the function to your search path of
course.

.. code-block:: bash

    >> tic; savefast('test.mat', 'R'); toc
    Elapsed time is 16.242634 seconds.

Unfortunately, we can't just stop here, because savefast by default is not
compressing anything and the whole point of using HDF5 is because we need to
store **large** matrices, bigger than 2GB. Blindly storing them to storage will
waste storage space and saturates the disk arrays with only 4 concurrent jobs
(based on the 15s benchmark above).
In other words, using some kind of compression is necessary - unless you
**know** that you generated random or close to random data. To highlight the
differences that come with compression level, let's look at the 5GB of zeros
stored in Z again. Matlab is a bit fast in compressing and saving with
::program:`save()`. But it still takes 145 seconds. You might be tempted to
combine the savefast() approach and simply compress with something fast like
::program:`gzip` afterwards. This will actually speed things up _and_ save
a lot of space:

.. code-block:: bash

   >> Z = zeros(128,1024,1024,5);

   >> tic; save('~/test.mat', ['Z'], '-v7.3'); toc;
   Elapsed time is 145.100863 seconds.
   >> tic; savefast('~/test2.mat', ['Z']); system('gzip test2.mat'); toc;
   Elapsed time is 53.527696 seconds.
   -rw-r--r--  1 krause domain users   31M May 16 17:57 test.mat
   -rw-r--r--  1 krause domain users  5.0M May 16 18:00 test2.mat.gz

On a first glance this appears to be faster and more efficient. But this
approach does not scale well to multiple jobs, as it saves the whole 5GB
uncompressed, then reads it all in again (from NFS cache, but still over the
network) and then, after compression, saves it back to disk.

Using HDF5's low level function you can fine tune the compression level from
0 (lowest and fastest) to 9 (highest and slowest). If you set the compression
level however, you also need to set a chunk size, probably because compression
is done chunk-wise. Recommending a generic compression level is hard and
depends very much on your data. Of course you don't want to waste time by
maximizing compression ratio, gaining only a couple of megabytes, but you also
don't want to waste bandwidth by saving overhead data. Consider our zeros again:

Michael Krause's avatar
Michael Krause committed
.. code-block:: bash

    >> h5create('test.h5', '/Z', size(Z), 'ChunkSize', [128,1024,1024,1], 'Deflate', 9);
    >> tic; h5write('test.h5', '/Z', Z); toc;
    Elapsed time is 35.333514 seconds.
    >> ls -lh test.h5
    -rw-r--r-- 1 krause domain users 5.0M May 16 18:35 test.h5

    >> h5create('test.h5', '/Z', size(Z), 'ChunkSize', [128,1024,1024,1], 'Deflate', 6);
    >> tic; h5write('test.h5', '/Z', Z); toc;
    Elapsed time is 36.646509 seconds.
    >> ls -lh test.h5
    -rw-r--r-- 1 krause domain users 5.0M May 16 18:36 test.h5

    >> h5create('test.h5', '/Z', size(Z), 'ChunkSize', [128,1024,1024,1], 'Deflate', 3);
    >> tic; h5write('test.h5', '/Z', Z); toc;
    Elapsed time is 20.002455 seconds.
    >> ls -lh test.h5
    -rw-r--r-- 1 krause domain users 23M May 16 18:37 test.h5

    >> h5create('test.h5', '/Z', size(Z), 'ChunkSize', [128,1024,1024,1], 'Deflate', 0);
    >> tic; h5write('test.h5', '/Z', Z); toc;
    Elapsed time is 50.847998 seconds.
    >> ls -lh test.h5
    -rw-r--r-- 1 krause domain users 5.1G May 16 18:38 test.h5

Here i picked a chunk size of 1GB, compressing with levels 9,6,3, and 0. Not
surprisingly the optimal value in this group is 3, as it only takes 20seconds
to save the data, while still reducing the 5GB file to 23MB.

Michael Krause's avatar
Michael Krause committed
.. _`MATLAB Distributed Computing Server`: http://de.mathworks.com/help/mdce/index.html
.. _`Tardis Website`: https://tardis.mpib-berlin.mpg.de/nodes
.. _`Matlabs' fileexchange`: https://de.mathworks.com/matlabcentral/fileexchange/39721-save-mat-files-more-quickly