Manual started by Victor Travieso, May 2004
Last updated by Andrey Kaliazin,
July 18, 2011
This page lists popular scientific software installed on COSMOS.
All custom-built applications and libraries are installed under COSMOLIB=/home/cosmos/share/$ARCH/$PP/ folder, so that the versions available depend on the current platform (x86-64 for universe and ia64 - for cosmos) and SGI ProPack level (PP7, as of 2011). The way to access the relevant version of the software stack, is to load the module 'cosmolib', which is done automatically at the login via the /etc/profile script, so users don't have to worry about it. Additional modules can be loaded automatically via personal .bashrc.local script:
|
Once the module is loaded, you should have the
right environmental variables to use the programs (eg. anafast, synfast
from Healpix) and link with the libraries (eg. cfitsio using the linker
flag -lcfitsio ) in the repository. If you want to see the exact
commands that the module performs, you can type 'module show healpix' in the command line.
To see what environments are available at the moment and which are loaded, use the corresponding commands:
|
Where source distributions of the applications are available, we have tried to create robust builds optimised for the Altix through a careful selection of configuration and compiler options. Although we try to ensure that all the software is well tested and bug free, the best guarantee is having the applications widely tested throughout the consortium. Please let us know if you find any omissions or bugs.
We welcome feedback about the library, so do contact us with comments
or suggestions, eg. other cosmology related software that you would
like to see included in the repository, to tell us about updates that
we might have missed, etc.
High performance, parallel numerical libraries from NAG and Intel are installed on COSMOS. These are strongly recommended to make the most out of the Altix, as they have been tuned specifically for the platform. If you have need to perform some common numerical task repeatedly in your program (eg. FFTs, ODEs, Linear Algebra) it will be worth exploring the libraries and trying to use the routines included in them. An added benefit is that a good amount of the routines can execute in parallel. In addition, the general purpose GNU Scientific Library is also available.
For more information on the libraries available on COSMOS, refer to the COSMOS developer guide.
'nagsmpexample_scs' followed by the routine name that you
are interested in, eg. 'nagfexample c06puf' to see an example of 3D
complex FFTs). cosmos only!) , providing easy to use parallel FFT routines. The documentation for the
libraries is available
online, or accesible via the man pages on cosmos (eg. 'man scsl' and 'man
intro_fft' ).
Current version: 3.2.2
Notes: Static and dynamic versions, single and double precision, OpenMP parallel.
Environment modules: cosmolib, fftw3 (optional)
The FFTW3 library is an open source, high performance library for real and complex Fast Fourier Transforms of any dimension. You can link to the FFTW3 library easily from Fortran, C or C++, and on COSMOS the library can perform FFTs in parallel through OpenMP. You can see examples of use from both the C and Fortran interfaces in the FFTW3 documentation.
To link against the library on COSMOS, you should compile the program using the following linker flags for the double precision version:
|
|
The number of processors used by the FFT routines is determined programmatically and through the environmental variable OMP_NUM_THREADS.
Current version : 2.1.5
Notes: Static versions,
separate real and complex libraries, single and double precision, MPI parallel.
Environment modules: cosmolib, fftw2 (optional)
The older version of FFTW is also installed on COSMOS, mainly to provide access to the MPI FFTs not yet implemented in FFTW3. However, for any other transforms, it is recommended that you use a vendor library or version 3 of the FFTW library for greater performance.
The FFTW2 library has both Fortran and C interfaces making it straightforward to use from any of these languages. For a detailed description of the library routines and code examples, refer to the online manual.
Note that the library comes in two different
versions, with separate library files for real and complex transforms - librfftw.a and libfftw.a respectively. On COSMOS, the
libraries and include files are also prefixed (d or s) according to the precision of the
library, so in general you must use the correct include files and
linking flags to select the appropiate version (-libfftw and fftw.h
default to complex, double precision transforms). Eg. :
|
Current version: Jan'12
Notes: New WMAP 7yr version.
Addtional environment modules: cfitsio healpix
The latest version of CosmoMC (Jan'12) accomodates the WMAP seven year data, is optimized for hybrid MPI and OpenMP parallelism, and can run very efficiently on the Altix. You can download the latest version from the official site.
For best results, we recommend compiling the distribution with the latest Intel compiler (module: latest - should be loaded by default) and running with MPI parallelism. If you are using a modified camb, or if you want to build your own binary, the recommended build options for COSMOS are as follows:
(Edit camb/Makefile to use the following lines - build with |
For camb_fits, the necessary additions are:
(Edit camb/Makefile to use the following lines)
|
For CosmoMC:
(Edit source/Makefile to use the following options)
|
To use the WMAP 7yr data, you should build CosmoMC against the centrally installed libraries using:
(Edit source/Makefile to set up the paths as follows)
|
Iif you don't want to build CosmoMC with WMAP support, you can ignore the above path, or at least comment out the WMAP variable above.
Note that there is a perl script (runCosmomc)
to submit CosmoMC-type of jobs on
COSMOS in a straightforward way. The script takes a few command
line arguments and generates and submits the appropiate job to the
queues, automating the selection of the right dplace mask. You can also
use runCosmomc if you are
running a modification of CosmoMC which retains the same program
structure - i.e. MPI chains and possibly an OpenMP core. To see the
command line options that the script takes, refer to this FAQ.
Also note that CosmoMC can now checkpoint the chains
past the initial burn phase if you set 'checkpoint
= T' in the .ini file. This feature is strongly recommended for
long runs on COSMOS in order to minimize data loss in case of a
failure. Checkpointing coupled with smaller run time requirements is
also a good way to get jobs running earlier and help the
scheduler fill the machine to maximum capacity: If you submit CosmoMC
jobs with less than the maximum run time, say 2 or 4 hours, there is a
good chance your job will run much earlier as the queueing system might
use it to backfill resources reserved by large jobs. Dividing a
long job into smaller ones using the checkpointing option will in
general reduce the time your jobs are waiting to run and give you a
much better turnaround.
http://lambda.gsfc.nasa.gov/product/map/current/
Current
version: v4 (7yr)
Location:
/home/cosmos/share/common/WMAP7/data
${COSMOLIB}/WMAP7 -- (library, MOD files and data)
Environment modules: cosmolib cfitsio
The WMAP likelihood software (V4, WMAP7 - became avaialble Jan'2010) used to calculate likelihoods of various models is now installed in the cosmolib. You can use the compiled version to build against other applications (eg. CosmoMC). If you are planning on using some other WMAP related software or need any of the other data files available on COSMOS, please let us know and we will install those in a central location according to demand. This will minimize the impact that multiple downloads of large datasets would have on the disk space.
This version of the software has been compiled with only the '-DOPTIMIZE' flag,
and links against the latest cfitsio library.
Please note that the previous versions of the library, MOD files and data are still available
from/home/cosmos/share/common/WMAP5/data and ${COSMOLIB}/WMAP5
http://sourceforge.net/projects/healpix/
Current version: 2.20a
Notes: Optimised build, includes the
IDL tools, C interface and C++ version. OpenMP parallel.
Additional environment modules: healpix, cfitsio
P_lm recursion algorithm (2 to 3 times faster than
version
1.22), improved IDL tools and Fortran90 visualization software
(map2gif).
To use the centrally installed Healpix optimised for the Altix, you should load the healpix module in addition to the standard set (cosmolib latest) from either the command line or your startup scripts:
|
If you are using programs that link with the Healpix
library, you will have to recompile them in order to make use of the new version. The C
interface and IDL configuration variables are also included in the
installation, along with the C++ version of the library and tools -
after loading cosmolib, you can access the C++ build under $HEALPIX_CXX. You will need the following flags to build your programs with Healpix:
Library flags: |
To maintain compatibility with older codes, the older versions of Healpix are still available on COSMOS and can be accessed by loading a corresponding module, eg.:
|
To see which versions available use the module avail command.
SHTools is a library for spherical harmonics. It can perform transforms, reconstructions, rotations of coefficients and spectral analyses on the sphere. For documentation and examples of use see the web page above. To compile against SHTools, you need the following flags:
Library flags: |
Current version: 8.0
Notes: No native ia64 support. Emulation mode might be slow for large
data/calculations on cosmos.
Environment modules: default
You can access IDL from COSMOS. However, note that there is no ia64 version of IDL available. The version available on COSMOS is the x86 version, working on the Itanium via the 32-bit emulation mode.Note however, that this is fairly slow.Please use instead the x86_64 nodes of COSMOS (ariel - available provisionally, since 2009).
http://heasarc.gsfc.nasa.gov/docs/software/fitsio/fitsio.html
Current version: 3.29
Notes: Built with Intel Compilers, shared and static versions available.
Environment modules:cosmolib, cfitsio
Cfitsio is the standard library for writing with files in the FITS data format. The library includes both C and Fortran interfaces, and has been improved with complete support for large data files (>2.1 Gb).
To link programs using the library, you will need to use the following linker flags in your compilation:
|
The current installation of Healpix is always
built against the current release of cfitsio. If you want to use the
previous releases of cfitsio, eg. with an older code, you can
still access it by loading correponding modules, prior to compilation/execution (older versions may vary on cosmos and universe):
|
http://hdf.ncsa.uiuc.edu/HDF5/
Current version:1.8.8
Notes: Built with Intel compilers and the MPT library for efficient parallel I/O.
Environment modules: default
HDF5 is a library for high performance data storage
and manipulation. The latest version
(1.8.5) has been built and optimised for the Altix using the latest
Intel compilers and the MPT library for parallel I/O. The library,
include files and example programs are installed under `$COSMOLIB/hdf5`. To
use the library in your programs, you will need to add the following
flags to your compilation:
|
The H5Utils,
a set of programs to manipulate and convert hdf5 files from other data formats,
are also installed with the library, and should be accessible once you
load the module 'cosmolib'.
For more information about the hdf5 file format and library, and how to
use it from your Fortran or C code, refer to
the HDF5 documentation.
http://www.astro.caltech.edu/~tjp/pgplot/
Current version: 5.2.2
Notes: There are two builds: one(default) with ifort and another, with gcc/gfortran. See 'module avail pgplot' for details.
Environment modules: default, pgplot.
Pgplot is the standard open source graphics library for scientific applications, widely used in a variety of software packages. In order to link your application with the library, you will need to link the fortran runtime library and sometimes the X11 libraries. If you get missing symbols in the linking stage, you can use the complete flags as follows:
|
Note, some applications require that
the environmental variables PGPLOT_DIR
are set - this is done automatically when the default module 'cosmolib' or specific 'pgplot' are loaded. See 'module show pgplot' for details.
http://www.mpa-garching.mpg.de/gadget/
Current version : 2.0.6 - public
Location: Not installed site-wide: please download from the public location and compile
Notes:Compiles successfully with the latest Intel compilers
Environment modules: default
To build versions of Gadget-2 on COSMOS, you should edit the Makefile and add the following information:
|
Current version : 1.5
Location: N/A
Notes: none
Environment modules: default
To build Enzo on Cosmos, you need to place this Make.mach.Cosmos machine config file in the 'enzo/src/' subdirectory.
The latest Intel compiler is selected by default. You can build the distribution by doing:
|
The Cosmos configuration is selected by the command 'make machine-Cosmos' - if the configuration is loaded successfully, the following commands should produce an output, similar to this:
|
With the default Cosmos machine config file mentioned above, the Enzo application is built multithreaded with OpenMP. In the current version, only FFT routines in the Enzo code are thread-enabled and it is not apparent, in which cases this additional parallelisation is beneficial or desireable. More testing is needed to figure that out.
OpenMP support can be disabled completely, by using this Make.mach.Cosmos_nomp machine config file instead. Use
$ wget http://www.damtp.cam.ac.uk/cosmos/software/enzo/Make.mach.Cosmos_nomp
$ make machine-Cosmos_nomp
and then follow with the commands, as in the example above.
Current version: 2010b (on universe), 2007b (on cosmos)
Notes: No native ia64 support. Runs in 32-bit emulation mode on cosmos.
Matlab doesn't currently support the ia64 architecture, and must run in x86 emulation mode on the Itanium2. In most cases this will be ok, but if you are manipulating large data sets or you need to perform intensive computations, you are likely to get a very poor performance. If you have Matlab applications that could benefit from a higher performance or parallel execution, please contact us and let us know about it - there are now some third party applications that might make Matlab much more efficient on COSMOS, but we would need to evaluate them with real applications in order to get a realistic idea of its usefulness.