Available Software
Contents
- The module command
- Software packages available
- Unclassified by field
- Classified by field, etc
- Packages related to Aeronautics
- Packages related to CFD
- Packages related to ComputerVision
- Packages related to DataCompression
- Packages related to DataFormat
- Packages related to Genomics
- Packages related to MachineLearning
- Packages related to Meteorology
- Packages related to NumericalMethods
- Packages related to SoftwareDevelopment
- Packages related to Utilities
- Packages related to Visualization
- Available by cluster
- Packages for scripting languages
The module command
A large number of software packages are available on each of the clusters, and
for some of them, multiple versions of the same package. Not all of them
like to co-exist with each other, so many of the packages are not available
by default. You need to tell the system you want to use them with the
module
command.
For example, matlab
is available on the HPC clusters, but
only after you module load
it. E.g.
login-1: matlab
matlab: Command not found.
login-1: module load matlab
login-1: module help matlab
----------- Module Specific Help for 'matlab/2014b' ---------------
This is Matlab Release 2014b
Releases after 2009 require Red Hat 5 or later.
Releases after 2013 require Red Hat 6 or later.
Matlab 2009b is the last release supported under Solaris.
Run command 'matlab' to start up the program,
or 'matlab -h' to see various command-line options.
*** To see all releases, issue the command
module avail matlab
login-1: matlab
< M A T L A B >
Copyright 1984-2014 The MathWorks, Inc.
R2014b (8.4.0.150421) 64-bit (glnxa64)
September 15, 2014
To get started, type one of these: helpwin, helpdesk, or demo.
For product information, visit www.mathworks.com.
>>
You can use the module help
and module whatis
commands before (or without) loading the package.
The module command
The module command associates a tag with
the various packages. It has a
hierarchical nature, so we have tags like matlab/2013b
and
matlab/2011a
in addition to plain old matlab
. The
former specify a specific version of matlab, whereas the latter will enable
the default version of matlab.
|
With the roleout of a new software library with the RHEL8 upgrade of the Deepthought2 HPC cluster, we are no longer keeping the "default" versions of software packages fixed over long periods of time. Instead, if you do not provide a version in your "module load" command, you will almost always get the latest installed version compatible with other modules you have loaded. (There will be rare and brief exceptions if we find problems in the latest version of a package). This is a change from the behavior of many packages on the RHEL6 version of Deepthought2. For reproducibility over time, it is strongly recommended that in job submission batch scripts you provide the full version specification for any packages for which you care about the version number. |
For many packages, the tag is basically just the base name of the
package, followed by a slash (/) and a version number. E.g., the examples
matlab/2013b
and matlab/2011a
refer to versions
2013b
and 2011a
, respectively, of matlab
.
You can also just use matlab
as the tag when e.g. loading a
module, in which case you will get one of the versions (usually the latest,
sometimes we pin to an earlier version if the newest version is still being
tested).
Many packages compiled from source will have different builds depending on what CPU microarchitecture they were compiled for. Running a code compiled for a more modern CPU than you are currently using will likely result in the code crashing; running a code compiled for a less modern CPU will generally work, but you might lose some of the optimizations available on the newer chip. Generally, these packages will have two equivalent module tags, one of the form "foo/VERSION/ARCHITECTURE" and one of the form "foo/ARCHITECTURE/VERSION"; both forms load foo version VERSION built for architecture ARCHITECTURE. The module load command will attempt to load the build for the most advanced architecture compatible with the CPU you are running on. If you omit the version, it will look for the latest version built for the most advanced CPU architecture compatible with the CPU you are running on.
Some packages are fussy about the compiler and/or MPI libraries they are
used with. This is particularly true of libraries which have fortran90 and/or
C++ components; some examples are openmpi
, hdf5
,
and netcdf
. In these cases, the package will again have multiple
module tags referring to the same build of the software. The first starts
like the simple case, with the base name of the package followed by the
version of the package, but to this is appended the other dependencies
(like compiler family and version, and perhaps MPI library family and version,
etc.). For example, one tag for version 1.8.13
of
hdf5
, built with version 4.8.1
of the GNU
compiler suite and version 1.8.1
of shared
linkage would be:
hdf5/1.8.13/gnu/4.8.1/openmpi/1.8.1/shared
There is also an alternative form giving the compiler, MPI library, and linkage portions of the tag immediately after the base package name, and ending with the version of the package, e.g.
hdf5/gnu/4.8.1/openmpi/1.8.1/shared/1.8.13
Both of these tags do exactly the same thing, and set up exactly the same
build of hdf5
, when loaded. You can use either.
The advantage of the module command is that you generally do NOT need to
give these long tag names, but can leave off components on the right and it
will try to do the right thing. In particular, if you try loading just
the base package name, e.g. module load hdf5
, the system will
determine if you have previously loaded a compiler and/or MPI library. If so,
then the system will see if there are any builds of any version of the package
matching the compiler and MPI library versions you previously loaded, and if
one is found load the build for the most recent version that matches. If no
matching build is found, and error will be displayed and no module loaded.
The following except gives an example of this:
login-2:> module purge
login-2:> module list
No Modulefiles Currently Loaded.
login-2:> module load gcc/4.8.1
login-2:> module load openmpi/1.8.1
login-2:> module load hdf5
login-2:> module list
Currently Loaded Modulefiles:
1) gcc/4.8.1(default)
2) openmpi/1.8.1/gnu/4.8.1
3) hdf5/gnu/4.8.1/openmpi/1.8.1/shared/1.8.13
login-2:>
login-2:> module purge
login-2:> module list
No Modulefiles Currently Loaded.
login-2:> module load intel/2016
login-2:> module load openmpi/1.10.2
login-2:> module load hdf5
login-2:> module list
Currently Loaded Modulefiles:
1) ofed/3.12(default)
2) intel/2016.3.210(default)
3) openmpi/1.10.2/intel/2016.3.210
4) hdf5/intel/2016.3.210/openmpi/1.10.2/shared/1.10.0
login-2:>
login-2:> module purge
login-2:> module list
No Modulefiles Currently Loaded.
login-2:> module load hdf5/1.8.15p1/sunstudio/12.4/openmpi/1.8.6/shared
login-2:> module list
Currently Loaded Modulefiles:
1) sunstudio/12.4(default)
2) openmpi/1.8.6/sunstudio/12.4
3) hdf5/1.8.15p1/sunstudio/12.4/openmpi/1.8.6/shared
We start by issuing a module purge
command to delete
any previously loaded modules (in general, you should use this with
caution. If you do use it, you should immediately load the modules
hpcc/deepthought
or hpcc/deepthought2
, as appropriate
for the cluster you are on, and slurm
. These are the default
modules on the Deepthought HPC clusters and things might not work properly
without them). We then do a module list
to confirm that there
are no modules loaded. We then load a specific version of the gcc compiler
(4.8.1) and the openmpi libraries (1.8.1) and then ask the system to load the
best hdf5
module. We then list what was loaded, and note that
the system loaded an openmpi build (version 1.8.1) for the compiler we loaded,
and a hdf5 version (1.8.13) for the previously loaded compiler and MPI library.
In the second example above, we load the 2016 version of the intel compiler
suite, and the 1.10.2 version of openmpi, and when we issue the simple
module load hdf5
, this time it loads hdf5 version 1.10.0 built
for that compiler and openmpi library. (Note also that the openmpi library
loaded was also built for the correct compiler.)
In the final example above, we do not explicitly load any compiler or
MPI library, but instead give a full path the the hdf5
module,
e.g. hdf5/1.8.15p1/sunstudio/12.4/openmpi/1.8.6/shared
. In
this case, we note that not only was the requested hdf5 module loaded, but
we also loaded the corresponding compiler (sunstudio/12.4
) and
openmpi (openmpi/1.8.6
) modules as well.
In summary, in almost all cases, you need only specify a portion
of the tag, and the system will try to default it properly. Generally,
you should load the compiler you wish to use first (or do not load anything
if you want to use the default gcc 4.6.1
compiler) followed by the
MPI library if needed. This will allow the system to pick the correct build
of libraries which depend on such.
Alternatively, if you specify the full path to a package which depends
on a specific compiler and MPI library, these will be loaded for you if not
already loaded. In general, if the package has dependencies, they will be
loaded automatically for you if needed.
However, if you want reproducable results, it is generally recommended that
you fully specify version numbers when loading modules in your job scripts.
I.e., the latest version of the foo
package might be 1.5
when you run your job today, but if you run it again a month from today, the
latest version might by 1.7 at that time, and that could change the results
unexpectedly if your script merely did a module load foo
. If
you wish to ensure the same version is always used, specifying the version
number in the module load command is advised, e.g.
module load foo/1.5
; this way you will get the same version of
the foo package both today and a month from now. But if you always prefer
to use the latest package available, then in general you should leave the
version numbers off.
|
NOTE: If you are submitting a job using a bourne style shell
(e.g.
sh or bash ), your dot files will not be read
automatically if your default shell is something else (e.g. csh
or tcsh ). So you will need to include a . ~/.profile
line near the top of your script, after the #SBATCH lines but
before you try to use tap or module commands.
|
The module
command allows you to issue commands of the
form module SUBCOMMAND OPTIONAL_ARGUMENT
.
Recognized subcommands include:
- list: This will list the packages you currently have loaded. Note that if a tap entry was migrated to modules, it will show up here if you loaded it using tap.
- avail OPTIONAL_TAG: this will list all the
packages available. If a tag is given, it will list all packages which
match the tag. Note the matching is purely lexical, so
module avail netcdf
will pick up allnetcdf
modules, as well asnetcdf-fortran
andnetcdf-cxx
modules. To restrict to justnetcdf
modules, use something likemodule avail netcdf/
(note the final /) - load TAG: this is the most common usage, and it sets up the environment for the application associated with TAG. Again, if you do not specify the full tag, the system will attempt to default it, and usually it can do that with some success. Most of our application definitions will pull in any dependencies as well. Note that unlike tap, this is a silent operation when it succeeds.
- help TAG: This displays a brief intro message about the package. Generally the same message you see when you tap a package. It is advised that you read this once for each package you use.
- unload TAG: This unloads the package; i.e. all the environment that was set up for the package is returned to what it was before you loaded it. This is not 100% successful all the time, but for the most part works. You generally do not need this unless you which to change the version of an application you are using.
- purge: This unloads ALL currently loaded modules.
This should be used sparingly. If you do use it, please follow it with the
loading of either
hpcc/deepthought
orhpcc/deepthought2
(as appropriate for the cluster you are on) andslurm
. These modules are loaded for you automatically when you log into the clusters, and if you remove them (e.g. with thepurge
command), things might not work as expected.
Please note that the HPC compute nodes have abridged copies of the software directories available on other Glue/TerpConnect systems. The HPC login nodes have the full software directories. Please look at software lists below, and in particular click on the software package name for the full breakdown by version number to see which, if any, HPC clusters a specific version of the package is on.
Library packages
Some packages are not so much applications as libraries; i.e. they provide libraries which your code can use, and might or might not have any binaries that you can run directly. To make use of these libraries in your code takes a little bit of work. Please see the section on compiling codes for more information on how to do this.
Common errors from module/tap commands
As stated early, not all packages co-exist nicely. This is particularly
true of different versions of the same package, or related libraries (e.g.
the various MPI libraries). The module
command will generate
an error if you try to load a package which does not play well with packages
you have previously loaded; if this occurs you will need to unload the
previous package and reload the new package. E.g.
login-1: module load matlab/2012b
login-1:~: module load matlab/2013b
matlab/2013b(8):ERROR:150: Module 'matlab/2013b' conflicts with the currently loaded module(s) 'matlab/2012b'
matlab/2013b(8):ERROR:102: Tcl command execution failed: source $moduledir/common
login-1: module unload matlab
login-1:~: module load matlab/2013b
Although not as user friendly as one would like, you can see that the first time I tried to load matlab/2013b it complained because matlab/2012b was already loaded. Also note that I did not need to specify the full tag to unload matlab/2012b, as there was only one matlab package loaded, so it could default it.
You can also run into similar incompatibillity problems if you mix packages (especially library packages) built for one compiler/MPI library/etc with a different compiler/MPI library/etc. For example
login-1: module avail fftw/3.3.3
------------------- /cell_root/system/common/modulefiles/sys -------------------
fftw/3.3.3/gnu/4.6.1/openmpi/1.6/shared
fftw/3.3.3/gnu/4.6.1/openmpi/1.6/static
fftw/3.3.3/gnu/4.6.1/openmpi/1.6.5/shared
fftw/3.3.3/gnu/4.6.1/openmpi/1.6.5/static
fftw/3.3.3/intel/2013.1.039/openmpi/1.6.5/shared
fftw/3.3.3/intel/2013.1.039/openmpi/1.6.5/static
login-1: module load intel/2013.1.039
login-1: module load fftw/3.3.3/gnu
***** Compiler mismatch
You have loaded compiler intel/2013.1.039
This package wants compiler gnu/4.6.1
login-1:
In the above, we show that there are various builds of
fftw/3.3.3
available. We then load the intel compiler module,
and try to load a GNU compiler build of fftw/3.3.3
, and get
an error. That is because using a GNU build of fftw/3.3.3 with the intel
compiler suite is likely not to work and cause you headaches. In general,
you should not be specifying more than fftw/3.3.3
anyway, and
the module command will figure out which build to load based on which compilers
you already have loaded. Alternatively, you can give the full path
(e.g. module load fftw/3.3.3/intel/2013.1.039/openmpi/1.6.5
)
and the module command will auto-load the appropriate intel compiler and
openmpi module for if, if it can. If you have a conflicting module already
loaded, you will still get the error.
For the most part, tap
is just a wrapper around the
module
command for backwards compatibility, and so you
will get the same errors with tap
if you try to load conflicting
modules. The tap
command is limited, so you generally do NOT
get a whole lot of choice as to which builds of packages get loaded, so
it is less likely to be a problem, but it can still arise. Some packages
have not been migrated from tap
to module
command,
and these will NOT generate such errors when you tap
incompatible
modules. But things will probably NOT work as you expected; the incompatibility
is REAL, not an artifact of the module command.
|
We are currently in the process of integrating the software documentation
for RHEL6 and RHEL8. This page primarily shows information for packages in
the RHEL8 software library; information on the RHEL6 software library can
be found at the old RHEL6
software documentation pages. We will gradually be adding the RHEL6
content under here, and in particular will be migrating (and updating as
needed) the package specific documentation here. But for now, you will likely
want to look at the old usage documentation (and module help PACKAGENAME).
|
List of available software packages
In this section we list the various software packages available on the UMD HPC clusters. We start with a list of all packages, unclassified by field of research, and follow with some shorter lists for various fields of research. If there is a mistake in the classification of software (e.g. a package is listed in the wrong category, or is missing from a category where it belongs), please contact systems staff so that we may correct the situation. Similarly, contact us if there is a category for software which is missing and you believe would be useful to have.
For each list, you can click on the name of the package to bring up a page with more detailed information on what versions are available on which clusters, and some basic (and sometimes not so basic) usage and help.
- All packages (unsorted by field)
- Software lists classified by field of research, etc
- Packages related to Aeronautics
- Packages related to CFD
- Packages related to ComputerVision
- Packages related to DataCompression
- Packages related to DataFormat
- Packages related to Genomics
- Packages related to MachineLearning
- Packages related to Meteorology
- Packages related to NumericalMethods
- Packages related to SoftwareDevelopment
- Packages related to Utilities
- Packages related to Visualization
- Software lists by cluster
All packages (unsorted by field)
This section list all installed research software, unsorted by field of research/category/etc.
Package Name | Description | GPU ready? |
---|---|---|
adios2 | The Adapatble Input/Output System, version 2 | Yes |
arpack-ng | Fortran77 subroutines designed to solve large scale eigenvalue problems | |
beagle | package for phasing genotypes and for imputing ungenotyped markers | |
bedtools2 | swiss-army knife of genomics analysis tasks | |
boost | free, portable C++ source libraries | |
bwa | Burrows-Wheeler Aligner | |
bzip2 | patent-free high quality data compression libraries | |
c-blosc | A blocking, shuffling, and loss-less compression library | |
caffe | Deep learning framework | Yes |
cdo | Climate Data Operators | Yes |
cgal | Computations Geometry Algorithms Library | Yes |
cmake | cross-platform, open-source build system | |
cuda | parallel computing platform/programming model for NVIDIA GPUs | |
cudnn | Library of deep learning/neural network primitives for GPUs | Yes |
curl | command line tool for transferring data | |
dmtcp | Distributed Multithreaded Checkpointing | |
dplr | Hypersonic Continuum CFD Code | |
dyninst | API for dynamic binary instrumentation | Yes |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
elpa | Eigenvalue solvers for Petaflop Applications | Yes |
fftw | Discrete Fourier Transform library | |
gcc | GNU Compiler Suite | |
gnuplot | command-line driven graphing utility | |
gsl | GNU Scientific Library | |
h5utils | Utilities for visualization/conversion of HDF5 data | |
hdf4 | Libraries for Hierarchical Data Format v4 | |
hdf5 | Libraries for Hierarchical Data Format v5 | |
hello-umd | Simple Hello world binary | |
hmmer | biosequence analyser using profile hidden Markov models | Yes |
hpctoolkit | toolkit for performance measurement and analysis | Yes |
Intel Parallel Studio | Intel Parallel Studio | |
intel-tbb | Intel Threaded Building Blocks | |
kahip | Karlsruhe High Quality Graph Partitioning Framework | Yes |
kallisto | Program for RNA-sequence quantification | |
lz4 | Lossless compression utility and library | |
magics | meteorological plotting software | Yes |
matio | MATLAB MAT file I/O library | |
matlab | Matlab numerical computing environment | |
metis | Metis unstructured graph partitioner | |
mgridgen | Multilevel Serial & Parallel Coarse Grid Construction Library | Yes |
mt-metis | Metis unstructured graph partitioner | |
mumps | MUltifrontal Massively Parallel sparse direct Solver | Yes |
nco | Toolkits for manipulating data in NetCDF formats | Yes |
netcdf | C Libraries for NetCDF data format | |
netcdf-cxx | C++ Libraries for NetCDF data format | |
netcdf-fortran | Fortran Libraries for NetCDF data format | |
openblas | Optimized BLAS libraries | |
opencv | Open-source Computer Vision Library | Yes |
openfoam | Opensource C++ CFD toolbox | Yes |
openjdk | Open-source implementation of Java | |
openmpi | OpenMPI implementation of Message Passing Interface | Yes |
osu-micro-benchmarks | Microbenchmarks for MPI | Yes |
parallel-netcdf | Parallel I/O library for NetCDF file format | Yes |
paraview | data analysis and visualization application | Yes |
parmetis | Parallel Metis | Yes |
perl | Perl scripting language | |
pfft | Parallel Fast Fourier Transform library | Yes |
python | Python scripting language | |
pytorch | Open-source machine learning library | Yes |
scalapack | Scalable linear algebra package | Yes |
scotch | Package for graph and mesh partitioning, graph clustering, and sparse matrix ordering | Yes |
silo | libraries for reading/writing a variety of scientific data | |
suite-sparse | Suite of Sparse Matrix algorithms | Yes |
superlu | Sparse Linear system solver (sequential) | |
superlu-dist | Sparse Linear system solver (distributed mem/MPI) | Yes |
superlu-mt | Sparse Linear system solver (multithreaded) | |
sz | Error-bounded Lossy Compressor for HPC Data | Yes |
Tecplot360 | CFD visualization software | |
us3d | Aeordynamic and Aerothermodynamic Simulations Software | |
visit | interactive, scalable, visualization, animation and analysis tool | Yes |
vtk | 3D computer graphics, image processing and visualization library | Yes |
xz | Lossless compression utility and library | |
zfp | Library for lossy compression of multidimensional arrays | |
zlib | General purpose, patent-free compression ibrary | |
zoltan | Toolkit for parallel combinatorial algorithms | Yes |
zstd | Fast, lossless compression algorithm |
Software packages related to Aeronautics
This section lists various software packages related to Aeronautics, Aviation, Aerospace Sciences and related fields (Astronautics, Hypersonics, Aerothermodynamics, etc.)
Package Name | Description | GPU ready? |
---|---|---|
dplr | Hypersonic Continuum CFD Code | |
us3d | Aeordynamic and Aerothermodynamic Simulations Software |
Software packages related to CFD
This section lists various software packages related to Computational Fluid Dynamics (CFD).
Package Name | Description | GPU ready? |
---|---|---|
dplr | Hypersonic Continuum CFD Code | |
openfoam | Opensource C++ CFD toolbox | Yes |
Tecplot360 | CFD visualization software |
Software packages related to ComputerVision
This section lists various software packages related to Computer Vision and related fields. You might also want to look at the section on packages related to Machine Learning.
Package Name | Description | GPU ready? |
---|---|---|
caffe | Deep learning framework | Yes |
opencv | Open-source Computer Vision Library | Yes |
pytorch | Open-source machine learning library | Yes |
Software packages related to DataCompression
This section lists various software packages related to Data Compression. Basically libraries and binaries for compressing and uncompressing data to/from various compressed formats and/or compression algorithms.
Package Name | Description | GPU ready? |
---|---|---|
bzip2 | patent-free high quality data compression libraries | |
c-blosc | A blocking, shuffling, and loss-less compression library | |
lz4 | Lossless compression utility and library | |
sz | Error-bounded Lossy Compressor for HPC Data | Yes |
xz | Lossless compression utility and library | |
zfp | Library for lossy compression of multidimensional arrays | |
zlib | General purpose, patent-free compression ibrary | |
zstd | Fast, lossless compression algorithm |
Software packages related to DataFormat
This section lists various software packages related to various scientific or other data formats, including libraries and binaries for reading, writing, and manipulating data in specific formats.
Package Name | Description | GPU ready? |
---|---|---|
adios2 | The Adapatble Input/Output System, version 2 | Yes |
arpack-ng | Fortran77 subroutines designed to solve large scale eigenvalue problems | |
cdo | Climate Data Operators | Yes |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
h5utils | Utilities for visualization/conversion of HDF5 data | |
hdf4 | Libraries for Hierarchical Data Format v4 | |
hdf5 | Libraries for Hierarchical Data Format v5 | |
matio | MATLAB MAT file I/O library | |
nco | Toolkits for manipulating data in NetCDF formats | Yes |
netcdf | C Libraries for NetCDF data format | |
netcdf-cxx | C++ Libraries for NetCDF data format | |
netcdf-fortran | Fortran Libraries for NetCDF data format | |
parallel-netcdf | Parallel I/O library for NetCDF file format | Yes |
silo | libraries for reading/writing a variety of scientific data |
Software packages related to Genomics
This section lists various software packages related to Genomics, Genetics, and related fields.
Package Name | Description | GPU ready? |
---|---|---|
beagle | package for phasing genotypes and for imputing ungenotyped markers | |
bedtools2 | swiss-army knife of genomics analysis tasks | |
bwa | Burrows-Wheeler Aligner | |
hmmer | biosequence analyser using profile hidden Markov models | Yes |
kallisto | Program for RNA-sequence quantification |
Software packages related to MachineLearning
This section lists various software packages related to Machine Learning, Deep Learning, Neural Nets and related fields. You might also wish to look at the section on Computer Vision
Package Name | Description | GPU ready? |
---|---|---|
caffe | Deep learning framework | Yes |
cudnn | Library of deep learning/neural network primitives for GPUs | Yes |
opencv | Open-source Computer Vision Library | Yes |
pytorch | Open-source machine learning library | Yes |
Software packages related to Meteorology
This section lists various software packages related to Meteorology, Weather, Atmospheric and Oceanographic Sciences and related fields.
Package Name | Description | GPU ready? |
---|---|---|
cdo | Climate Data Operators | Yes |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
magics | meteorological plotting software | Yes |
Software packages related to NumericalMethods
This section lists various software packages related to Numerical Methods and Computational Mathematics. It includes both underlying numerical libraries as well as advanced computational (and often GUI) environments. These packages are generally useful across a wide range of disciplines.
Package Name | Description | GPU ready? |
---|---|---|
cgal | Computations Geometry Algorithms Library | Yes |
elpa | Eigenvalue solvers for Petaflop Applications | Yes |
fftw | Discrete Fourier Transform library | |
gsl | GNU Scientific Library | |
kahip | Karlsruhe High Quality Graph Partitioning Framework | Yes |
matio | MATLAB MAT file I/O library | |
matlab | Matlab numerical computing environment | |
metis | Metis unstructured graph partitioner | |
mgridgen | Multilevel Serial & Parallel Coarse Grid Construction Library | Yes |
mt-metis | Metis unstructured graph partitioner | |
mumps | MUltifrontal Massively Parallel sparse direct Solver | Yes |
openblas | Optimized BLAS libraries | |
parmetis | Parallel Metis | Yes |
pfft | Parallel Fast Fourier Transform library | Yes |
pytorch | Open-source machine learning library | Yes |
scalapack | Scalable linear algebra package | Yes |
scotch | Package for graph and mesh partitioning, graph clustering, and sparse matrix ordering | Yes |
suite-sparse | Suite of Sparse Matrix algorithms | Yes |
superlu | Sparse Linear system solver (sequential) | |
superlu-dist | Sparse Linear system solver (distributed mem/MPI) | Yes |
superlu-mt | Sparse Linear system solver (multithreaded) | |
zoltan | Toolkit for parallel combinatorial algorithms | Yes |
Software packages related to SoftwareDevelopment
This section lists various software packages related to the development and optimization, etc. of software and codes. It includes compilers, debuggers, profiling tools, as well as relatively low level libraries. It also includes various scripting languages.
Package Name | Description | GPU ready? |
---|---|---|
arpack-ng | Fortran77 subroutines designed to solve large scale eigenvalue problems | |
boost | free, portable C++ source libraries | |
cgal | Computations Geometry Algorithms Library | Yes |
cmake | cross-platform, open-source build system | |
cuda | parallel computing platform/programming model for NVIDIA GPUs | |
cudnn | Library of deep learning/neural network primitives for GPUs | Yes |
dmtcp | Distributed Multithreaded Checkpointing | |
dyninst | API for dynamic binary instrumentation | Yes |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
elpa | Eigenvalue solvers for Petaflop Applications | Yes |
fftw | Discrete Fourier Transform library | |
gcc | GNU Compiler Suite | |
gsl | GNU Scientific Library | |
hdf4 | Libraries for Hierarchical Data Format v4 | |
hdf5 | Libraries for Hierarchical Data Format v5 | |
hello-umd | Simple Hello world binary | |
hpctoolkit | toolkit for performance measurement and analysis | Yes |
Intel Parallel Studio | Intel Parallel Studio | |
intel-tbb | Intel Threaded Building Blocks | |
kahip | Karlsruhe High Quality Graph Partitioning Framework | Yes |
matio | MATLAB MAT file I/O library | |
matlab | Matlab numerical computing environment | |
metis | Metis unstructured graph partitioner | |
mgridgen | Multilevel Serial & Parallel Coarse Grid Construction Library | Yes |
mt-metis | Metis unstructured graph partitioner | |
mumps | MUltifrontal Massively Parallel sparse direct Solver | Yes |
nco | Toolkits for manipulating data in NetCDF formats | Yes |
netcdf | C Libraries for NetCDF data format | |
netcdf-cxx | C++ Libraries for NetCDF data format | |
netcdf-fortran | Fortran Libraries for NetCDF data format | |
openblas | Optimized BLAS libraries | |
opencv | Open-source Computer Vision Library | Yes |
openjdk | Open-source implementation of Java | |
openmpi | OpenMPI implementation of Message Passing Interface | Yes |
osu-micro-benchmarks | Microbenchmarks for MPI | Yes |
parallel-netcdf | Parallel I/O library for NetCDF file format | Yes |
parmetis | Parallel Metis | Yes |
perl | Perl scripting language | |
pfft | Parallel Fast Fourier Transform library | Yes |
python | Python scripting language | |
scalapack | Scalable linear algebra package | Yes |
scotch | Package for graph and mesh partitioning, graph clustering, and sparse matrix ordering | Yes |
silo | libraries for reading/writing a variety of scientific data | |
suite-sparse | Suite of Sparse Matrix algorithms | Yes |
superlu | Sparse Linear system solver (sequential) | |
superlu-dist | Sparse Linear system solver (distributed mem/MPI) | Yes |
superlu-mt | Sparse Linear system solver (multithreaded) | |
zlib | General purpose, patent-free compression ibrary | |
zoltan | Toolkit for parallel combinatorial algorithms | Yes |
Software packages related to Utilities
This section lists various utility software packages, etc. This is sort of a catch-all category for packages which are useful across a wide range of disciplines but do not really fall into any discipline. These are often just newer versions of utilities supplied by the OS.
Package Name | Description | GPU ready? |
---|---|---|
curl | command line tool for transferring data | |
dmtcp | Distributed Multithreaded Checkpointing |
Software packages related to Visualization
This section lists various software packages related to Visualization; basically packages to assist in visualizing data generated from other packages/codes. It includes both basic graphics libraries and high end (GUI) visualization platforms.
Package Name | Description | GPU ready? |
---|---|---|
gnuplot | command-line driven graphing utility | |
magics | meteorological plotting software | Yes |
paraview | data analysis and visualization application | Yes |
visit | interactive, scalable, visualization, animation and analysis tool | Yes |
vtk | 3D computer graphics, image processing and visualization library | Yes |
All packages available on the Deepthought2 cluster (RHEL8)
This section lists all software packages available on the Deepthought2 cluster (RHEL8)
Package Name | Description | GPU ready? |
---|---|---|
adios2 | The Adapatble Input/Output System, version 2 | Yes |
arpack-ng | Fortran77 subroutines designed to solve large scale eigenvalue problems | |
beagle | package for phasing genotypes and for imputing ungenotyped markers | |
bedtools2 | swiss-army knife of genomics analysis tasks | |
boost | free, portable C++ source libraries | |
bwa | Burrows-Wheeler Aligner | |
bzip2 | patent-free high quality data compression libraries | |
c-blosc | A blocking, shuffling, and loss-less compression library | |
caffe | Deep learning framework | Yes |
cdo | Climate Data Operators | Yes |
cgal | Computations Geometry Algorithms Library | Yes |
cmake | cross-platform, open-source build system | |
cuda | parallel computing platform/programming model for NVIDIA GPUs | |
cudnn | Library of deep learning/neural network primitives for GPUs | Yes |
curl | command line tool for transferring data | |
dplr | Hypersonic Continuum CFD Code | |
dyninst | API for dynamic binary instrumentation | Yes |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
elpa | Eigenvalue solvers for Petaflop Applications | Yes |
fftw | Discrete Fourier Transform library | |
gcc | GNU Compiler Suite | |
gnuplot | command-line driven graphing utility | |
gsl | GNU Scientific Library | |
h5utils | Utilities for visualization/conversion of HDF5 data | |
hdf4 | Libraries for Hierarchical Data Format v4 | |
hdf5 | Libraries for Hierarchical Data Format v5 | |
hello-umd | Simple Hello world binary | |
hmmer | biosequence analyser using profile hidden Markov models | Yes |
hpctoolkit | toolkit for performance measurement and analysis | Yes |
intel-tbb | Intel Threaded Building Blocks | |
kahip | Karlsruhe High Quality Graph Partitioning Framework | Yes |
kallisto | Program for RNA-sequence quantification | |
lz4 | Lossless compression utility and library | |
magics | meteorological plotting software | Yes |
matio | MATLAB MAT file I/O library | |
matlab | Matlab numerical computing environment | |
metis | Metis unstructured graph partitioner | |
mgridgen | Multilevel Serial & Parallel Coarse Grid Construction Library | Yes |
mt-metis | Metis unstructured graph partitioner | |
mumps | MUltifrontal Massively Parallel sparse direct Solver | Yes |
nco | Toolkits for manipulating data in NetCDF formats | Yes |
netcdf | C Libraries for NetCDF data format | |
netcdf-cxx | C++ Libraries for NetCDF data format | |
netcdf-fortran | Fortran Libraries for NetCDF data format | |
openblas | Optimized BLAS libraries | |
opencv | Open-source Computer Vision Library | Yes |
openfoam | Opensource C++ CFD toolbox | Yes |
openjdk | Open-source implementation of Java | |
openmpi | OpenMPI implementation of Message Passing Interface | Yes |
osu-micro-benchmarks | Microbenchmarks for MPI | Yes |
parallel-netcdf | Parallel I/O library for NetCDF file format | Yes |
paraview | data analysis and visualization application | Yes |
parmetis | Parallel Metis | Yes |
perl | Perl scripting language | |
pfft | Parallel Fast Fourier Transform library | Yes |
python | Python scripting language | |
pytorch | Open-source machine learning library | Yes |
scalapack | Scalable linear algebra package | Yes |
scotch | Package for graph and mesh partitioning, graph clustering, and sparse matrix ordering | Yes |
silo | libraries for reading/writing a variety of scientific data | |
suite-sparse | Suite of Sparse Matrix algorithms | Yes |
superlu | Sparse Linear system solver (sequential) | |
superlu-dist | Sparse Linear system solver (distributed mem/MPI) | Yes |
superlu-mt | Sparse Linear system solver (multithreaded) | |
sz | Error-bounded Lossy Compressor for HPC Data | Yes |
visit | interactive, scalable, visualization, animation and analysis tool | Yes |
vtk | 3D computer graphics, image processing and visualization library | Yes |
xz | Lossless compression utility and library | |
zfp | Library for lossy compression of multidimensional arrays | |
zlib | General purpose, patent-free compression ibrary | |
zoltan | Toolkit for parallel combinatorial algorithms | Yes |
zstd | Fast, lossless compression algorithm |
All packages available on the Juggernaut cluster
This section lists all software packages available on the Juggernaut cluster
Package Name | Description | GPU ready? |
---|---|---|
arpack-ng | Fortran77 subroutines designed to solve large scale eigenvalue problems | |
boost | free, portable C++ source libraries | |
bzip2 | patent-free high quality data compression libraries | |
c-blosc | A blocking, shuffling, and loss-less compression library | |
caffe | Deep learning framework | Yes |
cmake | cross-platform, open-source build system | |
cuda | parallel computing platform/programming model for NVIDIA GPUs | |
cudnn | Library of deep learning/neural network primitives for GPUs | Yes |
curl | command line tool for transferring data | |
dmtcp | Distributed Multithreaded Checkpointing | |
dplr | Hypersonic Continuum CFD Code | |
eccodes | Libraries for encoding/decoding weather related data formats | Yes |
fftw | Discrete Fourier Transform library | |
gcc | GNU Compiler Suite | |
gsl | GNU Scientific Library | |
h5utils | Utilities for visualization/conversion of HDF5 data | |
hdf4 | Libraries for Hierarchical Data Format v4 | |
hdf5 | Libraries for Hierarchical Data Format v5 | |
hello-umd | Simple Hello world binary | |
hmmer | biosequence analyser using profile hidden Markov models | Yes |
Intel Parallel Studio | Intel Parallel Studio | |
intel-tbb | Intel Threaded Building Blocks | |
kahip | Karlsruhe High Quality Graph Partitioning Framework | Yes |
lz4 | Lossless compression utility and library | |
matio | MATLAB MAT file I/O library | |
matlab | Matlab numerical computing environment | |
metis | Metis unstructured graph partitioner | |
mgridgen | Multilevel Serial & Parallel Coarse Grid Construction Library | Yes |
mt-metis | Metis unstructured graph partitioner | |
mumps | MUltifrontal Massively Parallel sparse direct Solver | Yes |
netcdf | C Libraries for NetCDF data format | |
netcdf-cxx | C++ Libraries for NetCDF data format | |
netcdf-fortran | Fortran Libraries for NetCDF data format | |
openblas | Optimized BLAS libraries | |
openjdk | Open-source implementation of Java | |
openmpi | OpenMPI implementation of Message Passing Interface | Yes |
osu-micro-benchmarks | Microbenchmarks for MPI | Yes |
parallel-netcdf | Parallel I/O library for NetCDF file format | Yes |
parmetis | Parallel Metis | Yes |
perl | Perl scripting language | |
pfft | Parallel Fast Fourier Transform library | Yes |
python | Python scripting language | |
scalapack | Scalable linear algebra package | Yes |
scotch | Package for graph and mesh partitioning, graph clustering, and sparse matrix ordering | Yes |
silo | libraries for reading/writing a variety of scientific data | |
suite-sparse | Suite of Sparse Matrix algorithms | Yes |
superlu | Sparse Linear system solver (sequential) | |
superlu-dist | Sparse Linear system solver (distributed mem/MPI) | Yes |
superlu-mt | Sparse Linear system solver (multithreaded) | |
Tecplot360 | CFD visualization software | |
us3d | Aeordynamic and Aerothermodynamic Simulations Software | |
visit | interactive, scalable, visualization, animation and analysis tool | Yes |
vtk | 3D computer graphics, image processing and visualization library | Yes |
xz | Lossless compression utility and library | |
zfp | Library for lossy compression of multidimensional arrays | |
zlib | General purpose, patent-free compression ibrary | |
zoltan | Toolkit for parallel combinatorial algorithms | Yes |
zstd | Fast, lossless compression algorithm |
Packages installed for scripting languages
|
We are currently in the process of integrating the software documentation
for RHEL6 and RHEL8. This page primarily shows information for packages in
the RHEL8 software library; information on the RHEL6 software library can
be found at the old RHEL6
software documentation pages. We will gradually be adding the RHEL6
content under here, and in particular will be migrating (and updating as
needed) the package specific documentation here. But for now, you will likely
want to look at the old usage documentation (and module help PACKAGENAME).
|
The Division of Information Technology maintains several versions of several scripting languages on the clusters. These scripting languages can be enhanced by the addition of packages/modules/etc, and the Division of IT maintains a collection of packages for these. In most cases, the packages/modules are fairly lightweight, and we simply install them into the standard system directory location for the particular scripting language, upgrading them as needed. This means that they will just be found when you import/use/etc them; no additional work is needed on your part.
For a few cases, they are major packages in their own right, and we give them their own module. Those packages are included in the lists of software earlier in this document, and you will need to "module load" the appropriate package in order to use them.
This section is for the packages of the first type, and for the convenience of our users we provide a list of the packages/modules available for each install of a scripting language, by cluster and version. Note: that these lists do not include the standard library for the scripting language, i.e. those packages/modules that get included automatically in the installation of the scripting language, only those that need to be installed separately.
|
These sections are still under development. We hope to have it more
accurate in the near future.
|
Packages/modules for Python by cluster, version
Extensions for Python version 3.7.7 (built for gcc@8.4.0 on Deepthought2
Extension name | Version |
---|---|
absl-py | 0.7.1 |
astunparse | 1.6.3 |
babel | 2.7.0 |
cached-property | 1.5.1 |
cycler | 0.10.0 |
cython | 0.29.16 |
entrypoints | 0.3 |
flake8 | 3.7.8 |
gast | 0.3.3 |
google-pasta | 0.1.8 |
grpcio | 1.27.2 |
h5py | 2.10.0 |
jinja2 | 2.10.3 |
keras-preprocessing | 1.1.0 |
kiwisolver | 1.1.0 |
mako | 1.0.4 |
markupsafe | 1.1.1 |
matplotlib | 3.2.1 |
mccabe | 0.6.1 |
mpi4py | 3.0.3 |
nose | 1.3.7 |
numpy | 1.18.4 |
opt-einsum | 3.1.0 |
pillow | 7.0.0 |
pkgconfig | 1.5.1 |
protobuf | 3.11.2 |
pybind11 | 2.5.0 |
pycodestyle | 2.5.0 |
pyflakes | 2.1.1 |
pyparsing | 2.4.2 |
python-dateutil | 2.8.0 |
pytz | 2019.3 |
scipy | 1.4.1 |
setuptools-scm | 3.3.3 |
setuptools | 46.1.3 |
six | 1.14.0 |
termcolor | 1.1.0 |
wheel | 0.33.4 |
wrapt | 1.11.2 |
Extensions for Python version 3.7.7 (built for gcc@7.4.0 on Deepthought2
Extension name | Version |
---|---|
cached-property | 1.5.1 |
cycler | 0.10.0 |
cython | 0.29.16 |
entrypoints | 0.3 |
flake8 | 3.7.8 |
h5py | 2.10.0 |
kiwisolver | 1.1.0 |
mako | 1.0.4 |
markupsafe | 1.1.1 |
matplotlib | 3.2.1 |
mccabe | 0.6.1 |
mpi4py | 3.0.3 |
nose | 1.3.7 |
numpy | 1.18.4 |
pillow | 7.0.0 |
pkgconfig | 1.5.1 |
pybind11 | 2.5.0 |
pycodestyle | 2.5.0 |
pyflakes | 2.1.1 |
pyparsing | 2.4.2 |
python-dateutil | 2.8.0 |
pyyaml | 5.3.1 |
scipy | 1.4.1 |
setuptools-scm | 3.3.3 |
setuptools | 46.1.3 |
six | 1.14.0 |
Extensions for Python version 2.7.16 (built for gcc@8.4.0 on Deepthought2
Extension name | Version |
---|---|
backports-functools-lru-cache | 1.5 |
cycler | 0.10.0 |
kiwisolver | 1.1.0 |
mako | 1.0.4 |
markupsafe | 1.1.1 |
matplotlib | 2.2.5 |
mpi4py | 3.0.3 |
numpy | 1.16.6 |
pillow | 6.2.2 |
pyparsing | 2.4.2 |
python-dateutil | 2.8.0 |
pytz | 2019.3 |
scipy | 1.2.3 |
setuptools-scm | 3.3.3 |
setuptools | 44.1.0 |
six | 1.14.0 |
subprocess32 | 3.5.4 |