The following points are related to the installation of software for use on the HPC cluster:
- Generally, software is installed/upgraded upon request only.
- HPC staff will install the latest stable version of software you have require installed or upgraded.
- HPC staff will attempt to install all software under environment modules control. Exceptions will only occur when new HPC images are deployed (rarely).
- Software will not be installed or upgraded if there is a risk of system/service failure, as assessed by HPC and/or eResearch staff.
The most up to date version of scientific software installed on HPC cluster nodes can be obtained by logging onto zodiac.hpc.jcu.edu.au
(using an SSH client) and using the following command:
module avail
The list of installed software will be quite long. The environment for a given software package can usually be setup using one of the commands:
module load <software> module load <software>/<version>
where <software>
is replaced by the module name and, if required, <version>
is replaced by the specific version desired.
Operating Systems
JCU HPC cluster nodes are built on the RedHat Enterprise Linux (RHEL 6.x) operating system. There are two main reasons for this choice:
- Hardware maintenance agreements that JCU pays for require use of a commercially support operating system.
- JCU ICT has signed up to a RedHat CAUDIT agreement for licensing of RHEL systems and seek to maximize their return on investment.
JCU HPC also runs a small VMware ESXi cluster (2 servers) that can be used to deliver small Windows systems to satisfy eResearch requirements that cannot be solved on Linux (e.g., web services, databases, or Windows compute). Researchers wanting to run other flavours of Linux (e.g., Ubuntu) should look to taking advantage of NeCTAR resources.
HPC Cluster Software Catalogue
Yellow shaded shells indicate that the software version is only available through use of environment modules.
Shells | |
Software | Version |
bash | 4.1.2 |
dash | 0.5.5.1 |
ksh | 20120801 |
tcsh | 6.17 |
Compilers | |
Software | Versions |
GNU | 4.4.7 |
4.9.3 | |
5.3.0 | |
6.1.0 |
Compilers / Interpreters | ||
Software | Versions | |
java | sun-1.6.0.25 , oracle-jdk-7u67 , oracle-jdk-8u20 | |
matlab | 2008b , 2013a , 2014a , 2014b , 2015a , 2016a | |
perl | 5.10.1 | |
python | 2.6.6 | 2.7 , 2.7.3 , 3.5.1 |
R | 2.15.1 , 3.0.0 , 3.2.2 | |
ruby | 1.8.7.374 | |
tcl | 8.5.7 | |
tk | 8.5.7 |
Further detail on MATLAB toolboxes and R addons/plugins can be found toward the bottom of this page.
Libraries (only) | ||
Software | Versions | |
atlas | 3.8.4 | |
blas | 3.2.1 | 3.6.0 |
boost | 1.41.0 | 1.61.0 |
gmp* | 4.3.1 | 6.1.0 |
lapack | 3.6.0 | |
mpc* | 1.0.3 | |
mpfr* | 2.4.1 | 3.1.4 |
* gmp, mpc, & mpfr libraries may be built into GNU compilers (not necessarily the above versions though).
Scientific Software
Software Catalogue
Please note that there is an almost endless list of scientific software that could be installed on HPC systems. Unless a request is received, HPC staff do not try to guess what software (inc. version) you need/want to use. While you may be able to install software yourself, generally you should avoid doing this - it is an unsustainable practice in terms of power, cost, and time (whole of JCU view). Additionally, software installed by HPC staff will reside on a different filesystem to where users home directories are located - improving performance at times of high IO load on filesystem(s) containing home directories. Extra information about software highlighted by a light green background colour is supplied at the end of this page.
Access Command | Version | Access Command | Version | Access Command | Version | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
B | |||||||||||
module load bfast | 0.6.5a | module load blacs | 1.1 | module load blas | 3.2.1 | module load blast | 2.2.29 | ||||
module load blcr | 0.8.5 | module load bowtie | 1.0.0 | module load bowtie2 | 2.2.4 | module load bwa | 0.7.4 | ||||
C | module load caftools | 2.0.2 | module load cap3 | module load carthagene | 1.2.2 | module load casacore | 1.4.0 | ||||
module load cd-hit | 4.6.1 | module load cernlib | 2006 | module load cfitsio | 3.030 | module load chlorop | 1.1 | ||||
module load clipper | 2.1 | module load clustalw | 2.0.12 |
| 1.49 | module load cns | 1.3 | ||||
module load coils | 2.2 | module load colony2 | module load consel | 0.1k | module load crimap | 2.504a | |||||
module load crimap_Monsanto | module load cufflinks | 2.2.1 | |||||||||
D | module load dx | 4.4.4 | |||||||||
E | module load elph | 1.0.1 | module load EMBOSS | 5.0.0 | module load enmtools | 1.3 | module load express | ||||
F | module load fasta | module load fastme | module load fastStructure | module load ffmpeg | |||||||
module load fftw | module load fftw2 | module load fftw3 | |||||||||
G | module load garli | module load gdal | module load glimmer | module load glpk | |||||||
module load GMT | module load gnuplot | module load gpp4 | module load grass | ||||||||
module load gromacs | module load gsl | ||||||||||
H | module load hdf | 4.2.5 | module load hdf5 | 1.8.5 | module load hmmer | ||||||
I | module load ima2 | ||||||||||
J | module load jmodeltest | ||||||||||
L | module load lagan | module load lamarc | module load lapack | module load lis | |||||||
M | module load Macaulay2 | module load matlab | module load maxent | module load maxima | |||||||
module load migrate | module load mira | module load molphy | module load mpich2 | ||||||||
module load mrbayes | module load mrmodeltest | module load msbayes | |||||||||
N | module load ncl | module load netcdf | module load netpbm | module load netphos | |||||||
module load numpy | |||||||||||
O | module load oases | module load octave | module load ogdi | module load openmpi | |||||||
P | module load pari | module load paup | module load proj | module load pvm | |||||||
R | module load R | module load r8s | module load rsem | module load rpfits | |||||||
S | module load scalapack | module load scipy | module load seadas | module load seg | |||||||
module load signalp | module load sprng | module load ssaha2 | module load structure | ||||||||
module load suitesparse |
https://upc-bugs.lbl.gov/blcr/doc/html/BLCR_Users_Guide.html
https://github.com/bli25wisc/rsem
MATLAB Components/Toolboxes
Component |
License allows |
Toolbox |
License allows |
---|---|---|---|
MATLAB |
50 user connections |
Control System |
50 user connections |
MATLAB Coder |
50 user connections |
Optimization |
50 user connections |
Simulink |
50 user connections |
Signal Processing |
50 user connections |
Simulink Coder |
50 user connections |
Symbolic Math |
50 user connections |
Simulink Control Design |
50 user connections |
System Identification |
50 user connections |
|
|
Mapping |
5 user connections |
|
|
Neural Network |
5 user connections |
|
|
Statistics |
5 user connections |
|
|
Distributed Computing |
4 user connections |
|
|
Fuzzy Logic |
4 user connections |
|
|
Global Optimization |
4 user connections |
|
|
Image Processing |
4 user connections |
|
|
MATLAB Compiler |
4 user connections |
|
|
Wavelet |
4 user connections |
R Packages
A: abind, acepack, actuar, ade4, ade4TkGUI, adehabitat, AER, akima, alr3, anchors, ape
B: base, bdsmatrix, biglm, BIOMOD, Biobase, bitops, boot, BufferedMatrix
C: car, caTools, chron, CircStats, class, clim.pact, cluster, coda, codetools, coin, colorspace, compiler, CompQuadForm, coxme, cubature
D: DAAG, datasets, DBI, degreenet, deldir, Design, digest, diptest, DynDoc, dynlm
E: e1071, Ecdat, effects, ellipse, ergm, evaluate, expm
F: fBasics, fCalendar, fEcofin, fields, flexmix, foreach, foreign, Formula, fSeries, fts, fUtilities
G: gam, gbm, gclus, gdata, gee, geoR, geoRglm, ggplot2, gpclib, gplots, graphics, grDevices, grid, gtools
H: hdf5, hergm, hexbin, Hmisc, HSAUR
I: igraph, ineq, inline, ipred, iquantitator, ISwR, iterators, itertools, its
K: kernlab, KernSmooth, kinship
L: latentnet, lattice, leaps, limma, lme4, lmtest, locfit, logspline
M: mapproj, maps, maptools, mAr, marray, MASS, Matrix, matrixcalc, MatrixModels, maxLik, mboost, mclust, MCMCpack, mda, MEMSS, methods, mgcv, mice, misc3d, miscTools, mitools, mix, mlbench, mlmRev, mlogit, modeltools, moments, MPV, msm, multcomp, multicore, mutatr, mvtnorm
N: ncdf, network, networksis, nlme, nnet, nor1mix, np, numDeriv, nws
O: oz
P: parallel, party, PBSmapping, permute, pixmap, plm, plyr, png, prabclus, proto, pscl
Q: qtl, quadprog, quantreg
R: RandomFields, randomForest, RANN, RArcInfo, raster, rbenchmark, rcolony, RColorBrewer, Rcpp, RcppArmadillo, ReadImages, relevent, reshape, rgdal, rgenoud, rgeos, rgl, Rglpk, rjags, rlecuyer, rmeta, robustbase, ROCR, RODBC, rpanel, rpart, RSQLite, RUnit
S: sampleSelection, sandwich, scatterplot3d, SDMTools, sem, sfmisc, sgeostat, shapefiles, shapes, slam, sm, sna, snow, snowFT, sp, spam, SparseM, spatial, SpatialTools, spatstat, spdep, splancs, splines, statmod, statnet, stats, stats4, stringr, strucchange, subselect, survey, survival, systemfit
T: tcltk, tcltk2, TeachingDemos, testthat, timeDate, timeSeries, tis, tkrplot, tools, tree, tripack, truncreg, trust, TSA, tseries, tweedie
U: urca, utils
V: vcd, vegan, VGAM, VIM
W: waveslim, wavethresh, widgetTools
X: XML, xtable, xts
Z: Zelig, zoeppritz, zoo
Linux Shells
The following Linux shells are available on HPC systems (bash
is the default):
bash | csh | dash | ksh | tcsh | zsh |
Compression Utilities
The following archiving/compression applications are available on HPC systems:
7za | bzip2 | gzip | pbzip2 | tar | unzip | xz | zip |
Note that the versions of zip
and unzip
installed on HPC have an upper size limit of 2GB. Most active HPC users consume significantly more than 2GB of disk space. If you need assistance with using tar
, please contact HPC staff.
- Environment module files are located in
/sw/modules
- Most scientific software is installed in
/sw/<software>/<version>/
- Components used by programmers (e.g., stand-alone libraries) are generally installed in /sw/common/
Installation of frequently required libraries is additionally done onto local system disks (usingyum
), if libraries are available in repositories. - Live upgrades of software should only be performed when no login/compute node is using the software. Compute nodes should be reimaged rather than have live upgrades performed. The login node may need to be live upgraded, due to jobs always running on this system.
- Some packages (e.g., BLCR) are rebuilt from source RPMs. BLCR, in particular, needs to be recompiled for each new kernel installed.
The following table provides information about operating systems used on physical servers managed (in some way) by HPC staff.
Operating System | Primary service(s) provided | Typically accessed from |
---|---|---|
RedHat Enterprise Linux 6.x | HPC login nodes HPC compute nodes |
Desktop or Laptop computers HPC login nodes |
SUSE Linux Enterprise Server 11.x | CIFS fileshares NFS fileshares |
Desktop or Laptop computers HPC login and compute nodes |
Windows 2012 Server | CIFS fileshares | Desktop or Laptop computers (Cairns) |
Vendors usually require an enterprise O/S be installed on physical servers if you have purchased maintenance.
JCU researchers wishing to host servers/storage in a datacentre must contact ITR management before purchasing is even considered.
All virtual machines (VMs) offered by HPC are, by default, provided as Infrastructure as a Service (IaaS). VM owners are responsible for daily maintenance operations on their VMs. For security reasons, all systems are registered for application of automatic updates. HPC staff will apply patches/updates (from RedHat and EPEL repositories only) to VMs where the automatic update process fails.
Internal (conducted by ITR) and external security audits take place on all publicly visible systems at times determined by ITR management. VM owners are responsible for fixing any security concerns identified in these audits. HPC staff may be consulted or be asked to provide assistance with such fixes.