Versions Compared


  • This line was added.
  • This line was removed.
  • Formatting was changed.

THIS PAGE CONTAINS OLD INFORMATION - HPC staff will work on updating it soon.

This page is intended as a quick introduction for new users submitting their first job to the HPRC Cluster

In the typical work flow the user:

  1. Logs into
  2. Prepares a submission script for their jobs
  3. Submits their jobs to the Job Scheduler
  4. Monitors their jobs
  5. Collects the output of their jobs.

A few things that new users should be aware of:

  • Typically, jobs are not run in an interactive manner, except when:
    • users are running small one off jobs
    • evaluating the resources required for bigger jobs
    • using graphical applications like MATLAB
    • Examples of interactive jobs:
      Content by Label
      cqllabel = "interactive-cluster-job"
  • HPRC Cluster software is not run in a window on their desktop, neither is it launched by clicking on it in the a network drive.
  • Users need to log into the cluster and inform the job scheduler about their job and it will run it when it can.


Table of Contents

Logging In

The first step in using the HPRC Cluster is to log in to the login node -


titleLogging into the Cluster

Quick Start

Excerpt Include


Software Packages

The HPRC Cluster uses environment modules to manage the available software packages. This allows multiple versions of the same software to be installed without interfearing with each other. To enable the environment module systemthe following command needs to be executed on the command line:

No Format
-bash-4.1$ source /etc/profile.d/


The software that is available on the HPRC clusted is listed here: HPRC User Software.  Alternately you can query the software available on the cluster with the following commands:



module avail

A list of available software is displayed

module help <software>

Version number and brief synopsis is displayed for <software>

titleExample "module avail" run on the Thu Mar 6 11:19:21 EST 2014
No Format
-bash-4.1$ module avail
--------------------------------------------------------------------------------------- /usr/share/Modules/modulefiles ----------------------------------------------------
MPInside/3.5.1     compiler/gcc-4.4.5 module-cvs         modules            mpich2-x86_64      null               perfcatcher
chkfeature         dot                module-info        mpi/intel-4.0      mpt/2.05           perfboost          use.own
---------------------------------------------------------------------------------------------- /etc/modulefiles -----------------------------------------------------------
compat-openmpi-x86_64 openmpi-x86_64
------------------------------------------------------------------------------------------------- /sw/modules -------------------------------------------------------------
4ti2                      blast/2.2.23              crimap_Monsanto           hdf5                      migrate/3.6(default)      picard-tools              tmap/1.1
BEDTools                  blast/2.2.29(default)     dx                        hmmer                     mira                      proj                      tmhmm
EMBOSS                    bowtie                    elph                      ima2                      modeltest                 pvm                       topali
GMT                       bwa/0.7.4(default)        enmtools                  jags                      molphy                    r8s                       towhee
Macaulay2                 caftools                  fasta                     java                      mpich2                    rainbowcrack              towhee-openmpi
Python/2.7                cap3                      fastme                    jcusmart                  mrbayes                   rpfits                    trans-abyss
R/2.15.1(default)         carthagene/1.2.2(default) ffmpeg                    jmodeltest                mrmodeltest               ruby/1.9.3                tree-puzzle
R/3.0.0                   carthagene/1.3.beta       fftw2                     lagan                     msbayes                   ruby/2.0.0                trinityrnaseq
abyss                     casacore                  fftw3                     lamarc                    ncar                      samtools                  udunits
ariadne                   cernlib                   garli                     lapack                    netcdf                    scalapack                 udunits2
arlequin                  cfitsio                   gdal                      libyaml/0.1.4             netphos                   scipy                     velvet
asap                      chlorop                   glimmer                   matlab/2008b              numpy                     seadas/6.2                wcslib
atlas                     clipper                   glpk                      matlab/2012a              oases                     seg                       wise2
bayesass                  clustalw                  gmp                       matlab/2012b              octave                    signalp                   wwatch3
beagle                    cluster                   gnu/4.1.2                 matlab/2013a(default)     openbugs                  sprng                     yasm
beast                     cns                       gnu/4.4.0                 maxent                    openjdk                   ssaha2                    zonation
beast-1.5.4               coils                     gnuplot                   maxima                    openmpi                   stacks
bfast                     colony2                   grass                     merlin                    pari                      structure
blacs                     consel                    gromacs                   migrate/3.2.15            paup                      targetp
blas                      crimap                    hdf                       migrate/3.5.1             phyml                     tclreadline/2.1.0


Running Jobs

A common misconception for users new to the HPRC Cluster  HPRC Cluster Explained

A simple way to run a job on the cluster is to create a shell script containing with embedded PBS Directives containing the information required by the scheduler to schedule the job.

Example: paup witth the ML_analysis.nex sample file

In this example we will run paup with the ML_analysis.nex sample file provided on the paup sample nexus files page. After logging into the cluster download the example file with the command:

No Format
-bash-4.1$ wget
--2014-03-11 13:08:16--
Connecting to||:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 2990 (2.9K) [text/plain]
Saving to: “ML_analysis.nex”

100%[==============================================================>] 2,990       --.-K/s   in 0s

2014-03-11 13:08:17 (70.7 MB/s) - “ML_analysis.nex” saved [2990/2990]


Creating the job script

Using a text editor – examples include vim and nano – create your shell script with the filename: and the following contents (the colours are only used for illistration purposes below):



  1. The very first line of the script file is the Shebang line and must be on the first line.
  2. The second section contains the PBS directives. For more information on PBS directives please see the HPRC PBS script files page.

  3. The third section outputs information about the job, and is only included as an example of what can be done.
  4. The fourth section containf the commands that are actually run in the job. In this case we are using a bash shell.


Submitting the Job - qsub

The final step is to submit the job to the job scheduler:

No Format
-bash-4.1$ qsub

Monitoring the Job - qstat

Once the job has been submitted you can monitor its progress by using the qstat command.

When you first submit your job it is placed into the job queue, and its status column contains Q, meaning the job is in the queue:

No Format
-bash-4.1$ qstat
Job ID                    Name             User            Time Use S Queue
------------------------- ---------------- --------------- -------- - -----
148122.jobmgr              ML_analysis      jcxxxxxxx             0 Q normal


Once your job starts running its status changes to R:

No Format
-bash-4.1$ qstat
Job ID                    Name             User            Time Use S Queue
------------------------- ---------------- --------------- -------- - -----
148122.jobmgr              ML_analysis      jcxxxxx               0 R normal
titleMore qstat examples
  • You can display all of the jobs you are running:

    No Format
    -bash-4.1$ qstat -u jcxxxxx -1n
                                                                                      Req'd    Req'd       Elap
    Job ID                  Username    Queue    Jobname          SessID  NDS   TSK   Memory   Time    S   Time
    ----------------------- ----------- -------- ---------------- ------ ----- ------ ------ --------- - ---------
    148185.jobmgr.hpc.jcu.  jcxxxxx     normal   ML_analysis           0     1      1    5gb 500:00:00 R  00:00:17   n026/19
  • All of the jobs running on the cluster:

    No Format
    -bash-4.1$ qstat
    Job ID                    Name             User            Time Use S Queue
    ------------------------- ---------------- --------------- -------- - -----
    132501.jobmgr              DDWoodpca1L      jcxxxxxx        581:37:3 R normal
    132502.jobmgr              DDWoodpca0.1L    jcxxxxxx        581:11:0 R normal
    132503.jobmgr              DDWoodpca0.01L   jcxxxxxx        581:02:3 R normal
    132504.jobmgr              DDWoodpca0.001L  jcxxxxxx        581:05:1 R normal
    132560.jobmgr              IBWoodpca1L      jcxxxxxx        575:36:2 R normal
    132561.jobmgr              IBWoodpca0.1L    jcxxxxxx        550:28:1 R normal
    132562.jobmgr              IBWoodpca0.01L   jcxxxxxx        573:21:3 R normal
    132563.jobmgr              IBWoodpca0.001L  jcxxxxxx        575:25:3 R normal
    142918.jobmgr              DDpca0.001pcb1LL jcxxxxxx        275:22:4 R normal
    144744.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        3597:28: R bigmem
    144745.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        2279:22: R bigmem
    144746.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1098:06: R normal
    144747.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        928:26:1 R normal
    144748.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        942:19:5 R normal
    144753.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1434:04: R normal
    144754.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        927:20:1 R normal
    144756.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        938:07:5 R normal
    145377.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1145:52: R normal
    145379.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1286:49: R normal
    145381.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1034:06: R normal
    145382.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        740:05:5 R normal
    145384.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1285:28: R normal
    145386.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1028:30: R normal
    145387.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1070:39: R normal
    145390.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1238:34: R normal
    145391.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        2018:17: R normal
    145392.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1585:10: R normal
    145708.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1159:11: R normal
    145756.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        835:32:4 R normal
    145790.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        1083:20: R normal
    146395.jobmgr              ....001pcb2500LL jcxxxxxx        145:38:3 R normal
    146396.jobmgr              ....001pcb7500LL jcxxxxxx        145:00:3 R normal
    146397.jobmgr              ...001pcb12500LL jcxxxxxx        152:09:2 R normal
    146398.jobmgr              ....001pcb5000LL jcxxxxxx        151:49:3 R normal
    146399.jobmgr              ...001pcb15000LL jcxxxxxx        143:16:2 R normal
    146527.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        547:57:0 R normal
    147055.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        362:59:2 R normal
    147059.jobmgr              ..._567_bfw0_PBS jcxxxxxx        46:57:34 R normal
    147060.jobmgr              ..._547_bfw0_PBS jcxxxxxx        46:56:16 R normal
    147063.jobmgr              ..._543_bfw0_PBS jcxxxxxx        46:59:03 R normal
    147065.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        297:58:1 R normal
    147070.jobmgr              ..._573_bfw0_PBS jcxxxxxx        51:54:44 R normal
    147071.jobmgr              ..._549_bfw0_PBS jcxxxxxx        51:45:14 R normal
    147072.jobmgr              ..._550_bfw0_PBS jcxxxxxx        46:32:31 R normal
    147073.jobmgr              ..._532_bfw0_PBS jcxxxxxx        51:36:30 R normal
    147074.jobmgr              ..._575_bfw0_PBS jcxxxxxx        50:08:14 R normal
    147096.jobmgr              ..._582_bfw0_PBS jcxxxxxx        51:10:26 R normal
    147097.jobmgr              ..._578_bfw0_PBS jcxxxxxx        51:08:43 R normal
    147101.jobmgr              ..._568_bfw0_PBS jcxxxxxx        46:43:19 R normal
    147106.jobmgr              ..._539_bfw0_PBS jcxxxxxx        46:40:11 R normal
    147115.jobmgr     jcxxxxxx        50:46:46 R normal
    147203.jobmgr              NarrowBoxParent  jcxxxxxx        140:35:0 R normal
    147206.jobmgr              NarrowBoxParent  jcxxxxxx        150:53:4 R normal
    147207.jobmgr              NarrowBoxParent  jcxxxxxx        125:10:5 R normal
    147212.jobmgr              NarrowBoxParent  jcxxxxxx        151:57:4 R normal
    147213.jobmgr              NarrowBoxParent  jcxxxxxx        164:20:4 R normal
    147215.jobmgr              NarrowBoxParent  jcxxxxxx        160:11:4 R normal
    147216.jobmgr              NarrowBoxParent  jcxxxxxx        157:02:1 R normal
    147217.jobmgr              NarrowBoxParent  jcxxxxxx        163:39:0 R normal
    147218.jobmgr              NarrowBoxParent  jcxxxxxx        156:11:5 R normal
    147219.jobmgr              NarrowBoxParent  jcxxxxxx        140:37:4 R normal
    147220.jobmgr              NarrowBoxParent  jcxxxxxx        149:46:1 R normal
    147221.jobmgr              NarrowBoxParent  jcxxxxxx        141:12:4 R normal
    147222.jobmgr              NarrowBoxParent  jcxxxxxx        136:42:2 R normal
    147224.jobmgr              NarrowBoxParent  jcxxxxxx        132:20:5 R normal
    147225.jobmgr              NarrowBoxParent  jcxxxxxx        153:33:0 R normal
    147274.jobmgr              NarrowBox        jcxxxxxx        138:18:2 R normal
    147275.jobmgr              NarrowBox        jcxxxxxx        142:08:0 R normal
    147318.jobmgr              ..._bfw0_PBS_1.2 jcxxxxxx        39:59:20 R normal
    147325.jobmgr              NarrowBox        jcxxxxxx        148:00:2 R normal
    147331.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        248:00:3 R normal
    147332.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        229:03:3 R normal
    147391.jobmgr              NarrowBox        jcxxxxxx        126:32:5 R normal
    147398.jobmgr              NarrowBox        jcxxxxxx        125:48:5 R normal
    147470.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        232:50:4 R normal
    147471.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        196:34:1 R normal
    147472.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        415:56:4 R normal
    147494.jobmgr     jcxxxxxx        25:36:45 R normal
    147495.jobmgr     jcxxxxxx        26:03:45 R normal
    147496.jobmgr     jcxxxxxx        26:04:44 R normal
    147498.jobmgr     jcxxxxxx        25:59:16 R normal
    147499.jobmgr     jcxxxxxx        11:43:42 R normal
    147502.jobmgr              ...fw0_PBS_CG1.3 jcxxxxxx        131:28:4 R normal
    147576.jobmgr              NarrowBox        jcxxxxxx        60:24:11 R normal
    147577.jobmgr              NarrowBox        jcxxxxxx        60:36:39 R normal
    147584.jobmgr              NarrowBox        jcxxxxxx        52:12:36 R normal
    147590.jobmgr              NarrowBox        jcxxxxxx        49:06:43 R normal
    147616.jobmgr              NarrowBox        jcxxxxxx        33:08:43 R normal
    147619.jobmgr              NarrowBox        jcxxxxxx        30:28:18 R normal
    147621.jobmgr              NarrowBox        jcxxxxxx        30:33:22 R normal
    147629.jobmgr     jcxxxxxx        07:04:15 R normal
    147630.jobmgr     jcxxxxxx        07:15:13 R normal
    147631.jobmgr     jcxxxxxx               0 Q normal
    147632.jobmgr     jcxxxxxx               0 Q normal
    147634.jobmgr     jcxxxxxx               0 Q normal
    147635.jobmgr     jcxxxxxx               0 Q normal
    147636.jobmgr     jcxxxxxx               0 Q normal
    147637.jobmgr     jcxxxxxx        07:02:52 R normal
    147647.jobmgr              NarrowBox        jcxxxxxx        13:28:24 R normal
    147651.jobmgr          jcxxxxxx        21:46:18 R normal
    147652.jobmgr              NarrowBox        jcxxxxxx        19:52:35 R normal
    147657.jobmgr              NarrowBox        jcxxxxxx        18:04:53 R normal
    147665.jobmgr              NarrowBox        jcxxxxxx        16:17:01 R normal
    147702.jobmgr              NarrowBox        jcxxxxxx        13:03:45 R normal
    147820.jobmgr              NarrowBox        jcxxxxxx        10:16:38 R normal
    148021.jobmgr              NarrowBox        jcxxxxxx        08:41:26 R normal
    148022.jobmgr              NarrowBox        jcxxxxxx        08:42:34 R normal
    148087.jobmgr              NarrowBox        jcxxxxxx        07:45:33 R normal
    148095.jobmgr              NarrowBox        jcxxxxxx        07:03:12 R normal
    148096.jobmgr              NarrowBox        jcxxxxxx        07:10:18 R normal
    148099.jobmgr              NarrowBox        jcxxxxxx        06:54:12 R normal
    148100.jobmgr              NarrowBox        jcxxxxxx        06:38:54 R normal
    148101.jobmgr              NarrowBox        jcxxxxxx        06:46:31 R normal
    148108.jobmgr              NarrowBox        jcxxxxxx        05:57:17 R normal
    148124.jobmgr              NarrowBox        jcxxxxxx        05:11:52 R normal
    148126.jobmgr              NarrowBox        jcxxxxxx        04:49:25 R normal
    148138.jobmgr              NarrowBox        jcxxxxxx        04:30:48 R normal
    148150.jobmgr              NarrowBox        jcxxxxxx        03:46:22 R normal
    148154.jobmgr              NarrowBox        jcxxxxxx        03:19:45 R normal
    148179.jobmgr              NarrowBox        jcxxxxxx        00:10:33 R normal
    148180.jobmgr              NarrowBox        jcxxxxxx        00:10:57 R normal
    148182.jobmgr              NarrowBox        jcxxxxxx        00:30:19 R normal


Deleting a job - qdel

If you need to your job you can use the qdel command

No Format
-bash-4.1$ qdel

Your Jobs Output

Different programs have different ways of outputting their data. If they output data directly to a file then your results will be in whatever file you specified. If, however, the results are printed out to the standard out (as is the case for this example) then PBS captures them into a file for you.

-bash-4.1$ cat  ML_analysis.o148122




Job Resources

It is important to match resources requested with the PBS Directives in your script and the resource usage of your job. There can be consequences for incorrectly specifiying these resource requirements

  • Walltime: your job can be killed if it exceeds the specified wall time.
  • Memory: overusing memory can cause the compute node's memory to be pushed into swap space, slowing down all jobs on that node. This has also killed compute nodes in the past, destroying
  • CPUs: using more cpus than requested