Child pages
  • HPC Best Practices
Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 2 Next »

HPC Cluster

  1. Submitting jobs (using qsub) to compute nudes is recommended.
  2. If you must work interactively, learn how to use the screen command in Linux.
  3. Be aware of how much resource your jobs require - particularly the amount of memory (RAM) required.  Overconsumption of memory on a server can result in failiure - all running jobs on that server will need to be re-run (not just yours).

Windows

  1. Identify the resource requirement for your work.
  2. Request a resource (virtual machine) - HPC will be able to cater for most requirements.  Use Remote Desktop software to work on that resource.

Research Data Protection

  1. Keep one copy of all research data on HPC.  Feel free to have other copies elsewhere (e.g., on you desktop/laptop computer).
  2. Establishing a regular (e.g., daily) synchronisation of data is best.  This process should only update changes.
  3. Avoid deleting your files, or think long and hard before doing so.  HPC storage is not backed up.

All data placed on HPC is replicated to an off-site location, but this does not protect against accidental deletion.

Backups

Unfortunately, HPC does not have sufficient funding to provide a regular backup target for research data.

For small amounts of data, you could backup your data to OneDrive.

There are many external options in this space with varying levels of cost.  Be aware of data transfer costs prior to signing up - many offer free uploads but downloading (recovery) is quite/very expensive.

  • No labels