eResearch HPC

High Performance Computing (HPC) can be accessed by UTS researchers via the eResearch HPCC (High Performance Computing Cluster).

The goals are:

  • provide a shared resource across the UTS research community
  • provide a test-bed for larger HPCC projects destined for Intersect/NCI.

Access to the Cluster

The eResearch team will need to give you access. Simply email to introduce yourself and your requirements to us. Once you have access read the HPC Getting Started page.

Cluster Hardware

The HPCC consists of:

  • Seventeen nodes for compute, one node for login and a head node.
  • The number of cores in each node range from 28 to 64. Total number of cores is a little over 600.
  • Most cores have 256 GB of RAM but some have 512 GB for applications that require more memory. Total distributed memory is about 4 TB.
  • Some nodes contain GPU processing units. Either dual Nvidia Tesla K80 or a Tesla P100.
  • Most nodes have at least 3 TB and some have 6 TB of local attached disk.
  • 700 TB of Isilon storage shared with other eResearch infrastructure.

Acknowledging use of the HPCC

We would appreciate the following text or similar to be used for achknowledgemt. acknowledgement:

"Computational facilities were provided by the UTS eResearch High Performance Computer Cluster."