eResearch HPC

High Performance Computing (HPC) can be accessed by UTS researchers via the eResearch HPCC (High Performance Computing Cluster).

The goals are:

  • provide a shared resource across the UTS research community
  • provide a test-bed for larger HPCC projects destined for Intersect/NCI.

Access to the Cluster

The eResearch team will need to give you access. Simply email to introduce yourself and your requirements to us.

Cluster Hardware

The HPCC consists of:

  • Fourteen nodes for compute, one node for login and a head node.
  • The number of cores in each node range from 28 to 64. Total number of cores is about 540.
  • Most cores have 256 GB of RAM but some have 512 GB for applications that require more memory. Total distributed memory is about 3.8 TB.
  • Some nodes contain GPU processing units. Either dual Nvidia M2090 or newer dual Nvidia K80.
  • Most nodes have at least 6 TB of local attached disk.
  • A 73 TB Panasas ActiveStor14 SSD accelerated storage dedicated to the HPC.
  • A 700 TB of Isilon storage shared with other eResearch infrastructure.