Access to Regional HPC Clusters

EPSRC have funded several Regional Tier 2 HPC Clusters, designed to address the gulf in capability between local University Tier 3 HPC Clusters and the National Tier 1 Service (ARCHER).

QMUL are consortium members of the following Tier 2 clusters, launched in Q2 2017:


Name Short Name Host Cores Nodes RAM / Node Scheduler
Materials and Molecular Modelling Thomas UCL 17,280 720 128GB SGE
HPC Midlands Plus Athena Lboro 14,336 512 128GB Slurm

Current status

Thomas and Athena are accepting a small number of pilot users. During this phase, documentation will be a little sparse. We will obtain further information on Jade soon (likely summer 2017).

Who can access these clusters?

QMUL Academics may apply to use a cluster free of charge if:

  • they are performing predominantly EPSRC-funded research
  • jobs are an appropriate size for the Tier 2 service (the QMUL Apocrita service is appropriate for many users) e.g. parallel jobs running on 500+ cores
  • jobs are well-tested and known to run successfully
  • the scope and size of the work is stated in advance as part of an application to use the cluster
  • the work fits the designated application areas for the cluster

Gaining access to the service

Processes for the live service are still being formulated. However, access is always provided via the Point of Contact at the member institution (QMUL). In the first instance please send an email to explaining the research you are performing, applications you use, typical size and runtime required.

Loughborough provide account access via SAFE (Service Administration from EPCC). UCL aim to also use the same SAFE portal, but are using a manual email-based method in the meantime.

Specific information


Further information, including available software packages, is available on the UCL wiki. They also include some example job submission scripts.

Users are given 50GB of home space, and 200GB of scratch space.

Maximum wallclock time is 48 hours, maximum job size is 864 cores.

Thomas uses Intel Omni-Path Architecture.


Documentation is still being prepared, however, in the meantime, a package list is available on the Midlands Plus pages. The bulk of applications are being added during May.

Maximum wallclock time is 100 hours.

Athena uses Mellanox EDR Infiniband interconnects.