UK HPC - Hardware
The Lipscomb HPC Cluster (dlx.uky.edu) was built for UK by
Dell Inc and is rated at just over 40 Teraflops.
Basic Nodes
- 376 Nodes (4512 cores); ~39.34 Teraflops
- Intel Xeon X5650 (Westmere) @ 2.66 GHz.
- 2 sockets/node x 6 cores/socket = 12 cores/node.
- 36 GB/node.
- 250 GB local (internal) SAS disk.
- Linux OS
Hi-Mem (Fat) Nodes
- 8 Nodes (256 cores); ~0.95 Teraflops
- Intel Xeon X7560 (Nehalem) @ 2.66 GHz
- 4 sockets/node x 8 cores/socket = 32 cores/node.
- 512 GB/node.
- 1 TB mirrored local (internal) SAS disk.
- Linux OS
Login Nodes
- 2 Nodes (24 cores)
- Intel Xeon X5650 (Westmere) @ 2.66 GHz.
- 2 sockets/node x 6 cores/socket = 12 cores/node.
- 36 GB/node.
- 250 GB local (internal) SAS disk.
- Linux OS
Interconnect
- Mellanox Quad Data Rate Infiniband switch
- 2:1 over-subscription.
Global cluster filesystem
- Panasas ActiveScale.
- 260 TB raw with 208 TB usable.
- 7.8 GBps throughput and 79,300 IOPS.
Other Information
- Fills most of 9 equipment racks.
- Uses about 180 KW when loaded.
- Dedicated TSM/HSM node for fast access to near-line storage.
[Coming soon.]
Pictures
Basic Nodes (front)
Basic Nodes (back)
Hi-Mem (
Fat) Nodes
Login and Admin Nodes
IB Switches (front)
IB Switches (back)
Panasas Disk Store
Panasas Disk Store (more disks)
Racks
Past HPC Equipment
For comparison, here is some infomation about older hardware.