TSUBAME 2.5 - Cluster Platform SL390s G7, Xeon X5670 6C 2.930GHz, Infiniband QDR, NVIDIA K20x

Site: GSIC Center, Tokyo Institute of Technology
System URL: http://www.gsic.titech.ac.jp/en/tsubame
Manufacturer: NEC/HPE
Cores: 74,358
Memory: 74,358 GB
Processor: Xeon X5670 6C 2.93GHz
Interconnect: Infiniband QDR
Performance
Linpack Performance (Rmax) 2,843 TFlop/s
Theoretical Peak (Rpeak) 5,609.4 TFlop/s
Nmax 1,760,000
Power Consumption
Power: 1,398.61 kW (Submitted)
Software
Operating System: SUSE Linux Enterprise Server 11
Compiler: GCC
Math Library: Intel MKL
MPI: OpenMPI 1.7.2
List Rank System Vendor Total Cores Rmax (TFlops) Rpeak (TFlops) Power (kW)
11/2013 11 Cluster Platform SL390s G7, Xeon X5670 6C 2.930GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 74,358 2,843.0 5,609.4 1,398.61
11/2017 48 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61
06/2017 44 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61
11/2016 40 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 74,520 2,785.0 5,617.5 927.86
06/2016 31 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 74,520 2,785.0 5,617.5 927.86
11/2015 24 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61
06/2015 21 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61
11/2014 15 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61
06/2014 13 Cluster Platform SL390s G7, Xeon X5670 6C 2.93GHz, Infiniband QDR, NVIDIA K20x NEC/HPE 76,032 2,785.0 5,735.7 1,398.61