The Montblanc computing cluster
Montblanc is the computing cluster of the PMC group of the CPHT laboratory funded as part of the ERC Synergy grant Frontiers in Quantum Materials Control. All the nodes of the cluster are interconnected by a high speed QDR InfiniBand network.
Important
In 2022, all Montblanc nodes have been integrated into the Cholesky cluster on a dedicated QDR infiniband network.
User resources
- 2 login front-end nodes (DNS Round Robin for load sharing or fault tolerance) of Cholesky cluster
- A parallel file system BeeGFS to access only to HOME directories and Software Modules by Cholesky ethernet network
- A network file system NFS to access to WORK directories : 88 TB of usable space
- A Intel Q-Logic InfiniBand QDR (Quad Data Rate) interconnection network 40Gb/s
CPU resources
Nb of Nodes |
Nb of CPUs |
DELL Model Server |
CPU reference | CPU gen | Max memory |
Max reserved memory |
---|---|---|---|---|---|---|
124 | 2 | PowerEdge C6220 II |
Intel(R) Xeon(R) CPU E5-2650v2 8 cores @ 2.60GHz |
Ivy Bridge EP |
64 GB | 62 GB |
SLURM partitions
These are the partitions for CPU computing on Montblanc nodes.
Partition Name | Nodes | Cores per node | Max. memory per node | Max. Walltime |
---|---|---|---|---|
montblanc_short | montblanc[001-024] | 16 | 64 GB | 24:00:00 |
montblanc_qmac | montblanc[025-124] | 16 | 64 GB | unlimited |