How to work interactively on Ingrid Cluster
The main purpose of ingrid cluster is to be half of the Belgian Tier-2 computing facility for the CMS/LHC experiment located at CERN, Switzerland. Ingrid is principally a "grid cluster" connected to WLCG grid infrastructure. We also provide interactive access to our cluster for physicists working in belgian High Energy labs involved in CMS experiment. Within the collaboration with UCL/CISM, we provide access to ingrid for CISM users. See below some information and good practices for ingrid usage.
The details about the composition of the cluster are here
All the boxes are connected by a 1 Gb/s network. Note that this cluster is dedicated for sequential job run. If you have to run heavy parallel job (for example using MPI library), you should consider using the CISM clusters, which are optimized for such use cases.
Shell access to ingrid is done by using a ssh client. All good Linux/Unix distribution integrates it by default. For MS Windows users, we recommend putty. The informations needed for connection are:
- hostname: ingrid-ui1.cism.ucl.ac.be (SL6), ingrid-ui2.cism.ucl.ac.be (SL5)
- port: 22
- protocol: ssh v2
- your username and password
Credentials are the same as for the other CP3 computers. You just need to ask the cp3 support team to activate it.
Example of connection from a Linux CP3 workstation:
As batch scheduler we use condor. For more information how to use it see this page
When you use ingrid, you can access to different file system of different size:
- On each worker node and from UI, you have access to a shared home directory 2 TB in total
- On each worker node and from UI, you have access to a shared scratch directory (/nfs/scratch) of 20 TB (scratch links in your home)
- On each worker node and from UI, you have access to a user directory (/nfs/user) of 40 TB (storage link from your home)
- On each worker node, you have access to a local /scratch partition of 200 GB
If you need more storage space on ingrid, ask to administrator team. If this demand is justified and if we have space available, we can enlarge your quota.
- We are not a storage center: at the end of your computations move out your result to dedicated storage facilities. Data on scratch is not safe.
- The only storage volume that is backuped is the home user directories; the scratch volume and the large storage are not and will never be backuped.
- The code you developed should be versioned (see CMS git repository or use our GIT repository with your cp3 credentials).
If you need specific software (compiler, library, tool, ...), first check that it is not already available. A large number of softwares with various versions are located in:
- python (all versions above 2.5)
- na62 software
most of these can be used by loading the appropriate module:
> module avail ------------------------------ /nfs/soft/modules ------------------------------- cms/cmssw grid/grid-environment python/python27_sl6 crab/crab_2_9_0 matlab/matlab root/5.34.05-sl5_gcc41 geant/geant_sl5 python/beanstalk-client root/5.34.09-sl6_gcc44 geant/geant_sl6 python/python27_sl5
module load root/5.34.09-sl6_gcc44
module unload root
If there's no module for the software you need, you can add the executable path to your $PATH variable by adding the following in your .bashrc config file:
Do not add the new path at the beginning of the $PATH as it can slow down your whole session. If you want to replace an existing command, please do it with an alias instead, as in:
If the desired software is not available, you can either:
- install this software in your home
- ask to administrator team
For the second point as for storage, this demand must be justified and technically possible! For HE CMS users, CMSSW and classic CERN tools are already installed. See below for usage.
Working with CMSSW on ingrid
- make sure your environment is ready :
module load cms/cmssw
- list all available cmssw releases:
scram list CMSSW
- check wich is the recommended release here. In case of doubts, double-check with users.
- get a working directory:
scram project CMSSW_X_Y_Z_patchW