Contact
Name
Position
Email
Address
Phone
Office
UCL member card
Andres Tanasijczuk
Position
Physicist, engineer or computer scientist
Address
Centre for Cosmology, Particle Physics and Phenomenology - CP3
Université catholique de Louvain
2, Chemin du Cyclotron - Box L7.01.05
B-1348 Louvain-la-Neuve
Belgium
Université catholique de Louvain
2, Chemin du Cyclotron - Box L7.01.05
B-1348 Louvain-la-Neuve
Belgium
Phone
+32 10 473036
Office
UCL member card
Projects
Research directions:
Experiments and collaborations:
Active projects
Non-active projects
Cosmology and General Relativity
Data analysis in HEP, astroparticle and GW experiments
Detector commissioning, operation and data processing
Data analysis in HEP, astroparticle and GW experiments
Detector commissioning, operation and data processing
Experiments and collaborations:
Active projects
Virgo - computing
Giacomo Bruno, Andres Tanasijczuk
The existing UCLouvain/CP3 computing cluster has been augmented with GW-dedicated computing and storage resources. The cluster is integrated into the International Gravitational-Wave Observatory Network (IGWN) Computing Grid, leveraging on the infrastructure CP3 has in place for serving the cluster to the World LHC GRID (WLCG). The UCLouvain GW group has also taken up the responsibility of maintaining at the UCLouvain cluster a service that hosts and serves Virgo data to GW analysis codes submitted over the GRID by the entire LIGO/Virgo/KAGRA international Collaborations.
The existing UCLouvain/CP3 computing cluster has been augmented with GW-dedicated computing and storage resources. The cluster is integrated into the International Gravitational-Wave Observatory Network (IGWN) Computing Grid, leveraging on the infrastructure CP3 has in place for serving the cluster to the World LHC GRID (WLCG). The UCLouvain GW group has also taken up the responsibility of maintaining at the UCLouvain cluster a service that hosts and serves Virgo data to GW analysis codes submitted over the GRID by the entire LIGO/Virgo/KAGRA international Collaborations.
World LHC Computing Grid: the Belgian Tier2 project
Giacomo Bruno, Jérôme de Favereau, Pavel Demin, Vincent Lemaitre, Andres Tanasijczuk
The World LHC Computing GRID (WLCG) is the worldwide distributed computing infrastructure controlled by software middleware that allows a seamless usage of shared storage and computing resources.
About 10 PBytes of data are produced every year by the experiments running at the LHC collider. This data must be processed (iterative and refined calibration and analysis) by a large scientific community that is widely distributed geographically.
Instead of concentrating all necessary computing resources in a single location, the LHC experiments have decided to set-up a network of computing centres distributed all over the world.
The overall WLCG computing resources needed by the CMS experiment alone in 2016 amount to about 1500 kHepSpec06 of computing power, 90 PB of disk storage and 150 PB of tape storage. Working in the context of the WLCG translates into seamless access to shared computing and storage resources. End users do not need to know where their applications run. The choice is made by the underlying WLCG software on the basis of availability of resources, demands of the user application (CPU, input and output data,..) and privileges owned by the user.
Back in 2005 UCL proposed the WLCG Belgian Tier2 project that would involve the 6 Belgian Universities involved in CMS. The Tier2 project consists of contributing to the WLCG by building two computing centres, one at UCL and one at the IIHE (ULB/VUB).
The UCL site of the WLCG Belgian Tier2 is deployed in a dedicated room close to the cyclotron control room of the IRMP Institute and is currently a fully functional component of the WLCG.
The UCL Belgian Tier2 project also aims to integrate, bring on the GRID, and share resources with other scientific computing projects. The projects currently integrated in the UCL computing cluster are the following: MadGraph/MadEvent, NA62 and Cosmology.
External collaborators: CISM (UCL), Pascal Vanlaer (Belgium, ULB), Lyon computing centre, CERN computing centre.
The World LHC Computing GRID (WLCG) is the worldwide distributed computing infrastructure controlled by software middleware that allows a seamless usage of shared storage and computing resources.
About 10 PBytes of data are produced every year by the experiments running at the LHC collider. This data must be processed (iterative and refined calibration and analysis) by a large scientific community that is widely distributed geographically.
Instead of concentrating all necessary computing resources in a single location, the LHC experiments have decided to set-up a network of computing centres distributed all over the world.
The overall WLCG computing resources needed by the CMS experiment alone in 2016 amount to about 1500 kHepSpec06 of computing power, 90 PB of disk storage and 150 PB of tape storage. Working in the context of the WLCG translates into seamless access to shared computing and storage resources. End users do not need to know where their applications run. The choice is made by the underlying WLCG software on the basis of availability of resources, demands of the user application (CPU, input and output data,..) and privileges owned by the user.
Back in 2005 UCL proposed the WLCG Belgian Tier2 project that would involve the 6 Belgian Universities involved in CMS. The Tier2 project consists of contributing to the WLCG by building two computing centres, one at UCL and one at the IIHE (ULB/VUB).
The UCL site of the WLCG Belgian Tier2 is deployed in a dedicated room close to the cyclotron control room of the IRMP Institute and is currently a fully functional component of the WLCG.
The UCL Belgian Tier2 project also aims to integrate, bring on the GRID, and share resources with other scientific computing projects. The projects currently integrated in the UCL computing cluster are the following: MadGraph/MadEvent, NA62 and Cosmology.
External collaborators: CISM (UCL), Pascal Vanlaer (Belgium, ULB), Lyon computing centre, CERN computing centre.
Non-active projects
Publications in IRMP
All my publications on Inspire
Number of publications as IRMP member: 22
Last 5 publications
More publications
Number of publications as IRMP member: 22
Last 5 publications
2021
More publications