We take advantage of the large statistics being recorded by the CMS experiment in Run 2 to launch a systematic study of angular asymmetries in the ttW process, which have a potentially large sensitivity to non-SM effects.
In synergy with the CP3 phenomenology group, we aim at reporting our results in a form that can be easily translated in EFT constraints.
The CMS detector at the LHC can be used to identify particles via the measurement of their ionization energy loss. The sub-detectors that have provided so far useful information for this experimental technique are the silicon strip tracker and the pixel detectors. Identification of low momentum hadrons and detection of new exotic massive long-lived charged particles have all benefited from this experimental method. Members of UCL pioneered this technique in the early LHC times and have been developing the tools for its use and calibration. Since 2010 particle identification with ionization energy loss has been the basis of the CMS inclusive search for new massive long-lived charged particles, which has been providing the most stringent and model-independent limits existing to date on any model of new physics predicting such particles.
External collaborators: CMS collaboration.
A resonance consistent with the stanadard model Higgs boson with mass of about 125 GeV was discovered in 2012 by the CMS and ATLAS experiments at the LHC. Using the available dataset (2011+2012 LHC runs) evidence was later found of the existence of the SM-predicted decay into a pair of tau leptons. The CP3 Louvain group has been involved in the channel where the Higgs boson is produced in association with the Z boson and decays into a pair of tau leptons.
A search for additional Higgs bosons in the general framework of models with two Higgs doublets (2HDM) was then performed by the same CP3 group using the same final state and the full Run-1 data. Models with two Higgs doublets feature a pseudoscalar boson, A, two charged scalars (H+-) and two neutral (h0 and H0) scalars, one of which is identified with the 125 GeV SM-like Higgs resonance. In some scenarios the most favored decay chain for the discovery of the additional neutral bosons is H0-->ZA-->llττ (or llbb). The search was carried out in collaboration with another group in CP3 who looks at the llbb final state.
An update of both the SM search and the exotic one is expected using the Run-2 dataset using more advanced techniques and by adding the llee and llmumu channels.
Many well motivated new physics extensions of the SM include new particles whose decay width is very small and hence have a decay length which is macroscopic. One very attractive and minimal extension of the standard model is one with right-handed neutrinos with Majorana masses below the electroweak scale (low scale see-saw). This addition is able to generate both the light neutrino masses and the baryon asymmetry of the universe via low scale leptogenesis. In what is probably the most studied model that invokes the low scale seesaw, the Neutrino Minimal Standard Model , one of the three right-handed neutrinos is a dark matter candidate. A large allowed region of phase space for right handed neutrinos spans masses between 1 and 50 GeV with corresponding lifetimes (cτ) ranging from 10^3 to 10^-4 m. For higher masses the right handed neutrino basically decays promptly and for lower masses the probability that it decays within the detector volume is virtually zero thus giving rise to missing transverse momentum in the detector. These latter two extreme cases can be captured experimentally by standard searches at the general purpose LHC experiments, while the intermediate case is the natural target of the so-called “displaced” searches, which are highly peculiar and challenging analyses at the LHC in high demand for dedicated data reconstruction tools in order to extend their sensitivity. We intend to search for long-lived sterile neutrinos decaying at displaced vertices into a charged light lepton and hadrons. A fundamental ingredient of this search is the identification of charged tracks emerging from highly displaced vertices.
The CMS detector at the LHC is used to search for yet unobserved heavy (mass >100 GeV/c$^2$), long-lived (lifetime > 1 ns), electrically charged particles, called generically HSCPs.
HSCPs can be distinguished from Standard Model particles by exploiting their unique signature: very high momentum and low velocity. These features are a consequence of their high mass and the relatively limited LHC collision energy. Two experimental techniques are used to identify such hypothetical heavy and low-velocity particles: the measurement of the ionization energy loss rate using the all-silicon tracker detector and the time-of-flight measurement with the muon detectors.
UCL members have developed the ionization energy loss identification technique and have lead the CMS HSCP search since 2010, when the first HSCP paper became one of the first published LHC search papers. Updated results, using the 2011 dataset, were then published followed by a comprehensive paper including also searches for fractional and multiply-charged particles published using the full CMS Run-1 dataset. The results obtained by analysing the 2015 Run 2 data at 13 TeV have also been published.
The analysis, which is very inclusive, doesn't find evidence of HSCP. It currently excludes, among various models, the existence of quasi-stable gluinos, predicted by certain realizations of supersymmetry, and Drell-Yan-produced staus with masses lower than about 1.3 TeV and 350 GeV, respectively. These and the other limits set by the analysis are the most stringent to date. The CMS HSCP papers total to date more than 300 citations.
Development of the "phase II" upgrade for the CMS silicon strip stracker.
More precisely, we are involved in the development of the uTCA-based DAQ system and in the test/validation of the first prototype modules. We take active part to the various test-beam campaigns (CERN, DESY, ...)
This activity will potentially make use of the cyclotron of UCL, the probe stations and the SYCOC setup (SYstem de mesure de COllection de Charge) to test the response to laser light, radioactive sources and beams.
The final goal is to take a leading role in the construction of part of the CMS Phase-II tracker.
External collaborators: CRC and CMS collaboration.
The World LHC Computing GRID (WLCG) is the worldwide distributed computing infrastructure controlled by software middleware that allows a seamless usage of shared storage and computing resources.
About 10 PBytes of data are produced every year by the experiments running at the LHC collider. This data must be processed (iterative and refined calibration and analysis) by a large scientific community that is widely distributed geographically.
Instead of concentrating all necessary computing resources in a single location, the LHC experiments have decided to set-up a network of computing centres distributed all over the world.
The overall WLCG computing resources needed by the CMS experiment alone in 2016 amount to about 1500 kHepSpec06 of computing power, 90 PB of disk storage and 150 PB of tape storage. Working in the context of the WLCG translates into seamless access to shared computing and storage resources. End users do not need to know where their applications run. The choice is made by the underlying WLCG software on the basis of availability of resources, demands of the user application (CPU, input and output data,..) and privileges owned by the user.
Back in 2005 UCL proposed the WLCG Belgian Tier2 project that would involve the 6 Belgian Universities involved in CMS. The Tier2 project consists of contributing to the WLCG by building two computing centres, one at UCL and one at the IIHE (ULB/VUB).
The UCL site of the WLCG Belgian Tier2 is deployed in a dedicated room close to the cyclotron control room of the IRMP Institute and is currently a fully functional component of the WLCG.
The UCL Belgian Tier2 project also aims to integrate, bring on the GRID, and share resources with other scientific computing projects. The projects currently integrated in the UCL computing cluster are the following: MadGraph/MadEvent, NA62 and Cosmology.
External collaborators: CISM (UCL), Pascal Vanlaer (Belgium, ULB), Lyon computing centre, CERN computing centre.
The discovery of the 125GeV Higgs boson by the LHC experiments has finally opened a new era in the exploration of the TeV scale. The physics programs of CMS and ATLAS aim far beyond the simple discovery, and vigorously pursue the full characterization of the newly discovered state and the full exploration of the TeV scale in search of new phenomena. A key lesson drawn from first two years of LHC running is that most probably first discoveries and then identification of new states/interactions will not be easy. On the one hand, model-independent searches in simple topologies such as single/multi lepton at high transverse momenta have not shown any hint of new physics so far. On the other, topologies with jets and/or missing transverse energies, much more challenging experimentally, do strongly depend on the underlying theoretical models so that efficiently identifying signal enhanced regions of the phase space is quite involved. In this context, multi-variate techniques have become more and more central in the analysis of data from hadron collider experiments, to maximally exploit the information available on the signal and on the backgrounds. Amongst the most advanced techniques and certainly the most powerful one from the theoretical point of view, the so called matrix element method stands out. The main goal of this proposal is to advance the use and the scope of the matrix-element method so to significantly extend the range of physics applications at the LHC to the search of new physics. First we aim at providing the experimental HEP community with complete and automatic simulation tools, such as MadWeight/MoMEMta and Delphes, that overcome the technical limitations of the method. Second we propose to test and apply the new tools to current analyses in signatures that involve final state leptons and b-jets. Finally, we explore new and original applications of the method to both model-dependent or model-independent searches of new physics at the LHC.
The so called Magnet Test Cosmic Challenge (MTCC) was the first comprehensive operational and functional test of the CMS experiment. the MTCC took place in the first months of 2006 and was a slice test in which a small fraction of all the CMS detection equipment was operated in the 4 T solenoid of the experiment. Cosmic rays detected in the muon chambers were used to trigger the readout of all detectors in the global CMS data acquisition system. Prior to data taking, the detectors and their readout electronics were tuned and synchronized with dedicated software procedures. Local reconstruction was carried out online and offline in all sub-detectors for event selection and monitoring purposes. Global reconstruction, linking different sub-detectors, was performed mainly offline. A number of monitoring and visualization tools were also used for validation purposes and monitoring. One of the main goals of the MTCC was the validation of the hardware alignment system functionality.
At the MTCC, UCL had a leading role in the preparation, operation and offline data analysis related to the silicon strip tracker detector.
FROG is a generic framework dedicated to visualize events produced in particle collisions and detected by particle detectors.
It has been written in C++ and use OpenGL cross-platform libraries. It can be used to any particular physics experiment or detector design. The code is very light and very fast and can run on various Operating System. Moreover, FROG is self consistent and does not require installation of ROOT or Experiment software (e.g. CMSSW) libraries on user's computer.
It includes a lot of features based on an unique and powerful principle. Some of the functionalities are listed below :
3D and 2D visualization, graphical user interface, mouse interface, configuration files, production of pictures in various format, integration of personal objects.
One of the FROG application is to display events for one of the most complex physics experiment : the CMS experiment. But it works as well and even faster with smaller experiment like the Gastof detector.
CMS TWiki Page
The final state containing two Z bosons decaying into a pair of leptons and a pair of neutrinos has been exploited by the CMS experiment at the LHC to produce a number of results related to the Higgs boson, including measurements of related standard model cross sections.
Constraints have been set on the total width of the 125 GeV Higgs boson, using its relative on-shell and off-shell production and decay rates to a pair of Z bosons, where one Z boson decays to an electron or muon pair, and the other to an electron, muon, or neutrino pair. The analysis is based on the data collected by the CMS experiment at the LHC in 2011 and 2012. A simultaneous maximum likelihood fit to the measured kinematic distributions near the resonance peak and above the Z-boson pair production threshold leads to an upper limit on the Higgs boson width of < 22 MeV at a 95% confidence level, which is 5.4 times the expected value in the standard model at the measured mass of 125.6 GeV.
A search for heavy Higgs bosons in the H → ZZ → 2l2ν decay channel, where l = e or µ, has also been performed using data collected in 2015 and 2016 at the center of mass energy of 13 TeV. No significant excess is observed above the background expectation. The results are interpreted to set exclusion limits on a number of extensions of the standard model scalar sectors: models with an additional electroweak singlet, as well as Type-I and Type-II two-Higgs doublets models.
The amount and distribution of the material composing a particle detector that measures the trajectories of charged particles must be known with high accuracy for two main reasons: 1) avoid any bias in the measurements of the momentum of charged particles and 2) provide an accurate Monte Carlo simulation of the detector.
A novel method for measuring the material of a generic tracking apparatus has been developed. The method exploits the multiple scattering experienced by charged particles while they sail through the detector. The method relies on the precise position measurement of the crossing points provided by the tracking detectors. The method is completely general and can be applied to any experiment equipped with detectors with good enough space resolution.
The material of the CMS Silicon Strip Tracker has been measured with this technique to a precision at the level of 10%.
The current experimental program of the CMS experiment contains many analyses which look for a lepton in the final state. The decay of Higgs boson into leptons is one of the few decay channels which can be used to observe or exclude a low mass neutral scalar boson that is predicted by the Standard Model as well as by many Beyond SM scenarios. Additionally, the observation of a charged Higgs boson, which for masses below 200 GeV preferably decays into a lepton and a neutrino, would represent a unique clue to both the origin of mass and the deeper symmetries in Nature.
The lepton can be useful also in many other analyses beyond the Higgs sector, e.g. to test the lepton/flavour universality.
Being the heaviest lepton, can decay either to or electron (``leptonic '') or to lighter hadrons (``hadronic ''). Most of the leptons decay hadronically (65 %). In hadronic decays, there is an odd number of charged hadrons possibly accompanied by neutral hadrons (due to charge conservation), forming together so-called jet. Finally, there is always at least one neutrino (two for leptonic modes) among the decay products.
Given that bulk of decays are non-leptonic, the efficient reconstruction and identification of jets is of crucial importance for the CMS physics program.
At CMS, the decay products are reconstructed from Particle Flow (PF) objects. In the PF approach, the information from all sub-detectors is combined to identify and reconstruct all particles from collision, namely charged and neutral hadrons, photons, muons and electrons. The reconstruction starts from jets.
The main reconstruction algorithm at CMS is "Hadron plus strips" (HPS). It combines PF electromagnetic particles into strips (due to broadening of calorimeter depositions from photon conversions) in order to reconstruct candidates. Those are combined with charged hadrons to reconstruct visible decay products.
Several identification criteria are applied to the candidates: isolation (how much momentum is carried by jet constituents that cannot be associated with decay products) and rejection against electrons and muons. All discriminators exist in cut- or MVA-based form and have several working points with different values of reconstruction efficiency and rejection against fake candidates.
The aim of this project is to maintain and improve the performance of the CMS tau reconstruction and identification algorithms.
The detection of TeV muons is a fundamental ingredient of a number of key analyses performed by the CMS experiment at the LHC collider, like the search for new high-mass resonances decaying into di-muons or one muon and one neutrino. Muons with an energy of a few hundred GeV or more experience catastrophic energy losses in the material they traverse. These energy losses have a very significant negative imact on the most important parameters of the muon energy measurement distribution: central value, resolution, and tails.
In order to mitigate these effects, a new muon reconstruction algorithm, called DYnamic Truncation (DYT), has been developed. The DYT identifies the muon position measurements that are produced after a catastrophic energy loss. The inclusion of these measurements in the muon track fit is responsible for the degradation of the muon energy measurement. The identification of such measuremnts is based on the level of incompatibility between the position measurement itself and the expected position obtained using the previous measurements.
A search for a yet-unobserved baryon number violating top quark decay has been performed using data collected in 2011 and 2012 by the CMS experiment at the LHC. This search was motivated by a theoretical work from the UCL-CP3 phenomenology group, who have noticed that the existence of physics beyond the standard model would imply, under certain conditions, baryon number violation both in the production of top quarks and in their decay process. In the latter case top quarks would decay with a certain branching fraction into a lepton and two jets. The CP3 Louvain experimental group has searched for such decays in a final state containing a pair of top quarks, where the second top quark experiences a SM hadronic decay. No evidence of such an exotic decay has been found and limits have been set at the level of per mille on the branching fraction of the top quark.
More recently, the CP3 Louvain group has been preparing a new search for boosted same sign top quark pairs, possibly accompanied by additional ligh-flavor jets. This is also a signature of baryon number violation. Notable models where such topologies can be realized are supersymmetric ones with R-parity violation.
The CMS experiment is used to study the di-muon invariant mass spectrum. These spectra allow searches for high-mass unstable particles (resonances) to be performed in a yet unexplored high mass range.
High-mass resonances decaying into muon pairs are predicted in a number of models beyond the Standard Model of the fundamental interactions. Notable examples are heavy neutral gauge bosons predicted by grand unification theories, as well as gravitons arising in the Randall-Sundrum model of extra dimensions.
The first search for high mass resonances was published in JHEP by CMS using the data acquired in 2010. Updated results were produced using part of the 2011 dataset in Summer 2011. By combining di-electron and di-muon data, CMS excluded the existence of resonances predicted by a number of theoretical models with masses below about 2 TeV. These limits are the most stringent to date.
The UCL CP3 group contributed to these two early CMS publications by being one of the three teams of the CMS Collaboration that regularly analyzed new data, optimising the muon isolation criteria and conducting a full study of a mild excess observed in the low-mass region (at ~120 GeV) in both the di-electron and the di-muon channels.
Since 2012 the activity of the UCL-CP3 group is limited to the exploration of a matrix-element approach to this search. Preliminary results show that the exploitation of the full kinematical information of the di-muon events can give some sizable improvements over the classical one that uses just the di-muon invariant mass. In addition, the group develops a new algorithm for measuring the energy of TeV-muons (for details, please read the dedicated project). This algorithm is expected to bring improvements in both the di-muon and single muon+missing energy searches starting from 2014.
The Tracker Simulation group is responsible for the Geant-based simulation of the Pixel and Strip Tracker response, material budget and geometry description.
Members from CP3 are concentrating on various aspects of the validation with data. We also share the convenership of the group.
External collaborators: CMS tracker collaboration.
The matrix element reweighting method attempts to compute the full likelihood of an observed event given a theoretical model. The method therefore measures the degree of compatibility of the event with the given model using as much information as available. MadWeight is a tool that fully automatize the computation of the event likelihood for any model implemented in MadGraph, by performing phase-space integration and providing a framework for taking into account the experimental resolution on the observed final state objects.
This project aims at validating the matrix element reweighting technique implemented in MadWeight on a number of benchmark searches. In some cases, the final goal is the efficient identification of background events. The final states that are being considered are: Zbb, single top, ttbar resonances and dimuon resonances.