L I P

Laboratório de Instrumentação e Física Experimental de Partículas

L I P

L I P [PARTICLES AND TECHNOLOGY]


Os grupos de investigação do LIP oferecem uma grande diversidade de temas de investigação para doutoramento e mestrado.

Nesta página estão listados temas de tese de mestrado disponíveis nos grupos de investigação do LIP. Encontras mais informações sobre o trabalho dos grupos e os contactos dos investigadores responsáveis na secção de investigação. Para exemplos de projectos de menor duração e temas de estágios de investigação, consulta a página do Programa de Estágios do LIP. Podes também contactar directamente o grupo de formação avançada (training@lip.pt)

Página em actualização!

    Deteção de flashes de raios-gama terrestres para a segurança aérea

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Rui Miguel Curado da Silva, Filomena Pinto dos Santos
    Contacto: rui.silva@coimbra.lip.pt, filomena.santos@coimbra.lip.pt
    início: --

    Os TGF (Terrestrial Gamma-ray Flashes: Flashes de Raios-gama Terrestres) são emissões de raios-gama produzidos no topo de nuvens cumulonimbus. Estas emissões foram descobertas em 1994 e têm vindo a ser registadas por missões espaciais científicas dedicadas à astrofísica de altas energias, como a missão AGILE (Astrorivelatore Gamma ad Immagini LEggero) da Agência Espacial Italiana ou a missão Fermi da NASA. Os TGF são produzidos no topo de nuvens com alguns quilómetros de altura por avalanches de eletrões acelerados ao longo do campo elétrico gerado nestas nuvens durante tempestades quando são abruptamente desacelerados na atmosfera quando atingem o topo da nuvem. Com energias entre alguns keV algumas dezenas de MeV, estas emissões são o fenómeno mais energético de origem natural produzido na Terra, podendo representar um risco significativo para o bom funcionamento das aeronaves, em particular para os sistemas eletrónicos bem como para tripulação e passageiros de voos comerciais.

    Plano de Trabalhos: O objetivo desta tese é avaliar a probabilidade de um voo comercial ser exposto a uma emissão de TGF para diferentes tipos de rotas comerciais, determinar o fluxo típico de TGF a altitudes de voo de aeronaves comerciais (~10 km) e a dose potencialmente absorvida por tripulação e passageiros expostos a TGFs.

    1. O/A estudante deverá simular a interação de TGFs recorrendo a um modelo composto por uma aeronave comercial e respetivos tripulantes e passageiros, utilizando o programa de simulação MEGAlib. O modelo emissão dos TGF’s será determinado pelas medidas do satélite AGILE e Fermi;
    2. A dose absorvida por passageiros e tripulantes deverá ser calculada para diferentes distâncias e diferentes características das emissões iniciais de TGFs provenientes das nuvens;
    3. O resultado deste trabalho será da maior importância para configurar e propor um produto baseado em detetores de radiação com potencial para ser instalado em aviões comerciais cujas rotas representem algum perigo de exposição a TGF, permitindo a monotorização do estado de saúde de tripulação passageiros e do bom funcionamento dos sistemas eletrónicos.

    Observações: Há possibilidade de financiamento através de Bolsa de Iniciação à Investigação

    Looking for low mass Dark Matter in the LUX-ZEPLIN detector

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Cláudio Silva, Alexandre Lindote
    Contacto: claudio@coimbra.lip.pt
    início: --

    Cosmological evidence obtained from galaxy dynamics, weak gravitational lensing, and cosmic microwave background observations has revealed that approximately 85% of the total mass in the universe exists in the form of Dark Matter, an exotic form of matter. Understanding the nature of Dark Matter is one of the most prominent unresolved questions in physics, with its detection representing a crucial scientific pursuit. Direct-detection experiments aim to identify potential Dark Matter particles, such as Weakly Interacting Massive Particles (WIMPs) or Axion-like particles (ALPs).

    Liquid xenon time projection chambers (TPCs) have emerged as the leading technology in searches for a large variety of DM particle candidates. These experiments aim to detect the nuclear recoil produced by collisions between DM and xenon nuclei. The recoiling nucleus can deposit energy through excitations and ionisations that are subsequently detected: prompt scintillation (S1) is emitted in the interaction position and  delayed electroluminescence (S2) is produced by the drifted ionisation electrons in a thin xenon gas layer above the liquid. This is a well-established and scalable technology, proven from tens of kg target mass to the current multi-tonne detectors LUX-ZEPLIN (LZ).

    The LZ experiment, a remarkable endeavor in the field of dark matter detection, has been actively conducting operations since 2021. It achieved a significant milestone in July 2022 by setting the world's first limit in dark matter detection. Situated deep underground, specifically at a depth of 1485 meters, the experiment is housed within the renowned Davis Cavern at the Sanford Underground Research Facility (SURF) in Lead, South Dakota, USA. Notably, LZ proudly holds the distinction of being the largest liquid xenon detector presently in existence, boasting an impressive total mass of 10 tonnes. This pioneering experiment offers a fascinating opportunity to witness the forefront of dark matter research and exploration.

    In this work, we aim to develop techniques to explore the low-energy region of two-phase xenon TPCs. These detectors have typical thresholds of a few keV, limiting their sensitivity to low-mass WIMPs, other light DM candidates, or even neutrino signals. Below that range, only a small number of ionisation electrons are extracted to the gas region (leading to very small electroluminescence, S2, signals), and the number of scintillation (S1) photons is so low that in the most extreme cases, no photon is detected. We aim to characterise these small signals better and discriminate them from the much larger backgrounds due to, e.g. random coincidences of multiple single electrons (SEs) occasionally extracted to the gas, low energy interactions from radioactivity in the electrode grids, and random pile-up of dark noise on the PMTs which can mimic scintillation signals.

    In the LZ detector, the presence of radioactivity within the electrode grids gives rise to low-energy interactions that pose challenges to signal detection and analysis. These interactions, situated in close proximity to the grids, suffer signal degradation. The scintillation signals, for instance, may be obstructed by the wires, resulting in reduced or even eliminated S1 signals. Moreover, non-uniformities in the local electric field cause incomplete collection and irregular trajectories of ionization electrons, leading to distortions in the S2 signals. While these events can be effectively excluded when both the S1 and S2 signals are detected, they become a prominent background in the low-energy regime when only the S2 signal is present.

    To address this issue, you will explore the application of machine learning (ML) algorithms to distinguish these events from normal interactions occurring in the bulk of the detector, relying solely on the shape of the S2 pulses. Two potential methods of analysis will be investigated: i) a neural network (NN) utilizing the arrival times of individual photons in the photomultiplier tubes (PMTs); and ii) a convolutional neural network (CNN) directly employing the raw PMT waveforms. Both networks will be trained using populations of grid events for which both the S1 and S2 signals are detected. Subsequently, these trained networks will be applied to cases where only the S2 signal is available, aiding in the identification and discrimination of grid-related events. This exploration into machine learning techniques will offer valuable insights and pave the way for improved analysis methods in the LZ detector.

    Measurement of the Time-Resolved Photoluminescence of the PTFE for the Dark Matter Research

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Cláudio Silva, Vladimir Solovov
    Contacto: claudio@coimbra.lip.pt
    início: --

    Cosmological evidence obtained from galaxy dynamics, weak gravitational lensing, and cosmic microwave background observations has revealed that approximately 85% of the total mass in the universe exists in the form of Dark Matter, an exotic form of matter. Understanding the nature of Dark Matter is one of the most prominent unresolved questions in physics, with its detection representing a crucial scientific pursuit. Direct-detection experiments aim to identify potential Dark Matter particles, such as Weakly Interacting Massive Particles (WIMPs) or Axion-like particles (ALPs). Liquid xenon technology, extensively employed in low-background experiments investigating phenomena like neutrinoless double beta decay (0νββ) and Dark Matter particles (WIMPs), offers a viable approach. Furthermore, this technology enables the construction of highly sensitive neutrino detectors through coherent elastic neutrino scattering, carrying potential applications in nuclear non-proliferation and reactor monitoring. The LUX-ZEPLIN (LZ) detector, boasting a record-setting active mass of 7 tons, represents a significant advancement in xenon-based detection and is poised to lead the field of direct Dark Matter searches in the forthcoming years. 

    The emission of light up to 1 second following a particle interaction within the LZ detector constitutes a significant source of background that can impact its sensitivity, particularly for low-mass Weakly Interacting Massive Particles (WIMPs). This emitted light arises from the luminescence (combining fluorescence and phosphorescence) of the detector walls, which are constructed using Teflon material. Understanding and mitigating this luminescent emission is crucial for optimizing the detector's performance and enhancing its sensitivity to detect and discriminate potential WIMP signals.

    At LIP-Coimbra, we have a dedicated setup for measuring the Time-Resolved Photoluminescence (TRPL) of various materials, including Teflon. TRPL is a powerful technique employed in the study of a material's temporal dynamics of light emission subsequent to excitation, allowing for a comprehensive understanding of its electronic and optical properties over time. Our existing setup consists of a light source, specifically a xenon proportional counter, capable of generating photons with a wavelength of λ=175 nm. The emitted light is collimated and directed towards the sample. Subsequently, the reflected and luminescent light is efficiently collected using a parabolic mirror and a converging lens, ultimately focused onto a Photomultiplier Tube (PMT) for detection and analysis.

    This work entails conducting measurements of Time-Resolved Photoluminescence (TRPL) using the aforementioned setup, focusing specifically on multiple samples of Teflon. You will be responsible for acquiring the data and performing analysis across various time scales. Moreover, you will utilize different optical filters to selectively capture and examine specific desired wavelengths. Through these experiments, valuable insights into the temporal behavior of Teflon's photoluminescence will be obtained, contributing to a comprehensive understanding of its optical properties.

    Study of the optical performance of the future Dark Matter detector XLZD

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Cláudio Silva
    Contacto: claudio@coimbra.lip.pt
    início: --

    Cosmological evidence obtained from galaxy dynamics, weak gravitational lensing, and cosmic microwave background observations has revealed that approximately 85% of the total mass in the universe exists in the form of Dark Matter, an exotic form of matter. Understanding the nature of Dark Matter is one of the most prominent unresolved questions in physics, with its detection representing a crucial scientific pursuit. Direct-detection experiments aim to identify potential Dark Matter particles, such as Weakly Interacting Massive Particles (WIMPs) or Axion-like particles (ALPs). Liquid xenon technology, extensively employed in low-background experiments investigating phenomena like neutrinoless double beta decay (0νββ) and Dark Matter particles (WIMPs), offers a viable approach. Furthermore, this technology enables the construction of highly sensitive neutrino detectors through coherent elastic neutrino scattering, carrying potential applications in nuclear non-proliferation and reactor monitoring. The LUX-ZEPLIN (LZ) detector, boasting a record-setting active mass of 7 tons, represents a significant advancement in xenon-based detection and is poised to lead the field of direct Dark Matter searches in the forthcoming years.

    Future endeavours are underway to develop even more sensitive experiments in the field to push the boundaries of our understanding. The forthcoming XLZD detector is set to be a groundbreaking advancement, featuring a 60-tonne liquid xenon time projection chamber (LXe TPC) at its core. This detector holds the potential to explore the existence of dark matter down to the realm of neutrino fog, where interactions from solar neutrinos become dominant. 

    One of the major technical challenges facing XLZD is optimising its optical performance. In a liquid xenon detector, the light is emitted at 175 nm, deep in the ultra-violet. The emitted light must travel a substantial distance before being detected, and due to its low wavelength, it scatters multiple times along its path. Consequently, there is an increased likelihood of absorption by impurities or the detector's walls. This reduction in the overall light collection efficiency impacts the detector sensitivity and energy resolution. Therefore, optimising the light collection is one of the main research areas identified by APPEC for the future 3G detector.

    This project encompasses the development of optical simulations for the future XLZD experiment. The primary objective is to enhance the current reflectance model utilised in XLZD. This improvement will involve evaluating the impact of different detector configurations on light collection efficiency, thereby examining how the geometry and reflectance of materials influence the detector's light collection capabilities. Additionally, you will measure the enhancement in light collection efficiency for xenon scintillation photons when a wavelength shifter is incorporated into the detector. These simulations give valuable insights regarding optimising light collection in the XLZD detector setup.

    Machine learning for Earth Observation

    Linha de Investigação:
    Centros de Competência
    --
    115
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Inês Ochoa e João Pinelo
    Contacto: miochoa@lip.pt, joao.pinelo@aircentre.org
    início: --

    Earth Observation combines space and ground-based data collection for the understanding of the surface of the Earth and all its systems. It provides crucial information to help address a variety of challenges, from forest fire monitoring to the management of natural resources.

    LIP is partnering with AIR Centre to combine remote sensing technologies and advanced data analysis methods, bringing together data science and space expertise in a bid to contribute to some of today's biggest challenges.


    PhD projects with LIP and AIR Centre (hosted by the Universities of Lisbon, Coimbra, Minho or Évora) will offer students the opportunity to apply machine learning methods to Earth Observation data (such as that collected by satellites) and extract valuable insights on social, physical and ecological dynamics on Earth.

    The Radiation Environment at the Lunar Surface and at the Lunar Gateway

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico, Universidade de Lisboa
    Orientador(es): Patrícia Gonçalves e Luisa Arruda
    Contacto: patricia@lip.pt
    início: --

    Objectives:

    A comprehensive understanding of the lunar radiation environment is essential for preparing future human exploration of the Moon. The student will use the galactic cosmic ray (GCR) fluxes expected on transit to Moon mission at different epochs (Solar Maximum and Minimum) and take into account the occurance of a possible solar event during the space travel to estimate the:

    • Dose in astronauts and to assess the radiological risk they will be exposed;

    • Ambient Dose Equivalent (ADE) and Effective Dose (ED)

    • Linear Energy Transfer (LET) spectra)

    Assessed results will be compared with several advanced in-situ experiments taking measurements of the lunar radiation environment. The spectrometer on-board Lunar Prospector (Feldman et al., 2004) measured albedo neutron and gamma-ray spectra in the lunar orbit during the first half of 1999 (Adams et al., 2007; Maurice et al., 2000). The RADiatiOn Monitor (RADOM, Dachev et al., 2011) on-board the Chandrayaan-1 mission measured the dose and LET spectrum on the 100 km circular polar orbit from October 22, 2008 to August 31, 2009. The Cosmic Ray Telescope for the Effects of Radiation (CRaTER, Spence et al., 2010) Experiment on the Lunar Reconnaissance Orbiter has been providing dose data and LET spectra since June 2009. Most recently, the Lunar Lander Neutron and Dosimetry instrument (LND, Wimmer-Schweingruber et al., 2020) on-board Chang’E−4 has been measuring the radiation environment on the lunar surface (Xu et al., 2020; Zhang et al., 2020). The same assessment will be done for Lunar surface and for Gateway orbit. CGR fluxes for different periods will be extracted from the OLTARIS (Singleterry et al., 2011) web-based user interface.

    Overview

    Accurately predicting the radiation environment on the Moon is a challenge that becomes increasingly relevant with the renewed interest for Lunar exploration and the foreseen manned exploration missions, included in the Artemis NASA program. In planetary bodies like the Moon, with no atmosphere and with no magnetosphere, the ionising radiation environment is dominated by the continuous flux of Galactic Cosmic Ray (GCR), and sporadically, by the Solar Energetic Particle (SEP) events, which are difficult to predict and have short time spans. The radiation environment on the Moon includes primary space radiation and secondary radiation, which is induced in the lunar soil. Both primary and secondary radiation may pose severe health issues to future crews on the Moon.

    Radiological risk assessment on manned missions to the Moon, radiologic risk to the human explorers on surface of the Moon for chosen benchmark scenarios will be assessed. These scenarios shall also consider exposure inside spacecraft or shelters and on extra-vehicular activities on Lunar surfaces. The possibility of introducing a shielding layers and materials shall be considered, which will be used to simulate the chosen scenarios. The same will be done for the Lunar Orbital Plataform-Gateway, a space station placed on a cis-lunar orbit that is being planned by NASA, a vital component of NASA’s Artemis program, will serve as a multi-purpose outpost orbiting the Moon that provides essential support for long-term human return to the lunar surface and serves as a staging point for deep space exploration.

    Development of tools to detect the onset of failure conditions in complex instruments

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Francisco Neves
    Contacto: francisco.neves@coimbra.lip.pt
    início: --

    LUX-ZEPLIN (LZ) is a 2nd-generation Dark Matter direct detection experiment design to become the most sensitive for WIMPs detection in the upcoming years. After commissioning of the detector, LZ is currently finishing its 1st science run and preparing its final acquisition period of 3 years. The LIP Dark Matter group developed the LZ Underground Performance Monitor (UPM), a versatile analysis framework capable of analyzing in real time the raw signals from the detector. The UPM produces sets of parameters that can be used in higher level analysis suitable to infer on the detector performance and heath. With this project we propose to test and implement Machine Learning algorithms to automatically detect, using the UPM data, the onset of detector conditions leading to potential instabilities and failures (e.g. rate burst, electrical discharges, PMT or electronic failures). The ability to detect ahead of time such events would allow for instance operators to turn off the relevant circuitry (e.g. high voltages) therefore protecting equipment and maximizing acquisition time. Notably, the results of this work may find application in other areas, namely, space applications where the access of humans for repairs of complex instruments may be expensive or even impossible. The work will be carried out at the LIP-Coimbra and the student will be integrated into the Dark Matter group working within the LZ Collaboration.

    Dark Matter search at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Michele Gallinaro, João Varela, Nicola De Filippis
    Contacto: michgall@cern.ch
    início: --

    The subject of this thesis is the search for Dark Matter (DM) at the Large Hadron Collider (LHC) using Multi-Variate Analysis tools. The thesis is placed in the context of the Portuguese participation in the CMS experiment at the LHC, and it is linked to the Beyond the Standard Model (BSM) searches in the more general context of the searches for New Physics processes at the LHC. The case to be studied is DM produced in association with a Higgs boson. The plan of work will focus on the mono-Higgs search, where the SM Higgs boson decay into four leptons will be studied in events with large missing transverse momentum that could be due to DM particles. The LIP/CMS group is engaged in the study of SM and BSM processes to fully exploit the opportunities of the unparalleled energy of the LHC collisions. Searches for BSM processes have been carried out in Run 1 and Run 2 at center-of-mass-energies of 7 and 8 TeV, 13 TeV. Run3 is expected to start in 2022 and it will offer excellent opportunities for major discoveries in this domain by analyzing the large amount of data expected to be collected.

    Dark matter (DM) is one of the most compelling pieces of evidence for physics beyond the standard model (SM). There is strong astrophysical and cosmological evidence that suggests for the existence of DM and that it makes up approximately 26% of the total mass of the Universe. A number of BSM theories predict the particle origin of DM and several types of particle candidates are proposed. Some of the most popular models propose the DM in the form of stable, electrically neutral, weakly interacting massive particle (WIMPs) with a mass in the range between a few GeV to a few TeV, which opens up the possibility of searches at a particle collider. A search for DM at a collider involves the need to look for a recoil against visible SM particles. Due to the lack of electric charge and weak interaction cross section, the probability that DM particles produced in proton-proton collisions interact with the detectors is expected to be very small, and can be sought via an imbalance in the total momentum transverse to the beams, seen in the detector. Thus, many searches for DM at the LHC involve missing transverse momentum (MET) where a SM particle, X, is produced against the missing transverse momentum, associated with the DM particles escaping the detector, in the so called “MET+X” or “mono-X” final states. In the searches performed at colliders, X may be a jet, a heavy flavor jet, a photon, or a W or Z boson, or a Higgs boson.

    The main goal of the research program is to explore the data available and search for DM candidate events. In particular, the goal of the work to be performed in this thesis is the search for DM at the LHC produced in association with a Higgs boson by using Advanced Analysis tools, such as Machine Learning and Deep Neural Network. With the large data samples collected in Run2 and those expected to be collected in Run3 and beyond, the tools will be used to better understand the data and extend the sensitivity to DM searches to regions so far unexplored.

    The thesis is placed in the context of the Portuguese participation in the CMS experiment at the LHC, and it is linked to the Beyond the Standard Model (BSM) searches in the more general context of the searches for New Physics processes at the LHC. The research will be carried out in the context of the Portuguese participation in the CMS experiment at the LHC, in the framework of the activities of one of the outstanding research groups in High-Energy Physics in Portugal. Citing the Report of the recent Institutional Evaluation performed by an international review panel nominated by FCT: “The LIP-CMS group, while small in size, is really outstanding and world-class”. The candidate is expected to work in a team with a group of researchers. A strong motivation and a keen curiosity are essential requirements.

    Development of high-performance timing detectors for the CMS forward proton spectrometer

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Jonathan Hollar, Michele Gallinaro, Joao Varela
    Contacto: jonathan.hollar@cern.ch
    início: --

    A high Precision Proton Spectrometer (PPS) has been taking data as a subsystem of the CMS experiment. It uses a timing detector (either diamond or silicon-based) in order to measure the time of flight (ToF) of the two forward protons produced in the interaction point in the CMS detector. Considering central exclusive production processes (pppXp), 10 ps time resolution detector would provide 2.1 mm spatial resolution on the vertex position and improve of a factor of about 25 the background pile up suppression.

    One of the major challenges of operating such detectors at the HL-LHC is the "pileup", or additional collisions occurring in the same proton bunch crossing as the collision of interest. By precisely measuring their time-of-flight, forward protons produced in these collisions can be correctly associated to the correct collision vertex, enabling rare photon-photon interactions to be reconstructed even at the maximum pileup foreseen for the HL-LHC. In order to fully resolve all collisions with forward protons, timing precisions of ~20ps or less will be of paramount importance, and enhance the sensitivity to these processes.

    During Run 2 of the LHC, proton fast timing detectors were already operated as a proof of principle, achieving resolutions of ~100ps with high efficiency. While these detectors were very resistant to radiation, they were limited, particularly by the TDC and related electronics, to timing resolutions of ~30-40ps per plane. For the HL-LHC, new technologies will be required to cope with the pileup, radiation, and event rates, both in terms of sensors and readout electronics. This project will help to address that challenge by exploring the latest developments in timing detectors and electronics, with the smallest possible segmentation and under extreme radiation conditions.

    One of the leading technology candidates is Low Gain Avalanche Detector (LGAD) silicon detectors. These detectors are already being implemented on a much larger scale for upgrades of the central CMS detectors. They have achieved excellent time resolution in testbeams, including the electronics readout chain. In particular, a dedicated ASIC design of a TDC has obtained resolutions of ~6ps. The sensors can potentially survive radiation levels up to ~5x10^15 neutron equivalents/cm^2, sufficient for the proposed forward proton detectors.

    In this project, the development of LGAD detectors and readout systems for the forward proton detector upgrade of the CMS experiment will be pursued. This will consist of evaluating the candidate sensors and electronics in laboratory tests, their response to the realistic radiation conditions expected in the LHC, and their response to real hadron beams. The approaches used will include tests with laser signals, and in high irradiation facilities. The final result will be a full prototype of the detector and electronics to be used in the LHC, that will be tested in high-energy proton beams.

    The candidate is expected to work in a team with a group of researchers. A strong motivation and a keen curiosity are essential requirements.

    Development of High-Precision Timing Detector for the CMS experiment at HL-LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Joao Varela, Tahereh Niknejad, Michele Gallinaro
    Contacto: joao.varela@cern.ch
    início: --

    The HL-LHC requires challenging new particle detectors in order for the experiment to handle the huge protons collision rate. An average of two hundred proton-proton collisions will occur at each bunch crossing, creating an enormous background to rare events in particular those where Higgs bosons are produced. In order to cope with this challenge, the precise measurement of the production time of the charged particles is required. The timing information provides a powerful discrimination of background. The current CMS detectors can achieve a time resolution of the order of 500-1000 picosecond, which is insufficient to maintain the sensitivity to rare physics processes in the HL-LHC era. A strong R&D program towards precise timing of charged particles is now underway aiming at a time resolution of the order of 30-50 picosecond, representing a huge improvement relative to the state-of-the-art. This work focuses on the development of a new Timing Detector based on LYSO scintillating crystals, silicon photomultipliers (SiPM) and dedicated ASIC microelectronics developed in Portugal. The technologies used are at the forefront of R&D in particle physics detectors. It is expected that the developed technologies would be transferred to commercial applications like Light Detection And Range (LiDAR) and Time-of-Flight Positron Emission Tomography.
    The program of work will integrate a broad range of topics in detector physics and technology, including evaluation of detector prototypes with particle beams at CERN, as well as simulation studies of the impact of the Timing Detector in the CMS physics program.

    Flavor Anomalies — first hints of New Physics at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): A. Boletti, N. Leonardo
    Contacto: nuno.leonardo@cern.ch
    início: --

    The LHC physics program so far has been extremely successful. It has established the standard model (SM) of particle physics as a superb theory. The SM cannot however be the ultimate theory of Nature, and a major goal of the LHC for the coming years is to detect the new physics (NP) that lies beyond the SM. The most significant and exciting indications of NP, in all of the current collider data, lie in what is

    referred to as the Flavour Anomalies (FA). These have persistently emerged from the data of various experiments, with their significance enhanced considerably recently at the LHC. Several experimental measurements of b-quark decays present discrepancies with the SM prediction. As of today, the angular analysis of the flavour-changing neutral current (FCNC) decays shows the most significant anomaly. Such processes are highly sensitive to the presence of NP particles, like new gauge bosons (Z'), LeptoQuarks (LQ), and scenarios with extended Higgs sectors (2HDM). The Thesis project consists in the exploration of the FCNC b→s mu mu transition, that lies at the core of the FA, with LHC data collected by the CMS experiment. The results that will be obtained will contribute to a clarification of the anomalies, which is a main priority in the field of particle physics today.

    The CMS detector has accumulated, from 2016 to 2018, a very large data sample dedicated to the angular analysis of several FCNC decays with a pair of muons in the final state. The analysis of this sample allows for measurements with world-leading precision. In this Thesis project the student will take part in the analysis of this dataset, with the angular analysis of FCNC decays, and help in the preparation of the data collection during the upcoming LHC run, developing machine-learning algorithms to identify these rare decays in the harsh environment of the LHC collisions.

    Higgs boson property measurements using Machine Learning

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Michele Gallinaro, Pedro Silva
    Contacto: michgall@cern.ch
    início: --

    The subject of this thesis is the search for double Higgs production, decaying into taus and b-jets, at the Large Hadron Collider (LHC). The work plan includes the study of the double Higgs production, each subsequently decaying to pairs of taus and b-jets. Advanced Machine Learning (ML) techniques will be used in the separation of signal and background events. Searches for new physics in this channel can be significantly improved with the additional data, and with improved analysis techniques.

    The discovery of the Higgs boson in 2012 was a major step towards improving the understanding of the mechanism of electroweak symmetry breaking (EWSB). With the value of the mass of the H boson now experimentally measured, the structure of the Higgs scalar field potential and the intensity of the Higgs boson self-couplings are precisely predicted in the SM. There are, however, compelling reasons to believe the SM is not complete. In particular, the LIP/CMS group is engaged in the study of SM and BSM processes to fully exploit the opportunities of the unparalleled energy of the LHC collisions. While measured properties are so far consistent with the expectation from the SM predictions, measuring the Higgs boson self-couplings provides an independent test of the SM and allows a direct measurements of the scalar sector properties. The self-coupling of the Higgs boson can be extracted from the measurement of the Higgs boson pair production cross section. A large amount of data of approximately 150/fb have been collected in Run1+Run2 and are available to study this process. With the upcoming Run3, additional 300/fb of data may become available in the next few years and there will be excellent opportunities for major discoveries in this domain.

    The main goal of the research program is to analyze the data and search for the production of Higgs boson pairs via gluon-gluon and vector boson fusion processes in final states with two bottom quarks and two tau leptons. The use of Advanced Analysis tools, such as Machine Learning and Deep Neural Network, will be pursued to enhance the sensitivity of the study.

    The research will be carried out in the context of the Portuguese participation in the CMS experiment at the LHC, in the framework of the activities of one of the outstanding research groups in High-Energy Physics in Portugal. Citing the Report of the recent Institutional Evaluation performed by an international review panel nominated by FCT: “The LIP-CMS group, while small in size, is really outstanding and world-class”.

    The candidate is expected to work in a team with a group of researchers. A strong motivation and a keen curiosity are essential requirements.

    LFU — Lepton Flavour Universality tests at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): A. Boletti, N. Leonardo
    Contacto: nuno.leonardo@cern.ch
    início: --

    Lepton flavour universality (LFU) is deeply ingrained in the symmetry structure of the standard model (SM). Indeed, the SM gauge group SU(3)xSU(2)xU(1) is one and the same for all three generations of fermions. However, recent data provide strong evidence for departures from the LFU principle. Such departures are reported in b-quark decays to charged leptons of different flavours (electron, muon, tau). The definite observation of LFU violation, contrary to the SM, would have far reaching implications. It requires the presence of new particles beyond the SM (BSM), such as new gauge bosons (Z') or leptoquarks (LQ). The thesis project explores LFU observables in the most sensitive b->sll quark transitions using CMS data. The thesis project has the potential to deliver the first observation of New Physics at the LHC.

    The CMS detector has accumulated one of the largest heavy flavour datasets ever recorded. A dedicated data sample designed to facilitate the investigation of the LFU anomalies has been collected during LHC Run2 by CMS. In this Thesis project the student will take part in the analysis of this dataset, by carrying out measurements of LFU in B meson decays. In particular, the b → s tau tau decay involving third generation fermions will be explored. This will then be compared against the b → s mu mu decay. This will result in the measurement of the LFU ratio R_K* (tau/muon), that is a most NP-sensitive observable, robust against both experimental systematics and theoretical uncertainties. The lepton flavour violating (LFV) b → s tau mu decay, also foreseen in relevant BSM models, may be probed in addition (using the same tools that will be developed). The work will involve the development of new techniques based on machine-learning (ML) algorithms for the reconstruction of these decays. There will be also opportunity to take part in the preparation for data taking in the upcoming LHC Run3 starting 2022, and the inspection of that early data. The results that will be obtained will contribute to probing a fundamental principle of the SM.

    Probing the Standard Model with Forward Proton Tagging at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Michele Gallinaro, Jonathan Hollar
    Contacto: michgall@cern.ch
    início: --

    The subject of this thesis is the search for New Physics in exclusive processes at the Large Hadron Collider (LHC). These processes occur with high cross section in gamma-mediated processes at the LHC.

    Photon-photon collisions may provide the conditions to study particle production with masses at the electroweak scale. By tagging the leading proton from the hard interaction, the Precision Proton Spectrometer (PPS) provides an increased sensitivity to exclusive processes and probe the standard model. PPS is a detector system at approximately 210 m from the interaction point around the CMS detector, and it is designed to perform measurements of e.g. the quartic gauge couplings and search for rare exclusive processes. Since 2016, PPS has been taking data in normal high-luminosity proton-proton LHC collisions.

    Central exclusive production (CEP) in high-energy proton-proton collisions provides a unique method to access a variety of physics topics, such as new physics via anomalous production of W and Z boson pairs, high transverse momentum (pT) jet production, top quark pairs, and possibly the production of new resonances. These studies can be carried out in particularly clean experimental conditions thanks to the absence of proton remnants.

    CEP of an object X may occur in the process pp → p+X+p, where ”+” indicates the “rapidity gaps” adjacent to the state X. Rapidity gaps are regions without primary particle production. In CEP processes, the mass of the state X can be reconstructed from the fractional momentum losses ξ1 and ξ2 of the scattered protons. At the LHC, the mass reach of the system X, MX, is significantly larger than at previous colliders because of the larger collision energy. The scattered protons can be observed mainly thanks to their momentum loss, due to the horizontal deviation from the beam trajectory. For the first time, proton-proton collisions at the LHC provide the conditions to study particle production with masses at the electroweak scale through photon-photon fusion. At the LHC energies in Run2 and in Run3, values of MX above 300 GeV can be probed. CEP processes at these masses have small cross sections, typically of the order of a few fb, and thus can be studied in normal high-luminosity fills.

    The goal of this research project is to select and identify for the first time in data the extremely rare processes of heavy particles – including top quarks and W bosons - produced exclusively and identified with proton tagging. These are pure QED processes and provide the conditions to study particle production with masses at the electroweak scale.

    The research will be carried out in the context of the Portuguese participation in the CMS experiment at the LHC in the more general context of the searches for New Physics processes at the LHC, in the framework of the activities of one of the outstanding research groups in High-Energy Physics in Portugal. Citing the Report of the recent Institutional Evaluation performed by an international review panel nominated by FCT: “The LIP-CMS group, while small in size, is really outstanding and world-class”. The candidate is expected to work in a team with a group of researchers.

    QGP flavour — Probing the primordial quark gluon plasma with heavy flavour

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): N. Leonardo
    Contacto: nuno.leonardo@cern.ch
    início: --

    At the LHC we recreate droplets of the primordial medium that permeated the universe in its first microseconds. This hot, dense, coloured medium, the quark-gluon plasma (QGP), is produced in ultra-relativistic heavy-ion collisions. The highest energies attained at the LHC and its state-of-the-art detectors are facilitating tremendous advancements in our understanding of the strong interaction, and of QCD matter at extreme conditions. Such data-driven advances also highlight unexpected behaviour, and the study of the QGP medium is fostered by novel probes facilitated by the large datasets being collected. One such probe is provided by heavy quarks. The bottom quarks are particularly interesting probes, as they are produced early in the collision and thus experience the full evolution of the hot medium. The aim of the Thesis project is the detection and study of b-quark hadrons in heavy-ion collisions (with particular the focus on the still-to-be-detected B0 meson, the rarer Bc meson, and possible exotic hadrons). These novel probes facilitate unique information on the flavour and mass dependence of energy loss mechanisms, as particles traverse the medium, and on underlying quark-recombination mechanisms yet to be observed at these scales. The Thesis project aims at making unique contributions to further our understanding of the primordial medium, by exploring novel probes and unprecedented energies at the LHC.

    The exceptional capabilities of the CMS detector (in particular the muon and tracking systems) make it possible to identify and fully reconstruct b-quark states for the first time ever in ion collisions. The project starts by gaining familiarity with baseline tools developed, in the context of the analysis of lighter (and more abundant) B meson states. The B0 and Bc channels will be then probed first in proton-proton (pp) data. Machine learning algorithms will be deployed in turn to detect these meson states in the more challenging (busy) PbPb collision data. The signal yields are finally compared between the two channels (B0 and Bc) and collision systems (pp and PbPb). This results in a robust observable sensitive to the properties of the medium, and specifically to the interplay of underlying mechanisms of energy loss and recombination. In addition, so-called exotic hadron states (eg X(3872)) may be probed in the data, shinning light on their still unknown nature. The Thesis project involves the analysis of datasets collected by the CMS experiment in heavy-ion and proton-proton collisions at the LHC. The project is developed within the CMS working group formed by researchers from LIP and MIT.

    Rare Higgs — Search for rare Higgs decays and coupling to (light) quarks

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): N. Leonardo, S. Fonseca, E. Melo
    Contacto: nuno.leonardo@cern.ch
    início: --

    Since the discovery of the Higgs boson at the LHC in 2012, by the CMS and ATLAS experiments, the focus has been placed in measuring its properties. In particular, on the determination of how it couples to the other standard model (SM) particles. In 2018, both CMS and ATLAS have detected the couplings of the Higgs to the heaviest quarks: the top (ttH) and the bottom (H->bb). Evidence for the muonic decay was also recently reported. The next big, and natural, step in this endeavor is to access the H couplings to the lighter quarks. The Thesis project consists in the search for Higgs boson decays to a quarkonium state and a photon, using the large dataset collected by CMS at the LHC. The motivation for the study is twofold: to experimentally constrain the Higgs-quark couplings, and help decide whether our new scalar is indeed the particle at the core of the SM, or (even more excitingly) a first new particle beyond the SM.

    The very high level of background (light quarks are produced abundantly at the LHC, via QCD) renders the signals of direct Higgs decays to light quarks (H->qq) challenging to detect with existing data. But there's a way out (!): the quark-antiquark pairs originating from the Higgs decay may sometimes bind together, thus forming a meson state (which then decays to a lepton pair). Experimentally, this gives rise to a striking, clean signature: an energetic photon back-to-back to a dilepton resonance, H->Q+gamma. The thesis project consists in the exploration of such rare Higgs decays. These may give privileged access to the Higgs couplings to quarks (the lighter quarks may be inaccessible by other means). Their rareness however requires the development of dedicated algorithms for optimised trigger and selection, involving the exploration of machine learning methods.

    Top quark physics and search for physics beyond the Standard Model at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Michele Gallinaro, Pedro Silva, João Varela
    Contacto: michgall@cern.ch
    início: --

    Recent checks of lepton flavour universality violation sparked a renewed interest towards measurements involving tau leptons, owing to a potential disagreement with SM predictions. The work will focus on studying the properties of the top quark dilepton final state and measure the tau and heavy flavour contents of top quark events. Studies of final states, including 3rd generation leptons and quarks such as tau leptons and b-jets, produced in association with top quark pair events may provide first hints for New Physics processes and shed light on the anomalies of Lepton Flavor Universality measurements. An anomalous flavor production is directly “visible” in this study. Deviations from SM predictions will indicate evidence for New Physics.

    The research will be carried out in the context of the Portuguese participation in the CMS experiment at the LHC, in the framework of the activities of one of the outstanding research groups in High-Energy Physics in Portugal. Citing the Report of the recent Institutional Evaluation performed by an international review panel nominated by FCT: “The LIP-CMS group, while small in size, is really outstanding and world-class”.

    The candidate is expected to work in a team with a group of researchers. A strong motivation and a keen curiosity are essential requirements.

    Vector Boson Scattering processes at the Large Hadron Collider

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Michele Gallinaro, Pedro Silva, Jonathan Hollar
    Contacto: michgall@cern.ch
    início: --

    The high-energy scattering of massive electroweak bosons, known as vector boson scattering (VBS), is a sensitive probe of new physics. VBS signatures will be thoroughly and systematically investigated with the large data samples available and those that will be collected in the near future at the LHC. Searches for deviations from Standard Model (SM) expectations in VBS will be performed with the goal of testing the Electroweak Symmetry Breaking (EWSB) mechanism. Current state-of-the-art tools and theory developments, together with the latest experimental results and the studies foreseen for the near future will be studied, implemented, and integrated in the research program. New data analysis strategies to understand the interplay between models and the effective field theory paradigm for interpreting experimental results will be developed with the goal of probing existing Beyond the SM (BSM) models.

    The observation of a Higgs boson by the ATLAS and CMS collaborations confirmed the standard model (SM) of elementary interactions. The presence of the Higgs boson with gauge couplings compatible with those predicted for the SM provides evidence that contributions from the exchange of this boson may be responsible for preserving unitarity at high energies. However, new phenomena may be present in the electroweak symmetry breaking (EWSB) sector, where a sensitive probe of new physics is naturally given by the study of the scattering of massive electroweak bosons (known as vector boson scattering, VBS) at high energy. Any deviation in the SM coupling of the Higgs boson to the gauge bosons breaks this delicate cancellation, thus permitting a test of the EWSB mechanism.

    The program proposed foresees the study of VVjj (V=W,Z) final states, produced in proton-proton collisions at the LHC in Run2 via VBS. The study will be performed in the leptonic final states, including the tau lepton, in the decays of the V bosons. Thanks to the correlation between different final states and the inclusion of tau leptons, sensitivity to anomalous quartic gauge couplings can be significantly enhanced over the current results.

    The main goal of the research program is to explore the data available and search for VBS candidate events. In particular, the goal of the work to be performed in this thesis is the search for VBS processes in a final state that includes at least one tau lepton. The study is performed using Advanced Analysis tools, such as Machine Learning and Deep Neural Network. With the large data samples collected in Run2 and those expected to be collected in Run3 and beyond, the tools will be used to better understand the data and extend the sensitivity to VBS processes to regions so far unexplored.

    The research will be carried out in the context of the Portuguese participation in the CMS experiment at the LHC, in the framework of the activities of one of the outstanding research groups in High-Energy Physics in Portugal. Citing the Report of the recent Institutional Evaluation performed by an international review panel nominated by FCT: “The LIP-CMS group, while small in size, is really outstanding and world-class”.

    The candidate is expected to work in a team with a group of researchers. A strong motivation and a keen curiosity are essential requirements.

    Effective Field Theory phenomenology with vector and Higgs bosons at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Inês Ochoa
    Contacto: ines.ochoa@cern.ch
    início: --

    There are many observed phenomena in Nature which the Standard Model of Particle Physics (SM), despite its successes, is not able to describe. The Large Hadron Collider (LHC) data may yet lead to discoveries that will contribute to a complete understanding of the Universe. However, since the Higgs discovery, the LHC has seen no signs of new phenomena, which could indicate that they occur at much higher energies and only indirect signs of their presence can be detected in the proton-proton collision data. 

    If that is the case, it is plausible that effects of new physics can be captured via an effective field theory (EFT). EFTs are theoretical tools that approximate the effects of very high-energy phenomena at the energies probed by the LHC, framing the SM as a low energy description of Nature that is only valid below a given cut-off energy scale. 

    Events with vector bosons (V) and Higgs bosons (H) are particularly good probes of new physics given their connection to the electroweak symmetry breaking mechanism. In this project, the student will join the LIP ATLAS and phenomenology teams to study the sensitivity of very high-energy VH production to new physics operators in the EFT formalism. 

    This work is expected to examine the fully hadronic decay channels of the V and H bosons and determine the feasibility of a search in this final state, previously unexplored. The student will work with state-of-the-art event generators, detector simulations and machine learning methods to augment the simulated information. The student is expected to participate in group meetings within the ATLAS Collaboration and present his/her results either at CERN or by videoconference.

    Machine learning methods to improve boosted Higgs boson identification at ATLAS

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Inês Ochoa
    Contacto: ines.ochoa@cern.ch
    início: --

    Since the discovery of the Higgs boson by the ATLAS and CMS experiments in 2012, the Large Hadron Collider at CERN has provided vast amounts of high-energy proton-proton collision data that may hold the answer to some of the deepest mysteries in particle physics. The development of new and improved techniques to explore the available data is critical for the success of the experiments. 

    In particular, the phenomena governing the production of Higgs bosons can provide access to effects of new physics which so far have remained elusive. For example, measuring the cross-section for Higgs bosons produced with very high momentum is an important goal of the LHC and can lead to the discovery of new particles. This and other measurements can benefit from advanced methods that precisely identify Higgs bosons and distinguish them from other particles that create similar detector signatures. 

    In this project, the student will join the LIP ATLAS team to work on new approaches to identify Higgs boson decays to pairs of b-quarks, in the high momentum regime. New advanced methods will be evaluated, taking advantage of deep learning techniques and variables sensitive to the decay topology of the Higgs boson. These techniques will be evaluated using ATLAS full detector simulation with the goal of enhancing the sensitivity of analyses measuring H->bb production at high momentum. 

    This work is expected to provide new insights into the performance of these novel techniques to reject different sources of background phenomena, as well as into their dependence on the particular details of the simulation. Both aspects are crucial for applying these algorithms to real collision data and for guiding their development in the coming years. 

    The work will also be integrated in an international environment, with frequent contact with a group of experts in the area. The student is expected to participate in international meetings within the ATLAS Collaboration and present his/her results either at CERN or by videoconference.

    Measurement of boosted H->bb decays at ATLAS

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Patricia Conde Muíño
    Contacto: patricia.conde.muino@cern.ch
    início: --

    Since the discovery of the Higgs boson, the precise measurement of its properties has become a fundamental part of the ATLAS Physics programme. The observation of the Higgs decays to b-quarks and the associated production of the Higgs with top quarks, done in 2018 by the ATLAS and CMS collaborations at CERN, probes directly the coupling of the Higgs to quarks and constituted an important step forward in the understanding of the Higgs mechanism. As the LHC continues to take data and more luminosity is accumulated, more precise measurements of the Higgs boson properties are possible, opening the door to search for new physics in the Higgs sector. In this line, the study of the high transverse momentum (high-pT) Higgs production, in the associated production channel with a W boson, is sensitive to new physics in the HWW vertex and constitutes one of the measurements to be done in the near future at the ATLAS collaboration. 
    This project focuses on the study of the high-pT Higgs boson production cross section in the associated production channel with a W boson, when the W decays to a lepton and a neutrino and the Higgs to b-quarks.  
    The ATLAS Portuguese team has contributed to the observation of the Higgs boson decays to b-quarks in this channel and is currently involved in the measurements of the cross section production at high-pT. This work requires the use of dedicated reconstruction techniques for the identification of two highly collimated b-jets, produced in the decay of the Higgs boson. Machine Learning techniques can also be used to improve the performance of this methods. 
    The student will be part of the ATLAS Portuguese team participating in this analysis. The work will be developed in an international collaboration and the results obtained will be presented at CERN.

    Search for Exclusive HWW production at the LHC

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Patricia Conde Muíño
    Contacto: patricia.conde.muino@cern.ch
    início: --

    By detecting the intact protons that are diffracted though the beam pipe at the Large Hadron Collider at CERN (LHC), the ATLAS Forward Proton Tagging detectors (AFP) transform the LHC into a photon-photon collider [1], opening the door to enrich the ATLAS physics programme with a large variety of physics topics. This research project proposes a feasibility study of a very interesting physics process that has not yet been studied up to now: the photo-production of WW boson pairs in association with a Higgs boson. The interest of this channel is two fold: on the one hand it can provide an independent measurement to constraint the total Higgs boson width and will complement the measurement of the Higgs branching ratios, reducing the current systematic uncertainties. On the other hand, it might be sensitive to new physics effects in the interaction vertex of the two W bosons and the Higgs, that would be a major discovery at the LHC.

    The photo-production of WWH bosons suffers, however, of a very small production cross sections, meaning that an adequate analysis strategy has to be designed in order to efficiently select the signal events while rejecting the huge backgrounds processes with similar signatures. The exclusivity of the process (i.e. all the energy loss by the protons goes into the WWH system) provides clean experimental signatures that can be used effectively to reject the main backgrounds.

    The objective of this project focuses on the development of an analysis strategy to measure the WWH photo-production at the LHC with the Run-3 data to be collected from 2022-2025. This feasibility study will contribute to the definition of the physics programme at ATLAS for the Run 3.

    The student selected will join the ATLAS Portuguese team and will work in an international environment, in contact with experts on this subject. Frequent presentations at CERN (either physically or via videoconferencing) are expected.

    FIPs — Search for Feebly Interacting Particles

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): N. Leonardo, P. Bordalo, S. Ramos
    Contacto: nuno.leonardo@cern.ch
    início: --

    A descoberta do bosão de Higgs na década passada (em 2012) tornou o modelo padrão das partículas elementares completo. Contudo, a busca por novas partículas ainda não terminou: vários fenómenos observacionais bem estabelecidos -- matéria negra, assimetria entre matéria e anti-matéria no Universo primordial, massas e oscilações de neutrinos, inflação -- não podem ser explicados apenas com partículas conhecidas. Não terem sido encontrados até agora sinais definitivos de novas partículas aponta para que estas sejam muito pesadas (acima da escala TeV para produção directa nas colisões do LHC) ou (tendo massas possivelmente mais baixas e.g. da ordem de GeV) interajam muito debilmente com as partículas do modelo padrão. A procura das chamadas Feebly Interacting Particles, FIPs, constitui assim um caminho importante e complementar ao LHC na busca de nova física. 

    SHiP (Search for Hidden Particles) é uma experiência está a ser projectada para procurar por partículas de interacção extremamente fraca, relativamente leves (massas abaixo de 5 GeV) e de vida longa, na chamada fronteira de intensidade. A experiência usará um feixe de protões de alta intensidade (4 x 10^{13} protões/s) de 400 GeV do acelerador SPS de modo a tornar possível a aquisição de uma estatística razoável de eventos raros. Com esta intensidade de feixe SHiP irá explorar, com uma sensibilidade sem precedente, o chamado "Hidden-Sector" das partículas elementares numa região do espaço de fases que não é acessível às experiências do LHC. É uma experiência que tem o potencial de revolucionar o mundo da física de partículas visto que, a título de exemplo, a descoberta de neutrinos pesados com helicidade direita (HNLs) providencia uma explicação natural para a natureza da matéria negra, para a origem da assimetria bariónica do nosso universo e para a origem da massa dos neutrinos. SHiP também dispõe de um detector de neutrinos capaz de registar colisões directas de partículas de matéria negra de baixa massa com electrões, com uma sensibilidade que ultrapassa o limite que nos dá a abundância correcta de matéria negra no nosso Universo. 

    Com o presente projecto pretende-se que o/a estudante desenvolva simulações que envolvam a produção de partículas exóticas (pertencentes ao Hidden Sector das partículas elementares). Tais como HNLs (Heavy Neutral Particles), ALPs (Axion-Like Particles), DPs (Dark Photons) e sgoldstino (super-parceiro do goldstino, componente longitudinal do gravitão). O objectivo passa pela optimização da selecção dessas mesmas partículas, com recurso a algoritmos de Machine Learning. Sendo que a experiência está a ser planeada para ter contaminação zero, na parte onde serão detectadas as partículas exóticas, estudos de contaminação por outros processos (nomeadamente interações de neutrinos) assumem uma importância crucial. Uma vez que o grupo do LIP vai ter a seu cargo a construção de um detector que mede o tempo de voo das partículas, com uma resolução imbatível a nível mundial, o/a estudante terá a oportunidade de modificar (na simulação) as propriedades do detector de modo a optimizar a rejeição da contaminação introduzida pelos neutrinos nos vários canais de Física do Hidden Sector.

    (SND@LHC, baseada num protótipo do detector de neutrinos de SHiP, será instalada como uma nova experiência do LHC, com vista a tomada de dados em 2022. Para alem de FIPs, neutrinos serão detectados pela primeira vez no LHC, e à maior energia experimentalmente acessível no laboratório. Existe assim a oportunidade de participar na preparação da mais recente experiência do LHC!) 

    Astrofísica Multi-mensageira e Polarimetria com o Telescópio AMEGO da NASA

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: LIP-Coimbra, Departamento de Física da Universidade de Coimbra
    Orientador(es): Rui Miguel Curado da Silva
    Contacto: rui.silva@coimbra.lip.pt
    início: --

    Enquadramento (área científica): A deteção em 2017 de ondas gravitacionais geradas
    por fenómenos extremos do Universo, como a fusão de objetos compactos (estrelas de
    neutrões, buracos negros, etc.), é atualmente um dos tópicos mais pertinentes em
    astrofísica. Em particular a deteção destas ondas por instalações terrestres (experiência
    LIGO/Virgo) simultaneamente com a deteção no espaço de emissões intensas de raios
    gama (GRB: Gamma-Ray Bursts) geradas pelos mesmos objetos, abriu uma nova janela
    científica no domínio da astrofísica multi-mensageira.

    Objetivos: O objetivo deste trabalho será o de determinar o potencial científico do
    telescópio AMEGO [1] no domínio da astrofísica multi-mensageira e da polarimetria
    através da análise do conjunto de white papers científicos do AMEGO. Estes white papers
    discutem o potencial do telescópio de raios gama AMEGO no domínio da astrofísica
    multi-mensageira, em particular os fenómenos e os objetos mais importantes da
    astrofísica de altas energias como supernovas, fusão de objetos compactos (estrelas de
    neutrões, buracos negros, etc.), excesso de emissão gama no centro das galáxias, blazares,
    etc.

    Plano de Trabalhos: O/A aluno/a deverá simular a resposta do telescópio AMEGO às
    principais emissões celestes associadas à astrofísica multi-mensageira, em particular aos
    GRB (Gamma-Ray Bursts: Surtos de Raios Gama), bem como o seu potencial
    polarimétrico. As emissões celestes deverão ser implementadas no programa MEGAlib
    [2], bem como o modelo de massa dos instrumentos científico do telescópio AMEGO.
    Serão simuladas as prestações do AMEGO para vários objetos, como a Nebulosa 
    Caranguejo ou ps GRB, sem realizar qualquer alterar o modelo de massa do AMEGO:
    sensibilidade; resolução angular, área efetiva e sensibilidade à polarização.
    Posteriormente será feita alguma otimização da configuração do telescópio AMEGO,
    alterando a configuração original e verificando a sua influência nas prestações do
    instrumento. Propor alterações de configuração aos nossos parceiros do consórcio
    AMEGO.
    Despois da otimização dos principais instrumentos, o/a aluno/a deverás simular os objetos do
    céu gama a estudar, como supernovas, fusão de objetos compactos (estrelas de neutrões,
    buracos negros, etc.), excesso de emissão gama no centro das galáxias, blazares, tendo
    em conta a informação contida nos white papers científicos da missão. Determinar que
    tipo de observações serão possíveis de realizar e que questões científicas a que poderemos
    responder e, em particular para astrofísica multi-mensageira e a tendo em conta a
    possibilidade de realizar polarimetria de raios gama.

    A proposta de missão será submetida à NASA para aprovação para missão de categoria
    Probe, muito provavelmente em meados de 2022. Caso a missão AMEGO seja aprovada
    teremos acesso aos dados e à ciência que a missão nos poderá proporcionar, entre outros
    a assuntos como a astrofísica multi-mensageira e a polarimetria de raios gama, sendo o
    primeiro instrumento espacial dedicado à polarização.


    Referências:
    [1] J McEnery et al., “All-sky Medium Energy Gamma-ray Observatory: Exploring the Extreme
    Multimessenger Universe”, arXiv:1907.07558 [astro-ph.IM].
    [2] Zoglauer et al, MEGAlib – The Medium Energy Gamma-ray Astronomy Library, New Astr.
    R., 50, 2006, 629; 

    Constelação de CUBESATS para Astrofísica Multi-Mensageira

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: LIP-Coimbra, Departamento de Física da Universidade de Coimbra
    Orientador(es): Rui Miguel Curado da Silva, Jorge Manuel Maia Pereira
    Contacto: rui.silva@coimbra.lip.pt
    início: --

    Enquadramento (área científica): A deteção em 2017 de ondas gravitacionais geradas
    por fenómenos extremos do Universo, como a fusão de objectos compactos (estrelas de
    neutrões, buracos negros, etc.), é atualmente um dos tópicos mais pertinentes em
    astrofísica. Em particular a deteção destas ondas por instalações terrestres (experiência
    LIGO/Virgo) simultaneamente com a deteção no espaço de emissões intensas de raios
    gama (GRB: Gamma-Ray Bursts) geradas pelos mesmos objetos, abriu uma nova janela
    científica no domínio da astrofísica multi-mensageira.

    Descrição e Objetivos: A deteção da componente no espectro eletromagnético dos
    fenómenos que geram ondas gravitacionais, detetadas em instalações na superfície
    terrestre (LIGO/Virgo), tem sido realizada pelos telescópios espaciais Fermi da NASA e
    INTEGRAL da ESA (os raios gama são absorvidos pela atmosfera terrestre). Para estudar
    o Universo no domínio dos raios gama, como os GRBs em conjugação com ondas
    gravitacionais, é necessário um observatório espacial com capacidade para observar
    fenómenos raros e rápidos que podem ocorrer em qualquer parte do céu. Os observatórios
    espaciais atuais baseiam-se em grandes satélites monolíticos como pouca capacidade de
    para observar todo o céu em simultâneo ou com rapidez insuficiente para apontar para
    um GRB enquanto este está a decorrer (10s a 2 min.). No entanto, um observatório
    espacial constituído por uma constelação de CubeSats, onde cada CubeSat será equipado
    com detetores de raios gama, permitirá observar todo o céu em simultâneo, mas também
    melhorar a relação sinal/ruído graças à melhor resolução espacial das trajetórias das
    interações sucessivas geradas por efeito de Compton. Um observatório baseado em
    CubeSats permitirá baixar significativamente o orçamento de um observatório espacial.

    Plano de Trabalhos: O/A aluno/a deverá determinar, no âmbito do projeto europeu
    AHEAD2020 [1], a configuração ótima para uma constelação de CubeSats operando em
    formação, equipados de detetores de semicondutor que permita a observar todo o céu o 
    Universo no domínio de raios gama com boa resolução espectral e espacial, bem como o
    desenho de cada unidade de deteção a ser instalado em cada CubeSat.
    1- O/A aluno/a deverá contruir um modelo de massa da constelação de CubeSats
    utilizando o programa de simulação MEGAlib (baseado em GEANT4) [2],
    partindo do protótipo de demonstração COMCUBE desenvolvido no âmbito do
    projeto AHEAD2020 (Fig. 1);
    2- Deverá analisar as prestações científicas da constelação (resolução, eficiência de
    deteção, sensibilidade, polarização, etc.) para diversas configurações e diversos
    materiais de deteção (Si, CdTe, etc.) sob condições de observação semelhantes às
    existentes em órbita (fluxo da fonte celeste a observar, ruído cósmico, ambiente
    orbital, etc.);
    3- Através de simulações sucessivas, o/a aluno/a deverá determinar uma configuração
    ótima e o desenho ótimo de cada unidade COMCUBE a ser instalada em cada
    CubeSat, tendo em conta as limitações e condicionantes físicas realistas (massa,
    volume, consumo, custos, etc.).
    O resultado obtido servirá de base para uma futura de missão a uma agência espacial da
    rede de colaborações que integra o grupo i-Astro do LIP (Agência Espacial Italiana, ESA,
    NASA, etc.)

    [1] Integrated Activities for the High Energy Astrophysics Domain: AHEAD2020 project
    ID: 871158
    [2] Zoglauer et al, MEGAlib – The Medium Energy Gamma-ray Astronomy Library, New Astr.
    R., 50, 2006, 629

    Effects on CZT Detectors exposed to the International Space Station Orbital Environment

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: LIP-Coimbra, Departamento de Física da Universidade de Coimbra
    Orientador(es): Rui Miguel Curado da Silva e Jorge Manuel Maia Pereira
    Contacto: rui.silva@coimbra.lip.pt
    início: --

    Framework
    The knowledge of LEO environment effects on future gamma-ray telescopes’ sensors is essential to design instruments with fine radiation hardness during mission’s lifetime with optimized signal-to-noise ratio, by minimizing orbital background noise. Herein, we intend to analyze radiation damage and ageing effects on the operational performances of scientific instrument materials in real in-orbit conditions, allowing to estimate the effective lifetime of instrument components developed for high-energy astrophysics’ scientific future space missions such as the Advanced Surveyor of Transient Events and Nuclear Astrophysics (ASTENA) [1, 2], and the All-sky Medium Energy Gamma-ray Observatory (AMEGO) [3].

    Objectives
    Gamma-ray Laue Optics and Solid State detectors (GLOSS) project was selected by ESA in 2020 in the
    Euro Material Ageing (EMA) call. In this project, we will test several crystal samples for future gammaray
    space telescopes in the International Space Station (ISS) Bartolomeo platform. These crystals are Ge
    and Si crystals for Laue lens, and CdZnTe (CZT) crystals for sensor elements, respectively, for ASTENA
    mission concept and AMEGO calorimeter. Ge and Si crystals are 10×10×2 mm2 to fit the Euro Material
    Ageing facility requirements. The samples of CZT sensors are small crystals of 10´10 mm2 in area and
    2 mm thickness.
    In this GLOSS project task, we will contribute solely to estimate orbital environment effects on CZT as
    well as its pre-flight properties as a gamma-ray detector. The analysis of the exposition of CZT to the
    LEO ISS environment, in particular the combined effects of proton radiation field and the in-orbit
    thermal cycles, will be fundamental to engage new strategies to design the future scientific CZT based
    telescope instruments.

    Work plan
    The student shall perform two distinct tasks:
    1- Perform radiation and thermal simulations to estimate the orbital protons’ displacement damage
    dose in crystals and the temperature gradient inside the samples. Two types of orbital environment
    simulations will be performed in this task:
    a) orbital protons’ energy deposition along crystals materials’ depth;
    b) thermal profile inside the material during the orbital cycles.
    Simulations of orbital protons’ effects, i.e. the displacement damage dose will be performed with two
    simulation toolkits: OMERE and SRIM/TRIM. OMERE allows to estimate the proton radiation field in
    the ISS orbit, while SRIM/TRIM provides the displacement damage dose in each material. Crystal
    samples are 10×10×2 mm2. Simulations of the temperature gradient inside the sample crystals and
    support materials will be performed with COMSOL Multiphysics (Heat Transfer Module) covering the
    orbital cycles temperature range from -150ºC when the ISS is hidden from the sunlight by Earths’ disk
    up to +120ºC when the ISS is facing the sunlight.
    2- Analyze the properties of CZT detector in order to estimate the pre-flight conditions, to be
    compare with the samples’ properties after the flight (this last step is out of the scope of this thesis). The
    characterizations tests will be implemented at LIP UC and UBI facilities on CZT crystals’ backup
    samples with the same characteristics as the samples delivered to ESTEC for testing and flight, with
    10×10 mm2 in area and 2 mm thickness. These tests will be performed during the pre-launch and during
    the ISS experiment flight phases. Tests will be implemented by the following methodologies:
    a) Measurement of detector leakage currents: both bulk and surface leakage currents components
    will be measured with a high precision electrometer, as function of the electrodes’ bias voltage;
    b) Charge transport properties analysis: the mobility-lifetime products for electrons and holes will
    be measured by recording the alpha-particle (Am-241 source) pulse-height spectra at increasing
    bias voltage with opposite polarity among electrons and holes;
    c) Measurement of spectroscopic properties of the detectors - The energy resolution
    measurement and the photopeak’s shape analysis will be carried out by recording gamma-ray
    energy spectra with the following gamma-ray sources covering the dynamic operation band
    of the described missions (100 keV up to 1 MeV).

    References
    [1] C. Guidorzi, et al., “A Deep study of the high energy transient sky”, ESA Voyage 2050 White Paper, 2019.
    [2] F. Frontera, et al. 2019 “Understanding the origin of the positron annihilation line and the physics of the supernova
    explosions”, White Paper submitted to ESA Voyage 2050 Call (2019);~
    [3] J McEnery et al., “All-sky Medium Energy Gamma-ray Observatory: Exploring the Extreme Multimessenger
    Universe”, arXiv:1907.07558 [astro-ph.IM].

     

    Polarímetro Espacial para Raios X Solares

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: LIP-Coimbra e Observatório Geofísico e Astronómico, UC.
    Orientador(es): Rui Miguel Curado da Silva e João Fernandes
    Contacto: rui.silva@coimbra.lip.pt
    início: --

    Enquadramento (área científica): Na sequência do concurso de missões espaciais da
    ESA em colaboração com a China, o LIP e UC apresentaram a proposta “A Compact
    Solar Hard X-ray Polarimeter” (http://sci.esa.int/cosmic-vision/53072-esa-and-casplanning-
    for-a-joint-mission/?fbodylongid=2237). Trata-se de polarímetro para raios X
    solares constituído por um instrumento baseado em detetores de CdTe capaz de realizar
    imagem e espectroscopia das emissões solares na banda dos raios X.

    Descrição e Objetivos: Em astrofísica, o estudo da polarização das emissões de raios X teve
    parcos desenvolvimentos dada a complexidade requerida para conceber um polarímetro. Até ao
    presente, no domínio dos raios X, o Universo foi estudado quase exclusivamente através da análise espectral e da variabilidade temporal. No entanto, determinando o ângulo e o grau de polarização da radiação, será possível duplicar o número de parâmetros observacionais e consequentemente permitir distinguir entre os vários modelos que descrevem a física dos objetos celestes. As emissões de radiação polarizada revelam importante informação sobre geometria, campo magnético, composição e mecanismos de emissão associados a fontes celestes de raios gama, tais como: buracos negros, erupções solares, núcleos galácticos ativos e pulsares. Até hoje, nunca foram realizadas medidas de polarização de emissões solares no domínio dos raios X ou gama. O objetivo do trabalho a desenvolver no âmbito deste consórcio é a otimização do conceito provisório de um polarímetro, através da simulação com os programas GEANT4 e MEGAlib da interação de fotões provenientes das emissões solares de forma a quantificar com rigor a resposta e as prestações esperadas do polarímetro solar.

    Plano de trabalhos: Neste trabalho, o/a estudante deverá quantificar e otimizar a
    configuração do conceito de um futuro polarímetro solar para ser apresentado no próximo
    concurso de missões S (small missions) da ESA. Essa otimização requer a análise das
    potenciais regiões solares emissoras de radiação X polarizada e da simulação da resposta
    do instrumento à radiação solar. O/A estudante deverá fazer uma revisão exaustiva da
    literatura relativa a todos os fenómenos físicos solares com potencial para emitir radiação 
    polarizada. Após a compilação e análise das possíveis emissões de radiação polarizada,
    deverá otimizar um instrumento com a configuração mais adequada para responder ao
    nível de polarização das emissões solares passíveis de observação. Para tal deve simular
    o perfil das emissões solares polarizadas (fluxo, banda de energia, distribuição espacial,
    etc.) em conjunto com o modelo de massa do polarímetro, baseado em detectores de
    CdTe. Este instrumento será constituído por módulos de detectores (Caliste) [1] com
    píxeis de 500μm para uma superfície sensível total de cerca de 25 cm2. O campo de visão
    será definido por um colimador de cintilador plástico e por uma máscara codificada de
    tântalo. A configuração deste instrumento deverá permitir uma sensibilidade a níveis de
    polarização tão baixos como 5%, para um fluxo equivalente a 10 mCrab para um tempo
    de observação de 106s.

    [1] Limousin, O., et al.: “Caliste-256: A CdTe imaging spectrometer for space science with a 580 ìm pixel
    pitch”. Nucl. Instr. Meth. Phys. Res. A. 647, 46 (2011)

    Deep Neural Networks applications in experimental physics data analyses

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: FCUL
    Orientador(es): Helena Santos (LIP researcher and professor at FCUL)
    Contacto: helena@lip.pt
    início: --

    Nucleus-nucleus collisions at the Large Hadron Collider (LHC/CERN) provide an excellent opportunity to create the Quark Gluon Plasma (QGP) in the laboratory energy frontier. The ATLAS experiment provides essential capabilities to study it, namely large acceptance, high granularity calorimeters, tracking detectors and muon spectrometers. A major goal of the Heavy Ion Program of the LHC is the understanding of the effects of the QGP on heavy flavour jets (collimated sprays of particles originating on the hadronization of bottom (b) quarks). The b-quark is a sensitive probe because the hard-scattering process takes place at the earliest stages of the collisions, perceiving thus the entire evolution of the plasma. Understanding the nature of its energy loss (either collisional or gluon radiative), and comparing with the energy loss suffered by the lighter jets is crucial to infer the properties of the QGP.

    Deep Neural Networks to tag these heavy flavour jets in the demanding Pb+Pb collisions environment are under development. The student will collaborate to these efforts by implementing an hyper parameter scanner with the goal of optimising the DNN. The exploitation of new new variables to increase the performance of the DNN is also foreseen. The student will integrate the ATLAS LIP Group and will work within an international collaboration.

    Requirements: The student should have some computing skills, namely in Python programming. 

    Search for dark matter in photoproduction in ATLAS

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Helena Santos (DF-FCUL e LIP) e Patricia Conde Muino (IST e LIP)
    Contacto: helena@lip.pt, pconde@lip.pt
    início: --

    Resumo: The understanding of the nature of dark matter, that makes up around 25% the Universe, is one of the most relevant questions in physics nowadays. The ATLAS and CMS experiments, at the Large Hadron Collider (CERN), can shed some light into the question, since dark matter particles can be produced directly in the proton-proton collisions at the highest energy ever produced in a collider (13 TeV). Direct searches performed up to now have not yet shown any evidence of dark matter or new particles. However, favoured theoretical models, such as Supersymmetry (SUSY) in compressed mass scenarios, foresee that the lightest SUSY particle (the dark matter candidate) is nearly degenerated in mass with respect to the next-to-lightest charged SUSY particle, leading to decays that are very hard to select and reconstruct at the LHC [1], and that might have escaped detection up to now.

    The ATLAS trigger system is responsible for the real time analysis and filtering of the 40 MHz event rate produced at the LHC, selecting around 1000 events/second for further storage and analysis. The events that are rejected are lost forever, so the trigger should be very robust and flexible to keep any possible interesting events. The current trigger system, however, is not prepared to select the kind events mentioned above. The objective of this project is to design an innovative trigger strategy that will be able to select dark matter decays when the dark matter particles are produced in the interaction of two photons. The student will be part of the ATLAS Portuguese team participating in ATLAS and his/her work will be integrated in the international team working on photoproduction searches. Frequent presentations at meetings at CERN are expected, either physically at CERN or via videoconference.

    Reference: [1] L.A. Harland-Lang, V.A. Khoze, M.G. Ryskin, M. Tasevsky, LHC Searches for Dark Matter in Compressed Mass Scenarios: Challenges in the Forward Proton Mode, arXiv:1812.04886 [hep-ph].

    Orientadores: Helena Santos (DF-FCUL e LIP) e Patricia Conde Muíño (IST e LIP)

    À procura dos decaimentos mais raros do Universo.

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Alexandre Lindote, Cláudio Silva
    Contacto: alex@coimbra.lip.pt, claudio@coimbra.lip.pt
    início: --

    Keywords: Decaimentos raros, física para além do modelo padrão, física de neutrinos, análise de dados, estudos de sensibilidade
    Orientador(es): Alexandre Lindote (alex@coimbra.lip.pt), Cláudio Silva (claudio@coimbra.lip.pt) Local: Universidade de Coimbra, LIP-Coimbra

    Resumo: Recently, the XENON1T experiment observed the double electron capture (2ν2EC) decay in Xe-124. With a half-life of 1.8x10^22 years (at a 4.4 sigma significance), this direct observation surpasses the age of the Universe by 12 orders of magnitude, making it the rarest decay ever detected. With its larger mass, the LZ detector can achieve a >5 sigma discovery for this decay within a few months of operation. Xe-124's high Q-value enables the opening of the 2ν2β+ and 2νECβ+ channels. While the half-life of the former is too long for LZ to observe (10^27 years), the latter can generate tens to hundreds of events in 1000 days of detector operation (10^23 - 10^27 years). This decay has a unique signature, allowing simultaneous detection of the two gammas from positron annihilation, positron recoil, and atomic de-excitation. Thus, it is virtually background-free and suitable for an analysis targeting event topology exploration. It can potentially become the rarest process ever observed in a laboratory. Furthermore, the neutrinoless mode for the 2EC decay (0ν2EC) offers an alternative approach to understanding neutrinos, complementing 0ν2β in Xe-136. Although the half-life is expected to be orders of magnitude higher than that of 0ν2β, a resonance between the initial and final atomic states could enhance the decay rate by up to 10^6, enabling its observation in LZ. In this project, the student will analyze LZ data to search for 2ν2EC and 2νECβ+ signals and calculate their respective half-lives. Additionally, the detector's sensitivity to neutrinoless 0ν2EC decay will be projected, incorporating various nuclear models to simulate the resulting event topologies and assess relevant backgrounds.

    Leituras sugeridas: XENON Collaboration, Nature volume 568 (2019), pp. 532–535, https://doi.org/10.1038/s41586-019-1124-4, arXiv:1904.11002. LUX Collaboration, “Search for two neutrino double electron capture of 124Xe and 126Xe in the full exposure of the LUX detector”, J.Phys.G 47 (2020) 10, 105105, https://doi.org/10.1088/1361-6471/ab9c2d, arXiv:1912.02742. C. Wittweg et al., “Detection prospects for the second-order weak decays of 124Xe in multi-tonne xenon time projection chambers”, EPJ-C 80 (2020), 1161, https://doi.org/10.1140/epjc/s10052-020-08726-w,arXiv:2002.04239

    Cálculo das previsões do efeito Migdal em átomos leves

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Elías López Asamar
    Contacto: elias.asamar@coimbra.lip.pt
    início: --

    Área: Física de partículas/Física nuclear
    Keywords: Direct detection of dark matter, Migdal effect
    Orientador(es): Elías López Asamar (elias.asamar@coimbra.lip.pt)
    Local: Universidade de Coimbra, LIP-Coimbra

    Resumo: The presence of dark matter (DM) in the universe is considered as one of the most clear indications of the existence of new elementary particles not yet discovered. In this context, direct detection experiments aim to search for DM particles that reach the Earth and interact with a target of ordinary matter. One of the most promising directions to search for DM particles is based on the predicted Migdal effect, namely the expected emission of an atomic electron when the atomic nucleus is perturbed. To date, the theoretical predictions of the Migdal effect have been calculated for the most typical atoms used in direct detection experiments, namely xenon and germanium. However, lighter elements, that might allow to further increase the sensitivity to DM particles, have not been considered yet. The purpose of this project is to study the theoretical predictions of the Migdal effect in such lighter elements, in particular hydrogen (H) and helium (He). From the theoretical point of view, H is interesting because it is a simple system. First, the theoretical predictions will be calculated using an approach similar to that of Ibe et al (see bibliography). And second, these predictions will be used to assess the sensitivity of such atoms to DM particles.

    Leituras sugeridas: - M. Ibe et al., “Migdal Effect in Dark Matter Direct Detection Experiments”, JHEP 03, 194 (2018) [arXiv:1707.07258]

    Caracterização da reflectância do PTFE em função da temperatura (Measuring the reflectance at low temperatures for Dark Matter detection)

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Cláudio Silva, Francisco Neves
    Contacto: claudio@coimbra.lip.pt
    início: --

    Cosmological evidence obtained from galaxy dynamics, weak gravitational lensing, and cosmic microwave background observations has revealed that approximately 85% of the total mass in the universe exists in the form of Dark Matter, an exotic form of matter. Understanding the nature of Dark Matter is one of the most prominent unresolved questions in physics, with its detection representing a crucial scientific pursuit. Direct-detection experiments aim to identify potential Dark Matter particles, such as Weakly Interacting Massive Particles (WIMPs) or Axion-like particles (ALPs). Liquid xenon technology, extensively employed in low-background experiments investigating phenomena like neutrinoless double beta decay (0νββ) and Dark Matter particles (WIMPs), offers a viable approach. Furthermore, this technology enables the construction of highly sensitive neutrino detectors through coherent elastic neutrino scattering, carrying potential applications in nuclear non-proliferation and reactor monitoring. The LUX-ZEPLIN (LZ) detector, boasting a record-setting active mass of 7 tons, represents a significant advancement in xenon-based detection and is poised to lead the field of direct Dark Matter searches in the forthcoming years.

    Future endeavours are underway to develop even more sensitive experiments in the field to push the boundaries of our understanding. The forthcoming XLZD detector is set to be a groundbreaking advancement, featuring a 60-tonne liquid xenon time projection chamber (LXe TPC) at its core. This detector holds the potential to explore the existence of dark matter down to the realm of neutrino fog, where interactions from solar neutrinos become dominant. Achieving a comprehensive understanding and optimal performance of the detector's optical characteristics is critical for data analysis and simulation efforts. Among these optical parameters, the reflectance of the internal surfaces of the detector stands out as the most crucial factor.

    The scintillation of xenon occurs at deep ultraviolet wavelengths (λ=175 nm), typically absorbed by the atmosphere. However, in the low-temperature environment (-100°C) where these detectors operate, the reflectance of PTFE at these specific wavelengths is not well characterized. This project offers an opportunity to investigate and quantify the impact of temperature on PTFE reflectance. To accomplish this, a xenon lamp utilizing an alpha source will emit 175 nm photons for scintillation. At the same time, a Total-Integrating sphere equipped with internal Teflon-coated surfaces will measure the reflectance. A photodetector will collect the light reflected from the sample, and a cold finger attached to the reverse side of the sample will allow temperature control. The outcomes of this project will be integrated into the primary optical simulations of dark matter detectors, playing a crucial role in comprehending and optimizing their optical performance.

    Leituras sugeridas: Projected WIMP sensitivity of the LUX-ZEPLIN dark matter experiment D. S. Akerib et al. (LUX-ZEPLIN Collaboration) Phys. Rev. D 101, 052002 – Published 4 March 2020 Direct Detection of WIMP Dark Matter: Concepts and Status Marc Schumann J. Phys. G46 (2019) no.10, 103003

    Desenvolvimento de métodos de análise de dados para a confirmação experimental do efeito Migdal

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Elías López Asamar, Francisco Neves
    Contacto: elias.asamar@coimbra.lip.pt, neves@coimbra.lip.pt
    início: --

    Área: Física de partículas/Física nuclear
    Keywords: Direct detection of dark matter, Migdal effect, machine learning

    Resumo: The presence of dark matter (DM) in the universe is considered as one of the most clear indications of the existence of new elementary particles not yet discovered. In this context, direct detection experiments aim to search for DM particles that reach the Earth and interact with a target of ordinary matter. One of the most promising directions to search for DM particles is based on the predicted Migdal effect, namely the expected emission of an atomic electron when the atomic nucleus is perturbed. Given the importance of this predicted effect, a team of collaborating institutes that include LIP-Coimbra is developing an experiment to confirm its existence. This experiment will use fast neutrons to induce the Migdal effect in atoms of a gaseous target, and aims to detect the two ionization tracks caused by the perturbed nucleus and the ejected Migdal electron respectively. The purpose of this project is to contribute to develop the data analysis of such experiment. The data provided by the detector will consist of 2D images of the particle tracks, plus the ionization charge collected at the anode, that will contain further position information. This requires to construct algorithms to achieve the following goals: Identify separate tracks in the 2D images, combining information from the collected ionization charge. Discriminate between nuclear recoil tracks and electron tracks. Reconstruct track length, and ultimately the particle energy. These algorithms will be based on machine learning methods. One option that is currently being considered is the DBSCAN method. In addition, this project foresees to study the potential of using other machine learning approaches.

    Leituras sugeridas: - M. Ibe et al., “Migdal Effect in Dark Matter Direct Detection Experiments”, JHEP 03, 194 (2018) [arXiv:1707.07258]

    Estudo da sensibilidade de um detector de xenon de 3ª geração a 0v2ß em Xe-136

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Alexandre Lindote, Paulo Brás
    Contacto: alex@coimbra.lip.pt, pbras@coimbra.lip.pt
    início: --

    Resumo: Neutrinoless double beta decay (0v2ß) is one of the most interesting topics in modern particle physics. Its observation would be proof of new physics beyond the Standard Model (SM), showing that neutrinos are their own antiparticles — and thus Majorana particles. It would also show violation of lepton number conservation, hinting that leptons play a part in the observed matter/antimatter asymmetry of the Universe. Explaining 0v2ß requires SM extensions which can also explain the extremely small mass of the neutrinos (6 orders of magnitude lighter than the electron). If 0v2ß occurs through the exchange of a virtual light neutrino its rate can be associated with a weighted average of the 3 neutrino masses (the so-called effective Majorana mass), which can be used to probe the neutrino mass hierarchy (normal or inverted). A 3rd generation xenon detector, with 50-100 tonnes of xenon, will serve as a Global Rare Event Observatory by the end of this decade with extreme sensitivity to many rare physics phenomena, including 0v2ß in Xe-136. It can improve current best limits by two orders of magnitude, completely excluding the neutrino inverted mass hierarchy region and starting to probe the normal hierarchy scenario, competing with planned dedicated experiments while delivering a much broader scientific program. In this project the student will participate in the early stages of the design of this detector, working on its characterisation, mitigation of backgrounds and optimisation of its sensitivity to 0v2β. This will include the development of a detailed Monte Carlo simulation of the detector to correctly characterise its response (e.g. energy and position resolution) modeling the major background sources and developing realistic strategies to minimize them, in order to maximize the sensitivity to observe this decay.

    Leituras sugeridas: S. Dell´Oro et al., “Neutrinoless Double Beta Decay: 2015 Review”, Advances in High Energy Physics Volume 2016, 2162659, https://doi.org/10.1155/2016/2162659 M. Dolinski et al., "Neutrinoless Double-Beta Decay: Status and Prospects", Ann.Rev.Nucl.Part.Sci. 69 (2019), 219-251, https://doi.org/10.1146/annurev-nucl-101918-023407, arXiv:1902.04097 LZ Collaboration, “Projected sensitivity of the LUX-ZEPLIN experiment to the 0νßß decay of Xe136”, Physical Review C 102 (2020), 014602, https://doi.org/10.1103/PhysRevC.102.014602, arXiv: 1912.04248. “Snowmass2021 LoI: sensitivity of a 3rd generation liquid xenon TPC dark matter experiment to neutrino properties : magnetic moment and 0νßß decay of 136Xe”, https://www.snowmass21.org/docs/files/summaries/NF/SNOWMASS21-NF5_NF0_Matthew_Szy dagis-156.pdf

    Exploração de algoritmos de Machine Learning para discriminação entre acontecimentos de 0v2ß e de fundo em detectores de xenon líquido

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Francisco Neves, Alexandre Lindote
    Contacto: neves@coimbra.lip.pt, alex@coimbra.lip.pt
    início: --

    Área: Física de partículas/Física nuclear
    Keywords: Física para além do modelo padrão, física de neutrinos, machine learning, backgrounds, estudos de sensibilidade

    Resumo: Neutrinoless double beta decay (0v2ß) is one of the most interesting topics in modern particle physics. Its observation would be proof of new physics beyond the Standard Model (SM), showing that neutrinos are their own antiparticles — and thus Majorana particles. It would also show violation of lepton number conservation, hinting that leptons play a part in the observed matter/antimatter asymmetry of the Universe. Explaining 0v2ß requires SM extensions which can also explain the extremely small mass of the neutrinos (6 orders of magnitude lighter than the electron). If 0v2ß occurs through the exchange of a virtual light neutrino its rate can be associated with a weighted average of the 3 neutrino masses (the so-called effective Majorana mass), which can be used to probe the neutrino mass hierarchy (normal or inverted). Two-phase xenon time projection chambers (TPCs) such as LZ (and larger future generation detectors) are world leaders in the field of direct search for dark matter, but can also search for 0vßß in Xe-136, competing with dedicated experiments. The absence of neutrinos in this decay implies that the electrons carry the full energy of the decay, resulting in a mono-energetic peak -- at Q=2458 keV in the case of Xe-136. The main backgrounds for this signal come from interactions of high-energy γ-rays in the decay chains of U-238 and Th-232 in detector materials and even in the laboratory walls. But 0vßß decays have a unique topology, which can be explored to distinguish them from the background interactions: the two electrons, each carrying Q/2, produce two “blobs” at the end of their tracks where the ionisation density is highest. On the other hand, high-energy electrons near Q (generated by the photoelectric effect of γ-rays) produce a single “blob” at the end of their ~3 mm range. Discriminating these two topologies, even if not fully efficiently, can enable a significant gain in sensitivity by reducing most of the backgrounds. This is extremely challenging given the very short range of electrons in liquid xenon, but machine learning algorithms are expected to be able to explore the subtleties in the different signals and provide some degree of discrimination. In this project the student will study the discrimination efficiency of various machine learning algorithms to separate simulated 0vßß decay and background signals in xenon TPC detectors, both in the temporal (z) and spacial (xy) coordinates. The potential gains in using high granularity silicon photomultipliers (SiPMs) as light sensors in future detectors instead of the tradicional photomultiplier tubes (PMTs) will also be studied.

    Leituras sugeridas: S. Dell’Oro et al., “Neutrinoless Double Beta Decay: 2015 Review”, Advances in High Energy Physics Volume 2016, 2162659, https://doi.org/10.1155/2016/2162659 M. Dolinski et al., "Neutrinoless Double-Beta Decay: Status and Prospects", Ann.Rev.Nucl.Part.Sci. 69 (2019), 219-251, https://doi.org/10.1146/annurev-nucl-101918-023407, arXiv:1902.04097 LZ Collaboration, “Projected sensitivity of the LUX-ZEPLIN experiment to the 0νßß decay of Xe136”, Physical Review C 102 (2020), 014602, https://doi.org/10.1103/PhysRevC.102.014602, arXiv: 1912.04248. “Snowmass2021 LoI: sensitivity of a 3rd generation liquid xenon TPC dark matter experiment to neutrino properties : magnetic moment and 0νß decay of 136Xe”, https://www.snowmass21.org/docs/files/summaries/NF/SNOWMASS21-NF5_NF0_Matthew_Szy dagis-156.pdf

    Life prospection on Mars - Simulation of the radiation environment in the Mars subsoil

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Patrícia Gonçalves, Luísa Arruda
    Contacto: patricia@lip.pt, luisa@lip.pt
    início: --

    Objectivos: The student will model the Linear Energy Transfer (LET) spectrum in water of the ionizing field created by incident cosmic rays (GCR) and Solar event particles (SEP) at different depths of the Mars subsoil up to 3m, using the detailed Martian Energetic Radiation Environment Model (dMEREM), a Geant4 simulation developed at LIP for ESA. In addition, the student will simulate the early radiation-induced damage on a simplified model of the DNA molecule due to the most significant components of GCR. The student shall learn: Learn the basis of radiobiology and related concepts: relative biological effectiveness (RBE), quality factors, and radiation weighting factors; Dosimetric and microdosimetric concepts and the relation between them; Implement Monte Carlo simulations with Geant4 including the recent extension Geant4-DNA for simulating the particle track structure in simple models of the DNA molecule; Compute microdosimetric quantities from simulated track lengths and deposited energy distributions in a given volume; Simulate radiation-induced damage on DNA using the Geant4-DNA extension; Establish the relation between the yields of single- and double-strand breaks in the DNA (from energy deposition clustering criteria) and microdosimetric quantities.   Requisitos Entrevista/Interview Localização LIP-Lisboa Observações The characterization of the Martian radiation environment is essential to understand if the planet can sustain life and ultimately if its human exploration is feasible. The major components of the radiation environment in the Mars orbit, are Galactic Cosmic Rays (GCRs) and Solar Energetic Particle (SEP) events. In this work we aim to model the LET spectrum of the ionizing field created by incident cosmic rays at Mars and SEP events at different depths underground. Different scenarios of soil composition, corresponding to different locations on Mars, and epoch dependent input spectra will be defined to assess the subsurface radiation environment at regular depths through the subsurface. This work will focus on the astrobiological implications of the radiation field at regular depths through the subsurface to 2 m depth, which is the ExoMars drill depth or as far as 3 m total. The ExoMars mission will be launched in 2022 and estimated to target in June 2023, an ancient location, interpreted to have strong potential for past habitability and for preserving physical and chemical biosignatures. The mission will deliver a lander with instruments for atmospheric and geophysical investigations and a rover tasked with searching for signs of extinct life. The ExoMars rover will be equipped with a drill to collect material from outcrops and at depth down to 3 m to access to chemical biosignatures. The results will be analysed and interpreted in the context of survivability of both microbial cells and molecular biosignatures. In particular, the use of the Geant4 extension for modelling biological damage induced by ionising radiation at the DNA scale, Geant4-DNA [Inc10], will be investigated to provide information on the biological damage resulting from the predicted radiation field and establish the relation between the yields of single and double strand breaks in the DNA (from energy deposition clustering criteria) and microdosimetric quantities. On the other end, the Perseverance rover (https://mars.nasa.gov/ mars2020), arrived on Mars on the 18th February 2021 to seek for signs of past habitable conditions and of past microbial life in the Martian subsoil. It is equipped with a full suite of scientific instruments which are state-of-the-art tools for acquiring information about Martian geology, atmosphere, environmental conditions, and potential biosignatures. It is equipped with Sherlock, an armmounted, Deep UV resonance Raman and fluorescence spectrometer, capable of fine-scale detection of minerals, organic molecules and potential biosignatures and with SuperCam, a remote-sensing instrument that uses remote optical measurements and laser spectroscopy capable of detect the presence of organic compounds in rocks and regolith from a distance. Given the appropriate scenarios, the simulation can be used to assess the effects of the Martian radiation field overtime, also in terms of conditions for past microbial life in the Martian subsoil, and in the expected effect of the Mars radiation field in the compounds and biosignatures that could be detected by the Perseverance instruments. Additionally, the return of samples from Mars to Earth, with the proposed Mars Return Sample mission from ESA and NASA, will provide an unprecedented opportunity for detailed characterization of the composition and distribution of organic compounds and other potential biosignatures in terrestrial laboratories using state-of-the-art instrumentation with multiple complementary methodologies. Although the Mars Sample Return mission is outside the timeline of this thesis, in the future, the work here developed can be a valuable tool in to support the analysis of these samples from the radiochemical and radiobiological perspective.

    [Inc10] S. Incerti, G. Baldacchino, M. Bernal, R. Capra, C. Champion, Z. Francis, S. Guatelli, P.Guèye, A. Mantero, B. Mascialino, P. Moretto, P. Nieminen, A. Rosenfeld, C. Villagrasa and C. Zacharatou, TheGeant4- DNA project, Int. J. Model. Simul. Sci. Comput. 1 (2010) 157–178

    Orientadores: Patrícia Gonçalves patricia@lip.pt Luísa Arruda luisa@lip.pt

    This work will be performed with the LIP "Space Radiation Environment and Effects Group", in the context of its activities with the European Space Agency (ESA)

    Machine Learning Techniques applied to Particle Energy Spectra Reconstruction of two ESA radiation monitors: MFS and BERM

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico, Universidade de Lisboa
    Orientador(es): Patrícia Gonçalves, Marco Pinto, Luisa Arruda
    Contacto: patricia@lip.pt
    início: --

    ObjectivesDevelop a new unfolding method to obtain both the European Space Agency (ESA) Multi Functional Spectrometer (MFS) measured particle fluxes in GEO orbit and the BepiColombo Radiation Monitor (BERM) on orbit to Mercury, using neural networks.

    The student shall:

    • Learn about radiation environment and effects
    • Learn about radiation monitors and radiation effect monitors for space
    • Follow the whole process of real in-flight data analysis, from the spacecraft output data to the mapping of events.
    • Correlate in-flight data with ground based test data and simulation results.
    • Use the GEANT4 simulation to obtain reliable response functions of the instruments.
    • Develop an unfolding method using machine learning techniques to obtain the in-flight measured fluxes of protons, electrons, helium nuclei and heavier ions from the MFS and BERM measured count rates
    • Extensively apply the reconstruction method to space flight data.

    Observation: This thesis will cover the data analysis of two ESA radiation monitors. The first is MFS, on board of AlphaSAT is the largest ESA telecom satellite, in geostationary orbit (GEO) since July 2013. MFS was built by EFACEC S.A and it is a particle spectrometer capable of measuring electron, proton and ion spectra. LIP is responsible together with EFACEC for its data analysis.

    On the other hand, there is BERM onboard of the BepiColombo is Europe’s first mission to Mercury that was launched on October 20th, 2018. The journey to Mercury will take about seven years and the spacecraft will enter the orbit around Mercury by end of 2025. BepiColombo consists of two spacecrafts, the Mercury Planetary Orbiter (MPO) provided by ESA and the Mercury Magnetospheric Orbiter (MMO) provided by JAXA. As mentioned, the MPO carries the BepiColombo Radiation Monitor (BERM) designed to measure electrons, protons and ions near the spacecraft. BERM was built by EFACEC and IDEAS. LIP joined the effort to check validate the first data. LIP is co-responsible with PSI for in-flight data validation, cross-calibration of BERM with the SIXS instrument, and for developing high-level products.

    BERM was designed to study the high energy particle environment in the inner Solar System.

    Machine Learning techniques for data analysis of the Heavy Ion Counter instrument aboard the Galileo Spacecraft

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Patrícia Gonçalves, Marco Pinto, Elias Roussos (Max Planck Institute for Solar System Physics)
    Contacto: patricia@lip.pt, mpinto@lip.pt
    início: --

    Objectives In this work, the student will develop new algorithms to analyze data of the Heavy Ion Counter instrument aboard the Galileo spacecraft.

    The work includes:

    Obtaining the HIC detector response functions using Geant4 Using k-Nearest Neighbor and/or other machine learning techniques for particle identification and flux reconstruction. Analysis of the flight data regarding Galactic Comic Ray propagation in the Solar System and/or the Jovian system

    Requirements: CV and Interview

    Location: LIP-Lisboa

    Observations:

    The Galileo spacecraft was the first dedicated mission to the Jovian system. Launched in November 1989, it arrived at Jupiter in December 1995. It remained active until September 2003 improving our knowledge of the largest planet in the Solar System including its harsh radiation environment. The mission carried the Energetic Particle Detector (EPD) and the Heavy Ion Counter (HIC). While the former led to many publications the latter has yet to be fully explored. The HIC is optimized to measure the energy spectra and charge composition of oxygen, sulfur, and sodium in the Jovian magnetosphere from ~ 5 MeV/nucleon to ~ 200 MeV/nucleon. HIC allowed to measure Galactic Cosmic Rays during the cruise phase and the Jovian orbit. While HIC response functions are available, these are based on simplified detector models which are not publicly archived. Furthermore, current state-of-the-art particle transport toolkits such as Geant4 allow to increase the precision of the response functions significantly. Combining these with advanced analysis techniques such as machine learning k-Nearest Neighbor for particle identification will allow to study Galactic Cosmic Ray gradient as measured during the cruise phase, magnetic attenuation of the GCR fluxes at the Jovian system, etc, critical aspects of future Space missions such as the European Space Agency (ESA) JUpiter ICy moons Explorer (JUICE) mission as well as for astrobiology studies in the icy moons of Jupiter

    Orientadores: Patrícia Gonçalves: patricia@lip.pt Marco Pinto: mpinto@lip.pt Elias Roussos (Max Planck Institute for Solar System Physics)

    This work will be performed with the LIP "Space Radiation Environment and Effects Group", in the context of its activities with the European Space Agency (ESA)

    Potencial para estudar portais de neutrinos à matéria escura no âmbito da detecção directa

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Elías López Asamar
    Contacto: elias.asamar@coimbra.lip.pt
    início: --

    Área: Física de partículas/Física nuclear
    Keywords: Dark matter, neutrino portal, direct detection Orientador(es): Elías López Asamar (elias.asamar@coimbra.lip.pt)

    Resumo: The presence of dark matter (DM) in the universe is considered as one of the most clear indications of the existence of new elementary particles not yet discovered. In this context, direct detection experiments aim to search for DM particles that reach the Earth and interact with a target of ordinary matter. Direct detection experiments are designed to measure recoils of atomic nuclei (nuclear recoils, NR) or atomic electrons (electron recoils, ER), that would be caused by the interaction of DM particles in the target of ordinary matter. For this reason, these experiments have traditionally provided constraints on the coupling between DM particles and nucleons, or electrons in some cases. However, direct detection experiments have not yet addressed scenarios where DM particles only couple to neutrinos (neutrino portals). In this case, DM particles can still produce NRs through neutrino loops. The purpose of this project is to assess, for the first time, the potential of direct detection experiments to study the neutrino portals to DM using this approach. First, it will be necessary to obtain the equation relating the DM-neutrino coupling to the observed NR rate, at first order, evaluating the diagram above. Higher-order corrections will be considered. And second, this result will be used to conduct one of the following studies: Calculate the sensitivity of direct detection experiments to DM particles assuming neutrino portals. Use existing data to obtain constraints on neutrino portals

    Preparation of the Proto-Flight Model of the Radiation Hard Electron Monitor for the ESA JUICE mission

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Patrícia Gonçalves, Marco Pinto
    Contacto: patricia@lip.pt, mpinto@lip.pt
    início: --

    Objectives The student will participate in the calibration and development of algorithms in anticipation of RADEM’ launch in 2022 aboard the ESA JUICE mission. The main tasks include: Analysis of beam calibration data of RADEM detectors Detection channel optimization and computation of the response functions of the detectors using the Geant4 toolkit Development of flux reconstruction techniques (optional Requirements: CV and Interview Location LIP-Lisboa Observations: The Jovian system is known to be outstandingly complex with its extremely hazardous and highly dynamic radiation environment. Therefore, its rigorous, accurate exploration, as well as profound understanding is enormously valuable for answering questions on planet formation and emergence of life. One of the biggest challenges for the ESA JUICE mission is to measure and handle the compound, intense and highly penetrating radiation environment of Jupiter and its active moons. Based on previous data, it has been long understood that Jupiter radiation field plays a decisive role in radiation damage scenarios for the whole spacecraft and all its payloads. Due to its excessive features such as very high fluxes and wide range of energies, it also drives detection principles for science instruments and in particular for radiation monitors. In this context, a comprehensive, reliable and accurate monitoring of the radiation onboard of the JUICE mission is a major challenge and a high priority task. It is crucial for safe operation and continuity of the mission, as well as for the scientific data analysis support. RADEM is the Radiation Hard Electron Monitor, for JUICE (JUpiter Icy moons Explorer), the next European Space Agency Large mission, that is currently under development by an international consortium, lead by EFACEC (a Portuguese industrial partner) and in which LIP, a group from the Paul Scherrer Institute –PSI, in Switzerland, and IDEAS, a company from Norway - take part, under a contract with the European Space Agency. RADEM consists of four detectors heads: the Electron Detector Head (EDH) that will measure electrons from 0.3 to 40 MeV, the Proton Detector Head (PDH) which will measure protons from 5 to 250 MeV, the Heavy Ion Detector Head which will separate Ions up to 670 MeV, and the Directional Detector Head (DDH) that will measure electron angular distributions. Currently the flight model is under construction and its calibration should be performed at the end of 2020 / beginning of 2021. The JUICE mission scheduled for launch in 2022. As a housekeeping instrument, RADEM will be on during the interplanetary cruise phase that will take ~7.5 years. This provides an incredible opportunity to study the interplanetary radiation environment including Solar Energetic Particles (SEP), stochastic high fluxes of radiation, Galactic Cosmic Rays (GCRs, as well as to intercalibrate with other radiation monitors measuring the Van Allen radiation belts during its Earth flybys.

    Suggested reading: [1] M. Pinto et al., "Development of a Directionality Detector for RADEM, the Radiation Hard Electron Monitor Aboard the JUICE Mission," in IEEE Transactions on Nuclear Science, vol. 66, no. 7, pp. 1770-1777, July 2019. DOI: 10.1109/TNS.2019.2900398. M. Pinto et al., “Beam test results of the RADEM Engineering Model,” Nucl. Inst. and Meth. A, Vol. 958, 2020, 162795. DOI: 10.1016/j.nima.2019.162795

    This work will be performed with the LIP "Space Radiation Environment and Effects Group", in the context of its activities with the European Space Agency (ESA)

    Radiological risk assessment for Human Space Flight to Mars

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico, Universidade de Lisboa
    Orientador(es): Patrícia Gonçalves, Luísa Arruda
    Contacto: patricia@lip.pt, luisa@lip.pt
    início: --

    Objectives 

    The student will use the galactic cosmic ray fluxes expected on transit to Mars for different mission scenario and different epochs and take into account the occurance of a possible solar event during the space travel to estimate the:

    • Dose in astronauts and to assess the radiological risk they will be exposed;

    • Ambient Dose Equivalent (ADE) and Effective Dose (ED) from dMEREM a Geant4 based model.

    • Critical organ equivalent dose (ED) also using phantoms implemented with the G4PhantomBuilder class reference.

    • Radiological risk for men and women astronauts based on the model risk of exposure-induced death (REID) currently adopted by NASA for the stochastic effects.

    In addition, the same radiological measures will be estimated for the astronaut permanence at Mars surface. In this case the fluxes will be extracted from the detailed Martian Energetic Radiation Environment Model (dMEREM) developed at LIP for ESA.

    Observations 

    The characterization of the Martian radiation environment that astronauts will find during a future mission to Mars is essential to understand and plan future crewed missions to the red planet. The major components of the radiation environment in the Mars orbit, are Galactic Cosmic Rays (GCRs) and Solar Energetic Particle (SEP) events.

    Mars has an almost negligible magnetic field and a thinner atmosphere compared to the Earth's atmosphere. This fact determines that planet's surface is highly exposed to those primary sources, as well as, to secondary particles originated both in atmosphere and in the shallow layers of its soil.

    In this work will be estimated the absorbed dose and the corresponding radiation risk to the human explorers both in transit and in the surface of Mars due to the exposure to induced radiation by GCR and sporadically by SEP events resulting from transient solar events such as flares and coronal mass ejections. For this, known SEP events and forecasts models will be used to characterize the events. The ground-level radiation environment will be simulated with the dMEREM application at possible landing and human exploration sites and the GCR fluxes and possible SEPs during transit extracted from known models for different mission epochs.

    The Ambient Dose Equivalent (ADE) will be calculated for input GCR and SEP particle spectra for different solar cycle conditions. Calculations of the critical organ equivalent dose (ED) will be performed using phantoms implemented with the G4PhantomBuilder class reference. This class allows for detailed Monte Carlo simulations with phantoms based on the MIRD anthropomorphic model but with the possibility to distinguish between female and male phantoms. These values will be compared with the deterministic effects limits currently adopted by NASA for exposure of space crews. A distinct risk analysis for men and women astronauts will be made based on the model risk of exposure-induced death (REID) currently adopted by NASA for the stochastic effects.

    The relation between ADE and ED will allow to establish a dependence between the ambient dose measurements and the astronaut exposure risk. Shielding protection of future habitats can be estimated.

    [1] Borak, T. B. et al., Quality factors for space radiation: A new approach, Life Sciences in Space Research, Vol 1, 2014, pp 96-102,https://doi.org/10.1016/j.lssr.2014.02.005.

    [2] https://humanreasearchroadmap.nasa.gov/evidence/reports/Cancer.pdfcern

    Design and study of a tissue equivalent plastic dosimeter for applications in hadrotherapy and space

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Patrícia Gonçalves, Jorge Sampaio
    Contacto: patricia@lip.pt, jsampaio@lip.pt
    início: --

    Introduction Microdosimetry studies the effects of radiation at the cellular level to infer its effects on the scale of organs and tissues. Therefore, in experimental microdosimetry it is necessary to know the energy distribution at the scale of a cell, at which scale quantities such as the energy deposited per unit length and per unit mass are stochastic. The design of an ideal microdosimeter shall take following criteria into account: (i) the sensitive volume shall be well defined in order to be able to accurately determine the average length of the chord of that volume; (ii) the detector walls must have densities similar to the sensitive volume (SV) in order to avoid secondary radiation contributions that do not exist in the actual tissues; (iii) both the SV and the walls must be made of materials with similar compositions and tissue equivalents; (iv) the dosimeter shall have excellent spatial resolution in order to allow the paths of μm-scale energy deposition to be followed. The most common type of detector used for microdosimetry is the equivalent proportional tissue counter (TEPC). In these proportional counters, an equivalent fabric gas is used whose pressure (density) is equal to the energy lost in the volume of the cellular tissue. Although the existing TEPCs fulfill many of the above requirements, they have some important limitations: i) they are too large to model a cell matrix (limited spatial resolution); require the supply of a gas, which is expensive; iii) are affected by wall effects due to the difference in density between the SV gas and the plastic equivalent of fabric used on the walls; (iv) require high operating voltages; v) due to polarization effects, there is a dependence of the stopping power of the gas with the density, which makes equivalence with the energy deposited at the cellular scale impossible. Since the late 1990s, Si-based detectors have been studied for the construction of microdosimeters that avoid some of the problems of TEPC: i) operate at low voltages; ii) do not require gas supply; and above all iii) can be implemented with SV at the μm scale in matrices that mimic a set of cells. Although there have been several advances in the design, implementation, and testing of medical applications and the space of Si-based microdosimeters in recent years, there are still important limitations: i) SV can not be as well defined as necessary; ii) the charge collection efficiency limits the quality of the detector response due to diffusion in the depletion regions; iii) the conversion of the energy deposited in Si to tissue equivalent is much more complex due to the strong dependence with energy of the ratio between stopping powers in Si and Tissue. Objectives The objective of this work is to explore the possibility of using optical fibres and plastic scintillators in microdosimetry, since these have characteristics that together can overcome Tissue Equivalent Proportional Counters and Si-based microdosimeters. These characteristics are: i) their low voltage operation; ii) no gas supply is required; iii) the Sensitive Volume dimensions can be well characterized;iv) their density (~1 g / cm 3) and composition are favourable in terms of wall effects and tissue equivalence; v) the optical fibres and scintillators can be assembled into fibre bundles or plastic scintillator matrices, allowing for a good spatial resolution

    Requisitos: Good programming skills (C++, GEANT4, etc.) are an advantage for this thesis

    Development of a high voltage system for the ATLAS Tilecal calorimeter and performance tests with high energy particles

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Guiomar Evans, Agostinho Gomes
    Contacto: --
    início: --

    This thesis proposal focus on the development, implementation and integration of a high voltage (HV) system to be used in the Tilecal calorimeter for the operation in the future High Luminosity LHC (HL-LHC) environment (upgrade Phase II). Tilecal, the ATLAS hadron calorimeter in the barrel region is a key sub-detector used for the measurement of jets energy, missing transverse energy, identification of muons with low transverse momentum and for other physics’ tasks. All the Tilecal electronics is being upgraded for the HL-LHC operation. The current HV system is based on a HV distributor system located in the detector that receives a single HV per module as input and regulates the individual voltages to be applied locally to each photomultiplier tube (PMT). This concept has been in use in the Tilecal, and it has the important advantage of not requiring (a large number of) 100-150 m long cables from the HV regulator to the PMTs. In the upgrade, the HV system will be replaced since it has two drawbacks: the radiation damage and the impossibility of replacement of boards if a problem occurs since access is difficult and is possible only at the LHC shutdowns. The new HV system for the upgrade Phase II is a remote one. The HV boards will fit in crates that provide low voltage and that are located outside the ATLAS main cavern. They use standard electronics components since they are not located in the areas exposed to radiation inside the detector but this solution has the drawback of requiring a large number of long cables. A new control and monitoring system is being developed. The HV regulation and the respective control system designs are being finalized and the prototypes will be tested in the calorimeter environment in setups similar to the one of the ATLAS detector, using beams of high energy pions, electrons and muons at CERN and likely also in a special module that will operate in the ATLAS detector. These tests will allow to access the performance of the new Tilecal electronics that is being developed for the HL-LHC. Linearity, stability and noise of the developed system will be measured in the tests with beams of high energy particles and whenever possible in the ATLAS detector during the LHC Run 3 that will start in 2021. A special prototype called demonstrator featuring the new electronics for Phase II will be produced and it is expected to be inserted in the ATLAS detector for regular operation during LHC operation in Run3. This full integration in ATLAS will allow to test the system in the normal conditions of operation before the Phase II upgrade takes place and will give useful information about the performance for the future operation in Phase II.

    Development of a Silicon Stack Based Charged Particle Spectrometer

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): --
    Contacto: --
    início: --

    Development of fast parallel trigger algorithms for the ATLAS experiment at CERN

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Patricia Conde Muino
    Contacto: --
    início: --

    The LHC is the highest energy particle accelerator ever built. The gigantic ATLAS experiment records proton and ion collisions produced by the LHC to study the most fundamental matter particles and the forces between them. At the nominal rate of the LHC, the proton bunches cross 40 million times per second, producing up to 40 collisions per bunch crossing. This event rate is processed in real time by the ATLAS Trigger and data acquisition system, that analyse the results of the collisions, as registered by the 10^8 electronic channels of the detector to select a maximum rate of about 1000 events/second for further offline storage and analysis. In 2025-26, the LHC will be upgraded to increase the LHC collision rate up to a factor 7, to allow acquiring a huge amount of data and pushing the limits of our understanding of Nature. The collision rate and data volume after the LHC upgrade will impose extreme demands on the ATLAS trigger system. The estimated increase in trigger rates and event size lead to larger data volume and much longer event reconstruction times, that are not matched by the slower expected growth in computing power at fixed cost. This requires a change in paradigm, increasing parallelism in computer architecture, using concurrency and multithreading and/or hardware accelerators, such as GPUs or FPGAs for handling suitable algorithmic code. The objective of this thesis program is develop novel calorimeter clustering and jet reconstruction algorithms using accelerators such as FPGAs or GPUs, for the Phase II Upgrade of the ATLAS Trigger system. The student will work in close collaboration with researchers from LIP computing center and particle physics groups, as well as with colleagues from CERN and other institutions involved in the ATLAS experiment. This work will be developed in an international environment. The student will contribute to the improvement of the jet trigger system of the ATLAS experiment at CERN. Presentations at ATLAS Collaboration meetings are expected, either by video-conferencing or at CERN.

    Heavy Flavour Jets Production in Heavy Ion Collisions at the High Luminosity LHC with the ATLAS Detector

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Helena Santos, Patricia Conde Muino
    Contacto: --
    início: --

    Context: Ultrarelativistic nucleus-nucleus collisions at the Large Hadron Collider (LHC) provide an unique opportunity to recreate the Quark Gluon Plasma (QGP) in the laboratory energy frontier. This plasma of quarks and gluons, which is known to behave as a nearly perfect liquid, was the prevailing state of the Universe shortly after the Big Bang. The capabilities of ATLAS, namely large acceptance and high granularity calorimeters, afford excellent handles for QGP studies. The ATLAS experiment is strongly committed with the HI program of LHC and great expectations on the capabilities of the Upgrade to bring deeply understanding on the nature of the QGP are raised. A major goal of the Heavy Ion Program of the LHC is the understanding of the effects of the QGP on jets, namely the study of the nature of the energy loss suffered by the quarks and gluons while crossing the QGP. The bottom quark, in particular, constitutes an excellent probe. Due to its large virtuality, Q, it has a formation time,∆t ≈1/Q ≈ 0.1 fm/c, much smaller than the formation time of the QGP at the LHC (≈10 fm/c). The understanding of the nature of the energy loss (either collisional or gluon radiative), by its turn, is crucial to infer the properties of the QGP. The HI ATLAS/LIP group is contributing with important developments preceding the b-jet physics analysis, namely on b-jet reconstruction, b- tagging and b-jet triggers in HI collisions.  Objectives: The goal of the proposed thesis is the prospective study of the Heavy Flavour jets production in the HL-LHC phase (expected to start in 2027) benefiting from the 1 order of magnitude increased luminosity foreseen for the Pb+Pb runs. The most important ATLAS upgraded components for the proposed project are the calorimeters front-end electronics and the new tracker detector. Currently jets are reconstructed using the transverse energy of calorimeter towers (piled cells) as input signals, after subtracting the QCD underlying event. The new readout electronics of the calorimeters will provide support for a more sophisticated detector signal processing. The remaining part of jet reconstruction regards the identification of the b-jets, i.e. b-tagging. This technique aims the highest possible efficiency for tagging b-jets, with the largest possible rejection of light jets. In ATLAS the most used techniques take advantage of the relatively long lifetime of hadrons containing bottom quarks (ctau 450 mm), as well as of the hard fragmentation and the high mass of these hadrons. These properties lead to tracks in the ITk with large impact parameters (i.e., transverse and longitudinal distances of closest approach of the track to the primary and secondary vertices), on contrary to the tracks from light jets. Such a feature allows to disentangle heavy flavour jets from light jets, but it requires excellent impact parameter resolution. This is ensured by the ITk. Machine learning techniques using the properties of both the impact parameters and the secondary vertices have proven to increase significantly the b-tagging performance in pp collisions and the development in Pb+Pb is ongoing. Analysis of data taken in Run 2 and Run 3 of LHC will provide not only results on Heavy Flavour jets in HI collisions per se, but will also contribute preciously to the validation of the prospective study in the HL-LHC.  Requirements: This is an experimental PhD program. The work will be developed at LIP – Laboratorio de Instrumentacao e Fisica Experimental de Particulas. The student should have solid computing skills, namely in C++ programming. Furthermore, she/he will concurrently participate in the technical activities in which the ATLAS/LIP group is involved, namely in the Tile calorimeter and/or in Trigger systems.

    Measurement of top quark rare decay t->sW at ATLAS

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Filipe Veloso, Ricardo Gonçalo
    Contacto: --
    início: --

    The top quark is the heaviest elementary particle known, with a mass close to the electroweak symmetry breaking, and it can provide clues about the symmetry breaking and the Higgs mechanisms. It is thus an excellent object to test the Standard Model of Particle Physics (SM). There is an important effort to study the top quark properties, like its mass, production cross-sections, electric charge, spin, decay asymmetries, rare decays, etc… Deviations from SM predictions of the production and decay properties of the top quark provide model-independent tests for physics beyond the SM. According to the SM, the top quark decays nearly 100% of the time to a W boson and a b quark. The Cabbibo-Kobayashi-Maskawa (CKM) quark mixing matrix is related to the rates of the Flavour Changing Charged Current (FCCC) decay modes. Some of the elements, e.g. Vts, were not yet directly measured but are determined from the unitarity conditions of the matrix. A direct measurement of these elements put strict conditions on the assumptions behind the matrix properties, as the existence of only three families on the SM. This research program will be developed within the Portuguese ATLAS group. It aims to measure the decay rate of the top quark into a W boson plus a s-quark with LHC data collected by the ATLAS detector using computational tools such as machine learning techniques. This result will then be used to measure the amplitude of the CKM element Vts. In addition the student will participate in the maintenance and operation of the ATLAS detector, namely in the calibration of the TileCal hadronic calorimeter. Short periods at CERN may also be required in order to collaborate in working meetings and/or shifts.

    Neutrino Physics with SNO+: Background characterisation of scintillator data

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Valentina Lozza, José Maneira, Nuno Barros, Sofia Andringa
    Contacto: vlozza@lip.pt, maneira@lip.pt, nbarros@lip.pt, nbarros@lip.pt
    início: --

    Overview of SNO+ SNO+ is an underground multi-purpose neutrino detector located in Sudbury, Canada. It is the successor of the Sudbury Neutrino Observatory (SNO), that has demonstrated flavor change in solar neutrinos, leading to the 2015 Nobel Prize in Physics. The main goal of SNO+ is the search for neutrinoless double-beta decay, using Te-130, in order to determine if neutrinos are Majorana particles, and gain information about their absolute mass. Other topics are reactor neutrino oscillations, solar and geo-neutrino detection, Supernova neutrinos and dark matter search. The SNO+ detector is located in a mine in Sudbury at a depth of 2000 m. It has about 800 tons of liquid scintillator enclosed in a spherical acrylic vessel, and the light is detected by 9300 photomultiplier tube (PMTs) at 8.5 m distance from the center. The initial SNO+ phase, with ultra-pure water in the inner volume, started in 2017 and took data for about two years. The replacement of the water inside the vessel with liquid scintillator started in July 2019 and was completed as of March 2021. The follow up operation is the recirculation of the liquid scintillator to fully remove all water traces (drying), add the fluor to increase the light output, and improve its purity. The double-beta decay phase will follow with the loading of the scintillator with several tons of Tellurium. Research work at SNO+ with the LIP group The LIP group has several responsibilities within the SNO+ experiment being involved both in calibration and physics analyses. At the present date, the group is involved in the calibration of the detector with optical and gamma sources, and in the analysis of both the completed water and the ongoing scintillator phases. We currently focus on the backgrounds, antineutrino and solar physics. Proposed thesis topic: The background characterisation of the scintillator data is extremely important for all the physics topics of the SNO+ experiment. A constant monitoring of the background’s rates allows a prompt reaction in case of variations, or unexpected events, are detected. In this way the corresponding source can be identified and possibly reduced. Additionally, the intrinsic contaminants in the scintillator are measured and compared to expectations, checking the efficiency of the scintillator purification plant. In this thesis topic it is proposed to analyse the scintillator data of SNO+ to understand the spatial and time distribution of the two most common sources of background in a liquid scintillator experiment: Radon-222 and Po-210. We encourage the students to contact the proponents, in order to draft a work plan according to their preference.

    Neutrino Physics with SNO+: Searching for antineutrinos in the scintillator data

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Sofia Andringa, Valentina Lozza, José Maneira, Nuno Barros
    Contacto: vlozza@lip.pt, maneira@lip.pt, nbarros@lip.pt, sofia@lip.pt
    início: --

    Overview of SNO+ SNO+ is an underground multi-purpose neutrino detector located in Sudbury, Canada. It is the successor of the Sudbury Neutrino Observatory (SNO), that has demonstrated flavor change in solar neutrinos, leading to the 2015 Nobel Prize in Physics. The main goal of SNO+ is the search for neutrinoless double-beta decay, using Te-130, in order to determine if neutrinos are Majorana particles, and gain information about their absolute mass. Other topics are reactor neutrino oscillations, solar and geoneutrino detection, Supernova neutrinos and dark matter search. The SNO+ detector is located in a mine in Sudbury at a depth of 2000 m. It has about 800 tons of liquid scintillator enclosed in a spherical acrylic vessel, and the light is detected by 9300 photomultiplier tube (PMTs) at 8.5 m distance from the center. The initial SNO+ phase, with ultra-pure water in the inner volume, started in 2017 and took data for about two years. The replacement of the water inside the vessel with liquid scintillator started in July 2019 and was completed as of March 2021. The follow up operation is the recirculation of the liquid scintillator to fully remove all water traces (drying), add the fluor to increase the light output, and improve its purity. The double-beta decay phase will follow with the loading of the scintillator with several tons of Tellurium. Research work at SNO+ with the LIP group The LIP group has several responsibilities within the SNO+ experiment being involved both in calibration and physics analyses. At the present date, the group is involved in the calibration of the detector with optical and gamma sources, and in the analysis of both the completed water and the ongoing scintillator phases. We currently focus on the backgrounds, antineutrino and solar physics. Proposed thesis topic: SNO+ is expected to detect hundreds of electron antineutrinos per year, coming from the Earth’s crust and mantle and nuclear reactors, namely those in Canada. The first will give information for modelling the Earth, and the second allow to measure fundamental neutrino oscillation parameters. These antineutrinos create a clear signal, a coincidence between a positron and neutron event in the detector, and so they can be identified in the presence of high backgrounds. In this thesis topic we propose to analyse the first scintillator data of the fully filled SNO+ detector to search for antineutrinos and understand the possible variation of background sources presenting similar coincidences. It should lead to a first measurement of the reactor antineutrino flux in SNO+.

    Next-generation Neutrino Physics: Development of the DUNE laser-based Calibrations

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: FCUL
    Orientador(es): José Maneira, Nuno Barros
    Contacto: maneira@lip.pt, barros@lip.pt
    início: --
    Overview of DUNE DUNE – the Deep Underground Neutrino Experiment - plans to build a set of 4 large detectors based on Liquid-Argon (LAr) at the Sanford lab in the USA, to measure oscillations of high intensity neutrino and anti-neutrino beams directed from Fermilab, 1300 km away. Its main goal is the discovery of leptonic charge-parity (CP) violation, a major priority in Particle Physics today, contributing to shed light on the matter/anti-matter asymmetry of the Universe. The large detector mass (40 ktons) will provide also an interesting possibility for non-beam observations: namely, they should be able to detect lower energy neutrinos from Supernova explosions or signals of nucleon decay. The expected start date for the Fermilab beam data-taking is 2026. However, two smaller scale DUNE prototypes (ProtoDUNEs) were built at CERN and are being used to further understand the technology and to guide the design of the large scale far detectors. Each of these prototypes implements a different approach of the liquid argon TPC (LArTPC) technology and aims to demonstrate how each technology performs by using particles from charged particle beams to emulate the final states of neutrino interactions, cosmic rays and, in the future, calibration sources. Research work at DUNE with the LIP group The LIP group has responsibilities in both the far detectors and the prototypes at CERN, being involved both in instrumentation and physics analyses. Presently, the group is involved in the implementation of the laser and neutron calibration systems for ProtoDUNE and DUNE, the trigger system of ProtoDUNE and data analysis of existing ProtoDUNE data. DUNE is a very complex detector, using a technology that is not fully characterized, and with a very broad physics program spanning from accelerator neutrino oscillations, atmospheric neutrinos, nucleon decay, and sensitivity to Supernovae. This allows ample opportunities for research topics leading to Master’s theses, both in technology and instrumentation and in physics analyses. Development of the DUNE laser-based Calibrations For this thesis topic, we propose the development of new methods for detector calibration, i.e., to fully characterize the response of the detector to known signals, so that we can better understand the neutrino events. The discovery of CP violation in neutrino oscillations requires knowing the neutrino event's energy with very good precision, about 2%. We measure energy in DUNE by collecting electrons in wire plane assemblies after they have drifted for a several meters in ultra-pure liquid argon and in an intense electric field. Together with other institutes – Los Alamos in the USA, CERN – we propose to use intense laser beams to characterize the electric field and the electron transport in the argon, providing an essential ingredient for the simulation and reconstruction of the neutrino events. Before building the final setup, the design of these calibration systems needs to be tested and developed at the ProtoDUNE prototypes at CERN, and this activity is where this Master's work plan in integrated. Since the installation of the system at CERN will start in early-2022, the work plan will consist of three phases: implementation of the laser beams in the ProtoDUNE simulations; development of the data analysis methods to measure the electric field and electron transport in the argon with the laser beam simulations installation and preparation of the laser setup at CERN, in preparation for the first running. This work plan will integrate the student in the largest ever Neutrino Physics collaboration, a very international and stimulating scientific environment, with activities in two major world-class Particle Physics laboratories: CERN and Fermilab.

    Operating the ESA Radiation Hard Electron Monitor across the Solar System

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): --
    Contacto: --
    início: --

    Probing the Electroweak Vacum with di-Higgs production at the ATLAS experimente at CERN

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Ricardo Gonçalo, Konstantinos Nikolopoulos, Filipe Veloso
    Contacto: --
    início: --

    Since its discovery, the Higgs boson became a prime tool to search for physics beyond the Standard Model (SM). At the current level of precision, the Higgs boson is compatible with SM expectations. A number of open questions suggest the existence of new physics, that could be unveiled as we explore the LHC data. A wealth of experimental results from the ATLAS and CMS experiments, probe the region around the minimum of the Higgs potential, or vacuum. But the shape of this potential is not constrained experimentally. This shape is intimately connected to the breaking of the electroweak gauge symmetry, which resulted in the fundamental forces we experience today. To experimentally constrain this shape we must measure the Higgs boson self-coupling, which is accessible at the LHC through the simultaneous production of two Higgs bosons. The selected student will join the Portuguese ATLAS team, working in close collaboration with theorists and integrated into our international collaboration. He or she will be able to contribute to enhancing our current knowledge in this important area, which will become one of the most important measurements of LHC experiments. The student will also employ the latest theory developments and the most recent advances in reconstruction techniques: from boosted object identification to machine learning. Part of this research will be done at the University of Birmingham, in the United Kingdom, in co-supervision with a colleague from the ATLAS collaboration. The successful student will be able to participate in the operation of the ATLAS experiment during the LHC Run 3 to start in 2021, and travel to CERN will be required. 

    Signals from Beyond: Search for Anomalies in the LHC Data with the ATLAS Experiment at CERN

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Minho
    Pólo: LIP-Minho, Instituição de ensino: Universidade do Minho
    Orientador(es): Rute Pedro, Nuno Castro
    Contacto: --
    início: --

    The Standard Model (SM) of Particle Physics is notably descriptive and predicted new particles well in advance, from which the Higgs boson discovered at CERN’s Large Hadron Collider (LHC) is a remarkable recent case. However, there is paramount evidence for the need of beyond-Standard Model (BSM) physics, namely to provide dark matter candidates, explain the matter/dark-matter asymmetry, address the hierarchy problem and others. The LHC has a rich program on searches for New Physics (NP) but clues from new particles or interactions have not yet been located. Typical searches are guided by specific BSM candidates and would benefit from a complementary model-independent strategy, augmenting the scope of searches to signs of NP not even framed by theory. This proposal is to perform a novel generic search for NP within the ATLAS/LHC experiment using anomaly detection (AD) techniques based on generative Deep Learning (DL). The DL model will learn SM physics from simulated data and then look for anomalous non-SM like events in the real collision data. Detector effect anomalies can mislead the NP detection and a detailed study of this background will be considered to construct a high fidelity AD. Moreover, the impact of sources of theoretical and experimental uncertainties on the AD performance will be assessed. Benchmark NP signals will be used as tests throughout the AD development. The project will be integrated into the ATLAS Portuguese group, and collaboration with several international groups is foreseen. Synergies with the LIP Competence Centre for Simulation and Big Data and with the LIP Phenomenology group will be explored, namely to investigate approaches for experimental result interpretability and recasting into theory exclusion limits.  

    Simulation of Galactic Cosmic Rays (GCR) induced damage on DNA molecules

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Patrícia Gonçalves, Jorge Sampaio
    Contacto: patricia@lip.pt, jsampaio@lip.pt
    início: --
    Objectives: The student will simulate the early radiation-induced damage on a simplified model of the DNA molecule due to the most significant components of GCR. The student shall learn: • Learn the basis of radiobiology and related concepts: relative biological effectiveness (RBE), quality factors, and radiation weighting factors; • Dosimetric and microdosimetric concepts and the relation between them; • Implement Monte Carlo simulations with Geant4 including the recent extension Geant4-DNA for simulating the particle track structure in simple models of the DNA molecule; • Compute microdosimetric quantities from simulated tracklengths and deposited energy distributions in a given volume;  • Simulate radiation-induced damage on DNA using the Geant4-DNA extension; • Establish the relation between the yields of single- and double-strand breaks in the DNA (from energy deposition clustering criteria) and microdosimetric quantities. Requirements:  Interview Location:  LIP-Lisboa Observations: The major source of uncertainties in radiological risk assessment for manned space missions is the insufficient knowledge of the effective biological response (RBE) of organs and tissues in the energy range and for the ions that make up the space radition environment. This difficulty arises from the complexity of the physical, chemical and biological processes resulting from the interaction of secondary radiation at the cellular and subcellular level that are still poorly understood. Current RBE values result from a consensus based on a large number of radiobiology experiments (on cells, microorganisms and animals) performed for various radiation qualities (ie, particles and energies). However, despite the large number of experiments carried out in recent decades and even considering those planned for current and future particle accelerators, it is not and will not be possible to comprehensively cover the particle spectrum and range of energies in space radiation. In recent years, Monte Carlo applications are beeing developed for simulation of radiation-induced damage at the cellular and subcellular level, namely, at the level of the DNA molecule. These tools may play a very useful role in the evaluation of the risk of exposure to the radiation in space. Orientadores: Patrícia Gonçalves; patricia@lip.pt Jorge Sampaio, jsampaio@lip.pt This work will be performed with the LIP "Space Radiation Environment and Effects Group", in the context of its activities with the European Space Agency (ESA) Cursos: MEFT

    Study of the radiation field of the CTN Cobalt-60 irradiation facility

    Linha de Investigação:
    Ambientes de radiação e aplicações para missões espaciais
    --
    70
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Instituto Superior Técnico
    Orientador(es): Patrícia Gonçalves
    Contacto: patricia@lip.pt
    início: --
    Objectives:  Perform a detailed study of the radiation field of the CTN Co-60 irradiation facility using the Geant4 simulation toolkit (C++). Compare and validate the simulation results with dosymetric in-site measurements. This characterization is of extreme importance for the planning of radiation testing of EEE components, materials and systems using these facilities. The student shall: - learn about radiation interactions with matter - dosimetry - Develop a Geant4 simulation model for the facilities - Map the radiation field of the Co facility - Validate the simulation results with in-site dosymetric data. Requirements:  Location:  LIP-Lisboa and CTN (Campus Tecnológico Nuclear) Observations  Scope: Radiation Physics, Radiation effects, Geant4 simulation, Radiation testing of EEE components, materials and systems. This work will be performed with the LIP "Space Radiation Environment and Effects Group" and with CTN and it will contribute to ongoing contracts with the European Space Agency (ESA). Supervisors: Patrícia Gonçalves; patricia@lip.pt Cursos MEFT

    The Heavy Weights: Measuring Higgs and Top-Quark Associated Production at the ATLAS LHC Experiment to Probe Beyond the Standard Model

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Coimbra
    Pólo: LIP-Coimbra, Instituição de ensino: Universidade de Coimbra
    Orientador(es): Ricardo Gonçalo, Yvonne Peters
    Contacto: --
    início: --

    After an intense search, the associated production of the Higgs boson with a top quark pair was finally observed in 2018. This production channel provides the best way to directly measure the coupling between Higgs and top, the heaviest fundamental particles in the Standard Model (SM). But it also provides a way to look beyond the SM, in particular to search for signs of a non-standard Higgs boson, leading to CP-violation in the Higgs sector. Such a component is well justified in scenarios like the two-Higgs doublet model, and finding it would constitute a major discovery. In particular, it could lead to understanding why there is a huge asymmetry between matter and antimatter in the Universe. In this leading edge research, the selected student will analyze data collected by the ATLAS experiment during the LHC Run 3, to start in 2021. He or she will have the opportunity to participate in the operation of the experiment at CERN. The student will integrate the Portuguese ATLAS team and will use recent techniques developed in our group to enhance the experimental sensitivity of the analysis. He or she will be co-supervised by a colleague from the University of Manchester, where the student will spend part of the time of the PhD, as foreseen in the grant. Besides the Manchester group, the student would work in a vibrant international team within the ATLAS collaboration.

    Seeds for the Next Frontier Detectors: Lessons from the TileCal/ATLAS Operation and R&D on Emergent Scintillating Materials

    Linha de Investigação:
    Experiências LHC e Fenomenologia
    --
    10
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: Universidade de Lisboa
    Orientador(es): Rute Pedro, Amélia Maio
    Contacto: training@lip.pt
    início: --

    Currently, the field of Particle Physics is planning the next generation experiments with options for CERN-based accelerators, namely the FCC cicular collider, under consideration by the update of the European Strategy for collider HEP. *On the other hand*, the main technological challenges in the R&D for the future detectors are already identified and are input for the decision. Calorimeters are indispensable instruments to measure the energy of the collision products. For sampling hadronic calorimeters, choices relying on organic scintillators and wavelength-shifting (WLS) fibres read by photodetectors are successfull due to the low cost and are strong options for the future. Their operation under the expected harsher radiation conditions must meet crucial requirements of high light yield, fast response and radiation hardness. Although recent developments in organic scintillators/WLS indicate a breakthrough on light emission and time response, these emergent materials are lacking in R&D to scrutinise their radiation tolerance. This proposal includes R&D on the new organic scintillators/WLS, with the characterization of the light yield, attenuation length and resistance to ionising radiation. The work will be carried on at the LIP Laboratory of Optics and Scintillating Materials (LoMAC) and collaboration with national and CERN partners is expected. This research exploits also the current operation of the ATLAS Tile hadronic calorimeter to model the radiation damage of scintillators and WLS fibres using calibration data acquired in the real experimental environment. Several factors contribute to the total light output of scintillator+WLS fibre calorimeters, such as fibre length, scintillator plate/tile sizes, dose, dose rate and others. The plan will explore how these factors correlate with the light yield degradation using regression techniques based on modern machine learning and build tools to optimise the design of future detectors. 

    Search for nucleon decay in DUNE: analysis of kaon decay modes in ProtoDUNE-SP beam data

    Linha de Investigação:
    Neutrinos e matéria escura
    --
    40
    LIP-Lisboa
    Pólo: LIP-Lisboa, Instituição de ensino: --
    Orientador(es): Nuno Barros, José Maneira
    Contacto: barros@lip.pt, maneira@lip.pt
    início: --
    Overview of the DUNE experiment DUNE – the Deep Underground Neutrino Experiment - plans to build a set of 4 large detectors based on Liquid-Argon (LAr) at the Sanford lab in the USA, to measure oscillations of high intensity neutrino and anti-neutrino beams directed from Fermilab, 1300 km away. Its main goal is the discovery of leptonic charge-parity (CP) violation, a major priority in Particle Physics today, contributing to shed light on the matter/anti-matter asymmetry of the Universe. The large detector mass (40 ktons) will provide also an interesting possibility for non-beam observations: namely, they should be able to detect lower energy neutrinos from Supernova explosions or signals of nucleon decay. The expected start date for the Fermilab beam data-taking is 2026. However, two smaller scale DUNE prototypes (ProtoDUNEs) were built at CERN and are being used to further understand the technology and to guide the design of the large scale far detectors. Each of these prototypes implements a different approach of the liquid argon TPC (LArTPC) technology and aims to demonstrate how each technology performs by using particles from charged particle beams to emulate the final states of neutrino interactions, cosmic rays and, in the future, calibration sources. Activities by the LIP group in DUNE The LIP group has responsibilities in both the far detectors and the prototypes at CERN, being involved both in instrumentation and physics analyses. Presently, the group is involved in the implementation of the laser and neutron calibration systems for ProtoDUNE and DUNE, the trigger system of ProtoDUNE and data analysis of existing ProtoDUNE data. DUNE is a very complex detector, using a technology that is not fully characterized, and with a very broad physics program spanning from accelerator neutrino oscillations, atmospheric neutrinos, nucleon decay, and sensitivity to Supernovae. This allows ample opportunities for research topics leading to Master’s theses, both in technology and instrumentation and in physics analyses. For more details about this or other possible thesis topics with the neutrino group at LIP, the prospective students are encouraged to contact the proponent. Study of kaon events in ProtoDUNE-SP phase I dataset This thesis will focus on the analysis of the existing dataset taken with ProtoDUNE single phase prototype during the first beam run, that occurred between October 2018 and January 2019. During this period, a beam was used to provide a mixture of electrons, muons, pions, kaons and 2 protons at a selected momenta in the range 0.5 to 7 GeV. This allowed to accumulate a large sample of physics events that can now be used to evaluate the detector response to different particles and momenta. An electronics trigger was put in place that, using measurements from beam instrumentation, could identify the beam particle interacting in ProtoDUNE at each event, providing very pure samples of events of each particle type. This work will focus on the analysis of the data sample of kaon interactions. In the context of DUNE, kaons are one of the most interesting signatures of certain nucleon decay modes (p → ℓK and n → ℓK) where DUNE, due to its LArTPC technology, has a higher sensitivity than other experiments. The search for nucleon decay is one of the most direct ways to test baryon number conservation and a unique probe of Grand Unified Theories (GUTs) scale physics around 1014-16GeV. The goals of this work are to study the event signatures of different kaon interactions and determination of the detector response to these kinds of events. This will imply the analysis of both beam data and simulation, and the implementation of methods that can discriminate the signatures of kaon events from the remainder ProtoDUNE events (mostly cosmic rays). In a later stage it is aimed to use these results towards re-evaluating the sensitivity of the full DUNE experiment to the detection of nucleon decay through the kaon channels. This work plan will integrate the student in the largest ever Neutrino Physics collaboration, a very international and stimulating scientific environment, with activities in two major worldclass Particle Physics laboratories: CERN and Fermilab.


Relaccionado:

Oportunidades de Bolsas  |  Programas Doutorais

Address

Contacts


Send me a message/comment

Logos institucionais


    Parceiros
  • Co-financiado
    Co-financiado


Política de cookies

Este site utiliza cookies, com o objetivo de melhorar a sua utilização. Ao navegar no site estará a consentir a sua utilização.


Laboratório de Instrumentação e Física Experimental de Partículas   LIP.PT

Window-Size
// User: carlos@lip.pt EDITAR GUARDAR