New Physics from Multi-Higgs Models
Domain: Theoretical Particle Physics Supervisor: Joao Silva Co-Supervisor: Jorge Romão Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
After CERN announced the discovery of a particle with Higgs-like properties predicted in 1964, Higgs and Englert received the 2013 Physics Nobel Prize. This particle corresponds to a spin-zero field, necessary to give mass to the remaining particles in the Standard Model of electroweak interactions. There is no fundamental argument determining the number of scalar particles. Thus, as one probes further into the properties of the particle discovered, one must look for the differences which will exist in case there are more than one scalar; the so-called Multi-Higgs models. In this project, we wish to explore the consequences for LHC and ILC of the presence of more than one Higgs field. Conversely, we wish to understand how current and upcoming data affects these models. The project will be co-supervised by João P. Silva and Jorge C. Romão and can have a larger interface with phenomenology or be more theoretical, depending on the interests and abilities of the student. We have a large publication record and impact in this field; the last students working with us on similar topics had two articles in about one year. |
Decays of the Higgs boson in two tau leptons, and measurement of Higgs couplings
Domain: Experimental Particle Physics Supervisor: Pedrame Bargassa Co-Supervisor: Joao Varela Institution: Laboratório de Instrumentação e Física Experimental de Partículas Host Institution: CERN
Abstract
The major discovery of a Higgs boson at the LHC can be part of a theory beyond the Standard Model (SM) of particle physics where several Higgs bosons would exist at higher masses. In absence of direct observation of heavier Higgs bosons, measurements of the properties of the discovered boson are crucial to determine if it belongs to the SM, possibly opening the gate to new physics. The plan is to measure the decays of the Higgs boson in pairs of two tau leptons through different production modes of the Higgs, these, with the highest signal purity. To this end, multivariate analysis approaches will be used. This will lead to a precise measurement of the couplings of the Higgs boson decaying into SM fermions and bosons, ultimately constraining its properties. |
Analytic Methods for Astrophysical Defect Fingerprinting
Domains: Theoretical Particle Physics | Cosmology | Astroparticle Physics Supervisor: Carlos Martins Institution: Universidade do Porto Host Institution: CAUP / IA-Porto
Abstract
Cosmic strings arise naturally in many proposed theories of new physics beyond the standard model unifying the electroweak and strong interactions, as well as in many superstring inspired inflation models. In the latter case, fundamental superstrings produced in the very early universe may have stretched to macroscopic scales, in which case they are known as cosmic superstrings. If observed, these objects thus provide a unique window into the early universe and possibly string theory. Recent progress in CMB polarization and gravitational wave detection shows how some of these scenarios can be constrained by high-resolution data. However, to fully exploit the potential of ESA facilities such as CORE and LISA, one needs matching progress both in high-resolution HPC numerical simulations of defect networks and in the analytic modelling of key physical mechanisms underlying their evolution. This thesis will address the latter, using a series of novel mathematical and statistical techniques to develop more accurate analytic models for general defect evolution (building upon the successes of the current canonical VOS model) as well as for their astrophysical fingerprints, which is able to match the sensitivity of ongoing and future observational searches. |
Astrophysical and Local Tests of the Einstein Equivalence Principle
Domains: General Relativity | Cosmology | Astrophysics Supervisor: Carlos Martins Institution: Universidade do Porto Host Institution: CAUP / IA-Porto
Abstract
The Einstein Equivalence Principle (EEP, which Einstein formulated in 1907) is the cornerstone of General Relativity (only formulated in 1915) but also of a broader class known as metric theories of gravity. Although they are often confused, the two are conceptually distinct, and different experiments optimally constrain one or the other. Recent developments, including quantum interferometric tests and dedicated space missions, promise to revolutionize the field of local tests of the EEP and dramatically improve their current sensitivity. This thesis will explore new synergies between these imminent new local tests of the EEP and ongoing or planned astrophysical and cosmological tests: some of these directly test the EEP, while others only test GR on various scales. We will explore relevant paradigms (including scenarios with and without screening mechanisms), develop a taxonomy for various model classes, and study how they are further constrained by experiments such as MicroSCOPE and ACES, in combination with astrophysical data from ESPRESSO, ALMA and other facilities. The work will also be directly relevant for the science case of several ELT instruments, as well as Euclid and the SKA. |
New Maps of the Dark Side: Euclid and beyond
Domains: Cosmology | Astrophysics Supervisor: Carlos Martins Institution: Universidade do Porto Host Institution: CAUP / IA-Porto
Abstract
The growing amount of observational evidence for the recent acceleration of the universe unambiguously demonstrates that canonical theories of cosmology and particle physics are incomplete—if not incorrect—and that new physics is out there, waiting to be discovered. The most fundamental task for the next generation of astrophysical facilities is therefore to search for, identify and ultimately characterise this new physics. The acceleration is seemingly due to a dark component whose low-redshift gravitational behaviour is very similar to that of a cosmological constant. However, currently available data provides very little information about the high-redshift behaviour of this dark sector or its interactions with the rest of the degrees of freedom in the model. It is becoming increasing clear that tackling the dark energy enigma will entail significantly extending the redshift range where its behaviour can be accurately mapped. A new generation of ESA and ESO facilities, such as Euclid, the ELT, and the SKA have dark energy characterization as a key science driver, and in addition to significantly increasing the range and sensitivity of current observational probes will allow for entirely new tests. The goal of this thesis will be to carry out a systematic exploration of the landscape of physically viable dark energy paradigms and provide optimal discriminating observational tests. The work will initially focus on Euclid (whose launch is fast approaching) and will gradually broaden to explore synergies and probe combination with the SKA and relevant ELT-HIRES instruments. |
Fundamental cosmology from precision spectroscopy: from ESPRESSO to the ELT
Domains: Cosmology | Astrophysics Supervisor: Carlos Martins Institution: Universidade do Porto Host Institution: CAUP / IA-Porto
Abstract
ESPRESSO is the next generation spectrograph, combining the efficiency of a modern Echelle spectrograph with extreme radial velocity and spectroscopic precision, and including improved stability thanks to a vacuum vessel and wavelength calibration done with a Laser Frequency Comb. ESPRESSO has been installed in the Combined Coudé Laboratory of the VLT and linked to the four Unit Telescopes (UT) through optical Coudé trains, allowing operations either with a single UT or with up to four UTs for about a 1.5 magnitude gain. One of the key science drivers of ESPRESSO is to perform improved tests of the stability of nature’s fundamental couplings, and in particular to confirm or rule out the recent indications of dipole-like variations of the fine-structure constant, alpha. In this thesis the student will be directly involved in the analysis and scientific exploration of the ESPRESSO fundamental physics GTO, and in the preparation of any follow-up observations. Apart from its obvious direct – and very significant – impact on cosmology and fundamental physics, the ESPRESSO data will also be important as the first reliable precursor of analogous high-resolution spectrographs for the next generation of Extremely Large Telescopes, and in particular of ELT-HIRES (in whose Phase B we are directly involved). A second goal of the thesis is to use the ESPRESSO data for detailed realistic simulations to assess the cosmology and fundamental physics impact of ELT-HIRES, also including tests beyond the sensitivity of ESPRESSO, such as redshift drift measurements and molecular tests of composition-dependent forces. |
Big data, Machine Learning and object classification in high energy hadronic collisions
Domains: Theoretical Particle Physics | Experimental Particle Physics Supervisor: José Guilherme Milhano Co-Supervisor: Nuno Castro Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Machine Learning (ML) has become pervasive in today's world. Web search engines, image and voice recognition, targeted advertising and personalized music/film recommendations, credit card fraud detection and email spam filters all rely heavily neural network architectures. Efficient training of an artificial neural network typically requires very large amounts of data: input/output pairs from which the network will learn to predict the output for new inputs. The extremely large amounts of data generated by particle colliders make the use of ML both a necessity and a potentially very fruitful path to follow. ML techniques are used extensively in many areas of high energy particle physics with application ranging from low level tasks, such as identification of physical objects in collider data (top quarks, W and Z bosons, tau, ...) to high-level physics analyses discriminating between specific and rare signals and known backgrounds. More recently, ML has proved to be a powerful physics discovery tool allowing to identify important properties of physical objects (e.g. QCD jets) from 'detector-level' information that had escaped the imagination of theorists. This thesis will have a dual focus: (i) the application of ML to the efficient identification of physical objects in proton-proton, proton-nucleus and nucleus-nucleus collisions at LHC and future collider energies; (ii) development of systematic methods to learn Physics from the ML, that is to identify what is learnt by the machine and match to either existing analytical calculations or carry on those calculations. All work will be carried out using both Monte-Carlo generated samples and Open (publicly available) LHC data. The selected candidate will develop both strong and highly transferable computational skills and solid competence in Quantum Chromodynamics. The thesis will be co-supervised by an experimentalist (Nuno Castro) and a phenomenologist/theorist (Guilherme Milhano) and the selected candidate will divide her/his time between Braga and Lisbon. |
Searching for Dark Matter using top-quarks at the ATLAS experiment at the LHC
Domain: Experimental Particle Physics Supervisor: Nuno Castro Institution: Universidade do Minho Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The origin of Dark Matter (DM) is one of the current open questions in Particle Physics. The for DM at colliders can provide valuable information, complementary to those obtained from direct and indirect experiments. In the current proposal the proton-proton data collected by the ATLAS experiment during the second operation phase of the Large Hadron Collider (2015-2018) will be used. Monotop signatures, where a boosted top quark is produced in association with a large amount of missing transverse energy will be explored. In the scope of this thesis, a dedicated analysis sensitive to monotop signatures will be developed, including the definition of the signal and control regions, the detailed study of the systematic sources of uncertainty and the statistical interpretation of the obtained results. Advanced data analysis techniques, such as deep learning, will be used to tag the top quarks and a close collaboration with the LIP competence center on simulation and big data is foreseen. The phenomenological interpretation will also be done. The selected student will integrate the ATLAS Collaboration and technical work relevant for the consolidation of the student's expertise in experimental particle physics will be done. Travels to CERN in this context are expected. |
Uncovering the mass assembly history of late-type galaxies in different environments with IFS data
Domain: Astrophysics Supervisor: Jean Michel Gomes Co-Supervisor: Catarina Lobo Institution: Universidade do Porto Host Institution: Universidade do Porto
Abstract
Galaxy clusters and groups are gravitationally bound structures that contain hundreds to thousands of galaxies bonded by gravity and embedded in a dense medium. They are excellent laboratories for studying the impact of the environment on the mass assembly history of their member galaxies. As such, there is abundant literature on how several processes such as ram pressure stripping, strangulation, tidal effects, and merging can lead to significant changes in the secular evolution of galaxies. A combination of these mechanisms is expected to explain some of the morphological transformations of galaxies and the quenching of their star formation activity. Pinning down the dominant mechanism responsible for these evolutionary trends is one of the key-questions in extragalactic astronomy, that is likely linked to the process of infall of galaxies into clusters occurring within the framework of the hierarchical growth of large scale structure. The cold gas medium of galaxies is expected to be tightly connected with their capacity to keep forming stars and is extremely sensitive, in small timescales, to the mechanisms mentioned above. It should thus provide indications of the relevant processes acting on galaxies located at different clustercentric radii and in the field. Recent studies that show abnormally strong and frequent cold gas interstellar medium interactions signatures in the late-type galaxies in merging clusters seem to indicate that this phase of the galactic content carries the signatures of the physical processes occurring in different environments. This project proposes a systematic study of 2D integral field spectroscopy (IFS) from the MaNGA (Mapping Nearby Galaxies at APO) survey combined with HI and molecular gas ancillary data for a large sample of galaxies inhabiting different environments. In order to reconstruct the spatially resolved star-formation and chemical enrichment history of galaxies in distinct environments, the student will make use of the FADO population synthesis code and Porto3D post-processing IFS analysis pipeline. This project provides an excellent combination of astrophysical theory with observations, and it will lead to valuable expertise in the field of spectral synthesis and environment of galaxies, i.e. cluster as compared to field galaxies. Several publications will support the future career of the student. Previous knowledge on one of the following computing languages is desirable: ESO-MIDAS script language, Fortran 77/2008+, C, C++, IDL or Python. |
Spatial distribution of α-elements in galaxies
Domain: Astrophysics Supervisor: Jean Michel Gomes Co-Supervisor: Polychronis Papaderos Institution: Universidade do Porto Host Institution: Universidade do Porto
Abstract
A long-standing puzzle in extragalactic research concerns the anomalous abundances of so-called α-elements (e.g., C, N, O, Ne, Si, S, Mg, and Na) relative to iron (Fe) in early-type galaxies (ETGs). These elements are generally enhanced relative to Fe by an “enhancement-ratio” [E/Fe] correlating with the stellar velocity dispersion (hence, the total stellar mass) of an ETG. The dominant physical mechanism responsible for this trend is still unknown yet fundamental to the understanding of the chemo-dynamical evolution of ETGs across their entire mass spectrum. Three main scenarios have been proposed for these discrepancies: a) a varying star-formation rate efficiency in massive ETGs, b) a non-universality of the stellar initial mass function (IMF) in the sense of a “top- heavy” IMF and c) selective loss of elements due to galactic winds. All these scenarios attempt reproducing the observed [E/Fe] ratios as essentially the result of chemical enrichment by Type II and Type Ia Supernovae, each acting on different timescales, and with a relative frequency closely linked to the galaxy star formation history. Studies of stellar populations in galaxies have dramatically advanced in the last decade. Instead of using a few hand-picked Lick indices, fluxes and integral colors to constrain the star formation- and chemical enrichment history of galaxies, modern spectral synthesis codes and computing facilities now permit detailed modeling of the full optical spectrum of a galaxy in a pixel-by-pixel approach. These modeling tools and the availability of high-quality data sets (e.g., 2dF, 6dF, SDSS, and GAMA surveys) offer a promising avenue for a better understanding on how galaxies form and evolve through time. However, all spectral synthesis studies carried out over the past decade on the basis of these single-fiber spectroscopic surveys lack the necessary spatial coverage and resolution to study the radial trends in galaxies (cf e.g. Gomes et al. 2016b&c). Only recently spatially-resolved data from Integral Field Spectroscopy (IFS) has become available, permit- ting the study of radial abundance patterns of α-elements in galaxies with unprecedented detail. An innovative aspect of this Ph.D. research project is the use both the IFS data for 667 local Hubble-type galaxies from The Calar Alto Legacy Integral Field spectroscopy Area survey (CALIFA - http://califa.caha.es) and Mapping Nearby Galaxies at APO (MaNGA) survey (last release ~5k galaxies) to determine the 2D α-element distribution in a spatially resolved pattern. This observational input will be combined with the derived Star-Formation Histories and structural properties of ETGs from CALIFA with the goal of developing new evolutionary diagnostics for ETGs and shedding light into the origin of the α-element enhancement in these systems. |
Top quark physics and search for physics beyond the Standard Model at the Large Hadron Collider
Domain: Experimental Particle Physics Supervisor: Michele Gallinaro Co-Supervisor: Joao Varela Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Top quarks are primarily produced in pairs at a hadron collider. In the framework of the Standard Model (SM), each top quark decays into a W and a b quark. The top quark pair events can be characterized by the decays of the two W bosons. Top quarks are abundantly produced in proton-proton collisions at the LHC, and constitute the main background to searches for New Physics. A good understanding of the top quark events will allow a more sensitive reach into the realm of searches for BSM processes. The work will focus on studying the properties of the top quark dilepton final state and measure the tau and heavy flavour contents of top quark events. Studies of final states including 3rd generation leptons and quarks, such as tau leptons and b-jets, produced in association with top quark pair events may provide hint on New Physics processes and shed light on the anomalies of Lepton Flavor Universality measurements. The link between tau leptons and b-quark content to top events, and the possibly large content of heavy flavour in BSM final states will be used to further characterize the data, with the goal of discriminating between SM and BSM processes. An anomalous heavy flavor production is directly “visible” in this study. Deviations from SM predictions will indicate evidence for New Physics. |
Search for dark matter at the LHC
Domain: Experimental Particle Physics Supervisor: Michele Gallinaro Co-Supervisor: Joao Varela Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The subject of this thesis is the search for New Physics in events with jets, with W or a Z boson produced in association with large missing transverse energy. Dark matter (DM) is one of the most compelling pieces of evidence for physics beyond the standard model (SM). In many theories, pair production of DM particles in hadron collisions proceeds through a boson mediator of either spin-0 or spin-1. DM particles can be produced in pairs association jets or with a vector boson V (where V is either a W or a Z boson) and recoil with large missing transverse energy. This results in the `MET+X’ final state. The thesis is placed in the context of the Portuguese participation in the CMS experiment at the LHC, and it is linked to the Beyond the Standard Model (BSM) searches in the more general context of the searches for New Physics processes at the LHC. The candidate is expected to work in a team with a group of researchers. |
Phenomenology of Standard Model extensions with enlarged scalar sectors
Domain: Theoretical Particle Physics Supervisor: Pedro Ferreira Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The Standard Model of particle physics is a very successful theory, but leaves many questions unanswered, such as the nature of Dark Matter and the matter-antimatter asymmetry observedinthe universe. Models with enlarged scalar sectors (in which more than one Higgs boson particle are predicted) provide answers for some of those questions. One such model is the Two Higgs Doublet model, first proposed in 1973, but which is currently being thoroughly analysed thanks to the existence of experimental data from LHC, which is already constraining enormously the available parameter space of the theory. Phenomenological studies of the model (that is, prediciting values for observables to be tested at the LHC) necessitates a thorough theoretical understanding of the model, and deepening of procedures which have thus far been applied but, with the increasing precision emerging frm experiments, need to be updated and refined. One such procedure is the relationship between the Higgs potential parameters (in the most used version of the 2HDM, eight of them) and physical observables (such as the five physical masses and the angles alpha and beta which characterize all couplings of scalars to fermions and gauge bosons). Thus far, most phenomenological analyses use tree-level relationships to obtain the potential's parameters from physical quantities, but one-loop precision is beginning to be necessary. The thesis will therefore concentrate on obtaining the one-loop expressions for the scalar masses - and other observables - and from them extracting the quadratic and quartic couplings of the 2HDM scalar potential, confronting these results with those obtained from the tree-level approach. Then, those potential parameters will be confronted with theoretical requirements that the potential needs to obey - namely, that the potential be bounded from below (it must have a stable minimum) and obey unitarity (total probability of a given process cannot exceed unity). To begin a simple version of the 2HDM will be analysed - the Inert Model, which provides dark matter candidates. The re-analyses of all dark matter observable constraints will then be undertaken. The Inert Model is currently very contrained by experimental data, and a possible reason for it is the use of tree-level relations between the potential and physical observables. With what is learned from the Inert case, the student will then progress to the full 2HDM. Further models (such as extensions of the Standard Model with extra singlets, complex or real, or a number of doublets eqal to three) will then also be considered. The calculations necessary will be a mix of paper-and-pencyl deductions (to obtain the theoretical expressions from literature, or re-calculate them) and computer numerical scans to probe the models' parameter space. The use of internationally available and renowned computer codes for particle physics phenomenology (such as the codes ScannerS for analyses of the scalar sector, Madgraph and SUSHI for computation of cross sections, MICROmegas for dark matter analyses) will be a crucial part of the work of the PhD student. Attendance of international Schools for PhD students and presentation of results in international conferences will be encouraged and expected. The project will include an ongoing collaboration with experimental physicists. |
Stellar and Orbital evolution during the Red-giant phase
Domain: Astrophysics Supervisor: Alexandre Correia Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
In the next 5 billion years, the Sun will start to cool and expand, becoming a Red-giant star. During this process, the Sun radii will increase hundreds of times, and will engulf the orbits of the inner planets. However, during this process the orbits of the planets will also expand, and there is a possibility that they are never cached by the solar envelope. Nevertheless, the orbital scattering may also give rise to close encounters between the planets, which will cause collisions between them or ejections from the Solar System. During this Ph.D we want to study the simultaneous solar and orbital evolution of the Solar System during the last stages of the solar life. In particular, we wonder what will be the fate of the inner planets, and, in particular, the final destiny of the Earth. We will also apply our model to the already detected planets around Red-giant stars. |
Dynamics of the Uranian system
Domain: Astrophysics Supervisor: Alexandre Correia Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
Uranus is the most mysterious planet in the Solar System. In part because it is difficult to observe and only received the visit of one spacecraft. However, the truly intriguing features are related to dynamical aspects that challenge all the established formation mechanisms. In particular, that the spin axis of Uranus is tilted by 98 degrees, some of the satellites are highly cratered and others not, their orbits are very compact and have no resonant pairs, or the rings that do not completely encircle the planet. All these observations are probably related, but so far all the attempts to understand them were studied separately and never come up with a satisfactory answer. In this PhD thesis we try to address all these questions as a whole. We rely on new dynamical approaches developed to the study of exoplanets, together with modern computers that allow to model more detailed and realistic scenarios. |
Quarks in a hot medium: propagators, confinement and chiral symmetry restoration
Domain: Theoretical Particle Physics Supervisor: Orlando Oliveira Co-Supervisor: Paulo Silva Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
Quantum Chromodynamics (QCD) describes the interactions between quarks and gluons. One of the puzzling properties of QCD being that its fundamental particles, i.e. quarks and gluons, are not observed in nature but appear as constituents of mesons and baryons. In strong interactions two of its main open questions are the understanding of confinement (why there aren’t free quarks and gluons) and the mechanism for the generation of mass scales that sets in for quarks and gluons and prevents infrared divergences. Current believe are that confinement and chiral symmetry breaking are interlaced. There are indications that for sufficiently high temperatures quarks and gluons behave essentially as a gas of free non-interacting particles, i.e. they seem to behave as deconfined particles. The formulation of QCD on a space-time lattice enables first principles determination of the quark and gluon propagators. From the propagators one accesses information about confinement/deconfinement and on the generation of mass scales, namely its running quark mass. The knowledge of the mass functions are crucial for the understanding of modern heavy ion experimental programs and for the history of the Universe. In this project we aim to compute, using the lattice QCD simulations, the quark propagator at finite temperature. Its main goal is to help understanding chiral symmetry breaking and the confinement/deconfinement properties of QCD as a function of the temperature by studying the properties of the various form factors that define the propagator and looking at its spectral representation. The knowledge of the spectral representation allows to investigate transport properties that are of paramount importance for the dynamical description of the heavy ion experimental programs. The simulations will be performed using the supercomputer facilities at the University of Coimbra. The candidate will join a team with a large experience in lattice QCD simulations. |
Black hole spontaneous scalarisation: non-linear dynamics and gravitational-wave signatures
Domains: Astrophysics | General Relativity Supervisor: Carlos Herdeiro Co-Supervisor: Nicolas Sanchis-Gual Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
Gravitational Wave (GW) Astronomy allows unprecedented tests of strong gravity systems and, in particular, black holes (BHs). Vacuum General Relativity (GR) yields the Kerr solution as the only possible BH. But beyond vacum GR new possibilities emerge. Then, dynamical viability is a tight filtre for alternative BH models, so that their phenomenological viability may be assessed. This proposal focuses on the non-linear dynamics and GW phenomenology of a dynamical mechanisms known to yield modified BHs with respect to the standard (electro-)vacum GR ones: spontaneous scalarisation. Both modelds with charged BHs (as toy models) and models with higher curvature corrections will be considered. Fully non-linear numerical relativity techniques will be considered, complemented with perturbation theory studies. Formation scenarios, stability analyses and binary systems shall be tackled, with the goal of testing the viability of the scalarised BHs and extracting GW templates of relevance for LIGO/Virgo searches. |
Exploration of the physics potential of MARTA Engineering array
Domain: Experimental Particle Physics Supervisor: Ruben Conceição Co-Supervisor: Raul Sarmento Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas ; Thesis to be done partly in Instituto Superior Técnico partly in Universidade do Minho
Abstract
Nature and arrival direction of cosmic rays at the highest energies can only be inferred indirectly through the analysis of the air shower induced by their interaction with the atmosphere. The understanding of the shower development relies on our knowledge about the hadronic interactions that can occur at energies well above those reachable at accelerators on Earth. Muons, being long-lived particles, carry important information about these hadronic interactions that rule the shower development. Therefore, their detection at the ground is an essential tool to understand the physics of extensive air showers and particle interactions at extreme energies. However, the measurement of the highest energy extensive air-showers and in particular of the produced muons poses several challenges as it has to be performed in an outdoor environment, using detectors covering a vast area. Engaging this challenge the LIP group is leading the MARTA project, which proposes an innovative concept for the muon detection in air-shower experiments. MARTA (Muon Array of RPCs for Tagging Air showers) consists basically of robust RPCs (Resistive Plate Chambers) deployed under a Water Cherenkov Tank, which is sensitive to all kind of charged particles and is also used as an absorber of the shower electrons and gammas. This array will measure the muons on an event-by-event basis and will collect shower events produced mainly at a center-of-mass energy compatible to those reached currently at the Large Hadron Collider, LHC. Several full-scale MARTA prototypes are already installed and taking data in the Pierre Auger Observatory - currently the biggest cosmic ray observatory in the world - situated in Argentina. A MARTA Engineering Array (EA), consisting of seven MARTA stations, is planned to start to be deployed in Auger during 2019. The successful operation of the MARTA EA will be of the utmost importance for the proof-of-concept and a pathfinder for future experiments. The selected candidate will be involved in the activities of the LIP/Auger group, in particular: Participation on the commissioning of the MARTA Engineering Array; Validation of the detector concept and performance; Development of data analysis tools to reconstruct showers; Extract information from the showers using the MARTA EA and combine it with LHC to further constrain hadronic interaction properties. |
Reconstruction of the mass assembly history of Active Galactic Nuclei with FADO
Domain: Astrophysics Supervisor: Jean Michel Gomes Co-Supervisor: Polychronis Papaderos Institution: Universidade do Porto Host Institution: Universidade do Porto
Abstract
Fitting Analysis using differential evolution Optimization (FADO) is a conceptually novel, publicly available spectral population synthesis (PS) code (www.spectralsynthesis.org), which employs for the first time genetic optimization and artificial intelligence to identify the star-formation and chemical evolution history (SFH and CEH, respectively) that self-consistently reproduce the main nebular characteristics of star-forming (SF) galaxies. This unique concept allows us to alleviate and even overcome, degeneracies in spectral synthesis studies, thereby opening new avenues to the investigation of galaxy formation and evolution. However, a large fraction of emission-line galaxies hosts an Active Galactic Nucleus (AGN) powered by accretion of matter onto a central super-massive black hole of several million solar masses. Depending on our viewing angle to the galaxy nucleus and its surrounding obscuring torus, the strong non-stellar radiation from the AGN can provide an important fraction, or even outshine, the spectral continuum of the underlying galaxy host. Even a low-to-moderate (~20%) contribution of the AGN to the optical continuum emission of a galaxy can strongly bias conclusions drawn from state-of-the-art purely-stellar PS codes, as demonstrated in Cardoso, Gomes & Papaderos (2016,2017). The work tasks and main objectives of this Ph.D. thesis are to use FADO in order to: 1 - Quantify its accuracy with benchmark tests in retrieving the SFH & CEH in the presence of an AGN. Additionally, compare the results from FADO with those from purely stellar codes (e.g., STARLIGHT, STECKMAP, ULySS, FIREFLY). In this task, the student will make use of fictitious datasets created with the REBETIKO evolutionary synthesis code under the presence of an AGN; 2 - Disentangle the star-forming component from the non-thermal AGN component and estimate the AGN luminosity emission in various spectral bands; 3 - Test distinct recipes in modeling the spectral energy distribution of the AGN, like the inclusion of multi-component continuum (e.g., Ferland et al. 2017 - Big Blue Bump and distinct power-law slopes in the X-ray, UV and optical). The student will make use of the FADO AGN module; 4 - fit galaxy spectra data sets from SDSS & GAMA surveys as well as Integral Field Spectroscopy data from MUSE, MEGARA, MaNGA to investigate the SFH & CEH of galaxies hosting an AGN. This project provides an excellent combination of astrophysical theory with observations, and it will lead to valuable expertise and several publications that will support the future career of the student. Preferable computing languages are Fortran, IDL and/or Python. |
Role of the magnetic field in the formation and evolution of star-forming hub-filament systems
Domain: Astrophysics Supervisor: Doris Arzoumanian Co-Supervisor: Nanda Kumar Institution: Universidade do Porto Host Institution: Centro de Astrofisica da Universidade do Porto
Abstract
Understanding how stars form is one of the most important and wide research topics in astrophysics. Low mass stars, like our Sun, may host planets where life could emerge, and the most massive stars govern the physics and chemical enrichment of the interstellar medium of galaxies. Although the evolutionary sequence of solar type stars is relatively well described today, the physics of the early stages of star formation is still largely unknown, especially that of high-mass stars. Recent observations of the interstellar medium (the gas and dust filling the space between stars) have revealed the impressive organization of the matter into complex networks of filaments. The densest filaments are now identified as the main birthplaces of stars and the hubs formed by the intersection of several filaments host embedded clusters of stars from low to high masses. Much progress has been accomplished in recent years on the characterization of the density and velocity structures of filaments and hubs. However, the structure of the magnetic field and its role in their formation and evolution is still unexplored. The goal of the proposed PhD project is to unveil the detailed structure of the magnetic field in star-forming hub-filament complexes. The PhD candidate is expected to analyse observations of thermal dust emission in total and polarized intensity to study the density and magnetic field structures of star-forming hub-filament systems. The data will be obtained with state-of-the-art international telescopes. The candidate will be involved in writing the observational proposals, carrying out the observations, and reducing and analysing the data. The observational results will be compared to theoretical models and numerical simulations to infer the role of the magnetic field in the physics at play in hub-filament systems and its interplay with turbulence and gravity leading to fragmentation and formation of stars. The outcome of the PhD project will provide new insights on the role of the magnetic field in the formation of hub-filament systems and their star-formation activity. These results may also be a founding block to understand the global star formation process in galaxies. The candidate will be based at IA (Porto University) and integrated to the activities of the star-formation group of CAUP. To reach the proposed scientific objectives, the PhD thesis will be co-supervised by an expert in high-mass star-formation studies from the observatory in Bordeaux (France). Several visits to Bordeaux will be planed. Close interactions with theorists in Japan are also foreseen. Thanks to the proposed project, the candidate will gain deep knowledge in the physics of the interstellar medium and star-formation studies, building strong skills in observations and developing a key sense in theoretical interpretation. The expertise acquired during the PhD will provide key opportunities for future research in star-formation studies, especially in polarization astronomy, which will be highly valued in the coming years with the numerous polarization instruments newly installed on telescopes or planned for the near future. |
Self-consistent spectral modeling of quasars and its implication to the mass assembly history galaxies
Domain: Astrophysics Supervisor: Jean Michel Gomes Co-Supervisor: Luis Vega Institution: Universidade do Porto Host Institution: Universidade do Porto
Abstract
Quasars are thought to be hosted by a supermassive black hole (SMBH) capable of producing an energy release due to matter accretion that easily outshines the whole host galaxy, leading to this featureless quasi-stellar object appearance. This class of extremely luminous Active Galactic Nuclei (AGN) is an excellent laboratory for furnishing tight constraints on the formation and evolution of galaxies since they can be observed at all redshift values. They are also associated with the most massive and luminous galaxies in the early Universe. Over the past few years, it was discovered several quasars even at early times (redshift ~7). We propose a new spectral fitting code for fitting the UV-optical range in a self-consistent manner to be applied to quasars. This code will include a standard accretion disk model (Shakura & Sunyaev 1973) and a more realistic UV-optical model from, e.g., Kubota & Done (2018). We will tie these prescriptions together in order to energetically reproduce both the observed continuum plus emission-lines in quasars considering internal attenuation. This new fitting code will be publicly available and additionally applied to ~500 000 quasars from the SDSS DR15. We will produce a full database catalog for the astronomical community. This is a preparatory work for the MOONS spectrograph in which IA co-leads also it will add value when the modules for fitting quasars are incorporated in the population synthesis code FADO (Gomes & Papaderos 2017). This project provides an excellent combination of astrophysical theory with observations, and it will lead to valuable expertise in the field of spectral synthesis and AGN phenomenon. Several publications will support the future career of the student. Preferable computing languages are Fortran, Python or IDL. |
Impact of gravitational dark matter on properties of stars
Domains: Theoretical Particle Physics | Cosmology | Astrophysics Supervisor: Ilidio Lopes Co-Supervisor: Grigorios Panotopoulos Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
Current data show that non-relativistic matter in the Universe is dominated by dark matter, the nature and origin of which still remains a mystery. The Lagrangian of the Standard Model of Particle Physics, slightly extended to include dark matter, must be coupled to Einstein's General Relativity. If the dark matter particle is a scalar field, it may be nonminimally coupled to gravity (Jordan frame). One can perform a conformal transformation to go to the more familiar Einstein frame. There one finds that even if the dark matter particle does not have any direct coupling to the fields of the SM in the Jordan frame, the conformal factor induces non-vanishing coupling between the SM particles and the scalar field that plays the role of DM in the Universe. These interactions are expected to modify properties of stars. Using the powerful tool of Asteroseismology, we may constrain the nonminimal coupling. |
Properties of the Compact Astrophysical Objects Constrained by the Nuclear Physics, Astrophysics and Gravitational Wave Physics Data
Domains: Theoretical Particle Physics | Astrophysics Supervisor: Violetta Sagun Co-Supervisor: Ilidio Lopes Institution: Universidade de Coimbra Host Institution: CENTRA, Instituto Superior Técnico / CFisUC, Universidade de Coimbra
Abstract
The compact astrophysical objects, i.e. neutron stars (NSs), hypothetical hybrid (HS) and quark stars (QSs), are the most dense physical objects accessible by the direct observations. The 2017 Nobel Prize in Physics for detection of gravitational waves (GWs) in merger of black holes along with consequent detection of the NSs merger illustrates an importance of this hot topic. Despite the flourishing of astrophysical observations, the particle composition of the interior of compact stars is still very poorly known. Moreover, the physical processes inside hypothetical objects like HSs and QSs, for which is expected that matter goes through a phase transition from nuclear matter to a plasma of strongly interacting quarks, are also very poorly understood. Particularly, this limitation comes from the fact that QCD and its lattice formulation have very limited applicability at large baryonic densities and as such does not allow to obtain a reliable equation of state (EoS). Detection of QS or HS can become another scientific breakthrough and prove existence of quark matter, which is the main quest of largest research collaborations, such as ALICE at the Large Hadron Collider (LHC) in CERN. This thesis will pursue two goals: (i) development of a realistic quark-hadron EoS which is consistent with known properties of nuclear matter, experimental data on nuclear-nuclear collisions and observational data on compact stars and merger of binary system GW170817; (ii) application of the formulated EoS to the modelling of compact stars` evolution, while at non-vanishing temperatures it will be applied to the NSs mergers and prediction the GW waveforms. The selected candidate will establish a solid bridge between experimental and theoretical astrophysics, gravitation and nuclear physics. The thesis will be supervised by Violetta Sagun, theoretical physicist, in Coimbra and Ilídio Lopes, an expert in astrophysics and cosmology, in Lisbon. |
NLO study of Lorentz violation: extended non- abelian models and Lorentz violating effective QED
Domain: Theoretical Particle Physics Supervisor: Brigitte Hiller Co-Supervisor: Marcos Rodrigues-Sampaio Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
In [1] we have studied n-point one loop functions in the photonic sector of Extended Quantum Electrodynamics and shown that no gauge anomalies are present, just as in standard QED, which is particularly tricky given that gamma-5 matrices can lead to spurious breaking due to regularization issues. The Lagrangian above has been shown to be multiplicatively renormalizable at one loop order in [2], where the divergent structure of both fermion and photon 2- and 3-point functions were studied in order to compute the renormalization constants and thereupon the beta-function. However reference [2] left out the explicit complete evaluation of 2 and 3 point functions as it contained regularization dependent parameters. Some important and immediate issues must be explored: 1. An important point addressed in [1], [3] was the careful treatment of the gamma5-matrix in divergent amplitudes. Under the light of [4] and using Implicit regularization we intend to extend the work by Colladay and McDonald in [5] and generalise the analysis of [2] to non-abelian theories without fermions. Standard Model extensions have motivations as mentioned in [5] where neutrino oscillation physics and large radiative corrections due to strong coupling in QCD, to mention a few, can lead to good bounds for Lorentz violation. 2. There appear CPT even and odd terms so that the radiative corrections can therefore be calculated separately. The CPT odd term is topological which means that it is defined up to large gauge transformations. The question whether we want to impose gauge symmetry at the level of the Lagrangian or the action becomes important. It may lead to a quantisation of coupling constants when radiative corrections are taken into account [6]. Besides its Riemann-tensor-like symmetries may bring some other constraints on (arbitrarily defined) surface terms. This study is important because CPT-odd terms behave analogously to the usual strong coupling, while the CPT-even parameter increases with energy scale. Thus CPT-even effects in the strong interactions may increase significantly at higher-energy scales and become observable. Bibliography: [1] A. R. Vieira, A. L. Cherchiglia and Marcos Sampaio, Phys. Rev. D93 (2016) 025029. [2] V. A. Kostelecký, C. D. Lane and A. G. Pickering, Phys. Rev. D 65, 056006 (2002). [3] A. Viglioni, A.L. Cherchiglia, A.R. Vieira, Brigitte Hiller, Marcos Sampaio, Phys.Rev. D94 (2016) no.6, 065023; J.S. Porto, A.R. Vieira, A.L. Cherchiglia, Marcos Sampaio, Brigitte Hiller, Eur.Phys.J. C78 (2018) no.2, 160. [4] A. M. Bruque, A. L. Cherchiglia, M. Perez-Victoria, JHEP08 (2018) 109. [5] D. Colladay and P. McDonald, Phys. Rev. D75 (2007) 105002. [6] G. Dunne, Lectures at the 1998 Les Houches Summer School, hep-th/9902115v1. [7] M. Haghighat et al., Int.J.Mod.Phys. A28 (2013) 1350115. |
Phenomenological implications of Little Higgs models.
Domains: Theoretical Particle Physics | Experimental Particle Physics Supervisor: Jose Santiago Co-Supervisor: Nuno Castro Institution: Universidade do Minho Host Institution: LIP and Universidad de Granada
Abstract
The discovery of the Higgs boson by the ATLAS and CMS collaborations in 2012 represented the experimental confirmation of the Standard Model of particle physics. Since then, the main emphasis of the particle physics community has shifted towards the search of physics beyond the Standard Model (BSM). Supersymmetry and composite Higgs models are the two leading candidates for a natural theory of BSM physics. Among the latter, Little Higgs (LH) models with T-parity are particularly well motivated as the ingenious idea of collective symmetry breaking together with T-parity guarantee the calculability of the models at one loop order. Due to their rich spectrum, LH models can be efficiently probed in a variety of experimental directions, including LHC and future particle colliders, flavor physics and astrophysical and cosmological experiments. Many of the phenomenological implications of LH models have been studied in the past. However most of these studies were done before the LHC actually started and thefore have to be updated using the real LHC data that we have at our disposal now. Furthermore little effort has been performed so far in drawing a comprehensive picture of LH models and their phenomenology. Usually the phenomenological studies focus on one particular aspect without taking full advantage of other experimental probes that can be complementary to the ones under consideration. The point of view adopted in this thesis is the opposite. We will try to build a comprehensive view of the status of LH models by considering all possible experimental and theoretical tests of the models with emphasis on their complementarity. In order to do that we will focus on the Littlest Higgs model with T-parity (LHT). This model is strongly constrained by flavor experiments, constraints that can only be evaded currently by means of fine-tuning. Our first step will consist of studying possible flavor symmetry structures that naturally protect the model for these very stringent constraints. Once we have a successful theory of flavor for the LHT we will study its implications in the vacuum structure of the model. We will then proceed to study its main phenomenological signatures in LHC and dark matter searches. This will be done in a coordinated way, taking into account simultaneously the constraints from both collider and astrophysical/cosmological probes. A more detailed account of this working plan is as follows: - Flavor models of LHT: We will consider different flavor symmetries that protect the LHT from large flavor violation. We will start with the lepton sector and extend it later to the quark sector of the model. Ideally these flavor symmetries will also predict the observed structure of fermion masses and mixing angles in the Standard Model. - Vacuum structure of the model. Once we have established a successful theory of flavor for the LHT we will study its implications in the Coleman-Weinberg potential giving mass (and vacuum expectation value) to the scalars in the theory. This in turn induces correlations among the different particles in the spectrum. - Once the spectrum of the model is fixed by the flavor symmetries we will proceed to study its phenomenological implications in collider and astrophysical and cosmological experiments. In order to do that we will start with general purpose phenomenological studies based on parameterizations of LHC experimental searches. We will analyze the current results and select the most promising experimental searches to constrain the model. We will then consider in detail these searches and their implications on the spectrum of the model, always correlating these results with the ones from cosmological and astrophysical probes, in particular with the successful explanation of the observed dark matter relic abundance in the context of the LHT. - Design of new analyses and extrapolation to future experiments. In a final step we will propose new experimental searches, both at the LHC and at future colliders, that increase the sensitivity to the LHT. We will use these optimized analyses to estimate the reach of future and planned collider experiments as well as other possible astrophysical and cosmological probes, like gravitational waves. This working plan will be done in the Universities of Minho (Portugal) and Granada (Spain), profiting from the expertise on theoretical and experimental high energy physics of the two groups (LIP and CAFPE). |
Open Questions in Theoretical Neutrino Physics and their Impact on Experiment
Domain: Theoretical Particle Physics Supervisor: Margarida Nesbitt Rebelo Co-Supervisor: Gustavo Castelo Branco Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
Neutrino Physics is a field of intense activity with several experiments under way and several new neutrino facilities being planned. CERN is involved in the future DUNE experiment at Fermilab and the Neutrino Platform at CERN is an example of the present great effort to bring together both Theoretical and Experimental Physicists working in this field. Both the Supervisor and the Co-supervisor of this Thesis Proposal are members of the CERN neutrino platform. There are many fundamental open questions in Neutrino Physics related to the origin of neutrino masses, leptonic mixing and the possibility of having CP violation in the leptonic sector. CP violation in the leptonic sector may play a crucial role in the explanation of the baryon asymmetry of the Universe. There are several experimental anomalies requiring theoretical explanation. The existence of light sterile neutrinos remains open and could help solve some of the recent puzzles. This thesis will be done in the context of model building with emphasis on the leptonic sector. Symmetries will play an important role in these studies. The theoretical analysis will try to explain the observed anomalies and predict new phenomena that could be observed in present and future experiments. |
Higgs couplings to light quarks
Domain: Experimental Particle Physics Supervisor: Nuno Leonardo Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The couplings of the Higgs boson to the third fermion family have been recently detected. During 2018, ATLAS and CMS have reported the observation of the couplings to the heaviest fermions, the top and bottom quarks. The important step that follows is then to investigate the couplings to the lighter quarks. This goal is nonetheless challenging, for two reasons: as the Higgs couples in proportion to the particle’s mass, the relevant processes are rarer; also, there are backgrounds (QCD) that result in quark-pair production considerably more abundantly than Higgs decays to quark pairs. There is an alternative approach, however. The quark-antiquark pairs originating from the Higgs decay may bind together (thus forming a meson state). Experimentally, this gives rise to a striking, clean signature: an energetic photon back-to-back to a dilepton resonance. In this thesis project the student will investigate such exclusive decays of the Higgs boson using data just collected by CMS during LHC Run2. The motivation for the work is two-fold: experimentally constrain the Higgs-quark couplings and detect the effects of potential new physics that may be revealed in these sensitive processes. |
Flavour anomalies in LHC data
Domain: Experimental Particle Physics Supervisor: Nuno Leonardo Co-Supervisor: Joao Varela Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The LHC physics program so-far has been extremely successful. It has established the standard model (SM) of particle physics as a superb theory. The SM cannot however be the ultimate theory of Nature, and a major goal of the LHC for the coming years is to detect the new physics (NP) that lies beyond the SM. The most significant and exciting indications of NP, in all of current collider data, lie in what is referred to as the “flavour anomalies”. These have persistently emerged from the data of various experiments, at the LHC and elsewhere. The anomalies have been detected in recent measurements of B meson decays. Such processes are highly sensitive to the presence of NP particles: even if those NP particles happen to be too heavy to be detected directly in the LHC collisions, they contribute as intermediary states and their presence becomes experimentally accessible in measurements of these sensitive B decays. The CMS detector has accumulated one of the largest heavy flavour datasets ever recorded. A dedicated data sample designed to facilitate the investigation of the above anomalies has been collected during 2018 by CMS. In this thesis project the student will investigate lepton flavor universality variables that sit at the core of the current anomnalies. The results that will be obtained will lead to a clarification of the anomalies, which is a current main priority in the field of particle physics. |
Novel hard probes of the primordial QGP
Domain: Experimental Particle Physics Supervisor: Nuno Leonardo Co-Supervisor: Joao Seixas Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ultra-relativistic collisions of heavy nuclei at the LHC recreate droplets of the primordial medium that permeated the universe microseconds after the big bang. Particles are expected to loose energy while traversing this hot and dense soup of quarks and gluons (QGP). Traditionally, this phenomenon has been explored in an inclusive fashion (e.g. “jet quenching”). For the first time, however, with LHC Run2 data collected by the CMS detector, we are able to reconstruct the actual hadron decays, despite the very challenging environment. In this thesis project the student will explore the large data set of lead-lead collisions collected in the last LHC run (November 2018), leading to the first observation of heavy-flavour mesons (B) and their suppression in the QGP. The observation of the rare B signals in the busy collision environment will be achieved developing dedicated machine learning algorithms. These novel probes will facilitate unique information on the flavour dependence of energy loss and the underlying properties of the QGP medium. |
Higgs and top: ttH associated production to probe beyond the Standard Model
Domain: Experimental Particle Physics Supervisor: Ricardo Gonçalo Co-Supervisor: Patricia Conde Muino Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
After an intense search, the associated production of the Higgs boson with a top quark pair was finally observed last year. This production channel provides the best way to directly measure the top Yukawa coupling between Higgs and top, the largest Higgs coupling in the Standard Model (SM). But it also provides a way to look beyond the SM, in particular to search for signs of a possible pseudo-scalar component of the Higgs boson. Such a component is well justified in scenarios like the two-Higgs doublet model, and finding it would constitute a major discovery. The selected student will integrate a Portuguese analysis team already developing this analysis within the ATLAS experiment, using new techniques developed at LIP, and working in close collaboration with groups in the UK and the USA. The student will take part in the ongoing analysis using LHC Run 2 data and be able to use the data from the LHC Run 3, to start in 2021, to achieve the best sensitivity in this measurement. As part of the work, the student will be able to participate in the operation of the ATLAS experiment during the LHC Run 3, to start in 2021, as well as in the development of instruments and methods towards the LHC upgrade. Frequent trips to CERN may be required to participate in Control Room shifts and collaboration meetings. |
Gravity at the Extreme
Domains: General Relativity | Astrophysics Supervisor: Daniele Vernieri Co-Supervisor: Vitor Cardoso Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: CENTRA - Instituto Superior Técnico / Faculdade de Ciências - Universidade de Lisboa
Abstract
The detection of gravitational waves opened a new windown onto the universe and a tool to understand the most outrageous predictions of General Relativity. Gravitational waves are a powerful new probe of dynamical astrophysical phenomena such as collisions of black holes and compact objects like neutron stars, thus promising to turn gravity research into a data-driven field and potentially providing much needed insights about fundamental physics. The goal of the project will be to exploit the phenomenology and the full nonlinear dynamics of gravity theories by means of analytical and numerical analysis probing the spacetime geometry at extreme conditions in the cosmos. |
Lattice Study of Fundamental QCD Gluon Vertices
Domain: Theoretical Particle Physics Supervisor: Paulo Silva Co-Supervisor: Orlando Oliveira Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
Quantum Chromodynamics (QCD) describes the interactions between quarks and gluons. At high energies the standard approach to solve QCD relies on perturbation theory. On the other hand, for small momenta where non-perturbative effects lead to the formation of bound states such as protons, neutrons, to solve QCD is a highly non-trivial task that requires different approaches. To handle the non-perturbative regime of the color interaction it is used either its formulation on a space-time lattice (lattice QCD) or rely on an infinite tower of Green’s functions, that need to be truncated to be solved, as the Dyson-Schwinger equations (DSE). Recent studies show that higher order Green functions do play a major role on the non-perturbative solution of QCD. Indeed, the simultaneous solutions of the quark gap equation together with the corresponding (approximate) equation for the three gluon Green function clearly improve our understanding of the color dynamics in the sense that the solution of the quark DSE and the quark-gluon vertex become closer to the available lattice results. Furthermore, the description of the three and possible higher order gluon Green functions encode clear information on the non-perturbative nature of the color interaction as, for example, the ghost dominance at small energy scales. Recall that the knowledge of the QCD Green functions solves completely the theory. The aim of this project is to perform higher precision studies of higher order gluon Green functions using lattice QCD simulations. This requires the generation of large statistical ensembles of gauge configurations and the development of techniques to resolve the lattice artefacts on the description of the various Green functions. The goal is also to identify and quantify those properties that are connected with the infrared dynamics of the color interaction and, hopefully, provide input to improve the continuum approach to QCD. The simulations will be performed using the supercomputer facilities at the University of Coimbra. The candidate will join a team with a large experience in lattice QCD simulations. |
Neutrinos @ the LHC
Domain: Theoretical Particle Physics Supervisor: Filipe Joaquim Co-Supervisor: Juan Antonio Aguilar-Saavedra Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
The generation of the light neutrino masses is an open problem of the Standard Model that requires an extension of its particle content. Associated to this mass generation mechanism, direct observable effects are predicted at the LHC in a wide class of theoretical models. In the last decade, these predictions have been cast in terms of simplified scenarios, and subsequently the searches at the LHC experiments have been done based on these simplifying assumptions. Yet, no direct signals of new physics associated to the neutrino mass generation have been observed at colliders. In this proposal we aim at exploring novel scenarios with non-minimal matter content and/or flavour assumptions and their predictions for the LHC. |
The MARTA Engineering Array at the Pierre Auger Observatory
Domain: Experimental Particle Physics Supervisor: Pedro Assis Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The Pierre Auger has established several important results in the field of Ultra High Energy Cosmic Rays, such as the strong suppression of the flux compatible with the GZK effect. However, the results pose a new puzzle that can only be understood with more and better information. In fact, Auger has planned an upgrade of its detector based on the installation of scintillator detectors on top of the existing Water Cherenkov Tanks in order to disentangle the electromagnetic and muonic components of air showers. LIP has lead a development that aims at a directly measurement of the muonic component. This program, named MARTA - Muon Array of RPCs for Tagging Air showers -, is a set of RPC detectors to be placed underneath the existing Water Cherenkov tank. The tank will act as shielding, removing most of the electromagnetic component, whereas the muons are not attenuated and reach the RPCs. The detector is based on sturdy and low power consumption RPCs developed at LIP-Coimbra, which can be operated in harsh environments such as the Argentinian pampa. MARTA was approved to deploy an engineering array consisting of a unitary cell with 8 stations (an hexagon with a twin in the center). From 2019 the detectors will be installed in the pampa and several tests are to be conducted, among them the cross-calibration with an underground scintillator detector that is alredy installed. The candidate will be involved in the development of the detector and its ancillary instrumentation, including electronics, firmware and software. He/She will also participate in the development of methods to characterize the detectors, pre and post installation, with an emphasis on the performance and stability of the RPCs. Moreover, the candidate will also take part in the commissioning of the MARTA system, paying special attention to the interface with the other detectors of the Pierre Auger Observatory. Being it is first time that RPCs systems are operated in such adverse conditions, it is expected that the successful operation of the MARTA engineering array will provide important data, not just for Auger, but actually paving the way for future RPC based Cosmic Ray detectors. |
A Taste of the Flavour Problem: is Symmetry the Missing Ingredient?
Domain: Theoretical Particle Physics Supervisor: Gustavo Castelo Branco Co-Supervisor: Ivo de Medeiros Varzielas Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
The flavour problem is one of the most Fundamental Problems in Particle Physics which is left unadressed by the Standard Model. It originates from the observed fact that each type of fundamental fermion appears replicated in 3 generations, distinguished only by their masses. It results in a proliferation of free parameters that can not be predicted from theory and have to be measured independently and it is intrinsically linked to the puzzles of neutrino masses and CP violation - a necessary ingredient for the generation of the Baryon Asymmetry of the Universe and ultimately needed to explain our own existence. In this project we will propose Theories of Flavour which attempt to address the Flavour Problem, provide an explanation for the existence and smallness of neutrino masses and mixing, and explore the origin of CP violation. The project will be supervised by Gustavo Castelo Branco and Ivo de Medeiros Varzielas. It is expected that the student will spend periods abroad, benifitting from existing collaborations with other international scientists e.g. the University of Valencia, the University of Basel and the University of Southampton. The student will also be encouraged to attend some of the top International Schools in Particle Physics, such as Trieste, Corfu, Les Houches and, of course, those organised in the framework of IDPASC. |
Gravity waves: a key process in Venus and Mars middle/upper atmosphere
Domain: Astrophysics Supervisor: Pedro Machado Co-Supervisor: Gabriella Gilli Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
Recent observations by the ESA satellite Venus Express (VEx) (2006-2014) [Drossart2007, Titov2008, Limaye2017] and on-going ground-based campaigns, also led by IA Team Members, allowed to carry out an unprecedented characterization of winds [Machado2014, Machado2017, Widemann2008, Peralta2017] and atmospheric temperatures of Venus [Gilli2015, Piccialli2015, Mahieux2015]. Those new measurements significantly improved our comprehension of Venus atmospheric circulation, and achieved new valuable constrains in atmospheric dynamics of planets with superrotation. At the same time, they put in evidence the high variability of the Venus atmosphere and they opened new scientific questions such as: what processes control the transition region (70-120 km) between the retrograde superrotating zonal flow and day-to-night circulation? How does the interplay of planetary and small-scale waves control the circulation features? Recent advancements in observational techniques are expected to bring new constrains on Venus atmospheric models at cloud top level including the capability to track short-term variations. In particular, temporal and spatial variability of winds need to be better quantified and the role of waves and the mechanisms that allow the topography to influence the upper cloud motions need to be better addressed with the help of 3D models. An important new result led by IA team using the Doppler velocimetry technique is the evidence of a symmetrical, poleward meridional Hadley flow in both hemispheres of Venus [Machado2017]. A complete interpretation of those GW induced temporal and spatial variations in the atmospheres of planets is possible with the application of 3D models. Sophisticated modeling tools have been developed by Team Members, such as Global Circulation Models (GCM) for Mars [Forget1999, Bougher2000] and Venus [Lebonnois2010, Brecht2012]. They are unique tools to support the exploration by remote sounding. Current improved versions of Venus and Mars GCM developed at the Laboratoire de Meterorologie Dynamique (LMD) described in Gilli2017a and GonzalezGalindo2005, respectively, are nowadays the only ground-to-thermosphere available GCMs for those terrestrial planets. In order to address key measurements in the middle/upper atmosphere of Venus and Mars and to interpret the large observed variability, a non-orographic GW parameterization, following the formalism developed for the Earth LMD-GCM [LottGuez2012], was recently implemented in those models [Gilli2017a,Gilli2017b]. The preliminary results are very promising: the inclusion of this key physical mechanism could partially explain data-model biases at mesospheric layers. However, given the uncertainty in the wave basic characteristics, excitation mechanisms and sources of GW, it is difficult to quantify the GW effects using a unique set of parameters, and detailed sensitivity studies are required. Scientific Goal: For this proposal of PhD Fellowship, we propose a systematic characterization and classification of the waves apparent on Venus using remote sensing images from Venus Express (VMC and VIRTIS-M) (Svedhem et al. 2007) and Akatsuki cameras (UVI, IR1, IR2 and LIR) (Nakamura et al. 2007). In the case of remote sensing images, the wave amplitudes can be derived by means of devoted Radiative Transfer models. Of critical importance for the CGMs is the systematic characterization and classification of atmospheric waves on Venus, what will allow to provide accurate estimations of the energy transport in these atmospheres. Proposed Strategy: This PhD project is intended to combine synergistically space and ground-based observations with model simulations to improve the understanding of physical and dynamical processes in the atmosphere of terrestrial planets. After a systematic search and characterization of waves present in the imagery data sets of Venus (VEx/VMC and VIRTIS-M, Akatsuki/UVI, IR1, IR2 and LIR), the PhD student will build dispersion graphs to enable a classification of the real nature of these waves and the restoration forces responsible for them (Peralta et al. 2014a; 2014b). Their wave amplitude will be first-time estimated thanks to the use of different Radiative Transfer Models (RTM): (a) the amplitude of waves in 280-nm images from VIRTIS-M and UVI will be obtained with retrievals of SO2 abundance using the RTM NEMESIS (Irwin et al. 2008); (b) 1.74, 2.26, and 2.3-μm images from VIRTIS-M and IR2 can be used to estimate normalized temperature amplitudes of waves from clouds’ opacity using the RTM by McGouldrick & Toon (2008); (c) to estimate the amplitude of the mesoscale stationary waves apparent on the nightside upper clouds of Venus, we will estimate the brightness temperature using the 4.3-µm CO2 band (ranges 4.24–4.54 µm and 4.77–5.01 µm) observable in the spectra of the VIRTIS-M cubes (Peralta et al. 2017b) and the RTM by García-Muñoz et al. (2013); (d) the 10-μm images from Akatsuki/LIR will enable to constrain the amplitudes of the main giant stationary waves apparent on the brightness temperature of Venus’s upper clouds (Kouyama et al. 2017). On the modeling side, the candidate will make use of sophisticated theoretical tools such as Global Circulation Models (GCM). In particular, we plan to use a current improved version of the Venus GCM developed at the Laboratoire de Meteorologie Dynamique (LMD) (hereinafter LMD-VGCM) [Lebonnois2016, Gilli2017a] and the last improved version of the Mars GCM, also developed at LMD (hereinafter LMD-MGCM) [Forget2017]. Models validation and inter-comparison will be performed thanks to the collaboration with partners at Michigan University and NASA Ames Institute, world references for Venus and Mars thermosphere GCMs (hereinafter VTGCM and MTGCM) [Bougher2000, Brecht2012]. |
Search for new physics with forward proton detectors at ATLAS
Domain: Experimental Particle Physics Supervisor: Patricia Conde Muino Co-Supervisor: Nuno Castro Institution: Instituto Superior Técnico Host Institution: LIP and Universidade do Minho
Abstract
The non-abelian structure of the gauge theory in the SM implies the existence of triple and quartic gauge boson couplings (TGC and QGC, respectively) fully constrained by the gauge symmetry. The measurement QGC provide a window into the Electroweak symmetry breaking mechanism, given the fact that the longitudinal modes of the W and Z bosons are Goldstone bosons. Deviations from the SM predictions can appear due to interchange of new particles, integrated out in the effective interaction, in new physics theories. Models with a new heavy scalar singlet interacting with the Higgs sector can modify the quartic gauge boson couplings but not the triple ones. It is, therefore, essential to probe this missing part of the SM, measuring both the TGC and QGC. The ATLAS sensitivity to anomalous couplings in the γγWW, γγγγ and γγZZ vertices can be improved two orders of magnitude by using the ATLAS Forward Proton tagging detectors (AFP) [1]. AFP effectively converts the LHC into a photon-photon collider: the two scattered protons emit two photons that annihilate to produce a pair of vector bosons (two W’s, for instance). The protons, that stay intact after the interaction, are scattered through very small angles and they are detected at the AFP stations. Since there is no underlying event, the two vector bosons are the only particles produced centrally. If they decay to leptons they can be easily triggered and identified. The invariant mass of the vector boson pair can be measured precisely by determining the proton energy loss with the AFP detectors, even in the case of neutrinos in the final state. The presence of anomalous quartic gauge boson couplings could be observed as an increase in the number of detected vector boson pairs with large invariant masses. This project proposes the search for anomalous couplings of the type γγWW using the AFP detectors. The same final state can be used to search for dark matter in photon-induced processes, using also the capability of the forward proton tagging detectors [2]. The search for dark matter is challenged due to the low transverse momentum of the leptons produced. An adequate strategy for triggering this kind of processes is therefore needed. It implies the combination of proton tagging information with muon/electron triggers reconstructed with the ATLAS central detectors, already at the first level trigger and probably making use of the topological trigger processors. The development and optimisation of such a trigger strategy is also an objective of this project. References [1] E. Chapon, C. Royon, and O. Kepka, Anomalous quartic WWγγ, ZZγγ, and trilinear WWγ couplings in two-photon processes at high luminosity at the LHC, Phys.Rev. D81 (2010) 074003, arXiv:0912.5161 [hep-ph]. [2] L.A. Harland-Lang, V.A. Khoze, M.G. Ryskin and M. Tasevsky, LHC Searches for Dark Matter in Compressed Mass Scenarios: Challenges in the Forward Proton Mode, arXiv:1812.04886 [hep-ph] |
Probing Quark Gluon Plasma with Heavy Flavour Jets
Domain: Experimental Particle Physics Supervisor: Helena Santos Co-Supervisor: Patricia Conde Muino Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ultra-relativistic heavy-ion (HI) collisions at LHC energies are performed in order to produce strongly interacting QCD matter at extreme temperatures and densities. In such conditions, matter undergoes a phase transition from ordinary hadronic matter to a plasma of quarks and gluons, the quark-gluon plasma (QGP). High transverse momentum quarks and gluons generated by hard-scattering processes constitute powerful probes of the QGP; partons may lose energy in interactions or through induced gluon radiation while traversing the QCD medium, providing sensitivity to its transport properties. This phenomenon of energy loss is commonly referred to as “jet quenching” and its expected signatures are the modifications of the jet production at a given transverse momentum and of fragmentation functions. In particular, the study of heavy flavour jet production contributes to a deeper understanding of the QGP produced at LHC, namely in what regards energy loss mechanisms. Because the heavy quark mass inhibits the medium-induced gluon radiation it is expected that the energy loss (el) of the different quark flavours follows the pattern elb < elc < elq, where the subscripts b, c and q stand for bottom, charm and light (up, down or strange) quarks respectively. Hence, the comparison with the light jets production will be valuable to the comprehension of the energy loss nature. This topic is crucial for the ATLAS/LHC HI Program and special efforts are expected for the Run 3 of the LHC. |
Cosmic and Gamma rays new-generation detectors
Domain: Experimental Particle Physics Supervisor: Pedro Assis Co-Supervisor: Bernardo Tomé Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The main aim of this thesis is to prepare the future of Cosmic and Gamma-ray experiments. This thesis consists in developing new detectors and its associated instrumentation and also on devising new forms to use existing technology. Typically the techniques used to detect Cosmic Rays and Gamma Rays rely on the detection either of the particles in the shower front at ground level or the UV light produced in the atmosphere by fluorescence or by Cherenkov effects. The field has reached such maturity that it is necessary to change paradigm for further advances. On the detector technologies it is necessary to come up with a detector that can either enhance the detection capabilities or that is able to reduce the cost of each station to allow bigger arrays to be deployed. For instance LIP has adapted and developed the RPC (Resistive Plate Chamber) that can be used in CR arrays, enhancing its detection capability for the muon component of Air Showers, which will impact directly on the sensitivity to the mass of the primary and to hadronic interactions at the 100 TeV scale. The applicant will need to understand the standard techniques and the current plans for the field and identify the needs and opportunities for improvement. Moreover the applicant is expected to follow current discussions about the next steps in the area. The program comprises a large experimental component as well as simulation to test the proposed designs. |
Search for New Physics in the associated production channel of a Higgs and a W boson
Domain: Experimental Particle Physics Supervisor: Patricia Conde Muino Co-Supervisor: Ricardo Gonçalo Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Since the discovery of the Higgs boson, the precise measurement of its properties has become a fundamental part of the ATLAS Physics programme. The recent announcement of the observation of the Higgs decay to b-quarks and the associated production of the Higgs with top quarks, done by the ATLAS and CMS collaborations at CERN, probe directly the coupling of the Higgs to quarks and constituted an important step forward in the understanding of the Higgs mechanism. As the LHC continues to take data and more luminosity is accumulated, more precise measurements of the Higgs boson properties are possible, opening the door to search for new physics in the Higgs sector. In this line, the study of the high transverse momentum (high-pT) Higgs production, in the associated production channel with a W boson, is sensitive to new physics in the HWW vertex and constitutes one of the measurements to be done in the near future at the ATLAS collaboration. This project focuses on the study of the high-pT Higgs boson production cross section in the associated production channel with a W boson, when the W decays to a lepton and a neutrino and the Higgs to b-quarks. The ATLAS Portuguese team has contributed to the observation of the Higgs boson decays to b-quarks in this channel and is currently involved in the measurements of the cross section production at high-pT. This work requires the use of dedicated reconstruction techniques for the identification of two highly collimated b-jets, produced in the decay of the Higgs boson. Machine Learning techniques will be developed to improve the performance of this methods. The student will be part of the ATLAS Portuguese team participating in this analysis. The work will be developed in an international collaboration and the results obtained will be presented at CERN. The student is expected to contribute in addition to the data taking and detector operation activities, both at CERN and at LIP. |
First-ever population studies of blazar in the TeV regime with Cherenkov telescope data
Domain: Astrophysics Supervisor: Michele Doro Co-Supervisor: Alessandro de Angelis Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ground-based Imaging Cherenkov Telescopes observe the sky in the TeV regime, with gamma rays produced during non-thermal events such as in astrophysical shocks or relativistic jets in active galaxies. These instruments collected so far more than 50 targets among the class of active galactic nuclei, specially Blazars, that are those whose relativistic jet points at us. The TeV emission probes the extreme regime where electrons populating the relativistic jet upscatter low-energy ambient photons via inverse Compton process. It is timely now to start perform comparative studies among these targets. This has not been fully attempted so far. The candidate will make use of MAGIC telescope data at first, working on the first public catalog of MAGIC targets, and later on comparing with other telescopes data as well as using Fermi-LAT public data. The work will be finally made in prospects of CTA, a project of state-of-art high-energy astrophysics. It is foreseen the successful candidate will share the working time between Portugal (2 years) and University of Padova (1 year) in order to achieve the Doctor Europaeus PhD title. |
Polarimetric studies of galaxies
Domains: Cosmology | Astrophysics Supervisor: Santiago Gonzalez-Gaitan Co-Supervisor: Ana Mourão Institution: Instituto Superior Técnico Host Institution: Instituto Superior Técnico
Abstract
Astronomical sources such as galaxies contain large amounts of dust that is essential in shaping their characteristics, their life cycle and their evolution, but that also affects the observations we take of them. The polarization state of the light helps us to extract valuable information about the dust grains that is not available using traditional intensity measurements. We propose to carry out a study of multi-band linear polarimetry of nearby galaxies observed with the instrument FORS2 at ESO-VLT in conjunction with state-of-the-art 3D radiative transfer models to elucidate the question of galaxy dust in an unprecentend way. The PhD student will develop skills in observation, reduction and analysis of polarimetric data as well as in modelling continuum radiation transfer in dusty environments of galaxies. |
Measurement of top quark rare decay t->sW at ATLAS
Domain: Experimental Particle Physics Supervisor: Filipe Veloso Co-Supervisor: Ricardo Gonçalo Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The top quark is the heaviest elementary particle known, with a mass close to the electroweak symmetry breaking, and it can provide clues about the symmetry breaking and the Higgs mechanisms. It is thus an excellent object to test the Standard Model of Particle Physics (SM). There is an important effort to study the top quark properties, like its mass, production cross-sections, electric charge, spin, decay asymmetries, rare decays, etc... Deviations from SM predictions of the production and decay properties of the top quark provide model-independent tests for physics beyond the SM. According to the SM, the top quark decays nearly 100% of the time to a W boson and a b quark. The Cabbibo-Kobayashi-Maskawa (CKM) quark mixing matrix is related to the rates of the Flavour Changing Charged Current (FCCC) decay modes. Some of the elements were not yet directly measured but are determined from the unitarity conditions of the matrix. The present estimated value for the Vts element is (42.0+-2.7)×10-3. A direct measurement of these elements put strict conditions on the assumptions behind the matrix properties, as the existence of only three families on the SM. This research program will be developed within the Portuguese ATLAS group. It aims to measure the decay rate of the top quark into a W boson plus a s-quark with LHC data collected by the ATLAS detector using computational tools such as machine learning techniques. This result will then be used to measure the amplitude of the CKM element Vts. In addition the student will participate in the maintenance and operation of the ATLAS detector, namely in the calibration of the TileCal hadronic calorimeter. Short periods at CERN may also be required in order to collaborate in working meetings and/or shifts. |
Neutrino studies in the LUX-ZEPLIN (LZ) experiment
Domains: Experimental Particle Physics | Astroparticle Physics Supervisor: Francisco Neves Co-Supervisor: Alexandre Lindote Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Merging two teams with wide experience and an history of leadership in the field of direct dark matter detection (LUX and ZEPLIN), the LZ collaboration is developing a 10-ton dual-phase Liquid Xenon detector, the biggest ever built using this technology. With the protection offered by the outer layers of xenon, the 5 tonnes of the inner region of LZ will be an extraordinarily low background place, allowing this detector to improve the current best sensitivity to WIMPs (the leading dark matter candidate) by a factor of more than 100. But this extraordinarily quiet laboratory can also be used to study other important physics processes beyond dark matter search. While being an irreducible source of background in the context of WIMP searches, the high sensitivity of LZ will allow to study the fundamental properties of neutrinos as well their sources. Since neutrinos interact only very rarely with fermions or gauge bosons through weak interactions, they remain mostly unchanged since their creation and therefore provide an unique tool to probe the physics processes and conditions involved. In particular, LZ is expected to be sensitive to both the elastic scattering and coherent nuclear scattering of neutrinos from, respectively, pp, 7Be, 13N and 8B reactions in the sun thus allowing to study its core dynamics. Notably, the LZ ~1 keV energy threshold is expected to lead to an increase in sensitivity of ~1 order of magnitude relative to the current best upper limit of 5.4x10-11 μB for the neutrino magnetic moment. In this project, the student will be integrated in the activities of the LZ international collaboration being supported directly by the LIP team, largely experienced in all aspects of this type of experiment. The main focus of this project will be in the fundamental properties of neutrinos, their fluxes and sources, but this work will also have a valuable impact in the searches for various Dark Matter candidates (e.g. WIMPs, Axions, ALPs -- Axion Like Particles). |
Measuring di-Higgs production at ATLAS to probe the electroweak vacuum
Domain: Experimental Particle Physics Supervisor: Ricardo Gonçalo Co-Supervisor: Filipe Veloso Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Since its discovery, the Higgs boson became a prime tool to search for physics beyond the Standard Model (SM). With the current level of precision, LHC data indicates a Higgs boson compatible with SM expectations, but we have only just begun to scratch the surface. The wealth of new results on the Higgs boson properties since the discovery have only probed the region around the minimum of the potential, and no experimental constrain has been made on the shape of the Higgs potential. This is intimately intertwined with the breaking of the electro-weak gauge symmetry, and so with the nature of the fundamental forces we experience today. To experimentally constrain this shape we must measure the triple Higgs coupling parameter. This is accessible through the simultaneous production of two Higgs bosons, HH. The selected student will join the Portuguese ATLAS team, working in close collaboration with theorists and foreign experimental groups. The two related aims of this project are to contribute to ongoing ATLAS analyses in this channel and to develop sensitivity studies for future analyses. To achieve optimal results in this challenging analysis, the student will employ the latest theory developments and the most recent advances in reconstruction techniques: from boosted object identification to machine learning. As part of the work, the student will also take part in experimental developments towards the LHC upgrade, in particular the development of the Hardware Tracking Trigger co-processor, one of the most advanced parts of the ATLAS upgrade programme. The work programme will include at least one year at LIP Lisbon and one year at the University of Coimbra, where both supervisors are based. The student will be able to participate in the operation of the ATLAS experiment during the LHC Run 3 to start in 2021 Frequent trips to CERN may be required to participate in Control Room shifts and collaboration meetings. |
Silicon Photomultiplier technology to improve science for TeV wide-field detectors.
Domains: Astrophysics | Astroparticle Physics Supervisor: Pedro Assis Co-Supervisor: Michele Doro Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
In applications where fast response to weak photon signals are needed, Silicon Photomultipliers (SiPMs) are becoming a wide-spread solution, often providing optimal alternative solutions to photomultiplier tubes (PMTs). One of the main advantages of SiPM is the low-operating voltage and low ageing, it's cost-efficency and the possibility of novel integrated designs, counterbalanced by small detector area of few squared millimeters. Combining SiPMs to provide larger sensitive area - as well as using optical light guides - is possible, provided a suitable electronics for precise time-alignment. A module with 16 SiPM units was successfuly designed at the University of Padova for ground-based Cherenkov Telescopes. We propose now to adapt this technology to the needs of the recently proposed LATTES detector. LATTES is a hybrid instrument composed of Water Cherenkov tanks and Resistive Plate Counters, aimed for installation in South America, to detect cosmic gamma rays from 100 GeV to 100 TeV, to study particle acceleration in astrophysical environments such as pulsars and active galaxies' jets. The candidate will participate in designing SiPM units and configurations, simulating their response, and testing them on prototypes. Results will be applied to simulate LATTES sensitivity to the emission from the ultrarelativistic jest of active galaxies. The activities will be shared between Lisboa and Padova (Italy) and would grant the successful candidate the title of Doctor Europaeus. |
BH&CSLab - Black Holes and Compact Stars as Laboratory to test alternative theories of gravity
Domains: General Relativity | Astrophysics Supervisor: Daniele Vernieri Co-Supervisor: Daniela Doneva Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa / University of Tübingen
Abstract
General Relativity (GR) is the most elegant attempt to describe the gravity force. Tested to exquisite precision in the weak-field regime, it might be not sufficient for describing the final stages of gravitational collapse and more in general high-energy astrophysical events. Furthermore, a deeper understanding of gravitation seems to be a necessary ingredient for solving almost any other major challenge in fundamental physics. A strong breakthrough is then required in order to investigate new scenarios especially at energy regimes where GR loses its predictability. An outstanding chance is given by the detections of gravitational waves (GWs) which has opened a new era to test the fundamental nature of gravity at its strongest and matter at its densest in some of the most extreme environments in the cosmos. The goal of the project will be to unveil the fundamental nature of gravity through phenomenological investigations, theoretical modeling and numerical analysis of alternative theories of gravity using black holes and compact stars as laboratory in order to fully test the dynamics of gravity theories and to probe the spacetime geometry in strong-field regime. This work will thus pave the way for extracting useful information from past and future GWs detections. |
Jet Quenching Monte Carlo Event Generator
Domain: Theoretical Particle Physics Supervisor: Liliana Apolinário Co-Supervisor: Nestor Armesto Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The study of the Quark-Gluon Plasma (QGP), a state of QCD matter characterized by extreme densities and temperatures, is at the forefront of the physics program of both the Large Hadron Collider (LHC) at CERN and the Relativistic Heavy-Ion Collider (RHIC) at BNL. Due to its very short lifetime, the assessment of the QGP properties is only possible by relying on the probes that are generated within the collision. Among these, jets, a spray of highly energetic and collimated particles, are among the most widely used probes. Its production and evolution in proton-proton (pp) collisions are exceptionally well understood within the Quantum Chromodynamics perturbative approach. In the presence of a medium, however, the jet formation is known to be modified - a process generically known as jet quenching. By accurately assessing such modifications, jets have the potential to provide a unique tomographic tool of the QGP formation and evolution. Jet quenching description is still an on-going theoretical and phenomenological effort. Due to difficulties in consistently describing in-medium jet evolution, from high to low momentum scales, several jet quenching models, each with its analytical approximations and assumptions, came to be. In addition, as in pp collisions, data interpretation of heavy-ion collisions heavily rely on Monte Carlo event generators. Currently, there are several jet quenching Monte Carlo implementations, but none contains all the recent theoretical features that are known to be essential to describe both LHC and RHIC measurements. In this thesis, the student will further develop an existing jet quenching Monte Carlo that is based on perturbative QCD. The selected candidate will not only accommodate the latest theoretical jet quenching results in this Monte Carlo, as he/she will further develop the existing jet-QGP interaction model. This work is expected to take place between LIP-Lisbon in Portugal (Liliana Apolinário) and the University of Santiago de Compostela in Spain (Nestor Armesto and Carlos Salgado). |
Investigation of astrophysical gamma-ray sources with a new detector concept
Domain: Astroparticle Physics Supervisor: Ruben Conceição Co-Supervisor: Giovanni La Mura Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Very-high-energy (VHE) gamma-rays are messengers of violent processes in the universe. In particular, their production is closely connected with the particle acceleration of the very-high-energy cosmic rays detected on Earth. VHE photons are thus key to understand the long-standing mystery of the mechanism by which cosmic rays are accelerated. Their intergalactic propagation across vast distances is sensitive to possible modifications of the structure of space-time on the Planck scale. Moreover, the detection of VHE gamma rays may also provide a clue to the nature of the dark matter (DM). In fact, weakly interacting massive particles (considered the most plausible form of DM) are expected to mutually annihilate, resulting in measurable signals, among them VHE gamma rays. Dark matter particles tend to accumulate at the centers of the galaxies. As such the center of our galaxy is a preferred spot to look for these DM signals. While most VHE gamma-ray detectors currently in operation are located in the northern hemisphere, several of the next-generation detectors are planned to be installed in the southern hemisphere, to have a privileged view of the galactic center. In this context LATTES (Large Area Telescope for Tracking Energetic Sources) is a project currently involving groups from Brazil, Italy, and Portugal (LIP), aiming to develop a next-generation gamma-ray detector to be installed in South America. The region of the Atacama Desert in northern Chile, at an altitude above 5000 meters, is one of the most promising sites. One of the biggest challenges to be addressed by LATTES is to bridge the gap between gamma-ray observations using satellites such as Fermi, sensitive up to several tens of GeV and the present and planned gamma ray ground-based experiments, which start to be sensitive at only several hundreds of GeV. By employing a hybrid detection technique and being deployed at high altitude, LATTES should be able to detect photons with energies as low as 100 GeV. The selected candidate will work on the assessment of LATTES science capabilities, namely on its ability to observe and extract information of extreme energy events such as binary neutron star mergers (which are associated to gamma-ray bursts and to the production of gravitational waves) or flares from the jets accelerated by massive black holes and fast rotating pulsars. This task will involve the modeling and interpretation of observational data and the development of analysis tools to improve the sensitivity of this future experiment. |
Exploring the meson structure by the Drell-Yan process at the COMPASS Experiment from CERN
Domain: Experimental Particle Physics Supervisor: Catarina Quintans Co-Supervisor: Pedro Abreu Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The subject of this thesis is the study of the most recent pion-induced Drell-Yan data collected by the COMPASS experiment at CERN, with the aim of understanding the momentum distributions of partons inside hadrons. The Drell-Yan process consists in the quark-antiquark annihilation, with the detection of a pair of leptons in the final state. The COMPASS data now available is a unique data set, that can finally bring new input to the parton distribution functions of the pion, the lightest hadron, up to now very poorly known. The situation with the pion knowledge is at variance with the high precision already reached for the proton. The relevance of the intrinsic transverse momentum of quarks and gluons inside the hadrons, manifested in the measured angular dependencies of the Drell-Yan process, can be studied from the COMPASS data in a region of phase-space not accessible to collider experiments. These data shall have significant impact in our understanding of pion and proton internal dynamics. The PhD candidate will develop his work integrated in a team of experienced researchers. |
Charged particle astronomy and high energy particle physics with the Pierre Auger Observatory
Domain: Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: Ruben Conceição Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ultra High Energy Cosmic rays are the most energetic particles known so far. They exceed by several orders of magnitude the energies achieved at LHC and are produced in the most violent places of the cosmos. But very little is known about their true nature and how they achieve this extraordinary energy. The sources of the UHECR have remained in mystery for decades. The acceleration astrophysical scenarios point to the most violent phenomena in nature, like Active Galactic Nuclei or Gamma Ray Bursts. The Pierre Auger Observatory is the largest facility ever built to study cosmic rays. It is using 1600 water Cerenkov detectors covering an area of 3600 km2 to sample the shower particles when they reach the ground. Those detectors measure the energy deposited by charged particles when releasing Cerenkov light in water. In addition to that, an optical telescope collecting ultraviolet light -which is produced by fluorescence of the nitrogen molecules excited by the cascade particles can image the longitudinal development of the shower whereas it crosses the atmosphere. The observatory is also to deploy a series of complementary detectors that include: antennas for radio detection, a scintillator on top of the Cerenkov tanks, a set of buried scintillators (AMIGA), and engineering of segmented RPCs beneath the Cerenkov tanks (MARTA engineering array). The student will become a full member of the collaboration, and it is expected to take an active role in the experiment in any of the several tasks of the experiment ranging from: - detectors and performance (calibration, operations, and installation/deployment), - analysis foundations (data analysis, reconstructions algorithm creation, computing) - nuclear mass composition (which includes photons and neutrinos), hadronic interactions, astrophysical scenarios and arrival directions (which involve data analysis, physics interpretation, and physics model building). He or she will also be expected to make field trips to the Observatory site, in the Argentinian city of Malargue, near the Andes. The student is expected to lead and coauthor papers in leading journals. The Pierre Auger Observatory is among the most cited collaborations in the field. Thus, the exposure and impact of a Ph.D. thesis results are ensured. In addition, the work of the student has a quick starting point of dissemination, as the collaboration itself is a vast audience of more than 500 scientists around the world. This field demands the confluence of scientists from interdisciplinary fields, and thus the observatory has turned an excellent place to study from physics of the atmosphere, electromagnetism (for instance, elves, lightning formation), earthquakes, solar physics, and many others. |
Study of high energy hadronic cascade through muons
Domains: Theoretical Particle Physics | Experimental Particle Physics | Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: Ruben Conceição Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ultra High Energy Cosmic Rays (UHECR) are the most energetic particles known in nature, being their astrophysical sources and nature still unknown. As they enter the Earth’s atmosphere, they collide with atoms generating typically thousands of secondaries, which can interact again, creating a multiplicative process, known as Extensive Air Shower, which can reach up to 10^11 particles at ground level for 10^20 eV showers. The first interactions occur at the center of mass energies up to 400 TeV, more than one order of magnitude above the most energetic human-made accelerator. This means that UHECRs are a unique opportunity to study particle physics above the LHC energy scale. However, although the EAS encodes the information about the nature of the primary (which is expected to be of hadronic nature – proton to iron) and about the characteristics of the hadronic interaction (which shapes the development of the EAS), this information is degenerated. A promising tool to break this degeneracy is the study of muons. Muons come from the decays of charged mesons, which are a direct by-product of hadronic interactions. Moreover, muons can travel many kilometers from the hadronic shower almost unaffected, carrying valuable information. The understanding of the muons distributions is an essential key to break the degeneracy between the uncertainties on the extrapolation of the hadronic interaction models to the highest energies and the composition of the UHECR beam. The study of the air shower can be done by means of the cascade equations, assuming some simplifications, or using full Monte Carlo simulations that include many important details challenging to account for otherwise. On the other hand, Heitler models offer a simplified version of the main multiplicative process of a cascade and serve to qualitatively understand the most important features, giving approximated values for relevant variables of the cascade. Although Monte Carlo simulations offer the most complete description of the shower, this is done at the cost of losing understanding to the main underlying physics. Hence, in this thesis, we propose to investigate the muon distributions of the EAS using analytical models. This is a complex physics problem that requires a combined effort from different points of view: mathematical, statistical and analytical. This would allow not only to identify the main shower properties that drive the muon distributions main characteristics but also would give a profound knowledge over its connection to the hadronic shower. The results from this work would naturally be used to extract information about the high-energy hadronic interaction from the experimental measurements on the muon distributions, in particular, those conducted at the Pierre Auger Observatory. |
Data Science: new tools for Astroparticle Physics and its applications to the industry
Domain: Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: Ruben Conceição Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Data Science is an old field with renewed looks, thanks to the advances of computing science. It consists on analyzing data sets to find correlations, causal relations, patterns; build a hypothesis, assign significances to them, assess the efficiency of an algorithm of finding a signal probability of false positives, assess the capabilities of finding a signal. Define control samples, simulate and replicate the reality according to a model. Access, store, retrieve data, moderate or extremely large data-sets (Big Data); create automatic tools that take decisions, machine learning. Data Science is key in modern society outside fundamental science. Data scientist is one of the most sought-after jobs of the moment by a large variety of companies, for instance: social networks, large retail companies, pharmaceutical, consulting and telecommunication companies, among others. The goal of this Ph.D. is to develop and explore new tools in the the domain of astroparticle physics, namely the data collected by the Pierre Auger Collaboration: muons collected at the ground and their relation with high-energy hadronic interaction models; ultra-high-energy cosmic ray arrival directions and connection with astrophysical sources. The student will also take part in a joint project with a well-known company within a research project in Data Science. He/she will apply the standard techniques of physics in general, and the tools developed within this project in particular, aiming at strengthening the synergies between fundamental research and the industry. The proposed Ph.D. combines the data analysis performed in Academia with the needs of Modern Industry. |
Design of a next-generation detector for gamma-ray astrophysics in South America
Domains: Experimental Particle Physics | Astroparticle Physics Supervisor: Ruben Conceição Co-Supervisor: Bernardo Tomé Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Very-high-energy (VHE) cosmic gamma rays are messengers of violent processes in the universe. In particular, their production is closely connected with the particle acceleration of the very-high-energy cosmic rays detected on Earth. VHE photons are thus key to understand the long-standing mystery of the mechanism by which cosmic rays are accelerated. Also, their intergalactic propagation across vast distances is sensitive to possible modifications of the structure of space-time on the Planck scale. Moreover, the detection of VHE gamma rays may also provide a clue to the nature of the dark matter (DM). In fact, weakly interacting massive particles (considered the most plausible form of DM) are expected to mutually annihilate, giving rise to the creation particles, among them VHE gamma rays. Dark matter particles tend to accumulate at the centers of the galaxies. As such the center of our galaxy is a preferred spot to look for these DM signals. While most VHE gamma-ray detectors currently in operation are located in the northern hemisphere, several of the next-generation detectors are planned to be installed in the southern hemisphere, to have a privileged view of the galactic center. LIP is currently involved in an international project involving teams from different countries such as, Brazil, Italy, Germany, and the USA, to develop and optimize a new concept for a large field-of-view gamma-ray observatory to be installed in South America. The region of the Atacama Desert in northern Chile, at an altitude above 5000 meters, is one of the most promising sites. One of the biggest challenges in the design of this novel detector concept is to bridge the gap between gamma-ray observations using satellites such as Fermi, sensitive up to several tens of GeV and the present and planned gamma ray ground-based experiments, which start to be sensitive at only several hundreds of GeV. This would allow to observe the southern sky with high duty-cycle in an energy region where serendipitous events from, e.g., VHE gamma-ray flares from Active Galactic Nuclei or Gamma-ray bursts are expected to occur. The observatory would thus also play an essential role in issuing alerts to other observatories, thus contributing in an important way to the multi-messenger observations network. The interested candidate will participate in the activities of the LIP team. The works to be performed include: The simulation of the detector concept using the Geant4 toolkit and air-shower simulation tools such as CORSIKA; The development of data analysis tools and studies of the performance of the full-scale detector; The participation on the commissioning of the prototype detector to be installed in South America; |
Electronic and optical properties of liquid xenon for particle detection
Domain: Experimental Particle Physics Supervisor: Vitaly Chepel Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Quite a number of experiments in particle and astroparticle physics are using or planning to use noble liquid detectors given their outstanding electronic and optical properties. Among those experiments one can mention direct dark matter searches, search for neultrinoless double beta decay, proton decay, flavour violation in muon decay, neutrino physics. Detectors based on liquefied noble gases can provide high energy resolution and excellent tracking capabilities and, at the low energy end, can reach sensitivity to a single electron created in the liquid by a passing through particle. It is the possibility of simultaneous detection of scintillation light and ionisation electrons from a particle track, but above all the possibility to extract these electrons from the liquid to gas, that make noble liquids so special. In order to push the detectors to the limit of their capabilities, on need to understand in details the physical processes that a particle triggers in the medium like liquid xenon or argon. Although a great effort has been done in this direction, one can identify some lacunae in our understanding that are important to be filled. It is therefore proposed to the candidate to carry out laboratory studies of electronic and optical properties of liquid xenon (and eventually liquid argon). |
Enhanced event discrimination in liquid scintillators through the use of advanced pattern recognition techniques
Domains: Experimental Particle Physics | Astroparticle Physics Supervisor: Fernando Barao Co-Supervisor: Nuno Barros Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The SNO+ experiment aims to observe the rare neutrinoless double beta decay by doping a a large volume of liquid scintillator with natural tellurium, having the largest sensitivity of the experiments currently built. By using a large volume of liquid scintillator, SNO+ is able to obtain a competitive sensitivity to this decay, and display a portfolio with a rich physics program such as solar neutrinos, reactor and geo-antineutrinos and also be sensitive to the neutrino signal of a supernova. One of the main challenges of a liquid scintillator experiment consists in the discrimination of different physics channels and involving different particles. For instance, solar neutrinos analysis could benefit from directionality separation from backgrounds, which are not correlated with sun direction. Conversely, solar neutrinos are a major background for double beta decay search and its impact can be strongly reduced using directionality reconstruction. The main signal emitted by a charged particle in SNO+ comes from scintilation light, that is uniformly emitted over all directions. Nevertheless, a smaller light component is present due to the Cerenkov radiation, which is highly directional. This thesis proposes to explore the use of advanced methods of pattern recog- nition to enhance the discrimination capabilities of the experiment to different particle types and event topologies. The work will include the development of methods able to separate the Cerenkov and scintilation components. In addition, identification of different particle types (electron, positron, proton, alpha) can be explored via time and signal charge profiles. This will allow not only to gain a better understanding of the backgrounds, but also to improve the purity of the signal data sample and the physics sensitivity of the experiment. In a later phase the implemented techniques will be applied to perform a search for solar neutrinos and double beta decay in the SNO+ datasets. |
Modelization and impact of electric field distorsions on the DUNE Long Baseline Neutrino program
Domain: Experimental Particle Physics Supervisor: Fernando Barao Co-Supervisor: Nuno Barros Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The DUNE experiment is a next generation neutrino experiment currently under development to be deployed at SURF in Lead, South Dakota. By using the broad- band neutrino beam from LBNF at Fermilab, DUNE’s main goal is to determine 1the neutrino mass hierarchy and the observation of leptonic CP violation. The primary technology employed by the experiment consists of a liquid argon TPC, which allows the experiment to not only have unparalleled imaging capabilities, but also have a rich physics program beyond the main accelerator neutrino program, including low energy neutrinos, nucleon decay, atmospheric neutrinos and be sensitive to neutrinos of supernavae within this and nearby galaxies. A prototype program is currently being pursued at CERN with two cryostats of 1kt that will be used to study events from cosmic rays and from a proton beam that will permit to produce the final states of the major neutrino interactions expected in DUNE. One of the main challenges of this technology consists in the proper modeling of the detector response and the various effects that affect it. The particle signal is built from argon ionization whose positive ions and electrons derive in opposite directions, pushed by the large electric field applied. The knowledge of the field is of paramount importance for the reconstruction accuracy and distortions of the electric field in the cryostat can be caused by accumulation of slow argon ions. ProtoDUNE, installed at the earth surface, benefit from the large number of muons crossing it thus providing straight calibration tracks. This thesis will use the cosmic ray events taken in the first run of the experiment to understand the so-called space-charge effects. This will permit to calibrate the detector in the whole volume, improving its response and therefore reducing the systematic uncertainties in the event reconstruction. In a later phase, planned to 2021, a dedicated calibration system based on a powerful laser able to ionize argon will be developed and deployed. The thesis plan will include working with the dedicated calibration system, contribute to the development of DUNE calibration strategy by comparing the electric field distorsions obtained with two complementar methods from the cosmic muon analysis and the laser system. |
Machine learning and neutron star physics
Domain: Astrophysics Supervisor: Constança Providência Co-Supervisor: Márcio Ferreira Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
There is a growing interest in applying Machine Learning (ML) techniques in astrophysical problems. The challenging issues faced in astrophysics, and in particular Gravitational Waves (GW) physics, demand for different perspectives and approaches of computer science. The efficiently handling of complex and massive data sets demands for new techniques and algorithms. One of the open questions in nuclear physics is determining the equation of state of nuclear matter. Neutrons stars (NSs) are singular physical systems that allow one to investigate the equation of state of nuclear matter under extreme conditions, far beyond the ones attainable in laboratory experiments. The GWs signals emitted during a neutron stars merger carry crucial information about the neutron star matter at high densities. The tidal deformability is one important signature carried by GWs that constrain the equation of state of nuclear mater. Advanced LIGO and Advanced Virgo gravitational-wave detectors made their first observation of a binary NS inspiral on August 2017. The GW170817 observation exposed the potential to directly probe the physics of NSs, which opened a new era in the field of multi-messenger astronomy and nuclear physics. From the analysis of the GW170817 data, was possible to set an upper bound on the NSs combined dimensionless tidal deformability. Further insights into the physics of neutron stars are expected from future gravitational wave observations. Bayesian analysis is one of the standard methods for making inference of physics information from experimental observations. The Bayesian inference relies in calculating the posterior probability distribution, which is the product between the likelihood function and the prior probability distribution. The prior distribution encodes our present knowledge on some physical quantity and, for a limited number of available data, its choice becomes crucial for the Bayesian inference. Therefore, only for a sufficiently larger data-set, the inference becomes weakly dependent on the prior. Machine learning methods might be a reliable alternative for inference even when we are faced with a limited experimental data-set. Deep learning is a branch of machine learning with a hierarchical structure of neural networks that extracts high-level representation from data [1]. Due to the non-linear representations that these hierarchical structures can learn from data, they are highly effective in tackling complex non-linear systems. They have been very successfully applied in images or speech pattern recognition and automated translation. Currently, there is an increasing interest in the application of these Deep learning methods in many areas of physics. In condensed matter physics, they were applied in the identification of phase transitions [2]. In particle physics, for the processing of experimental heavy-ion collision data-set [3]. Recently, they were employed for recognizing the different phases of a quantum field theory system and to predict the value of several observables [4]. The use of Deep learning on neutron star physics was first explored in [5]. A neural network was employed as an efficient procedure for mapping from a finite set of mass-radius data with observational errors onto an equation of state. The presents project proposes to further explore these Deep learning methods in constraining the equation of state of nuclear matter. It possibilities physical inference on the equation of state properties from the combined experimental and observational results. The generation of the data-set on which the Deep learning methods will learn is a crucial step, which must consist of all physical reliable equations of state. [1] Mehta, Pankaj, et al. "A high-bias, low-variance introduction to machine learning for physicists." Physics Reports (2019). [2] Carrasquilla, J., \& Melko, R. G. (2017). Machine learning phases of matter. Nature Physics, 13(5), 431. [3] Pang, Long-Gang, et al. "An EoS-meter of QCD transition from deep learning." , Nature Commun. 9 (2018) no.1, 210\\\\ [4] Zhou, K., Endrődi, G., \& Pang, L. G. (2018). Regressive and generative neural networks for scalar field theory. arXiv preprint arXiv:1810.12879.\\\\ [5] Fujimoto, Y., Fukushima, K., \& Murase, K. (2018). Methodology study of machine learning for the neutron star equation of state. Physical Review D, 98(2), 023019. |
2 Fast 2 Furious Universe
Domains: General Relativity | Cosmology Supervisor: Nelson Nunes Co-Supervisor: Tiago Barreiro Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The realization that the Universe is accelerating propels the idea of dark energy. This project aims to understand its nature and how it interacts with the other particles: dark matter, baryons, radiation and neutrinos. The crucial starting point is the most general scalar-tensor theory that leads to viable theoretical cosmologies. The student will test the free functions of the theory against current and forthcoming observational data (ESPRESSO, Euclid, Lisa). This is both a theoretical and hands on data project. |
Muon Tomography applied to geological structures
Domain: Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: Mourad Bezzeghoud Institution: Universidade de Évora Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Muons are deeply penetrating particles which are naturally present at the Earth's surface, as a consequence of the interaction of cosmic rays with the atmosphere. By measuring the attenuation of the open air flux through different directions, they can be used to radiograph, or better said, muograph the interior of large volumes of material like geological structures, volcanoes, mines or archaeological sites. Muon Tomography is a direct application born from particle and astroparticle physics, and it is rapidly growing towards civil engineering applications, geotechnics, geophysics, monitoring of materials, and homeland security. This thesis project is about exploring the capabilities of Muon Tomography for geological applications from the interior of a well surveyed mine. Muographies are obtained by comparing the flux of the atmospheric cosmic ray muons with those attenuated by the passage of the large structure arriving from different directions. The student will collaborate in designing, building, operating and analysing the data of a muon telescope prototype, based on Resistive Plate Chambers (RPCs), to be installed in the Lousal mine at Alentejo. In order to control the results of the Muon Tomography, the student will participate in the production of a high-definition three-dimensional geophysical model of the mine structures. Is a model made from the crossing of Georradar, Seismic and Gravimetric data, obtained from a geophysical campaign specially prepared for this purpose. The sensitivity of to find cavities, mineral deposits, aquifers, bedrock or fragmented rock and humid or dry soil which are important in geotechnics and mining applications will be tested. The results will be used to develop a solution to get insight on the volcanic activity in the Azores islands. |
Muogravimetric joint inversion.
Domain: Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: José Borges Institution: Universidade de Évora Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Muons are deeply penetrating particles which are naturally present at the Earth's surface, as a consequence of the interaction of cosmic rays with the atmosphere. By measuring the attenuation of the open air flux through different directions, they be used to radiograph, or better said, muograph the interior of large volumes of material, like geological structures like volcanoes or mines to archaeological sites. The existing techniques so far, include gravimetric, magnetic, seismic, electric and electromagnetic methods, or simply drilling, however each technique has limited sensitivity or spatial resolution. In general, muons will allow mapping at deeper levels. Currently, gravimetry is the general-purpose method for density mapping and provides also information of density contrast from measurements of the vertical component of the local gravity field. Similarly to muography (transmission muon tomography), it is linearly linked to the density of the material, but their spatial resolution and sensitivity is different. Muon telescopes can be placed in existing tunnels to observe the muon flux for brownfield mining applications, and the muon tomography images correctly identify the location of mineralized rock. The enhanced 3D density algorithm combines the two or more sensors. or different points of measurements, into a 3D image by optimizing inversion process. The greatest advantage of muography is its high spatial resolution compared with other geophysical methods - in particular with the gravimetry. As gravimetry alike, Inversion of muon data is also affected by non-uniqueness. In fact, the number of muon trajectories may not be enough to the resolution of small scale geological density models. Since both muon tomography and gravimetry are geophysical methods that provide information on the density structure of the Earth's subsurface, our approach to imaging a density distribution is to invert gravity and muon data jointly. Additionally, the resolution in deeper regions not sampled by muon tomography (not possible due to geometric constraints or excessive depth of the targets) will be significantly improved by joining the two techniques. Therefore, there are 3 strategies that will be pursued 1) imaging with muons; 2) muons as input a priori data for conventional inversion of gravity data; 3) imaging with gravity and muon data jointly. The student will develop and implement the theoretical and computational methodologies for inverting the results of muon surveys and for the joint inversion of muon and gravity surveys. These will be tested by simulation of the muon propagation through synthetic 3D models, before applying it to real muon data. The final performance of the methodology will be assessed by comparing the reconstructed density distributions with the preexisting information.. |
Simulation and reconstruction of cosmic muons for muon tomographic applications.
Domain: Astroparticle Physics Supervisor: Lorenzo Cazon Co-Supervisor: Pedro Assis Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Muons are deeply penetrating particles which are naturally present at the Earth's surface, as a consequence of the interaction of cosmic rays with the atmosphere. By measuring the attenuation of the open air flux through different directions, they be used to radiograph, or better said, muograph the interior of large volumes of material, like geological structures like volcanoes or mines to archaeological sites. LIP is currently deploying its first prototype in an non active mine in Lousal, to prove the capabllities and sensitivity of the technique. Muon Tomography is an applied field born from particle and astroparticle physics, and it is rapidly growing towards civil engineering applications, geotechnics, geophysics, monitoring of materials, homeland security, etc. The detectors used to perform Muon Tomography need to reconstruct the direction of the incoming muon. LIP is leader in tRPC, a high performance detector while maintaining relative low const. Several tRPCs planes, a tRPC Muon Telescope, are capable of reconstruct muon trajectories with mRad precission. The goal of this thesis are: * to create the analysis tools to interpret "transmission" and "scattering" muon tomography data sets from real data, collected by a prototype at LIP. * to created a simulation package for the passage of cosmic ray muons through 3D structures. (and subsequently apply the analysis tools) |
Exploring the Hidden Sector of Particle Physics at the SHiP experiment
Domain: Experimental Particle Physics Supervisor: Celso Franco Co-Supervisor: Nuno Leonardo Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The discovery of the Higgs boson at LHC in 2012 made the Standard Model of elementary particles complete. Still, several well-established observational phenomena - neutrino masses and oscillations, dark matter, and baryon asymmetry of the Universe - cannot be explained with known particles alone and clearly indicate that New Physics should exist. The fact that no definitive evidence of new particles have been found so far suggests that they are either heavier than the reach of the present days accelerators or interact very weakly. The SHiP experiment is designed to search for extremely feebly interacting, relatively light and long lived particles, at the intensity frontier. The experiment can also probe the existence of Light Dark Matter through the observation of its scattering on electrons and nuclei in its neutrino detector material. In the region from a few MeV/c^2 to 200 MeV/c^2 the SHiP sensitivity reaches below the limit which gives the correct relic abundance of dark matter. SHiP is being proposed as a discovery experiment but it also includes a rich program of tau neutrino physics and measurements on neutrino-induced charm production. The plan of work is to develop the simulation and reconstruction software of the SHiP experiment with the goal of studying the production and direct detection of Heavy Neutral Leptons (HNLs). These leptons are regarded as the right-handed partners of the standard model neutrinos and, if found in the phase-space region (uniquely) covered by SHiP, the HNLs can provide a natural explanation for dark matter, neutrino masses and baryonic asymmetry of the Universe. The selected student will also perform a detailed study of possible background sources, mimicking HNL decays, with a main focus on the neutrino induced backgrounds. Machine learning algorithms will be used to suppress the residual backgrounds in the spectrometer part dedicated to the Hidden Sector Physics. The student will be integrated as a member of the SHiP Collaboration and trips to CERN are expected. Some trips between Lisboa and Coimbra are also foreseen: the group in Coimbra is a candidate to build a precise timing detector whose properties need to be defined from the HNL simulations. |
Higher dimension operators for a composite Higgs
Domain: Theoretical Particle Physics Supervisor: Brigitte Hiller Co-Supervisor: Alex Blin Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
The announcement of the discovery of a Higgs boson in 2012 at the CERN Large Hadron Collider has been since then succeeded by numerous measurements to confirm its properties in terms of the Standard Model. To present date some deviations in its couplings are possible at the level of about ten percent, leaving open a window for extensions of the Higgs sector [1]. One of the venues is to admit a composite Higgs in terms of quark substructures associated to a dynamically generated top antitop condensate at the electroweak scale, in a Nambu- Jona-Lasinio (NJL) like mechanism [2], where a scalar boson emerges with a mass twice as large as the top quark. Confronted with the values of the top quark and Higgs masses presently known, one must however conclude that they are not compatible with that prediction, even after considering the associated renormalization group (RG) evolution. We propose to take into account higher dimension multiquark operators, which are known to allow a reduction in the mass of the scalar isoscalar meson sigma in the low energy spectrum of QCD, resulting from chiral symmetry breaking instead [3]. Due to the universal character of dynamical symmetry breaking phenomena, similar effects may be expected at different scales. The inclusion of the higher dimension operators requires a careful reformulation of the Higgs sector dynamics and couplings, starting from a classification of those interactions relevant at the pertinent scale, within an effective Lagrangian approach, and how they impact on the related Higgs production and decay rates, the gauge bosons and rho parameter, fermion families and RG equations. Bibliography: [1] H. E. Logan, TASI 2013 lectures on Higgs physics within and beyond the Standard Model, arXiv:1406.1786 [hep-ph]. [2] William A. Bardeen, Christopher T. Hill, and Manfred Lindner, Phys. Rev. D41 (1990) 1647. [3] A. A. Osipov, B. Hiller, A. H. Blin , J. da Providência, Annals Phys. 322 (2007) 2021-2054. |
Modified gravity: linear and non-linear cosmological probes
Domains: General Relativity | Cosmology | Astrophysics Supervisor: Noemi Frusciante Co-Supervisor: Francisco Lobo Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The recent cosmic acceleration is challenging the theory of General Relativity (GR) at the basis of the standard cosmological model, LCDM. This led people to propose in place of the cosmological constant (L), new alternatives in the form of dynamical dark energy(DE) or theories of modified gravity(MG). Collective properties of these proposals are: the dynamics of the graviton is modified at large scales; at intermediate scales the modifications are regulated by screening mechanisms, which suppress the modifications on smaller scales. Ongoing and upcoming cosmological surveys will provide highly precision data allowing to test gravity with an unprecedented accuracy and the new era of multi-messenger astronomy opened new possibilities for testing gravity. On the theoretical side, while many Modified Gravity models are still allowed by type Ia supernova (SNIa) and Cosmic Microwave Background (CMB) data; structure formation can help us to distinguish among them and the standard scenario, thanks to their signatures on the matter power spectrum, in the linear and mildly non-linear regimes. Thus, to fully exploit the available data at our disposal, e.g. the galaxy clustering, CMB lensing and, most importantly, weak lensing data, one has to scrutinize at all scales the properties of DE/MG models, in particular the effects of screening mechanisms on structures formation. The integration of screening and non-linearities effects in numerical investigations is still an open issue. The goal of the PhD project will be to unveil the real nature of the theory of gravity. To achieve this, the student will apply theoretical modeling and numerical methods to construct new tools to test linear/non-linear effects of modified gravity. The analysis tools developed by the student are expected to be used in the upcoming ESA Euclid mission in which the host institution has a leading role. In order to ensure a successful PhD, this project contains theoretical and numerical elements that are flexible such that they can fit with the student’s skills and expertise. |
Combining 2D Elemental Distributions obtained with Ion Beam Analysis to construct 3D distributions and Applications to perovskite Solar Cells (CEDIBASC)
Domain: Experimental Particle Physics Supervisor: Teresa Peña Institution: Instituto Superior Técnico Host Institution: CTN, DECN and DF -IST
Abstract
Nuclear microprobe and IBA (Ion Beam Analysis) techniques represent a non-destructive and fast tool that can provide elemental 2D compositional maps and elemental depth profiles of samples with a precision up to the micrometric scale. Objective: The objective of this project is to combine 2D elemental distributions given by Particle Induced X-ray Emission (PIXE) with depth resolved Elastic Backscattering Spectrometry (EBS) spectra to obtain 3D elemental distribution information of materials. Method: This will be achieved by means of artificial neural networks and the development of an user-friendly graphical interface which will allow to visualize the 3D structures. Part of the research work will concentrate on the development and training of artificial neural networks (ANN) that promptly analyse and join EBS and PIXE data. Application: This method will be applied to the analysis of the most promising material for solar cells, the perovskites. Implementing also the sensitive Ion Beam Induced Charge (IBIC) technique, a relationship between the perovskites based-solar cells charge collection efficiency and the 3D elemental distributions will be established, contributing to the optimization of cells manufacturing conditions. This relationship is crucial since the composition of the absorber layer is closely related to cell lattice parameter, optical properties and energy band gaps, that finally influence the energy conversion efficiency of the cell. Innovation: The innovative methodological approach of this research project will be to concentrate on a particular material, Perovskite, to develop and properly train ANN algorithms that can calculate elemental 3D distributions of the sample using combined IBA techniques. This will also imply the analysis of samples with the Total-IBA approach, in order to have depth-resolved simulated spectra to use for the ANN training. Collaboration: The project will be done within a collaboration between the Department of Physics of IST and CTN (Centro de Ciências e Tecnologias Nucleares, Luis Cerqueira Alves). CTN has unique conditions in the country for applications of nuclear Physics techniques. In this research work the two facilities to be used are the Van der Graaf accelerator and the nuclear microprobe. They are two important tools to enhance the non-destructive IBA techniques capabilities and results. Possible co-supervision (Celso Franco or Patrícia Gonçalves, for instance) from LIP will also be an added value for the development of software. |
Discrimination between Light Dark Matter and Neutrino interactions at the SHiP experiment using Machine Learning algorithms
Domain: Experimental Particle Physics Supervisor: Celso Franco Co-Supervisor: Nuno Leonardo Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The SHiP (Search for Hidden Particles) experiment at CERN is being proposed as a discovery experiment. It is designed to detect extremely feebly interacting, relatively light and long lived particles, which are predicted to exist in the so-called Hidden Sector of Particle Physics. However, in parallel, a rich program of Neutrino Physics is also being prepared. SHiP, as a Beam Dump Facility, will produce an enormous amount of neutrinos and photons in its primary target (over a period of 5 years): above 7x10^18 and 1x10^20, respectively. The spectrometer will be equipped with an emulsion detector, to detect neutrino interactions and, due to its combination with a muon detector, it will be possible to distinguish all six neutrinos/anti-neutrinos. Therefore, SHiP will detect for the first time interactions with tau anti-neutrinos and will provide unique data concerning neutrino-induced charm production. The knowledge about the structure of the proton will also be significantly improved as a result of probing the proton with neutrino DIS (Deep Inelastic Scattering) events. In addition, due to the micrometric precision of the emulsion detector, SHiP has the potential of separating Light Dark Matter (LDM), arising from dark photon (coupling to photons) decays, from neutrino interactions in the emulsion detector. The work plan involves the use of Machine Learning algorithms with the goal of optimising the spectrometer to maximise the separation between LDM and neutrino interactions, without compromising the SHiP exploration of the Hidden Sector. These algorithms will also be used at the analysis level, both at the low (reconstruction) and high (process selection) levels of analysis, to maximise both the efficiency and purity of each Physics sample. The selected student will be integrated as a member of the SHiP Collaboration and trips to CERN are expected. |
Formation and evolution of H-alpha filaments at chromospheric heights
Domain: Astrophysics Supervisor: Nuno Peixinho Co-Supervisor: Ricardo Gafeira Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
The solar atmosphere provides a unique laboratory for studying physical processes occurring throughout the universe but, due to there proximity, they are better observed in the sun. As a result of all these physical processes, several features can appear in the solar atmosphere at different heights which are observed at different spectral ranges. One of these features, observed in the H-alpha spectral line, are called solar filaments when observed in the disk or prominences when observed outside the limb. They are one the oldest structures observed in the sun but until today few is know about what triggers there formation and the physical processes that lead to that. A better understanding of these structures is also important due to there relation to the coronal mass ejection (CMEs) that can hit the earth and disturb the earth magnetic field with drastic consequences on Earth telecommunications, GPS, etc., and all technology dependent on these system. The goal of this PhD thesis is to study the dynamics of the solar atmosphere at small and large size-scale, mainly in the chromosphere, based on multi-spectral-line observations by means of radiative transfer inversion codes in order to establish there magnetic and thermodynamic context, in order to predict its formation and evolution. This project aims to study the temporal evolution of chromospheric structures that lead to the formation of the H-alpha filaments using multi-spectral-line observations in small and large spatial scales. There are multiple reasons for focusing on this topic. The most important are related to improve our yet little understanding of the formation of these structures, and to seek and understand their correlation with the solar atmospheric conditions, the triggers for coronal mass ejections (CMEs), and how to effectively predict these CMEs. Since all these processes have manifestations at large scales, it is imperative that low-resolution solar observations with a large field of view are used as a complement to high-resolution satellite data. Such observations are available from several sources like the Solar Space Telescope SDO, and ground-based instruments like the Spectroheliograph at the Observatory of Coimbra and Swedish Solar Telescope (SST) and new techniques are bringing new light to the usage of those datasets. Brand new inversion codes that allow computation of non-local thermodynamic equilibrium (NLTE) spectral lines are now available opening the door to new studies. Thus in this PhD project, datasets will be analysed using new solar atmospheric inversion codes that allow retrieving the atmospheric parameters a study their evolution, mainly in the chromosphere. The contributions to the spectral line formation depends significantly on the magnetic activity. Therefore, multiple-line studies with polarimetric information (Ca II K, H-alpha together with photospheric lines like Fe) are essential to constrain and derived physical parameters of the solar atmosphere in height, and finally, shed light on the true origin of these features. The scientific questions which shall be tackled address the dynamics of the solar atmosphere that leads to the appearance and therefore evolution of the H-alpha filaments. |
Cosmological tests of gravity theory beyond General Relativity
Domains: General Relativity | Astrophysics | Cosmology Supervisor: Noemi Frusciante Co-Supervisor: Francisco Lobo Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
An outstanding problem faced by modern cosmology concerns cosmic acceleration, i.e. the phase of accelerated expansion recently entered by the Universe, for which we still lack a satisfactory theoretical explanation. Within the context of General Relativity, an accelerated expansion can be achieved adding an extra ingredient in the energy budget of the Universe, commonly referred to as dark energy. A different approach is to modify the law of gravity describing the Universe at large scales. A plethora of modified gravity models addressing the phenomenon of cosmic acceleration have been proposed and analyzed. The astronomical community has embarked on an intense observational effort to help exploring the real nature of the cosmic acceleration. Up and coming missions will deliver highly accurate data, offering an unprecedented insight into gravity on cosmological scales. This observational effort is not yet balanced by an equally focused effort at theoretical modeling. The ability to constrain various properties of cosmological models using observational data, such as the anisotropies of the cosmic microwave background, the large scale structure of the galaxy distribution, the expansion and acceleration rate of the universe and other such quantities, has become an essential part of modern cosmology. The goal of the PhD project will be to unveil the real nature of the theory of gravity. To achieve this, the student will apply theoretical modeling and numerical methods to the best data available and perform forecasts for future next generation surveys. Development of this project is required for several reasons 1) new theoretical models need to be built; 2) new numerical patches need to be developed which serve to test models against cosmological observations. The analysis tools developed by the student are expected to be used in the upcoming ESA Euclid mission in which the host institution has a leading role. In order to ensure a successful PhD, this project contains theoretical and numerical elements that are flexible such that they can fit with the student’s skills and expertise. |
Migdal effect: a way to light dark matter detection
Domains: Experimental Particle Physics | Astrophysics Supervisor: Francisco Neves Co-Supervisor: Vladimir Solovov Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
For the past several decades, experimental efforts to directly detect dark matter (DM) interactions have focused mostly on Weakly Interacting Massive Particles (WIMPs) in the mass range above a few GeV. This limit was driven by the fact that most theoretical models predicted WIMP particles with a mass above ∼2 GeV, as well as technical difficulties in building a detector with sensitivity to lighter WIMPs. However, newer models of sub-GeV dark matter (including freeze-out DM, asymmetric DM and freeze-in DM) boosted interest in direct detection of such kind of less-massive WIMPs. Currently, the most sensitive WIMP detectors (LUX, PANDA-X, XENON-1T and future LZ and XENON-nT) are based on liquid xenon technology. For some time, it was assumed that liquid xenon DM detectors are not sensitive to this kind of light dark matter as low nuclear recoil energy combined with strong quenching would result in a signal below the detection threshold (~1 keV). However, as it was pointed out in recent theoretical studies, the energy of the nuclear recoil can be transferred to the atomic shell, resulting in emission of either a gamma-ray (bremsstrahlung) or an electron (Migdal effect). The energy deposited by either the gamma-ray or electron (~keV) will not be quenched thus enhancing the detectability of light dark matter. If confirmed experimentally, these effects will considerably extend the sensitivity range of the large liquid xenon detectors, currently under construction (LZ and XENON-nT) and would allow to probe lighter dark matter candidates otherwise inaccessible. We propose, in the framework of this project, to perform experimental verification of the Migdal effect in liquid xenon. In this project, the student will be integrated and supported directly by the LIP team, largely experienced in all aspects of liquid xenon detectors and direct dark matter detection (as members of ZEPLIN, LUX and LZ Collaborations). The work will comprise three main components: (1) Integration of the student in the LZ Collaboration team, including on-site work at Sanford Underground Research Facility (USA); (2) Feasibility study using the GEANT4 simulation toolkit of a setup for the direct measurement of the energy transfer mechanisms from nuclear recoils into bremsstrahlung and Migdal effect in liquid xenon; (3) Experiment planning and experimental validation using LZ detector neutron calibration facility. |
First FAIR experiments on halo nuclei at relativistic energies
Domain: Experimental Particle Physics Supervisor: Daniel Galaviz Redondo Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The Facility for Antiprotons and Ion Research (FAIR) will be a major upgrade of the current accelerator of the GSI Helmholtzzentrum fuer Schwerionenforschung GmbH at Darmstadt, Germany. FAIR will be a unique facility worldwide with an ability to produce antiprotons, ions and energies for heavier nuclei using exotic ion beams up to uranium with unprecedented intensity and quality. Within FAIR, the high energy branch of the NUSTAR collaboration using the experimental setup of the Reactions with Relativistic Radioactive Beams (R3B), will study reactions with exotic nuclei far off the stability focusing on nuclear structure and dynamics, and performing reaction studies that will elucidate unanswered questions in nuclear physics and nuclear astrophysics. The first phase of experiments at FAIR, called Day-Zero, has recently started, with the benchmark of the newly developed detectors systems performed recently. For this Day-Zero campaign, the experimental proposal S442 (Study of multi-neutron configurations in atomic nuclei towards the neutron drip line. Spokesperson: O. Sorlin, GANIL) will explore the structure of very neutron rich carbon isotopes through reactions within a liquid hydrogen target at relativistic energies. The experiment allows as well for a precise study of neutron knock-out reactions on halo nuclei like 17C or 14B. The candidate for this PhD program will participate in the preparation, execution and analysis of (p,pn) and (p,2p) reactions on the halo nuclei 17C and 14B, joining an international collaboration and being involved in one of the first experiments performed in the facility that will be at the forefront of rare isotope physics in Europe in the next decades. |
Optimisation of data reduction of adaptive-optics assisted observations
Domain: Astrophysics Supervisor: Carlos M Correia Co-Supervisor: Paulo Garcia Institution: Universidade do Porto Host Institution: Universidade do Porto
Abstract
The widespread deployment of adaptive-optics systems in existing and in foreseen telescopes is changing the paradigm of data analysis. Adaptive optics (AO) systems compensate for the blurring effects of the Earth’s turbulent atmosphere, called “seeing”, in real-time, giving superior spatial resolution over space-based alternatives at a fraction of the cost. AO systems have been deployed on nearly all of the world’s largest telescopes, including the European Very Large Telescope (VLT) and its 10m-class telescopes counterparts. The power of AO is now widely recognized and it will be built into the 1st-light instruments of ALL the next-generation giant telescopes -- the European ELT, the Giant Magellan Telescope and the Thirty Meter Telescope (Ramsay+ 2014, Matt+ 2006, Sanders+ 2014) with diameters up to 40m. Despite its effectiveness and undisputable gains, AO systems are complex and produce images with point-spread functions (PSFs) that that depend on many factors, like the flavour of AO that feeds each instrument, the local characteristics of the atmosphere, the field of view, the availability and adequacy of natural guide stars, etc. In other words, we went from relatively simple optical systems that produced low-resolution but stable and well-defined PSFs determined mainly by the local seeing and by the diameter of the telescope in classical observatories, to state-of-the-art AO-assisted systems that reach much closer to the full potential of the telescope in terms of spatial resolution but that produce PSFs that vary in time and space and that are more challenging to model (Veran+, 1997, Gilles+, 2012). The development of these systems represented a huge leap forward in astronomical observations, but it was not (yet) met with an equivalent development of data analysis algorithms. We are therefore at a point where breakthrough science with AO-assisted observations on current and future ground-based telescope requires new paradigms in data-analysis algorithms in order to extract the most precise measurements of photometric brightness, astrometric position, and morphology for planets, stars, and galaxies. Our team is leading a large effort to bridge this important gap between the technology and the science facilitated by AO assisted instruments. For this end, we are seeking a candidate to collaborate in one or several of the following fronts: 1. Exploring the parameter space of AO-corrected PSFs for the ELTs The student/fellow will evaluate the accuracy to which the PSF should be known to meet the most representative science cases on ELTs. 2. Parametric PSFs for standard photometry/astrometry software packages The student will investigate the coupling of reconstructed PSFs (from AO telemetry) and its parametric by-products and data analysis standard software for real/simulated observations. 3. PSF reconstruction from multi-Wavefront Sensor telemetry The student will develop and integrate efficient methods starting from telemetry to provide the reconstructed AO across the field for a few cases of AO correction systems. The candidate will be dedicated to the facilitation of the H2020-WP10 PSFR network by spending time with the ELT 1st light consortia to collect science requirements and turn them into meaningful PSF metrics that can be crunched by PSF reconstruction algorithm developers within the consortium. PROFILE Excellent candidates with astronomy, applied physics, mathematics, engineering backgrounds with strong signal processing and programming skills are encouraged to apply. NOTES The student will likely spend ~20% of his/her time in partner institutions as is the Lab Astrophysics Marseille, CENTRA in Lisbon, Durham University, JKU in Linz and others. |
The First Radio Galaxies in the Universe
Domain: Astrophysics Supervisor: José Afonso Co-Supervisor: Catarina Lobo Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: IA, Universidade de Lisboa / IA, Universidade do Porto
Abstract
Recent observations of the highest redshift quasars and radio galaxies pinpoint the early growth of supermassive black holes (SMBH) that trigger the formation of active galactic nuclei (AGNs) at redshifts greater than 7. It is anticipated that radio emission can be detected from such early AGN, although its characteristics are still quite indeterminate. The importance of such detection, however, is extremely high. It will: (a) provide us with a lighthouse that reveals the physics of the first accretion episodes to the first SMBHs in the Universe; (b) allow the direct study of the neutral gas throughout the Epoch of Reionisation itself with the next generation of radio telescopes, through the observation and study of the HI 21cm forest against such early AGN; (c) allow us to trace the early growth of Large Scale Structure in the Universe. After decades of laborious work, trying to understand the deepest radio observations, the conditions are now finally right to develop a project that can make us understand where are the “first radio galaxies” and how to find them with upcoming radio telescopes. |
Intertwinement between entanglement and the QCD flux tube
Domain: Theoretical Particle Physics Supervisor: Orlando Oliveira Co-Supervisor: David Dudal Institution: Universidade de Coimbra Host Institution: Universidade de Coimbra
Abstract
Quantum entanglement can be interpreted as the extent that the quantum state of one system A is depending on that of the rest of the system B. It can be quantified by the entanglement entropy (EE) of A relative to B, a generalization of the usual concept of (thermal) entropy. We are interested in the QCD flux tube connecting a quark and antiquark. This picture is realized in the non-perturbative dual superconductor scenario of the QCD vacuum, and it naturally leads to a linear and thus confining potential between the static quarks. Lattice QCD has shown that EE is sensitive to (de)confinement, on a par with predictions from holography (AdS/QCD). This is solely verified for the infinite strip geometry. On one hand, research in (holographic) superconductivity has unveiled that EE is also sensitive to the transition between the superconductor--normal state. On the other hand, (old) lattice work in a 3D QCD toy model has given partial evidence that inside the color-electric flux tube binding the quarks together, a normal perturbative vacuum state emerges. We want to first verify this picture from state of the art 4D QCD- simulations, after which we can turn to the 1st ever study of how the (inside of the) flux tube is entangled with the outside. We conjecture this can be probed by studying the lattice EE of a cylinder placed around the flux tube. From the response of the EE to the radius of the cylinder, we might even quantify the thickness of the flux tube. |
Anti-neutrino oscillations with SNO+
Domain: Experimental Particle Physics Supervisor: Sofia Andringa Co-Supervisor: José Maneira Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
SNO+ is a detector located 2000 m underground in Canada, which is now being filled with liquid scintillator, boosting the measurement of anti-neutrinos created by natural radioactivity of the Earth and by nuclear reactors. It will do a a first measurement of geo-neutrinos from the North-American crust, useful to constrain global Earth models. Moreover, due to the distribution of nuclear reactors around SNO+, neutrino oscillations will induce clear features on the detectable energy spectrum. For one of the squared mass differences between neutrino states, SNO+ is expected to reach a precision comparable to the only other existing such measurement. Recent results from solar neutrinos prefer slightly different values of this parameter, increasing the impact of the measurement in the global precision in neutrino physics. The expected rates of anti-neutrinos are small, but they can be identified by the delayed coincidence of a positron annihilation followed by a neutron capture. The positron energy and the neutron initial direction follow the anti-neutrino kinematics. The proposed work plan will involve all aspects of the anti-neutrino data analysis. The detector response will be characterized with a neutron calibration source that mimics the delayed coincidence signature. The time and spacial evolution of similar radioactive backgrounds will be monitored. Methods will be explored to separate positrons from the most common alpha and gamma events. This program will lead to the precise measurement of the energy spectrum of anti-neutrinos, from which the neutrino oscillation parameters will be extracted. The program is well fitted to the responsibilities of the LIP group within SNO+. Participation in in-situ activities in SNOLAB will be required. |
The non-homogeneous and statistically isotropic Universe: Structure formation tools for Euclid studies
Domains: General Relativity | Cosmology | Astrophysics Supervisor: Antonio da Silva Co-Supervisor: José Pedro Mimoso Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The cosmological principle is at the heart of modern cosmology. Combined with the Einstein field equations leads to the popular homogeneous and isotropic LCDM Friedmann-Lemaitre-Robertson-Walker (FLRW) models that have become the present baseline paradigm to analyze and predict large-scale cosmological datasets. The ESA/Euclid satellite mission will be able to further test this baseline paradigm over a wide range of cosmological scales with a variety of probes, and it may put in evidence that a more general class of non-homogeneous Lemaitre-Tolman-Bondi (LTB) models should be considered to accurately describe observations. To achieve its objectives, the Euclid mission will observe billions of galaxies in the visible and infrared sky for weak gravitational lensing and galaxy clustering studies. These data will be used to investigate the nature of dark energy, dark matter and gravity as well as to test the validity of the cosmological principle on a range of distance scales using survey tomographic information. This project addresses the problem of structure formation in the context of non-homogeneous LTB models. It proposes to develop ways to characterize observational signatures of these models and to confront them with future Euclid data. The project includes the development of a set of numerical tools to generate mock simulations of cosmological volumes and projected mass maps that can be used to simulate Euclid observations. The proposed approach should allow to obtain FLRW asymptotic behavior on very large scales and to model and test the transition scale between homogeneity and non-homogeneity for different LTB model hypothesis. |
Testing the unified dark matter-dark energy hypothesis in the Euclid era
Domain: Cosmology Supervisor: Ismael Tereno Co-Supervisor: Alberto Rozas-Fernandez Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
Physical cosmology relies on two unknown substances – dark matter and dark energy – in order to provide a description of the Universe that is consistent with current observations. Many dark matter and dark energy hypotheses will soon become testable with data from the Euclid space mission. This project addresses one of those hypotheses: the unification of dark matter and dark energy (UDM). The goal of the project is to study the evolution of cosmological structure in UDM scenarios, deriving their signatures in Euclid’s gravitational lensing observables and forecasting Euclid’s ability in testing this hypothesis. The theoretical derivations need to be done with high precision and accuracy to match the data quality. In particular, they need to address structure formation in the non-linear regime. An important task of the project is the development of N-body simulations of UDM models. Other tasks include assessment of model-dependent systematics, creation of lensing maps and statistical inference of model parameters. |
Neutron identification in DUNE and proto-DUNE
Domain: Experimental Particle Physics Supervisor: Sofia Andringa Co-Supervisor: José Maneira Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The DUNE experiment is designed to measure the last unknown parameters of neutrino oscillations, namely the mass ordering between the three neutrino states, and the CP-violation phase – which may be a fundamental ingredient to explain the matter / anti-matter asymmetry in the Universe. DUNE will be a long base line experiment, measuring the oscillations of neutrino and anti-neutrino beams, 1300 km away from their production point at Fermilab, USA. Four 10 kTon detectors are planned, the first one should be ready in 2025, one or two years before the beam starts. A much smaller prototype of the first DUNE detector is already the largest Liquid Argon detectors ever built: it is a Time Projection Chamber (TPC) with 6.9 x 7.2 x 6.0 m3. This detector is installed at CERN, and has collected data from interactions from beams of protons and pions of different energies, and samples of cosmic ray muon events. In contrast to the high resolution image obtained for charged particles, the TPC is not directly sensitive to neutrons, which contribute to reduce the visible energy in the neutrino and anti-neutrino measurements in DUNE. The neutrons can only be detected through high energy scattering or by thermal capture in argon. The capture signal is at the limit of the lowest energy considered for the DUNE neutrino program. A pulsed neutron calibration source is being developed for DUNE and will be tested in protoDUNE. This project is dedicated to the identification of neutrons in the protoDUNE data. The beam data will be used to measure the neutron multiplicity and associated invisible energy in proton-Argon interactions. The first data collected with the calibration source will provide a large sample of neutron captures, which will be used to extend the neutron tagging to signatures to lower energy neutrons. The work will include participation in the calibration data taking at CERN and prior tests needed for the development of the source. |
Probing cosmic defect evolution with gravitational waves
Domain: Cosmology Supervisor: Lara Sousa Co-Supervisor: Pedro Avelino Institution: Universidade do Porto Host Institution: Centro de Astrofísica da Universidade do Porto/Instituto de Astrofísica e Ciências do Espaço
Abstract
The detection of gravitational waves by the LIGO experiment precipitated us into new era of astronomy: the era of Gravitational Wave Astronomy. Gravitational waves will allow for the detection of new astronomical sources, enabling us to hear what is currently unseen. In this doctoral project, we will develop a new generation of numerical algorithms to describe cosmic defect networks, with the objective of gaining new understanding of their evolution and of constructing more realistic semi-analytical descriptions of their dynamics. These tools will enable us to study the gravitational wave background generated by cosmic defect networks with unprecedented precision. The production of these defects in symmetry breaking phase transitions in the early universe is predicted in several models of particle physics. This work will then give us insight, not only about the physics of the early universe, but also about fundamental physics. Stringent constraints on defect-forming scenarios will also be derived using upcoming gravitational wave data (e.g. from LIGO-VIRGO,IPTA,LEAP), in order to discriminate between different models of particle physics at extremely high energies. Detailed forecasts for upcoming probes, such as LISA and SKA, will be central to this project too. |
Diffuse ionized gas and Lyman continuum photon escape in spiral galaxies
Domain: Astrophysics Supervisor: Polychronis Papaderos Co-Supervisor: Jean Michel Gomes Institution: Universidade do Porto Host Institution: Instituto de Astrofísica e Ciências do Espaço, Centro de Astrofísica da Universidade do Porto
Abstract
Diffuse ionized gas (DIG) is an ubiquitous component in the disk and halo of late-type galaxies (LTGs). The excitation mechanisms and ionization conditions of the DIG pose a long-standing enigma. The prevailing picture though is that the DIG originates from Lyman continuum (LyC) photons escaping from sites of ongoing star formation and their reprocessing into nebular emission on scales of ~1 kpc away from HII regions. The mechanisms facilitating escape and transport of LyC radiation are unclear, it is yet likely that a key role is played by injection of energy and momentum by stellar winds and SNe into a porous multi-phase interstellar medium. Various lines of evidence indicate that the diffuse, low-surface brightness DIG contributes ~20-50% of the total Hα emission in late-type galaxies, a fact that may introduce a substantial observational bias on estimates of star formation rates in high-redshift galaxies where DIG emission is barely detectable. This project aims at a detailed investigation of the physical properties of the DIG in a representative sample of nearby face-on spiral galaxies using image processing techniques, spatially resolved integral field spectroscopy and advanced spectral synthesis models. A central question to be addressed concerns the relation between the fractional contribution of the DIG to the total Hα luminosity and the star formation history and structural properties of LTGs. |
Future Space Telescopes in the Multi-Messenger Era
Domains: Experimental Particle Physics | Astrophysics Supervisor: Rui Curado da Silva Co-Supervisor: Jorge Maia Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
In the Multi-Messenger Era, the simultaneous localization and characterization of Gravitational Waves and Gamma-Ray Bursts require all-sky, time-domain and polarimetric observation capabilities, as well as increased sensitivity [1]. In the framework of ESA and NASA missions, i-Astro group will develop innovative high-energy astrophysics space mission concepts in 2 alternative platforms: 1) ESA F call selected for phase 2 study, All-Sky-ASTROGAM is a small size class mission that will provide one order of magnitude better observational sensitivity due to its innovative Si tracker and CsI calorimeter without passive elements and to its all-sky capabilities. It will provide Compton and pair production polarimetry [2]. The P.I., De Angelis, is LIP and i-Astro External Collaborator; 2) AMEGO is a NASA and european mission proposal (AMEGO team: https://asd.gsfc.nasa.gov/amego/team.html ) [3]. It will be submitted to large size Probe-Class NASA call. Its larger Si tracker and CsI/CZT calorimeter instrument should provide better sensitivity by two orders of magnitude and polarimetric sensitivity up to 50 MeV; The selected PhD student will participate in the design, development and experimental characterization of the main instruments of these mission concepts, based on Si trackers, and CZT and CsI calorimeters. He should perform a complete study of the spectroscopic and polarimetric sensitivity of the main instrument by mass model Monte Carlo simulation, using MEGAlib [4] dedicated program based on GEANT4. The student will study particularly the polarimetric sensitivity of these instruments in the pair production regime, since this capability should offer an outstanding knew knowledge window in high-energy astrophysics. He will participate in the development of prototypes for laboratorial testing and, possibly, in high-altitude balloon testing at Esrange Space Center, Sweden, and/or at NASA facilities in Palestine, Texas. A higher level of participation in 1) All-Sky-Astrogam or in 2) AMEGO will depend of the selection of All-Sky-Astrogam for launch in 2020. In case All-Sky-Astrogam will not be selected, the research performed should be focused in AMEGO mission. [1] De Angelis, (…), Rui Silva et al., The e-ASTROGAM mission: Exploring the extreme Universe with gamma rays in the MeV - GeV range, Exp Astron., 2017, 44, 1, 25. [2] All-Sky-ASTROGAM, Proposal for ESA F Mission Programme, http://coimbra.lip.pt/~rcsilva/AllSkyASTROGAMSubmitted25202018.pdf [3] A. Moiseev et al., All-Sky Medium Energy Gamma-ray Observatory (AMEGO), ICRC2017, PoS, 301, 798. [4] “MEGAlib – The Medium Energy Gamma-ray Astronomy Library”, Andreas Zoglauer et al., New Astronomy Reviews 50 (2006) 629–632. |
QGP effects in Ultra-High-Energy Cosmic Rays
Domains: Theoretical Particle Physics | Astroparticle Physics Supervisor: Ruben Conceição Co-Supervisor: Liliana Apolinário Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The Quark-Gluon Plasma (QGP) is a Quantum Chromodynamics state of matter characterized by extreme temperatures and densities. Currently produced in ultra-relativistic heavy-ion collisions, this strongly interacting medium is believed to be produced only in very dense/extended systems, such as Lead-Lead or Gold-Gold collisions. Recent results of high multiplicity proton-proton collisions at the LHC show strong collective effects whose magnitude is attributed to the presence of QGP droplets. These observations put to the test our understanding of the energy/density scale necessary to trigger QGP-like effects. Ultra-high energy cosmic rays (UHECRs) can be a unique opportunity to shed some light in this puzzle. With center-of-mass energies that can surpass 400 TeV, and with a hadronic nature that span elements from proton to iron, UHECRs can be used to explore the presence of QGP-like effects in a multitude of systems. The Pierre Auger Observatory, located in Argentina, is the largest experiment dedicated to the study of these extremely energetic cosmic rays. The results collected by Auger so far present several incongruences challenging to be explained within standard interactions and astrophysical scenarios, which significantly motivates the investigation of QGP effects at ultra-high-energy interactions. Hence, the candidate is expected to work on the interface between accelerator data (in particular LHC) and cosmic ray data (Auger) by investigating potential signatures of a strongly-coupled QCD medium in the development of UHRCRs. The work is expected to take place at LIP-Lisbon, within the Phenomenology and Auger groups. |
Is the neutrino its own antiparticle? - Probing the nature of the neutrino with the LZ dark matter detector
Domains: Experimental Particle Physics | Astroparticle Physics Supervisor: Alexandre Lindote Co-Supervisor: Claudio Frederico Pascoal da Silva Institution: Universidade de Coimbra Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Over the last two decades, the neutrino physics field has been very active, especially after the discovery of the neutrino flavour oscillations by the Super-Kamiokande and SNO experiments (Nobel prize 2015) and confirmed by many other experiments. Due to the weak interaction of the neutrinos with the matter, many of their properties are still not well known, such as their absolute mass, their mass hierarchy, the possible existence of sterile neutrinos, their magnetic moment, etc. LZ is a 10-tonne dual-phase liquid xenon detector designed to look for WIMP dark matter particles, the biggest ever built using this technology. With the protection offered by the outer layers of xenon, the 5 tonnes of the inner region of LZ will be an extraordinarily low background place, allowing to improve the current best sensitivity to WIMPs by a factor of more than 100. But this extraordinarily quiet “laboratory” can be used to study other important physics processes beyond dark matter search such as neutrino physics. Rare nuclear processes during which two neutrinos are emitted simultaneously can be used to probe some of the neutrino properties, such as double beta decay (2νββ), double electron capture (2ν2EC) and the mixed mode with positron emission 2νECβ+. Neutrinoless versions of these decays are not allowed as they would violate the conservation of the lepton number, and thus their observation would be evidence of new physics beyond the Standard Model and the proof that neutrinos are Majorana particles (i.e., they are their own antiparticles), while providing information about the neutrino mass hierarchy and effective mass. This is one of the most interesting topics in modern physics, with several experiments around the world dedicated to look for such forbidden decays in various elements (e.g. Te-130 in SNO+ and CUORE, Ge-76 in GERDA and MAJORANA). Xenon is particularly well suited for these studies given that it has two isotopes that can decay through 2νββ -- Xe-136 and Xe-134 -- and two that can decay through 2ν2EC and the mixed mode -- Xe-124 and Xe-126. This, together with its sheer size and extremely low background, places LZ in an excellent position for such studies. |
Investigating Structure Formation around Massive Galaxies through a Radio-Infrared Synergy
Domains: Cosmology | Astrophysics Supervisor: José Afonso Co-Supervisor: Catarina Lobo Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: IA, Universidade de Lisboa / IA, Universidade do Porto
Abstract
One of the greatest challenges facing observational cosmology is understanding the formation of large scale structure in the Universe. Hierarchical models for structure formation developed over the last few years, achieving the high degree of predictive success that they do, are however still unconstrained, in particular in helping to understand how the light (galaxies) traces the underlying (dark) matter and how this relation evolves over time. We will address this problem by performing a systematic study of the evolution of the densest regions of the Universe, as traced by the most massive galaxies and their environments, improving our understanding of how the most massive regions of the Universe form and evolve. This will only be possible by using data from a deep mid-infrared wide-field survey, the extended Spitzer Extragalactic Representative Volume Survey (SERVS), now including over 2700 hours of deep mid-infrared observations and capable of finally overcoming long-standing observational limitations. |
Jet sub-structure as a probing tool of the Quark Gluon Plasma
Domain: Theoretical Particle Physics Supervisor: José Guilherme Milhano Co-Supervisor: Liliana Apolinário Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
Ultra-relativistic collisions of heavy atomic nuclei recreate temperature and density conditions prevalent in the early stages of the history of the Universe. The Quark Gluon Plasma (QGP) created in these collisions is the most perfect liquid ever observed. Its formation and evolution constitutes one of the most relevant and fundamental problems currently under study. The experimental and theoretical heavy-ion research programme has evolved from its initial QGP-discovery phase (CERN-SPS and BNL-RHIC) to a full-fledged QCD-probing effort at the LHC. Extraction of QGP properties from data requires the availability of adequate probes — that are both under excellent theoretical control and that are generated within the QGP as its very short lifetime (a few yoctoseconds) precludes any external probing approach. At the LHC at CERN, Lead nuclei are collided at the highest ever centre-of-mass energy (~5 TeV per nucleon pair, a factor 25 increase from RHIC). The large collision energy leads to the abundant production of QCD jets that have proved to have a huge potential as detailed probes of the QGP. In particular the sub-structure properties of jets offer unique opportunities to fully characterize the QGP. The aim of this thesis is to develop the theory of jet sub-structure for jets that are created within and travel through the QGP, to simultaneously carry out event-generator based phenomenological studies aimed at establishing the sensitivity of specific sub-structure observables to specific QGP properties. Our group is at the forefront of efforts in this domain. The selected candidate will develop both strong and highly transferable computational skills and solid competence in Quantum Chromodynamics. The thesis work, co-supervised by Guilherme Milhano and Liliana Apolinário, will take place within the Phenomenology group at LIP-Lisbon in close collaboration with colleagues at CERN, Santiago de Compostela, and MIT. |
Dark energy from quantum entanglement: a test in the Euclid era
Domain: Cosmology Supervisor: Ismael Tereno Co-Supervisor: Alberto Rozas-Fernandez Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The main goal of ESA's Euclid space mission, to be launched in 2022, is to understand the origin of the observed late-time accelerated expansion of the Universe. One theoretical possibility is to describe the acceleration in terms of the entanglement energy of the Universe. A promising and most economical cosmological model has been proposed in this context. It is based on general relativity and some quantum effects associated with the probabilistic description of quantum physics that become dominant at late times, accounting thus for the acceleration. The goals of the project are to study the background expansion and the evolution of cosmological structure in this quantum entanglement model, deriving its signatures in Euclid’s gravitational lensing observables and forecasting Euclid’s ability in testing this hypothesis. We will focus first on the mildly non-linear regime, studying structure formation in the spherical collapse approach. Those preliminary results will then inform the study of the full non-linear regime, where we will consider N-body simulations and lensing maps. |
Non-linear evolution of cosmic structure in the Euclid Universe beyond LCDM
Domains: Cosmology | Astrophysics Supervisor: Antonio da Silva Co-Supervisor: Nelson Nunes Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
CMB observations by the Planck satellite, combined with galaxy survey and supernovae data, show that a simple inflationary LCDM model with only six parameters provides a consistent description of the main global properties of the Universe on large scales. However, this baseline model has only moderate success on smaller scales, where tensions arise when CMB data is combined with other large-scale structure (LSS) datasets. Excursions beyond the Planck baseline model assuming extra parameters that describe alternative dark energy / modified gravity hypothesis or additional dark matter physics, have been used to try to reconcile observations, but different LSS probes (mostly sensitive to the linear evolution of cosmic structure) often favor different model excursions. The non-linear evolution of cosmic structure provides a unique way to further test the viability of these alternative conjectures. Numerical N-body simulation methods have already allowed to explore some classes of alternative models, but they have been conducted in an isolated way, using inconsistent baseline cosmology parameters, and more importantly, lack dedicated products that mimic existing or planned galaxy surveys. This type of systematic study is currently missing in most large-scale structure surveys and is clearly necessary for the upcoming ESA/Euclid satellite mission, which will measure the shapes of billions of galaxies and accurate redshifts of tens of millions of galaxies for weak gravitational lensing and galaxy clustering studies. The objective of this project is to perform a consistent study of the nonlinear growth of cosmological structure for alternative cosmological models and to develop a suite of Euclid simulation products that can be used to study degeneracies and identify ways to discriminate between alternative models using non-linear probes of large-scale structure. |
Search for Double Higgs production using Machine Learning
Domain: Experimental Particle Physics Supervisor: Michele Gallinaro Co-Supervisor: Joao Varela Institution: Instituto Superior Técnico Host Institution: Laboratório de Instrumentação e Física Experimental de Partículas
Abstract
The subject of this thesis is the search for double Higgs production, decaying into taus and b-jets, at the Large Hadron Collider (LHC). The thesis is placed in the context of the Portuguese participation in the CMS experiment at the LHC, and it is linked to the Beyond the Standard Model (BSM) searches in the more general context of the searches for New Physics processes at the LHC. In the course of the last forty years the SM has received increasing and consistent verification by precise experimental tests of its predictions, culminating in 2012 with the discovery of a new particle, which appears to be called “the” Higgs boson. There are, however, compelling reasons to believe the SM is not complete. In particular, the LIP/CMS group is engaged in the study of SM and BSM processes to fully exploit the opportunities of the unparalleled energy of the LHC collisions. A large amount of data of approximately 150/fb have been collected in Run 1 and Run 2, and are available to study this process. With the upcoming Run3, additional 300/fb may become available in the next few years and it will offer excellent opportunities for major discoveries in this domain. The work plan includes the study of the double Higgs production, each subsequently decaying to pairs of taus and b-jets. Advanced multi-variate analysis (MVA) techniques will be used in the separation of signal and background events. The candidate is expected to work in a team with a group of researchers. Searches for new physics in this channel can be significantly improved with the additional data, and with improved analysis techniques. |
Dark matter at the LHC
Domains: Theoretical Particle Physics | Experimental Particle Physics Supervisor: Rui Santos Co-Supervisor: Antonio Onofre Institution: Faculdade de Ciências - Universidade de Lisboa Host Institution: Faculdade de Ciências - Universidade de Lisboa
Abstract
The Higgs mechanism of mass generation acts as a unifying principle in model building. However, the number of models that could fit the Large Hadron Collider (LHC) data, and are well motivated, is very large. Even disregarding several theoretical issues like the ones related to fine-tuning, as is the case of the so-called hierarchy problem, the "new" Standard Model (SM) has to, at least, be able to settle the most striking disagreements with experimental data. There are urgent modifications to be made to the SM in order to get the right amount of observed cold Dark Matter and to accommodate the observed predominance of matter over anti-matter in the Universe for which we need to increase the amount of CP-violation in the model. This work is about examining how the possible extensions of the scalar sector of the SM, that could solve those discrepancies, can be tested at the LHC and future colliders through the Higgs sector, and its connection with dark matter. The work will be performed in close collaboration with ATLAS experimentalists to devise the best strategies to adopt in order to acquire sensitivity to all such models. |