Project acronym GWmining
Project Gravitational-wave data mining
Researcher (PI) Davide Gerosa
Host Institution (HI) THE UNIVERSITY OF BIRMINGHAM
Country United Kingdom
Call Details Starting Grant (StG), PE9, ERC-2020-STG
Summary Gravitational-wave astronomy is entering its large-statistics regime. Catalogs with thousands of gravitational-wave events will soon be available, providing a wealth of information on the most compact objects in the Universe --black holes and neutron stars. These new datasets need new tools to be exploited effectively in order to maximize their scientific impact.
GWmining is an ambitious program to explore upcoming gravitational-wave catalogs with data-mining techniques. We will develop a complete framework to analyze gravitational-wave data in light of astrophysical predictions. Going beyond phenomenological models, we will train machine-learning algorithms directly on large banks of population-synthesis simulations and post-Newtonian integrations. The development of these astrophysical predictions requires new modeling strategies to accurately capture all the gravitational-wave observables, notably spins and eccentricities.
Combined with a hierarchical Bayesian analysis, our neural network will deliver the most stringent measurements to date on elusive phenomena influencing the lives of massive stars. We will constrain phenomena such as binary common envelope, supernova kicks, stellar winds, tidal interactions, etc.
Besides harnessing the catalog in its entirety, our complete framework will put us at the forefront to analyze outliers --golden events with favorable properties of one or more parameters. We will design a complete strategy to exploit the strongest signals to infer exquisite details of the relativistic dynamics of their sources.
GWmining is a unique project strategically placed at the intersection of astronomy, data analysis, and relativity. As the large-statistics revolution of gravitational-wave astronomy unfolds, GWmining will pioneer the application of data-mining techniques in gravitational-wave population studies, setting the foundations of this booming field for decades.
Summary
Gravitational-wave astronomy is entering its large-statistics regime. Catalogs with thousands of gravitational-wave events will soon be available, providing a wealth of information on the most compact objects in the Universe --black holes and neutron stars. These new datasets need new tools to be exploited effectively in order to maximize their scientific impact.
GWmining is an ambitious program to explore upcoming gravitational-wave catalogs with data-mining techniques. We will develop a complete framework to analyze gravitational-wave data in light of astrophysical predictions. Going beyond phenomenological models, we will train machine-learning algorithms directly on large banks of population-synthesis simulations and post-Newtonian integrations. The development of these astrophysical predictions requires new modeling strategies to accurately capture all the gravitational-wave observables, notably spins and eccentricities.
Combined with a hierarchical Bayesian analysis, our neural network will deliver the most stringent measurements to date on elusive phenomena influencing the lives of massive stars. We will constrain phenomena such as binary common envelope, supernova kicks, stellar winds, tidal interactions, etc.
Besides harnessing the catalog in its entirety, our complete framework will put us at the forefront to analyze outliers --golden events with favorable properties of one or more parameters. We will design a complete strategy to exploit the strongest signals to infer exquisite details of the relativistic dynamics of their sources.
GWmining is a unique project strategically placed at the intersection of astronomy, data analysis, and relativity. As the large-statistics revolution of gravitational-wave astronomy unfolds, GWmining will pioneer the application of data-mining techniques in gravitational-wave population studies, setting the foundations of this booming field for decades.
Max ERC Funding
1 499 917 €
Duration
Start date: 2021-10-01, End date: 2026-09-30
Project acronym KRANK
Project KilonovaRank: gravitational wave counterparts and exotic transients with next-generation surveys
Researcher (PI) Matt NICHOLL
Host Institution (HI) THE UNIVERSITY OF BIRMINGHAM
Country United Kingdom
Call Details Starting Grant (StG), PE9, ERC-2020-STG
Summary Time-domain astronomy will soon be transformed by powerful instrumentation: the Large Synoptic Survey Telescope (LSST) and upgraded gravitational wave detectors. We will finally be able to build large samples of rare and multi-messenger transients, allowing new scientific breakthroughs. But to do so, we must overcome the substantial difficulty of identifying the important events among an expected sea of contaminants. I will solve this problem by developing novel image classification techniques before LSST begins, and then use this with LSST data to answer some of the most pressing questions about stellar evolution, nucleosynthesis, and high energy physics. I will discover hundreds of superluminous supernovae (SLSNe), key to unknown physics in massive stars, and tidal disruption events (TDEs) of stars around supermassive black holes, probing black hole accretion in usually inaccessible regimes. By folding in the sky maps from GW detections of neutron star mergers, I will rapidly find kilonova counterparts in LSST follow-up searches. With 10-100 kilonovae (compared to 1 well-studied event now), we will understand the nucleosynthesis of all heavy (r-process) elements, determine the equation of state for nuclear matter, and pin down what these mergers leave behind. Moreover, I will determine the progenitor stars and power source of SLSNe, and the emission mechanisms in TDEs and relation to black hole mass, using even larger samples of those events. We may even find whole new transient classes. This project could not be more timely as the upgrades in GW detectors, the start of LSST, and the brief era of JWST will overlap in just a few years (and within the time of an ERC-funded project). Everything is in place for success, from guaranteed data access and follow-up resources, to public/prototype codes. The legacy datasets for rare transient classes will dwarf those from pre-LSST, and our inference framework will lead to an unprecedented understanding of these populations.
Summary
Time-domain astronomy will soon be transformed by powerful instrumentation: the Large Synoptic Survey Telescope (LSST) and upgraded gravitational wave detectors. We will finally be able to build large samples of rare and multi-messenger transients, allowing new scientific breakthroughs. But to do so, we must overcome the substantial difficulty of identifying the important events among an expected sea of contaminants. I will solve this problem by developing novel image classification techniques before LSST begins, and then use this with LSST data to answer some of the most pressing questions about stellar evolution, nucleosynthesis, and high energy physics. I will discover hundreds of superluminous supernovae (SLSNe), key to unknown physics in massive stars, and tidal disruption events (TDEs) of stars around supermassive black holes, probing black hole accretion in usually inaccessible regimes. By folding in the sky maps from GW detections of neutron star mergers, I will rapidly find kilonova counterparts in LSST follow-up searches. With 10-100 kilonovae (compared to 1 well-studied event now), we will understand the nucleosynthesis of all heavy (r-process) elements, determine the equation of state for nuclear matter, and pin down what these mergers leave behind. Moreover, I will determine the progenitor stars and power source of SLSNe, and the emission mechanisms in TDEs and relation to black hole mass, using even larger samples of those events. We may even find whole new transient classes. This project could not be more timely as the upgrades in GW detectors, the start of LSST, and the brief era of JWST will overlap in just a few years (and within the time of an ERC-funded project). Everything is in place for success, from guaranteed data access and follow-up resources, to public/prototype codes. The legacy datasets for rare transient classes will dwarf those from pre-LSST, and our inference framework will lead to an unprecedented understanding of these populations.
Max ERC Funding
1 478 343 €
Duration
Start date: 2021-10-01, End date: 2026-09-30
Project acronym LensEra
Project The statistical era of strong gravitational lensing cosmology
Researcher (PI) Thomas COLLETT
Host Institution (HI) UNIVERSITY OF PORTSMOUTH HIGHER EDUCATION CORPORATION
Country United Kingdom
Call Details Starting Grant (StG), PE9, ERC-2020-STG
Summary Strong gravitational lensing is on the cusp of a new era. Strong lensing occurs when the mass of a galaxy deforms space time so much that multiple images of a single background source is observed. Strong lensing is a powerful cosmological probe, but it is hamstrung by sample size: currently we only know of a few hundred systems. I am leading the work on a new telescope, the Large Synoptic Survey Telescope, which in its first year will observe more of the optical Universe than all previous telescopes combined. This revolutionary dataset will transform strong lensing into a statistical science by enabling LensEra to discover 30,000 new lenses and exploit them for cosmology.
The first objective of LensEra uses machine learning, citizen science and automated lens modelling to build a discovery engine to find 30,000 new lenses in the Large Synoptic Survey Telescope data, including 600 lenses with multiple background sources, and 300 lensed supernovae. LensEra will then confirm these using the 4MOST multi-object spectroscopic survey and the robotic Liverpool Telescope.
The second objective of LensEra develops 3 new tests of cosmology using rare subsets of the lens population: measuring the expansion of the Universe with samples of lensed supernovae; measuring the equation of state of dark energy with lenses with multiple background sources; and, testing the validity of General Relativity with lensing combined with stellar kinematics. The new sample from the first objective will allow LensEra to launch strong lensing cosmology into a new statistical age. Combining detailed analysis of ~200 golden lenses (the best LSST lensed supernovae and brightest double source plane lenses) with an automated modelling of the full sample will enable precise and accurate cosmological inference.
Summary
Strong gravitational lensing is on the cusp of a new era. Strong lensing occurs when the mass of a galaxy deforms space time so much that multiple images of a single background source is observed. Strong lensing is a powerful cosmological probe, but it is hamstrung by sample size: currently we only know of a few hundred systems. I am leading the work on a new telescope, the Large Synoptic Survey Telescope, which in its first year will observe more of the optical Universe than all previous telescopes combined. This revolutionary dataset will transform strong lensing into a statistical science by enabling LensEra to discover 30,000 new lenses and exploit them for cosmology.
The first objective of LensEra uses machine learning, citizen science and automated lens modelling to build a discovery engine to find 30,000 new lenses in the Large Synoptic Survey Telescope data, including 600 lenses with multiple background sources, and 300 lensed supernovae. LensEra will then confirm these using the 4MOST multi-object spectroscopic survey and the robotic Liverpool Telescope.
The second objective of LensEra develops 3 new tests of cosmology using rare subsets of the lens population: measuring the expansion of the Universe with samples of lensed supernovae; measuring the equation of state of dark energy with lenses with multiple background sources; and, testing the validity of General Relativity with lensing combined with stellar kinematics. The new sample from the first objective will allow LensEra to launch strong lensing cosmology into a new statistical age. Combining detailed analysis of ~200 golden lenses (the best LSST lensed supernovae and brightest double source plane lenses) with an automated modelling of the full sample will enable precise and accurate cosmological inference.
Max ERC Funding
1 794 707 €
Duration
Start date: 2021-08-01, End date: 2026-07-31
Project acronym MapItAll
Project Illuminating the darkness with precision maps of neutral hydrogen across cosmic time
Researcher (PI) Philip Bull
Host Institution (HI) QUEEN MARY UNIVERSITY OF LONDON
Country United Kingdom
Call Details Starting Grant (StG), PE9, ERC-2020-STG
Summary My proposal is to map out the 3D structure of the Universe over an unprecedentedly broad swath of cosmic time, covering 13 billion years of cosmic history. I will do this by using radio telescopes to detect the 21cm emission from neutral hydrogen. The detailed statistical properties of the maps will allow us to answer some of the most pressing questions in cosmology, such as how fast space is expanding, what the physical properties of dark energy are, and how the first stars and galaxies lit up the Universe.
All experiments currently trying to make these observations are severely limited by systematic effects, exacerbated by the extremely high dynamic range between the cosmological signal and many other sources of radio emission. Even tiny calibration errors can cause huge artefacts in the data that make it extremely difficult to pick out the target signal. While a great deal of work has gone into designing methods to analyse the data, they are not yet accurate enough – by a factor of 100 by some measures.
I will develop a statistical analysis framework called “Total Calibration” that can deliver the remaining two orders of magnitude of improvement, and apply it to the most sensitive data available. The result will be precise, systematics-free maps and the most robust statistical measurements of large-scale structure ever made in the radio. Total Calibration seeks to model all of the relevant degrees of freedom in the data simultaneously, in one large global model of the signal, contaminants, and the calibration of the telescope. This is highly complex, and has never been done before.
By applying total calibration to sensitive but complex data from two cutting-edge telescopes, HERA and MeerKAT, I will obtain the most robust constraints on the 21cm signal to date, from redshifts 0–1.4 (late times) and 5–27 (reionisation/Cosmic Dawn), to constrain the physical processes that shaped the cosmic energy budget at high redshift and any possible evolution of dark energy.
Summary
My proposal is to map out the 3D structure of the Universe over an unprecedentedly broad swath of cosmic time, covering 13 billion years of cosmic history. I will do this by using radio telescopes to detect the 21cm emission from neutral hydrogen. The detailed statistical properties of the maps will allow us to answer some of the most pressing questions in cosmology, such as how fast space is expanding, what the physical properties of dark energy are, and how the first stars and galaxies lit up the Universe.
All experiments currently trying to make these observations are severely limited by systematic effects, exacerbated by the extremely high dynamic range between the cosmological signal and many other sources of radio emission. Even tiny calibration errors can cause huge artefacts in the data that make it extremely difficult to pick out the target signal. While a great deal of work has gone into designing methods to analyse the data, they are not yet accurate enough – by a factor of 100 by some measures.
I will develop a statistical analysis framework called “Total Calibration” that can deliver the remaining two orders of magnitude of improvement, and apply it to the most sensitive data available. The result will be precise, systematics-free maps and the most robust statistical measurements of large-scale structure ever made in the radio. Total Calibration seeks to model all of the relevant degrees of freedom in the data simultaneously, in one large global model of the signal, contaminants, and the calibration of the telescope. This is highly complex, and has never been done before.
By applying total calibration to sensitive but complex data from two cutting-edge telescopes, HERA and MeerKAT, I will obtain the most robust constraints on the 21cm signal to date, from redshifts 0–1.4 (late times) and 5–27 (reionisation/Cosmic Dawn), to constrain the physical processes that shaped the cosmic energy budget at high redshift and any possible evolution of dark energy.
Max ERC Funding
1 665 802 €
Duration
Start date: 2021-01-01, End date: 2025-12-31
Project acronym SHADE
Project Statistical Host Identification As a Test of Dark Energy
Researcher (PI) Theresa Baker
Host Institution (HI) QUEEN MARY UNIVERSITY OF LONDON
Country United Kingdom
Call Details Starting Grant (StG), PE9, ERC-2020-STG
Summary The past four years have witnessed dramatic discoveries surrounding the birth of gravitational wave astronomy. By their nature, gravitational waves are ideal probes with which to test the laws of gravity – something currently under scrutiny due to unresolved questions about the dark sector of the universe. In this proposal I lay out an ambitious campaign to determine the behaviour of gravity over cosmological distances, using the upcoming surge of gravitational wave data. I will achieve this by developing the burgeoning technique of `Statistical Host Identification’ of gravitational wave sources. This new tool will enable me to test gravity using hundreds of future detections of binary black holes at high redshifts, even without direct redshift information – thus removing a major obstacle for gravitational wave cosmology. I will phrase my constraints in terms of model-independent parameters that quantify physically viable deviations from General Relativity, making my results applicable to virtually any dark energy or extended gravity model. In this way, I can validate or eliminate the space of theories in current literature. To model the distribution of gravitational wave events and their host galaxies, I will construct an approximate simulation that operates with generalised, model-independent gravitational laws – the first ever simulation to do this. This tool enables me to additionally use information about gravity from non-linear scales of cosmological structure. This regime is virtually untouched by current comparable work, and is a prime target for the next generation of galaxy surveys. My key objectives are: i) To develop the calculations and software tools needed to apply gravitational wave Statistical Host Identification, in theories of gravity beyond General Relativity; ii) To use these tools to obtain powerful new constraints on extended gravity models, thereby confirming or ruling out a leading candidate explanation for the nature of dark energy.
Summary
The past four years have witnessed dramatic discoveries surrounding the birth of gravitational wave astronomy. By their nature, gravitational waves are ideal probes with which to test the laws of gravity – something currently under scrutiny due to unresolved questions about the dark sector of the universe. In this proposal I lay out an ambitious campaign to determine the behaviour of gravity over cosmological distances, using the upcoming surge of gravitational wave data. I will achieve this by developing the burgeoning technique of `Statistical Host Identification’ of gravitational wave sources. This new tool will enable me to test gravity using hundreds of future detections of binary black holes at high redshifts, even without direct redshift information – thus removing a major obstacle for gravitational wave cosmology. I will phrase my constraints in terms of model-independent parameters that quantify physically viable deviations from General Relativity, making my results applicable to virtually any dark energy or extended gravity model. In this way, I can validate or eliminate the space of theories in current literature. To model the distribution of gravitational wave events and their host galaxies, I will construct an approximate simulation that operates with generalised, model-independent gravitational laws – the first ever simulation to do this. This tool enables me to additionally use information about gravity from non-linear scales of cosmological structure. This regime is virtually untouched by current comparable work, and is a prime target for the next generation of galaxy surveys. My key objectives are: i) To develop the calculations and software tools needed to apply gravitational wave Statistical Host Identification, in theories of gravity beyond General Relativity; ii) To use these tools to obtain powerful new constraints on extended gravity models, thereby confirming or ruling out a leading candidate explanation for the nature of dark energy.
Max ERC Funding
1 497 672 €
Duration
Start date: 2021-02-01, End date: 2026-01-31