Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3D-QUEST
Project 3D-Quantum Integrated Optical Simulation
Researcher (PI) Fabio Sciarrino
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Summary
"Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Max ERC Funding
1 474 800 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym 3DSPIN
Project 3-Dimensional Maps of the Spinning Nucleon
Researcher (PI) Alessandro Bacchetta
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Summary
How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Max ERC Funding
1 509 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ADAPTIVES
Project Algorithmic Development and Analysis of Pioneer Techniques for Imaging with waVES
Researcher (PI) Chrysoula Tsogka
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The proposed work concerns the theoretical and numerical development of robust and adaptive methodologies for broadband imaging in clutter. The word clutter expresses our uncertainty on the wave speed of the propagation medium. Our results are expected to have a strong impact in a wide range of applications, including underwater acoustics, exploration geophysics and ultrasound non-destructive testing. Our machinery is coherent interferometry (CINT), a state-of-the-art statistically stable imaging methodology, highly suitable for the development of imaging methods in clutter. We aim to extend CINT along two complementary directions: novel types of applications, and further mathematical and numerical development so as to assess and extend its range of applicability. CINT is designed for imaging with partially coherent array data recorded in richly scattering media. It uses statistical smoothing techniques to obtain results that are independent of the clutter realization. Quantifying the amount of smoothing needed is difficult, especially when there is no a priori knowledge about the propagation medium. We intend to address this question by coupling the imaging process with the estimation of the medium's large scale features. Our algorithms rely on the residual coherence in the data. When the coherent signal is too weak, the CINT results are unsatisfactory. We propose two ways for enhancing the resolution of CINT: filter the data prior to imaging (noise reduction) and waveform design (optimize the source distribution). Finally, we propose to extend the applicability of our imaging-in-clutter methodologies by investigating the possibility of utilizing ambient noise sources to perform passive sensor imaging, as well as by studying the imaging problem in random waveguides.
Summary
The proposed work concerns the theoretical and numerical development of robust and adaptive methodologies for broadband imaging in clutter. The word clutter expresses our uncertainty on the wave speed of the propagation medium. Our results are expected to have a strong impact in a wide range of applications, including underwater acoustics, exploration geophysics and ultrasound non-destructive testing. Our machinery is coherent interferometry (CINT), a state-of-the-art statistically stable imaging methodology, highly suitable for the development of imaging methods in clutter. We aim to extend CINT along two complementary directions: novel types of applications, and further mathematical and numerical development so as to assess and extend its range of applicability. CINT is designed for imaging with partially coherent array data recorded in richly scattering media. It uses statistical smoothing techniques to obtain results that are independent of the clutter realization. Quantifying the amount of smoothing needed is difficult, especially when there is no a priori knowledge about the propagation medium. We intend to address this question by coupling the imaging process with the estimation of the medium's large scale features. Our algorithms rely on the residual coherence in the data. When the coherent signal is too weak, the CINT results are unsatisfactory. We propose two ways for enhancing the resolution of CINT: filter the data prior to imaging (noise reduction) and waveform design (optimize the source distribution). Finally, we propose to extend the applicability of our imaging-in-clutter methodologies by investigating the possibility of utilizing ambient noise sources to perform passive sensor imaging, as well as by studying the imaging problem in random waveguides.
Max ERC Funding
690 000 €
Duration
Start date: 2010-06-01, End date: 2015-11-30
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AGEnTh
Project Atomic Gauge and Entanglement Theories
Researcher (PI) Marcello DALMONTE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Summary
AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Max ERC Funding
1 055 317 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ARMOS
Project Advanced multifunctional Reactors for green Mobility and Solar fuels
Researcher (PI) Athanasios Konstandopoulos
Host Institution (HI) ETHNIKO KENTRO EREVNAS KAI TECHNOLOGIKIS ANAPTYXIS
Call Details Advanced Grant (AdG), PE8, ERC-2010-AdG_20100224
Summary Green Mobility requires an integrated approach to the chain fuel/engine/emissions. The present project aims at ground breaking advances in the area of Green Mobility by (a) enabling the production of affordable, carbon-neutral, clean, solar fuels using exclusively renewable/recyclable raw materials, namely solar energy, water and captured Carbon Dioxide from combustion power plants (b) developing a highly compact, multifunctional reactor, able to eliminate gaseous and particulate emissions from the exhaust of engines operated on such clean fuels.
The overall research approach will be based on material science, engineering and simulation technology developed by the PI over the past 20 years in the area of Diesel Emission Control Reactors, which will be further extended and cross-fertilized in the area of Solar Thermochemical Reactors, an emerging discipline of high importance for sustainable development, where the PI’s research group has already made significant contributions, and received the 2006 European Commission’s Descartes Prize for the development of the first ever solar reactor, holding the potential to produce on a large scale, pure renewable Hydrogen from the thermochemical splitting of water, also known as the HYDROSOL technology.
Summary
Green Mobility requires an integrated approach to the chain fuel/engine/emissions. The present project aims at ground breaking advances in the area of Green Mobility by (a) enabling the production of affordable, carbon-neutral, clean, solar fuels using exclusively renewable/recyclable raw materials, namely solar energy, water and captured Carbon Dioxide from combustion power plants (b) developing a highly compact, multifunctional reactor, able to eliminate gaseous and particulate emissions from the exhaust of engines operated on such clean fuels.
The overall research approach will be based on material science, engineering and simulation technology developed by the PI over the past 20 years in the area of Diesel Emission Control Reactors, which will be further extended and cross-fertilized in the area of Solar Thermochemical Reactors, an emerging discipline of high importance for sustainable development, where the PI’s research group has already made significant contributions, and received the 2006 European Commission’s Descartes Prize for the development of the first ever solar reactor, holding the potential to produce on a large scale, pure renewable Hydrogen from the thermochemical splitting of water, also known as the HYDROSOL technology.
Max ERC Funding
1 750 000 €
Duration
Start date: 2011-02-01, End date: 2017-01-31
Project acronym AROMA-CFD
Project Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics
Researcher (PI) Gianluigi Rozza
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Summary
The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Max ERC Funding
1 656 579 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ARS
Project Autonomous Robotic Surgery
Researcher (PI) Paolo FIORINI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI VERONA
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Summary
The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Max ERC Funding
2 750 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ASSESS
Project Episodic Mass Loss in the Most Massive Stars: Key to Understanding the Explosive Early Universe
Researcher (PI) Alceste BONANOS
Host Institution (HI) NATIONAL OBSERVATORY OF ATHENS
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Summary
Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Max ERC Funding
1 128 750 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ATMOPACS
Project Atmospheric Organic Particulate Matter, Air Quality and Climate Change Studies
Researcher (PI) Spyridon Pandis
Host Institution (HI) FOUNDATION FOR RESEARCH AND TECHNOLOGY HELLAS
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Despite its importance for human health and climate change organic aerosol (OA) remains one of the least understood aspects of atmospheric chemistry. We propose to develop an innovative new framework for the description of OA in chemical transport and climate models that will be able to overcome the challenges posed by the chemical complexity of OA while capturing its essential features.
The objectives of ATMOPACS are: (i) The development of a new unified framework for the description of OA based on its two most important parameters: volatility and oxygen content. (ii) The development of measurement techniques for the volatility distribution and oxygen content distribution of OA. This will allow the experimental characterization of OA in this new “coordinate system”. (iii) The study of the major OA processes (partitioning, chemical aging, hygroscopicity, CCN formation, nucleation) in this new framework combining lab and field measurements. (iv) The development and evaluation of the next generation of regional and global CTMs using the above framework. (v) The quantification of the importance of the various sources and formation pathways of OA in Europe and the world, of the sensitivity of OA to emission control strategies, and its role in the direct and indirect effects of aerosols on climate.
The proposed work involves a combination of laboratory measurements, field measurements including novel “atmospheric perturbation experiments”, OA model development, and modelling in urban, regional, and global scales. Therefore, it will span the system scales starting from the nanoscale to the global. The modelling tools that will be developed will be made available to all other research groups.
Summary
Despite its importance for human health and climate change organic aerosol (OA) remains one of the least understood aspects of atmospheric chemistry. We propose to develop an innovative new framework for the description of OA in chemical transport and climate models that will be able to overcome the challenges posed by the chemical complexity of OA while capturing its essential features.
The objectives of ATMOPACS are: (i) The development of a new unified framework for the description of OA based on its two most important parameters: volatility and oxygen content. (ii) The development of measurement techniques for the volatility distribution and oxygen content distribution of OA. This will allow the experimental characterization of OA in this new “coordinate system”. (iii) The study of the major OA processes (partitioning, chemical aging, hygroscopicity, CCN formation, nucleation) in this new framework combining lab and field measurements. (iv) The development and evaluation of the next generation of regional and global CTMs using the above framework. (v) The quantification of the importance of the various sources and formation pathways of OA in Europe and the world, of the sensitivity of OA to emission control strategies, and its role in the direct and indirect effects of aerosols on climate.
The proposed work involves a combination of laboratory measurements, field measurements including novel “atmospheric perturbation experiments”, OA model development, and modelling in urban, regional, and global scales. Therefore, it will span the system scales starting from the nanoscale to the global. The modelling tools that will be developed will be made available to all other research groups.
Max ERC Funding
2 496 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BACKUP
Project Unveiling the relationship between brain connectivity and function by integrated photonics
Researcher (PI) Lorenzo PAVESI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Advanced Grant (AdG), PE7, ERC-2017-ADG
Summary I will address the fundamental question of which is the role of neuron activity and plasticity in information elaboration and storage in the brain. I, together with an interdisciplinary team, will develop a hybrid neuro-morphic computing platform. Integrated photonic circuits will be interfaced to both electronic circuits and neuronal circuits (in vitro experiments) to emulate brain functions and develop schemes able to supplement (backup) neuronal functions. The photonic network is based on massive reconfigurable matrices of nonlinear nodes formed by microring resonators, which enter in regime of self-pulsing and chaos by positive optical feedback. These networks resemble human brain. I will push this analogy further by interfacing the photonic network with neurons making hybrid network. By using optogenetics, I will control the synaptic strengthen-ing and the neuron activity. Deep learning algorithms will model the biological network functionality, initial-ly within a separate artificial network and, then, in an integrated hybrid artificial-biological network.
My project aims at:
1. Developing a photonic integrated reservoir-computing network (RCN);
2. Developing dynamic memories in photonic integrated circuits using RCN;
3. Developing hybrid interfaces between a neuronal network and a photonic integrated circuit;
4. Developing a hybrid electronic, photonic and biological network that computes jointly;
5. Addressing neuronal network activity by photonic RCN to simulate in vitro memory storage and retrieval;
6. Elaborating the signal from RCN and neuronal circuits in order to cope with plastic changes in pathologi-cal brain conditions such as amnesia and epilepsy.
The long-term vision is that hybrid neuromorphic photonic networks will (a) clarify the way brain thinks, (b) compute beyond von Neumann, and (c) control and supplement specific neuronal functions.
Summary
I will address the fundamental question of which is the role of neuron activity and plasticity in information elaboration and storage in the brain. I, together with an interdisciplinary team, will develop a hybrid neuro-morphic computing platform. Integrated photonic circuits will be interfaced to both electronic circuits and neuronal circuits (in vitro experiments) to emulate brain functions and develop schemes able to supplement (backup) neuronal functions. The photonic network is based on massive reconfigurable matrices of nonlinear nodes formed by microring resonators, which enter in regime of self-pulsing and chaos by positive optical feedback. These networks resemble human brain. I will push this analogy further by interfacing the photonic network with neurons making hybrid network. By using optogenetics, I will control the synaptic strengthen-ing and the neuron activity. Deep learning algorithms will model the biological network functionality, initial-ly within a separate artificial network and, then, in an integrated hybrid artificial-biological network.
My project aims at:
1. Developing a photonic integrated reservoir-computing network (RCN);
2. Developing dynamic memories in photonic integrated circuits using RCN;
3. Developing hybrid interfaces between a neuronal network and a photonic integrated circuit;
4. Developing a hybrid electronic, photonic and biological network that computes jointly;
5. Addressing neuronal network activity by photonic RCN to simulate in vitro memory storage and retrieval;
6. Elaborating the signal from RCN and neuronal circuits in order to cope with plastic changes in pathologi-cal brain conditions such as amnesia and epilepsy.
The long-term vision is that hybrid neuromorphic photonic networks will (a) clarify the way brain thinks, (b) compute beyond von Neumann, and (c) control and supplement specific neuronal functions.
Max ERC Funding
2 499 825 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym BIC
Project Cavitation across scales: following Bubbles from Inception to Collapse
Researcher (PI) Carlo Massimo Casciola
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Summary
Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Max ERC Funding
2 491 200 €
Duration
Start date: 2014-02-01, End date: 2019-01-31