Project acronym 3D-QUEST
Project 3D-Quantum Integrated Optical Simulation
Researcher (PI) Fabio Sciarrino
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Summary
"Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Max ERC Funding
1 474 800 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym 3DSPIN
Project 3-Dimensional Maps of the Spinning Nucleon
Researcher (PI) Alessandro Bacchetta
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Summary
How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Max ERC Funding
1 509 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AGEnTh
Project Atomic Gauge and Entanglement Theories
Researcher (PI) Marcello DALMONTE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Summary
AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Max ERC Funding
1 055 317 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ALEM
Project ADDITIONAL LOSSES IN ELECTRICAL MACHINES
Researcher (PI) Matti Antero Arkkio
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Summary
"Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Max ERC Funding
2 489 949 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AMETIST
Project Advanced III-V Materials and Processes Enabling Ultrahigh-efficiency ( 50%) Photovoltaics
Researcher (PI) Mircea Dorel GUINA
Host Institution (HI) TAMPEREEN KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Compound semiconductor solar cells are providing the highest photovoltaic conversion efficiency, yet their performance lacks far behind the theoretical potential. This is a position we will challenge by engineering advanced III-V optoelectronics materials and heterostructures for better utilization of the solar spectrum, enabling efficiencies approaching practical limits. The work is strongly motivated by the global need for renewable energy sources. To this end, AMETIST framework is based on three vectors of excellence in: i) material science and epitaxial processes, ii) advanced solar cells exploiting nanophotonics concepts, and iii) new device fabrication technologies.
Novel heterostructures (e.g. GaInNAsSb, GaNAsBi), providing absorption in a broad spectral range from 0.7 eV to 1.4 eV, will be synthesized and monolithically integrated in tandem cells with up to 8-junctions. Nanophotonic methods for light-trapping, spectral and spatial control of solar radiation will be developed to further enhance the absorption. To ensure a high long-term impact, the project will validate the use of state-of-the-art molecular-beam-epitaxy processes for fabrication of economically viable ultra-high efficiency solar cells. The ultimate efficiency target is to reach a level of 55%. This would enable to generate renewable/ecological/sustainable energy at a levelized production cost below ~7 ¢/kWh, comparable or cheaper than fossil fuels. The work will also bring a new breath of developments for more efficient space photovoltaic systems.
AMETIST will leverage the leading position of the applicant in topical technology areas relevant for the project (i.e. epitaxy of III-N/Bi-V alloys and key achievements concerning GaInNAsSb-based tandem solar cells). Thus it renders a unique opportunity to capitalize on the group expertize and position Europe at the forefront in the global competition for demonstrating more efficient and economically viable photovoltaic technologies.
Summary
Compound semiconductor solar cells are providing the highest photovoltaic conversion efficiency, yet their performance lacks far behind the theoretical potential. This is a position we will challenge by engineering advanced III-V optoelectronics materials and heterostructures for better utilization of the solar spectrum, enabling efficiencies approaching practical limits. The work is strongly motivated by the global need for renewable energy sources. To this end, AMETIST framework is based on three vectors of excellence in: i) material science and epitaxial processes, ii) advanced solar cells exploiting nanophotonics concepts, and iii) new device fabrication technologies.
Novel heterostructures (e.g. GaInNAsSb, GaNAsBi), providing absorption in a broad spectral range from 0.7 eV to 1.4 eV, will be synthesized and monolithically integrated in tandem cells with up to 8-junctions. Nanophotonic methods for light-trapping, spectral and spatial control of solar radiation will be developed to further enhance the absorption. To ensure a high long-term impact, the project will validate the use of state-of-the-art molecular-beam-epitaxy processes for fabrication of economically viable ultra-high efficiency solar cells. The ultimate efficiency target is to reach a level of 55%. This would enable to generate renewable/ecological/sustainable energy at a levelized production cost below ~7 ¢/kWh, comparable or cheaper than fossil fuels. The work will also bring a new breath of developments for more efficient space photovoltaic systems.
AMETIST will leverage the leading position of the applicant in topical technology areas relevant for the project (i.e. epitaxy of III-N/Bi-V alloys and key achievements concerning GaInNAsSb-based tandem solar cells). Thus it renders a unique opportunity to capitalize on the group expertize and position Europe at the forefront in the global competition for demonstrating more efficient and economically viable photovoltaic technologies.
Max ERC Funding
2 492 719 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ANPROB
Project Analytic-probabilistic methods for borderline singular integrals
Researcher (PI) Tuomas Pentinpoika Hytönen
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Summary
The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Max ERC Funding
1 100 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym AROMA-CFD
Project Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics
Researcher (PI) Gianluigi Rozza
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Summary
The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Max ERC Funding
1 656 579 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ATM-GTP
Project Atmospheric Gas-to-Particle conversion
Researcher (PI) Markku KULMALA
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Summary
Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym ATMNUCLE
Project Atmospheric nucleation: from molecular to global scale
Researcher (PI) Markku Tapio Kulmala
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Summary
Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ATOP
Project Atomically-engineered nonlinear photonics with two-dimensional layered material superlattices
Researcher (PI) zhipei SUN
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2018-ADG
Summary The project aims at introducing a paradigm shift in the development of nonlinear photonics with atomically-engineered two-dimensional (2D) van der Waals superlattices (2DSs). Monolayer 2D materials have large optical nonlinear susceptibilities, a few orders of magnitude larger than typical traditional bulk materials. However, nonlinear frequency conversion efficiency of monolayer 2D materials is typically weak mainly due to their extremely short interaction length (~atomic scale) and relatively large absorption coefficient (e.g.,>5×10^7 m^-1 in the visible range for graphene and MoS2 after thickness normalization). In this context, I will construct atomically-engineered heterojunctions based 2DSs to significantly enhance the nonlinear optical responses of 2D materials by coherently increasing light-matter interaction length and efficiently creating fundamentally new physical properties (e.g., reducing optical loss and increasing nonlinear susceptibilities).
The concrete project objectives are to theoretically calculate, experimentally fabricate and study optical nonlinearities of 2DSs for next-generation nonlinear photonics at the nanoscale. More specifically, I will use 2DSs as new building blocks to develop three of the most disruptive nonlinear photonic devices: (1) on-chip optical parametric generation sources; (2) broadband Terahertz sources; (3) high-purity photon-pair emitters. These devices will lead to a breakthrough technology to enable highly-integrated, high-efficient and wideband lab-on-chip photonic systems with unprecedented performance in system size, power consumption, flexibility and reliability, ideally fitting numerous growing and emerging applications, e.g. metrology, portable sensing/imaging, and quantum-communications. Based on my proven track record and my pioneering work on 2D materials based photonics and optoelectronics, I believe I will accomplish this ambitious frontier research program with a strong interdisciplinary nature.
Summary
The project aims at introducing a paradigm shift in the development of nonlinear photonics with atomically-engineered two-dimensional (2D) van der Waals superlattices (2DSs). Monolayer 2D materials have large optical nonlinear susceptibilities, a few orders of magnitude larger than typical traditional bulk materials. However, nonlinear frequency conversion efficiency of monolayer 2D materials is typically weak mainly due to their extremely short interaction length (~atomic scale) and relatively large absorption coefficient (e.g.,>5×10^7 m^-1 in the visible range for graphene and MoS2 after thickness normalization). In this context, I will construct atomically-engineered heterojunctions based 2DSs to significantly enhance the nonlinear optical responses of 2D materials by coherently increasing light-matter interaction length and efficiently creating fundamentally new physical properties (e.g., reducing optical loss and increasing nonlinear susceptibilities).
The concrete project objectives are to theoretically calculate, experimentally fabricate and study optical nonlinearities of 2DSs for next-generation nonlinear photonics at the nanoscale. More specifically, I will use 2DSs as new building blocks to develop three of the most disruptive nonlinear photonic devices: (1) on-chip optical parametric generation sources; (2) broadband Terahertz sources; (3) high-purity photon-pair emitters. These devices will lead to a breakthrough technology to enable highly-integrated, high-efficient and wideband lab-on-chip photonic systems with unprecedented performance in system size, power consumption, flexibility and reliability, ideally fitting numerous growing and emerging applications, e.g. metrology, portable sensing/imaging, and quantum-communications. Based on my proven track record and my pioneering work on 2D materials based photonics and optoelectronics, I believe I will accomplish this ambitious frontier research program with a strong interdisciplinary nature.
Max ERC Funding
2 442 448 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BIC
Project Cavitation across scales: following Bubbles from Inception to Collapse
Researcher (PI) Carlo Massimo Casciola
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Summary
Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Max ERC Funding
2 491 200 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym BIHSNAM
Project Bio-inspired Hierarchical Super Nanomaterials
Researcher (PI) Nicola Pugno
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary "Nanomaterials such as carbon nanotubes or graphene sheets represent the future of material science, due to their potentially exceptional mechanical properties. One great drawback of all artificial materials, however, is the decrease of strength with increasing toughness, and viceversa. This problem is not encountered in many biological nanomaterials (e.g. spider silk, bone, nacre). Other biological materials display exceptional adhesion or damping properties, and can be self-cleaning or self-healing. The “secret” of biomaterials seems to lie in “hierarchy”: several levels can often be identified (2 in nacre, up to 7 in bone and dentine), from nano- to micro-scale.
The idea of this project is to combine Nature and Nanotechnology to design hierarchical composites with tailor made characteristics, optimized with respect to both strength and toughness, as well as materials with strong adhesion/easy detachment, smart damping, self-healing/-cleaning properties or controlled energy dissipation. For example, one possible objective is to design the “world’s toughest composite material”. The potential impact and importance of these goals on materials science, the high-tech industry and ultimately the quality of human life could be considerable.
In order to tackle such a challenging design process, the PI proposes to adopt ultimate nanomechanics theoretical tools corroborated by continuum or atomistic simulations, multi-scale numerical parametric simulations and Finite Element optimization procedures, starting from characterization experiments on biological- or nano-materials, from the macroscale to the nanoscale. Results from theoretical, numerical and experimental work packages will be applied to a specific case study in an engineering field of particular interest to demonstrate importance and feasibility, e.g. an airplane wing with a considerably enhanced fatigue resistance and reduced ice-layer adhesion, leading to a 10 fold reduction in wasted fuel."
Summary
"Nanomaterials such as carbon nanotubes or graphene sheets represent the future of material science, due to their potentially exceptional mechanical properties. One great drawback of all artificial materials, however, is the decrease of strength with increasing toughness, and viceversa. This problem is not encountered in many biological nanomaterials (e.g. spider silk, bone, nacre). Other biological materials display exceptional adhesion or damping properties, and can be self-cleaning or self-healing. The “secret” of biomaterials seems to lie in “hierarchy”: several levels can often be identified (2 in nacre, up to 7 in bone and dentine), from nano- to micro-scale.
The idea of this project is to combine Nature and Nanotechnology to design hierarchical composites with tailor made characteristics, optimized with respect to both strength and toughness, as well as materials with strong adhesion/easy detachment, smart damping, self-healing/-cleaning properties or controlled energy dissipation. For example, one possible objective is to design the “world’s toughest composite material”. The potential impact and importance of these goals on materials science, the high-tech industry and ultimately the quality of human life could be considerable.
In order to tackle such a challenging design process, the PI proposes to adopt ultimate nanomechanics theoretical tools corroborated by continuum or atomistic simulations, multi-scale numerical parametric simulations and Finite Element optimization procedures, starting from characterization experiments on biological- or nano-materials, from the macroscale to the nanoscale. Results from theoretical, numerical and experimental work packages will be applied to a specific case study in an engineering field of particular interest to demonstrate importance and feasibility, e.g. an airplane wing with a considerably enhanced fatigue resistance and reduced ice-layer adhesion, leading to a 10 fold reduction in wasted fuel."
Max ERC Funding
1 004 400 €
Duration
Start date: 2012-01-01, End date: 2016-12-31