Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3D-QUEST
Project 3D-Quantum Integrated Optical Simulation
Researcher (PI) Fabio Sciarrino
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Summary
"Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Max ERC Funding
1 474 800 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym 3DSPIN
Project 3-Dimensional Maps of the Spinning Nucleon
Researcher (PI) Alessandro Bacchetta
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Summary
How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Max ERC Funding
1 509 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym AFDMATS
Project Anton Francesco Doni – Multimedia Archive Texts and Sources
Researcher (PI) Giovanna Rizzarelli
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), SH4, ERC-2007-StG
Summary This project aims at creating a multimedia archive of the printed works of Anton Francesco Doni, who was not only an author but also a typographer, a publisher and a member of the Giolito and Marcolini’s editorial staff. The analysis of Doni’s work may be a good way to investigate appropriation, text rewriting and image reusing practices which are typical of several authors of the 16th Century, as clearly shown by the critics in the last decades. This project intends to bring to light the wide range of impulses from which Doni’s texts are generated, with a great emphasis on the figurative aspect. The encoding of these texts will be carried out using the TEI (Text Encoding Initiative) guidelines, which will enable any single text to interact with a range of intertextual references both at a local level (inside the same text) and at a macrostructural level (references to other texts by Doni or to other authors). The elements that will emerge from the textual encoding concern: A) The use of images Real images: the complex relation between Doni’s writing and the xylographies available in Marcolini’s printing-house or belonging to other collections. Mental images: the remarkable presence of verbal images, as descriptions, ekphràseis, figurative visions, dreams and iconographic allusions not accompanied by illustrations, but related to a recognizable visual repertoire or to real images that will be reproduced. B) The use of sources A parallel archive of the texts most used by Doni will be created. Digital anastatic reproductions of the 16th-Century editions known by Doni will be provided whenever available. The various forms of intertextuality will be divided into the following typologies: allusions; citations; rewritings; plagiarisms; self-quotations. Finally, the different forms of narrative (tales, short stories, anecdotes, lyrics) and the different idiomatic expressions (proverbial forms and wellerisms) will also be encoded.
Summary
This project aims at creating a multimedia archive of the printed works of Anton Francesco Doni, who was not only an author but also a typographer, a publisher and a member of the Giolito and Marcolini’s editorial staff. The analysis of Doni’s work may be a good way to investigate appropriation, text rewriting and image reusing practices which are typical of several authors of the 16th Century, as clearly shown by the critics in the last decades. This project intends to bring to light the wide range of impulses from which Doni’s texts are generated, with a great emphasis on the figurative aspect. The encoding of these texts will be carried out using the TEI (Text Encoding Initiative) guidelines, which will enable any single text to interact with a range of intertextual references both at a local level (inside the same text) and at a macrostructural level (references to other texts by Doni or to other authors). The elements that will emerge from the textual encoding concern: A) The use of images Real images: the complex relation between Doni’s writing and the xylographies available in Marcolini’s printing-house or belonging to other collections. Mental images: the remarkable presence of verbal images, as descriptions, ekphràseis, figurative visions, dreams and iconographic allusions not accompanied by illustrations, but related to a recognizable visual repertoire or to real images that will be reproduced. B) The use of sources A parallel archive of the texts most used by Doni will be created. Digital anastatic reproductions of the 16th-Century editions known by Doni will be provided whenever available. The various forms of intertextuality will be divided into the following typologies: allusions; citations; rewritings; plagiarisms; self-quotations. Finally, the different forms of narrative (tales, short stories, anecdotes, lyrics) and the different idiomatic expressions (proverbial forms and wellerisms) will also be encoded.
Max ERC Funding
559 200 €
Duration
Start date: 2008-08-01, End date: 2012-07-31
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AGEnTh
Project Atomic Gauge and Entanglement Theories
Researcher (PI) Marcello DALMONTE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Summary
AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Max ERC Funding
1 055 317 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ALEM
Project ADDITIONAL LOSSES IN ELECTRICAL MACHINES
Researcher (PI) Matti Antero Arkkio
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Summary
"Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Max ERC Funding
2 489 949 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AMETIST
Project Advanced III-V Materials and Processes Enabling Ultrahigh-efficiency ( 50%) Photovoltaics
Researcher (PI) Mircea Dorel GUINA
Host Institution (HI) TAMPEREEN KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Compound semiconductor solar cells are providing the highest photovoltaic conversion efficiency, yet their performance lacks far behind the theoretical potential. This is a position we will challenge by engineering advanced III-V optoelectronics materials and heterostructures for better utilization of the solar spectrum, enabling efficiencies approaching practical limits. The work is strongly motivated by the global need for renewable energy sources. To this end, AMETIST framework is based on three vectors of excellence in: i) material science and epitaxial processes, ii) advanced solar cells exploiting nanophotonics concepts, and iii) new device fabrication technologies.
Novel heterostructures (e.g. GaInNAsSb, GaNAsBi), providing absorption in a broad spectral range from 0.7 eV to 1.4 eV, will be synthesized and monolithically integrated in tandem cells with up to 8-junctions. Nanophotonic methods for light-trapping, spectral and spatial control of solar radiation will be developed to further enhance the absorption. To ensure a high long-term impact, the project will validate the use of state-of-the-art molecular-beam-epitaxy processes for fabrication of economically viable ultra-high efficiency solar cells. The ultimate efficiency target is to reach a level of 55%. This would enable to generate renewable/ecological/sustainable energy at a levelized production cost below ~7 ¢/kWh, comparable or cheaper than fossil fuels. The work will also bring a new breath of developments for more efficient space photovoltaic systems.
AMETIST will leverage the leading position of the applicant in topical technology areas relevant for the project (i.e. epitaxy of III-N/Bi-V alloys and key achievements concerning GaInNAsSb-based tandem solar cells). Thus it renders a unique opportunity to capitalize on the group expertize and position Europe at the forefront in the global competition for demonstrating more efficient and economically viable photovoltaic technologies.
Summary
Compound semiconductor solar cells are providing the highest photovoltaic conversion efficiency, yet their performance lacks far behind the theoretical potential. This is a position we will challenge by engineering advanced III-V optoelectronics materials and heterostructures for better utilization of the solar spectrum, enabling efficiencies approaching practical limits. The work is strongly motivated by the global need for renewable energy sources. To this end, AMETIST framework is based on three vectors of excellence in: i) material science and epitaxial processes, ii) advanced solar cells exploiting nanophotonics concepts, and iii) new device fabrication technologies.
Novel heterostructures (e.g. GaInNAsSb, GaNAsBi), providing absorption in a broad spectral range from 0.7 eV to 1.4 eV, will be synthesized and monolithically integrated in tandem cells with up to 8-junctions. Nanophotonic methods for light-trapping, spectral and spatial control of solar radiation will be developed to further enhance the absorption. To ensure a high long-term impact, the project will validate the use of state-of-the-art molecular-beam-epitaxy processes for fabrication of economically viable ultra-high efficiency solar cells. The ultimate efficiency target is to reach a level of 55%. This would enable to generate renewable/ecological/sustainable energy at a levelized production cost below ~7 ¢/kWh, comparable or cheaper than fossil fuels. The work will also bring a new breath of developments for more efficient space photovoltaic systems.
AMETIST will leverage the leading position of the applicant in topical technology areas relevant for the project (i.e. epitaxy of III-N/Bi-V alloys and key achievements concerning GaInNAsSb-based tandem solar cells). Thus it renders a unique opportunity to capitalize on the group expertize and position Europe at the forefront in the global competition for demonstrating more efficient and economically viable photovoltaic technologies.
Max ERC Funding
2 492 719 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ANPROB
Project Analytic-probabilistic methods for borderline singular integrals
Researcher (PI) Tuomas Pentinpoika Hytönen
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Summary
The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Max ERC Funding
1 100 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym aQUARiUM
Project QUAntum nanophotonics in Rolled-Up Metamaterials
Researcher (PI) Humeyra CAGLAYAN
Host Institution (HI) TAMPEREEN KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Novel sophisticated technologies that exploit the laws of quantum physics form a cornerstone for the future well-being, economic growth and security of Europe. Here photonic devices have gained a prominent position because the absorption, emission, propagation or storage of a photon is a process that can be harnessed at a fundamental level and render more practical ways to use light for such applications. However, the interaction of light with single quantum systems under ambient conditions is typically very weak and difficult to control. Furthermore, there are quantum phenomena occurring in matter at nanometer length scales that are currently not well understood. These deficiencies have a direct and severe impact on creating a bridge between quantum physics and photonic device technologies. aQUARiUM, precisely address the issue of controlling and enhancing the interaction between few photons and rolled-up nanostructures with ability to be deployed in practical applications.
With aQUARiUM, we will take epsilon (permittivity)-near-zero (ENZ) metamaterials into quantum nanophotonics. To this end, we will integrate quantum emitters with rolled-up waveguides, that act as ENZ metamaterial, to expand and redefine the range of light-matter interactions. We will explore the electromagnetic design freedom enabled by the extended modes of ENZ medium, which “stretches” the effective wavelength inside the structure. Specifically, aQUARiUM is built around the following two objectives: (i) Enhancing light-matter interactions with single emitters (Enhance) independent of emitter position. (ii) Enabling collective excitations in dense emitter ensembles (Collect) coherently connect emitters on nanophotonic devices to obtain coherent emission.
aQUARiUM aims to create novel light-sources and long-term entanglement generation and beyond. The envisioned outcome of aQUARiUM is a wholly new photonic platform applicable across a diverse range of areas.
Summary
Novel sophisticated technologies that exploit the laws of quantum physics form a cornerstone for the future well-being, economic growth and security of Europe. Here photonic devices have gained a prominent position because the absorption, emission, propagation or storage of a photon is a process that can be harnessed at a fundamental level and render more practical ways to use light for such applications. However, the interaction of light with single quantum systems under ambient conditions is typically very weak and difficult to control. Furthermore, there are quantum phenomena occurring in matter at nanometer length scales that are currently not well understood. These deficiencies have a direct and severe impact on creating a bridge between quantum physics and photonic device technologies. aQUARiUM, precisely address the issue of controlling and enhancing the interaction between few photons and rolled-up nanostructures with ability to be deployed in practical applications.
With aQUARiUM, we will take epsilon (permittivity)-near-zero (ENZ) metamaterials into quantum nanophotonics. To this end, we will integrate quantum emitters with rolled-up waveguides, that act as ENZ metamaterial, to expand and redefine the range of light-matter interactions. We will explore the electromagnetic design freedom enabled by the extended modes of ENZ medium, which “stretches” the effective wavelength inside the structure. Specifically, aQUARiUM is built around the following two objectives: (i) Enhancing light-matter interactions with single emitters (Enhance) independent of emitter position. (ii) Enabling collective excitations in dense emitter ensembles (Collect) coherently connect emitters on nanophotonic devices to obtain coherent emission.
aQUARiUM aims to create novel light-sources and long-term entanglement generation and beyond. The envisioned outcome of aQUARiUM is a wholly new photonic platform applicable across a diverse range of areas.
Max ERC Funding
1 499 431 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym AROMA-CFD
Project Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics
Researcher (PI) Gianluigi Rozza
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Summary
The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Max ERC Funding
1 656 579 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ARS
Project Autonomous Robotic Surgery
Researcher (PI) Paolo FIORINI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI VERONA
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Summary
The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Max ERC Funding
2 750 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ATM-GTP
Project Atmospheric Gas-to-Particle conversion
Researcher (PI) Markku KULMALA
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Summary
Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym ATMNUCLE
Project Atmospheric nucleation: from molecular to global scale
Researcher (PI) Markku Tapio Kulmala
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Summary
Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31