Project acronym 1D-Engine
Project 1D-electrons coupled to dissipation: a novel approach for understanding and engineering superconducting materials and devices
Researcher (PI) Adrian KANTIAN
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Summary
Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Max ERC Funding
1 491 013 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 2D-4-CO2
Project DESIGNING 2D NANOSHEETS FOR CO2 REDUCTION AND INTEGRATION INTO vdW HETEROSTRUCTURES FOR ARTIFICIAL PHOTOSYNTHESIS
Researcher (PI) Damien VOIRY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Summary
CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Max ERC Funding
1 499 931 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym 2F4BIODYN
Project Two-Field Nuclear Magnetic Resonance Spectroscopy for the Exploration of Biomolecular Dynamics
Researcher (PI) Fabien Ferrage
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Summary
The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Max ERC Funding
1 462 080 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 3D-BioMat
Project Deciphering biomineralization mechanisms through 3D explorations of mesoscale crystalline structure in calcareous biomaterials
Researcher (PI) VIRGINIE CHAMARD
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Summary
The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Max ERC Funding
1 966 429 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3D-QUEST
Project 3D-Quantum Integrated Optical Simulation
Researcher (PI) Fabio Sciarrino
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Summary
"Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Max ERC Funding
1 474 800 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 3DSPIN
Project 3-Dimensional Maps of the Spinning Nucleon
Researcher (PI) Alessandro Bacchetta
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Summary
How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Max ERC Funding
1 509 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym 3DWATERWAVES
Project Mathematical aspects of three-dimensional water waves with vorticity
Researcher (PI) Erik Torsten Wahlén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Summary
The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Max ERC Funding
1 203 627 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym 4TH-NU-AVENUE
Project Search for a fourth neutrino with a PBq anti-neutrino source
Researcher (PI) Thierry Michel René Lasserre
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Summary
Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym A-LIFE
Project The asymmetry of life: towards a unified view of the emergence of biological homochirality
Researcher (PI) Cornelia MEINERT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2018-STG
Summary What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Summary
What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym AArteMIS
Project Aneurysmal Arterial Mechanics: Into the Structure
Researcher (PI) Pierre Joseph Badel
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Summary
The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Max ERC Funding
1 499 783 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym ABIOS
Project ABIOtic Synthesis of RNA: an investigation on how life started before biology existed
Researcher (PI) Guillaume STIRNEMANN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Summary
The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Max ERC Funding
1 497 031 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ADEQUATE
Project Advanced optoelectronic Devices with Enhanced QUAntum efficiency at THz frEquencies
Researcher (PI) Carlo Sirtori
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Summary
The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Max ERC Funding
1 761 000 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AdS-CFT-solvable
Project Origins of integrability in AdS/CFT correspondence
Researcher (PI) Vladimir Kazakov
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Summary
Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Max ERC Funding
1 456 140 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AEROFLEX
Project AEROelastic instabilities and control of FLEXible Structures
Researcher (PI) Olivier Pierre MARQUET
Host Institution (HI) OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Summary
Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Max ERC Funding
1 377 290 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AFRIVAL
Project African river basins: catchment-scale carbon fluxes and transformations
Researcher (PI) Steven Bouillon
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Summary
This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Max ERC Funding
1 745 262 €
Duration
Start date: 2009-10-01, End date: 2014-09-30
Project acronym AFRODITE
Project Advanced Fluid Research On Drag reduction In Turbulence Experiments
Researcher (PI) Jens Henrik Mikael Fransson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary A hot topic in today's debate on global warming is drag reduction in aeronautics. The most beneficial concept for drag reduction is to maintain the major portion of the airfoil laminar. Estimations show that the potential drag reduction can be as much as 15%, which would give a significant reduction of NOx and CO emissions in the atmosphere considering that the number of aircraft take offs, only in the EU, is over 19 million per year. An important element for successful flow control, which can lead to a reduced aerodynamic drag, is enhanced physical understanding of the transition to turbulence process.
In previous wind tunnel measurements we have shown that roughness elements can be used to sensibly delay transition to turbulence. The result is revolutionary, since the common belief has been that surface roughness causes earlier transition and in turn increases the drag, and is a proof of concept of the passive control method per se. The beauty with a passive control technique is that no external energy has to be added to the flow system in order to perform the control, instead one uses the existing energy in the flow.
In this project proposal, AFRODITE, we will take this passive control method to the next level by making it twofold, more persistent and more robust. Transition prevention is the goal rather than transition delay and the method will be extended to simultaneously control separation, which is another unwanted flow phenomenon especially during airplane take offs. AFRODITE will be a catalyst for innovative research, which will lead to a cleaner sky.
Summary
A hot topic in today's debate on global warming is drag reduction in aeronautics. The most beneficial concept for drag reduction is to maintain the major portion of the airfoil laminar. Estimations show that the potential drag reduction can be as much as 15%, which would give a significant reduction of NOx and CO emissions in the atmosphere considering that the number of aircraft take offs, only in the EU, is over 19 million per year. An important element for successful flow control, which can lead to a reduced aerodynamic drag, is enhanced physical understanding of the transition to turbulence process.
In previous wind tunnel measurements we have shown that roughness elements can be used to sensibly delay transition to turbulence. The result is revolutionary, since the common belief has been that surface roughness causes earlier transition and in turn increases the drag, and is a proof of concept of the passive control method per se. The beauty with a passive control technique is that no external energy has to be added to the flow system in order to perform the control, instead one uses the existing energy in the flow.
In this project proposal, AFRODITE, we will take this passive control method to the next level by making it twofold, more persistent and more robust. Transition prevention is the goal rather than transition delay and the method will be extended to simultaneously control separation, which is another unwanted flow phenomenon especially during airplane take offs. AFRODITE will be a catalyst for innovative research, which will lead to a cleaner sky.
Max ERC Funding
1 418 399 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym AGEnTh
Project Atomic Gauge and Entanglement Theories
Researcher (PI) Marcello DALMONTE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Summary
AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Max ERC Funding
1 055 317 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AIRSEA
Project Air-Sea Exchanges driven by Light
Researcher (PI) Christian George
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Summary
The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Max ERC Funding
2 366 276 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym ALMA
Project Attosecond Control of Light and Matter
Researcher (PI) Anne L'huillier
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Summary
Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Max ERC Funding
2 250 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym ALOGLADIS
Project From Anderson localization to Bose, Fermi and spin glasses in disordered ultracold gases
Researcher (PI) Laurent Sanchez-Palencia
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Summary
The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Max ERC Funding
985 200 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ALPAM
Project Atomic-Level Physics of Advanced Materials
Researcher (PI) Börje Johansson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE5, ERC-2008-AdG
Summary Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Summary
Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AMPERE
Project Accounting for Metallicity, Polarization of the Electrolyte, and Redox reactions in computational Electrochemistry
Researcher (PI) Mathieu Eric Salanne
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Consolidator Grant (CoG), PE4, ERC-2017-COG
Summary Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Summary
Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Max ERC Funding
1 588 769 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ANADEL
Project Analysis of Geometrical Effects on Dispersive Equations
Researcher (PI) Danela Oana IVANOVICI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Summary
We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Max ERC Funding
1 293 763 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym analysisdirac
Project The analysis of the Dirac operator: the hypoelliptic Laplacian and its applications
Researcher (PI) Jean-Michel Philippe Marie-José Bismut
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Summary
This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Max ERC Funding
1 112 400 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ANAMORPHISM
Project Asymptotic and Numerical Analysis of MOdels of Resonant Physics Involving Structured Materials
Researcher (PI) Sebastien Roger Louis Guenneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Summary
One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Max ERC Funding
1 280 391 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ANDLICA
Project Anderson Localization of Light by Cold Atoms
Researcher (PI) Robin KAISER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2018-ADG
Summary I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Summary
I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Max ERC Funding
2 490 717 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym ANSR
Project Ab initio approach to nuclear structure and reactions (++)
Researcher (PI) Christian Erik Forssén
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE2, ERC-2009-StG
Summary Today, much interest in several fields of physics is devoted to the study of small, open quantum systems, whose properties are profoundly affected by the environment; i.e., the continuum of decay channels. In nuclear physics, these problems were originally studied in the context of nuclear reactions but their importance has been reestablished with the advent of radioactive-beam physics and the resulting interest in exotic nuclei. In particular, strong theory initiatives in this area of research will be instrumental for the success of the experimental program at the Facility for Antiproton and Ion Research (FAIR) in Germany. In addition, many of the aspects of open quantum systems are also being explored in the rapidly evolving research on ultracold atomic gases, quantum dots, and other nanodevices. A first-principles description of open quantum systems presents a substantial theoretical and computational challenge. However, the current availability of enormous computing power has allowed theorists to make spectacular progress on problems that were previously thought intractable. The importance of computational methods to study quantum many-body systems is stressed in this proposal. Our approach is based on the ab initio no-core shell model (NCSM), which is a well-established theoretical framework aimed originally at an exact description of nuclear structure starting from realistic inter-nucleon forces. A successful completion of this project requires extensions of the NCSM mathematical framework and the development of highly advanced computer codes. The '++' in the project title indicates the interdisciplinary aspects of the present research proposal and the ambition to make a significant impact on connected fields of many-body physics.
Summary
Today, much interest in several fields of physics is devoted to the study of small, open quantum systems, whose properties are profoundly affected by the environment; i.e., the continuum of decay channels. In nuclear physics, these problems were originally studied in the context of nuclear reactions but their importance has been reestablished with the advent of radioactive-beam physics and the resulting interest in exotic nuclei. In particular, strong theory initiatives in this area of research will be instrumental for the success of the experimental program at the Facility for Antiproton and Ion Research (FAIR) in Germany. In addition, many of the aspects of open quantum systems are also being explored in the rapidly evolving research on ultracold atomic gases, quantum dots, and other nanodevices. A first-principles description of open quantum systems presents a substantial theoretical and computational challenge. However, the current availability of enormous computing power has allowed theorists to make spectacular progress on problems that were previously thought intractable. The importance of computational methods to study quantum many-body systems is stressed in this proposal. Our approach is based on the ab initio no-core shell model (NCSM), which is a well-established theoretical framework aimed originally at an exact description of nuclear structure starting from realistic inter-nucleon forces. A successful completion of this project requires extensions of the NCSM mathematical framework and the development of highly advanced computer codes. The '++' in the project title indicates the interdisciplinary aspects of the present research proposal and the ambition to make a significant impact on connected fields of many-body physics.
Max ERC Funding
1 304 800 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym ANT
Project Automata in Number Theory
Researcher (PI) Boris Adamczewski
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Summary
Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Max ERC Funding
1 438 745 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym APOGEE
Project Atomic-scale physics of single-photon sources.
Researcher (PI) GUILLAUME ARTHUR FRANCOIS SCHULL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2017-COG
Summary Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Summary
Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Max ERC Funding
1 996 848 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym APPROXNP
Project Approximation of NP-hard optimization problems
Researcher (PI) Johan Håstad
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Summary
The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Max ERC Funding
2 376 000 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym AQUARAMAN
Project Pipet Based Scanning Probe Microscopy Tip-Enhanced Raman Spectroscopy: A Novel Approach for TERS in Liquids
Researcher (PI) Aleix Garcia Guell
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE4, ERC-2016-STG
Summary Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Summary
Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Max ERC Funding
1 528 442 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ARCHEIS
Project Understanding the onset and impact of Aquatic Resource Consumption in Human Evolution using novel Isotopic tracerS
Researcher (PI) Klervia Marie Madalen JAOUEN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2018-STG
Summary The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Summary
The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Max ERC Funding
1 361 991 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ARENA
Project Arrays of entangled atoms
Researcher (PI) Antoine Browaeys
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2009-StG
Summary The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Summary
The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Max ERC Funding
1 449 600 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym AROMA-CFD
Project Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics
Researcher (PI) Gianluigi Rozza
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Summary
The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Max ERC Funding
1 656 579 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ARPEMA
Project Anionic redox processes: A transformational approach for advanced energy materials
Researcher (PI) Jean-Marie Tarascon
Host Institution (HI) COLLEGE DE FRANCE
Call Details Advanced Grant (AdG), PE5, ERC-2014-ADG
Summary Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Summary
Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Max ERC Funding
2 249 196 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ARTHUS
Project Advances in Research on Theories of the Dark Universe - Inhomogeneity Effects in Relativistic Cosmology
Researcher (PI) Thomas BUCHERT
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Summary
The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Max ERC Funding
2 091 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARTISTIC
Project Advanced and Reusable Theory for the In Silico-optimization of composite electrode fabrication processes for rechargeable battery Technologies with Innovative Chemistries
Researcher (PI) Alejandro Antonio FRANCO
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2017-COG
Summary The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Summary
The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Max ERC Funding
1 976 445 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym ASD
Project Atomistic Spin-Dynamics; Methodology and Applications
Researcher (PI) Olof Ragnar Eriksson
Host Institution (HI) Uppsala University
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary Our aim is to provide a theoretical framework for studies of dynamical aspects of magnetic materials and magnetisation reversal, which has potential for applications for magnetic data storage and magnetic memory devices. The project focuses on developing and using an atomistic spin dynamics simulation method. Our goal is to identify novel materials and device geometries with improved performance. The scientific questions which will be addressed concern the understanding of the fundamental temporal limit of magnetisation switching and reversal, and the mechanisms which govern this limit. The methodological developments concern the ability to, from first principles theory, calculate the interatomic exchange parameters of materials in general, in particular for correlated electron materials, via the use of dynamical mean-field theory. The theoretical development also involves an atomistic spin dynamics simulation method, which once it has been established, will be released as a public software package. The proposed theoretical research will be intimately connected to world-leading experimental efforts, especially in Europe where a leading activity in experimental studies of magnetisation dynamics has been established. The ambition with this project is to become world-leading in the theory of simulating spin-dynamics phenomena, and to promote education and training of young researchers. To achieve our goals we will build up an open and lively environment, where the advances in the theoretical knowledge of spin-dynamics phenomena will be used to address important questions in information technology. In this environment the next generation research leaders will be fostered and trained, thus ensuring that the society of tomorrow is equipped with the scientific competence to tackle the challenges of our future.
Summary
Our aim is to provide a theoretical framework for studies of dynamical aspects of magnetic materials and magnetisation reversal, which has potential for applications for magnetic data storage and magnetic memory devices. The project focuses on developing and using an atomistic spin dynamics simulation method. Our goal is to identify novel materials and device geometries with improved performance. The scientific questions which will be addressed concern the understanding of the fundamental temporal limit of magnetisation switching and reversal, and the mechanisms which govern this limit. The methodological developments concern the ability to, from first principles theory, calculate the interatomic exchange parameters of materials in general, in particular for correlated electron materials, via the use of dynamical mean-field theory. The theoretical development also involves an atomistic spin dynamics simulation method, which once it has been established, will be released as a public software package. The proposed theoretical research will be intimately connected to world-leading experimental efforts, especially in Europe where a leading activity in experimental studies of magnetisation dynamics has been established. The ambition with this project is to become world-leading in the theory of simulating spin-dynamics phenomena, and to promote education and training of young researchers. To achieve our goals we will build up an open and lively environment, where the advances in the theoretical knowledge of spin-dynamics phenomena will be used to address important questions in information technology. In this environment the next generation research leaders will be fostered and trained, thus ensuring that the society of tomorrow is equipped with the scientific competence to tackle the challenges of our future.
Max ERC Funding
2 130 000 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym ASTRODYN
Project Astrophysical Dynamos
Researcher (PI) Axel Brandenburg
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Summary
Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Max ERC Funding
2 220 000 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym ASTROGEOBIOSPHERE
Project An astronomical perspective on Earth's geological record and evolution of life
Researcher (PI) Birger Schmitz
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary "This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Summary
"This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Max ERC Funding
1 950 000 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ATMO
Project Atmospheres across the Universe
Researcher (PI) Pascal TREMBLIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Summary
Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ATMOFLEX
Project Turbulent Transport in the Atmosphere: Fluctuations and Extreme Events
Researcher (PI) Jérémie Bec
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Summary
A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Max ERC Funding
1 200 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym ATMOGAIN
Project Atmospheric Gas-Aerosol Interface:
From Fundamental Theory to Global Effects
Researcher (PI) Ilona Anniina Riipinen
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Summary
Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Max ERC Funding
1 498 099 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym ATOMAG
Project From Attosecond Magnetism towards Ultrafast Spin Photonics
Researcher (PI) Jean-Yves Bigot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Summary
We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Max ERC Funding
2 492 561 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym Atto-Zepto
Project Ultrasensitive Nano-Optomechanical Sensors
Researcher (PI) Olivier ARCIZET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Summary
By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Max ERC Funding
2 067 905 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym AUGURY
Project Reconstructing Earth’s mantle convection
Researcher (PI) Nicolas Coltice
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Summary
Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Max ERC Funding
1 994 000 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym AXION
Project Axions: From Heaven to Earth
Researcher (PI) Frank Wilczek
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2016-ADG
Summary Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Summary
Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Max ERC Funding
2 324 391 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BALLISTOP
Project Revealing 1D ballistic charge and spin currents in second order topological insulators
Researcher (PI) helene BOUCHIAT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2018-ADG
Summary One of the greatest recent achievement in Condensed matter physics is the discovery of a new class of materials, Topological Insulators (TI), whose bulk is insulating, while the edges conduct current in a quasi-ideal way. In particular, the 1D edges of 2DTI realize the Quantum Spin Hall state, where current is carried dissipationlessly by two counter-propagating ballistic edge states with a spin orientation locked to that of the propagation direction (a helical edge state). This opens many possibilities, ranging from dissipationless charge and spin transport at room temperature to new avenues for quantum computing. We propose to investigate charge and spin currents in a newly discovered class of TIs, Second Order Topological Insulators (SOTIs), i.e. 3D crystals with insulating bulk and surfaces, but perfectly conducting (topologically protected) 1D helical “hinge” states. Bismuth, despite its well-known semimetallic character, has recently been shown theoretically to belong to this class of materials, explaining our recent intriguing findings on nanowires. Our goal is to reveal, characterize and exploit the unique properties of SOTIs, in particular the high velocity, ballistic, and dissipationless hinge currents. We will probe crystalline bismuth samples with refined new experimental tools. The superconducting proximity effect will reveal the spatial distribution of conduction paths, and test the ballisticity of the hinge modes (that may coexist with non-topological surface modes). High frequency and tunnel spectroscopies of hybrid superconductor/Bi circuits will probe their topological nature, including the existence of Majorana modes. We will use high sensitivity magnetometers to detect the orbital magnetism of SOTI platelets, which should be dominated by topological edge currents. Lastly, we propose to detect the predicted equilibrium spin currents in 2DTIs and SOTIs via the generated electric field, using single electron transistors-based electrometers.
Summary
One of the greatest recent achievement in Condensed matter physics is the discovery of a new class of materials, Topological Insulators (TI), whose bulk is insulating, while the edges conduct current in a quasi-ideal way. In particular, the 1D edges of 2DTI realize the Quantum Spin Hall state, where current is carried dissipationlessly by two counter-propagating ballistic edge states with a spin orientation locked to that of the propagation direction (a helical edge state). This opens many possibilities, ranging from dissipationless charge and spin transport at room temperature to new avenues for quantum computing. We propose to investigate charge and spin currents in a newly discovered class of TIs, Second Order Topological Insulators (SOTIs), i.e. 3D crystals with insulating bulk and surfaces, but perfectly conducting (topologically protected) 1D helical “hinge” states. Bismuth, despite its well-known semimetallic character, has recently been shown theoretically to belong to this class of materials, explaining our recent intriguing findings on nanowires. Our goal is to reveal, characterize and exploit the unique properties of SOTIs, in particular the high velocity, ballistic, and dissipationless hinge currents. We will probe crystalline bismuth samples with refined new experimental tools. The superconducting proximity effect will reveal the spatial distribution of conduction paths, and test the ballisticity of the hinge modes (that may coexist with non-topological surface modes). High frequency and tunnel spectroscopies of hybrid superconductor/Bi circuits will probe their topological nature, including the existence of Majorana modes. We will use high sensitivity magnetometers to detect the orbital magnetism of SOTI platelets, which should be dominated by topological edge currents. Lastly, we propose to detect the predicted equilibrium spin currents in 2DTIs and SOTIs via the generated electric field, using single electron transistors-based electrometers.
Max ERC Funding
2 432 676 €
Duration
Start date: 2020-04-01, End date: 2025-03-31
Project acronym BEBOP
Project Bacterial biofilms in porous structures: from biomechanics to control
Researcher (PI) Yohan, Jean-Michel, Louis DAVIT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary The key ideas motivating this project are that: 1) precise control of the properties of porous systems can be obtained by exploiting bacteria and their fantastic abilities; 2) conversely, porous media (large surface to volume ratios, complex structures) could be a major part of bacterial synthetic biology, as a scaffold for growing large quantities of microorganisms in controlled bioreactors.
The main scientific obstacle to precise control of such processes is the lack of understanding of biophysical mechanisms in complex porous structures, even in the case of single-strain biofilms. The central hypothesis of this project is that a better fundamental understanding of biofilm biomechanics and physical ecology will yield a novel theoretical basis for engineering and control.
The first scientific objective is thus to gain insight into how fluid flow, transport phenomena and biofilms interact within connected multiscale heterogeneous structures - a major scientific challenge with wide-ranging implications. To this end, we will combine microfluidic and 3D printed micro-bioreactor experiments; fluorescence and X-ray imaging; high performance computing blending CFD, individual-based models and pore network approaches.
The second scientific objective is to create the primary building blocks toward a control theory of bacteria in porous media and innovative designs of microbial bioreactors. Building upon the previous objective, we first aim to extract from the complexity of biological responses the most universal engineering principles applying to such systems. We will then design a novel porous micro-bioreactor to demonstrate how the permeability and solute residence times can be controlled in a dynamic, reversible and stable way - an initial step toward controlling reaction rates.
We envision that this will unlock a new generation of biotechnologies and novel bioreactor designs enabling translation from proof-of-concept synthetic microbiology to industrial processes.
Summary
The key ideas motivating this project are that: 1) precise control of the properties of porous systems can be obtained by exploiting bacteria and their fantastic abilities; 2) conversely, porous media (large surface to volume ratios, complex structures) could be a major part of bacterial synthetic biology, as a scaffold for growing large quantities of microorganisms in controlled bioreactors.
The main scientific obstacle to precise control of such processes is the lack of understanding of biophysical mechanisms in complex porous structures, even in the case of single-strain biofilms. The central hypothesis of this project is that a better fundamental understanding of biofilm biomechanics and physical ecology will yield a novel theoretical basis for engineering and control.
The first scientific objective is thus to gain insight into how fluid flow, transport phenomena and biofilms interact within connected multiscale heterogeneous structures - a major scientific challenge with wide-ranging implications. To this end, we will combine microfluidic and 3D printed micro-bioreactor experiments; fluorescence and X-ray imaging; high performance computing blending CFD, individual-based models and pore network approaches.
The second scientific objective is to create the primary building blocks toward a control theory of bacteria in porous media and innovative designs of microbial bioreactors. Building upon the previous objective, we first aim to extract from the complexity of biological responses the most universal engineering principles applying to such systems. We will then design a novel porous micro-bioreactor to demonstrate how the permeability and solute residence times can be controlled in a dynamic, reversible and stable way - an initial step toward controlling reaction rates.
We envision that this will unlock a new generation of biotechnologies and novel bioreactor designs enabling translation from proof-of-concept synthetic microbiology to industrial processes.
Max ERC Funding
1 649 861 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym BIC
Project Cavitation across scales: following Bubbles from Inception to Collapse
Researcher (PI) Carlo Massimo Casciola
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Summary
Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Max ERC Funding
2 491 200 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym Big Mac
Project Microfluidic Approaches mimicking BIoGeological conditions to investigate subsurface CO2 recycling
Researcher (PI) SAMUEL CHARLES GEORGES MARRE
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2016-COG
Summary The management of anthropogenic CO2 will be one of the main challenges of this century given the dramatic impact of greenhouse gases on our living environment. A fascinating strategy to restore the advantages of stored CO2 as a raw material would be to consider a slow biological upgrading process of CO2 in deep geological formations.
Significantly, the recent development of microfluidic tools to study pore-scale phenomena under high pressure, opens new avenues to investigate such strategies. Thus, the strategic objective of this project is to develop and to use “Biological Geological Laboratories on a Chip - BioGLoCs” mimicking reservoir conditions in order to gain greater understanding in the mechanisms associated with the biogeological conversion process of CO2 to methane in CGS environment at pore scale.
The specific objectives are: (1) to determine the experimental conditions for the development of competent micro-organisms (methanogens) and to establish the methane production rates depending on the operating parameters, (2) to evaluate the feasibility of a H2 in situ production strategy (required to sustain the methanogenesis process), (3) to investigate the full bioconversion process in 2D and 3D, (4) to demonstrate the process scaling from pore scale to liter scale and (5) to evaluate the overall process performance.
This multidisciplinary project gathering expertise in chemical engineering and geomicrobiology will be the first ever use of microfluidics approaches to investigate a biogeological transformation taking into account the thermo-hydro-bio-chemical processes. It will result in the identification of efficient geomicrobiological methods and materials to accelerate the CO2 to methane biogeoconversion process. New generic lab scale tools will be also made available for investigating geological-related topics (enhanced oil recovery, deep geothermal energy, bioremediation of groundwater, shale gas recovery).
Summary
The management of anthropogenic CO2 will be one of the main challenges of this century given the dramatic impact of greenhouse gases on our living environment. A fascinating strategy to restore the advantages of stored CO2 as a raw material would be to consider a slow biological upgrading process of CO2 in deep geological formations.
Significantly, the recent development of microfluidic tools to study pore-scale phenomena under high pressure, opens new avenues to investigate such strategies. Thus, the strategic objective of this project is to develop and to use “Biological Geological Laboratories on a Chip - BioGLoCs” mimicking reservoir conditions in order to gain greater understanding in the mechanisms associated with the biogeological conversion process of CO2 to methane in CGS environment at pore scale.
The specific objectives are: (1) to determine the experimental conditions for the development of competent micro-organisms (methanogens) and to establish the methane production rates depending on the operating parameters, (2) to evaluate the feasibility of a H2 in situ production strategy (required to sustain the methanogenesis process), (3) to investigate the full bioconversion process in 2D and 3D, (4) to demonstrate the process scaling from pore scale to liter scale and (5) to evaluate the overall process performance.
This multidisciplinary project gathering expertise in chemical engineering and geomicrobiology will be the first ever use of microfluidics approaches to investigate a biogeological transformation taking into account the thermo-hydro-bio-chemical processes. It will result in the identification of efficient geomicrobiological methods and materials to accelerate the CO2 to methane biogeoconversion process. New generic lab scale tools will be also made available for investigating geological-related topics (enhanced oil recovery, deep geothermal energy, bioremediation of groundwater, shale gas recovery).
Max ERC Funding
1 995 354 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym BigFastData
Project Charting a New Horizon of Big and Fast Data Analysis through Integrated Algorithm Design
Researcher (PI) Yanlei DIAO
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2016-COG
Summary This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Summary
This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Max ERC Funding
2 472 752 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BIHSNAM
Project Bio-inspired Hierarchical Super Nanomaterials
Researcher (PI) Nicola Pugno
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary "Nanomaterials such as carbon nanotubes or graphene sheets represent the future of material science, due to their potentially exceptional mechanical properties. One great drawback of all artificial materials, however, is the decrease of strength with increasing toughness, and viceversa. This problem is not encountered in many biological nanomaterials (e.g. spider silk, bone, nacre). Other biological materials display exceptional adhesion or damping properties, and can be self-cleaning or self-healing. The “secret” of biomaterials seems to lie in “hierarchy”: several levels can often be identified (2 in nacre, up to 7 in bone and dentine), from nano- to micro-scale.
The idea of this project is to combine Nature and Nanotechnology to design hierarchical composites with tailor made characteristics, optimized with respect to both strength and toughness, as well as materials with strong adhesion/easy detachment, smart damping, self-healing/-cleaning properties or controlled energy dissipation. For example, one possible objective is to design the “world’s toughest composite material”. The potential impact and importance of these goals on materials science, the high-tech industry and ultimately the quality of human life could be considerable.
In order to tackle such a challenging design process, the PI proposes to adopt ultimate nanomechanics theoretical tools corroborated by continuum or atomistic simulations, multi-scale numerical parametric simulations and Finite Element optimization procedures, starting from characterization experiments on biological- or nano-materials, from the macroscale to the nanoscale. Results from theoretical, numerical and experimental work packages will be applied to a specific case study in an engineering field of particular interest to demonstrate importance and feasibility, e.g. an airplane wing with a considerably enhanced fatigue resistance and reduced ice-layer adhesion, leading to a 10 fold reduction in wasted fuel."
Summary
"Nanomaterials such as carbon nanotubes or graphene sheets represent the future of material science, due to their potentially exceptional mechanical properties. One great drawback of all artificial materials, however, is the decrease of strength with increasing toughness, and viceversa. This problem is not encountered in many biological nanomaterials (e.g. spider silk, bone, nacre). Other biological materials display exceptional adhesion or damping properties, and can be self-cleaning or self-healing. The “secret” of biomaterials seems to lie in “hierarchy”: several levels can often be identified (2 in nacre, up to 7 in bone and dentine), from nano- to micro-scale.
The idea of this project is to combine Nature and Nanotechnology to design hierarchical composites with tailor made characteristics, optimized with respect to both strength and toughness, as well as materials with strong adhesion/easy detachment, smart damping, self-healing/-cleaning properties or controlled energy dissipation. For example, one possible objective is to design the “world’s toughest composite material”. The potential impact and importance of these goals on materials science, the high-tech industry and ultimately the quality of human life could be considerable.
In order to tackle such a challenging design process, the PI proposes to adopt ultimate nanomechanics theoretical tools corroborated by continuum or atomistic simulations, multi-scale numerical parametric simulations and Finite Element optimization procedures, starting from characterization experiments on biological- or nano-materials, from the macroscale to the nanoscale. Results from theoretical, numerical and experimental work packages will be applied to a specific case study in an engineering field of particular interest to demonstrate importance and feasibility, e.g. an airplane wing with a considerably enhanced fatigue resistance and reduced ice-layer adhesion, leading to a 10 fold reduction in wasted fuel."
Max ERC Funding
1 004 400 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BIOFUNCTION
Project Self assembly into biofunctional molecules, translating instructions into function
Researcher (PI) Nicolas Winssinger
Host Institution (HI) UNIVERSITE DE STRASBOURG
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary The overall objective of the proposal is to develop enabling chemical technologies to address two important problems in biology: detect in a nondestructive fashion gene expression or microRNA sequences in vivo and, secondly, study the role of multivalency and spatial organization in carbohydrate recognition. Both of these projects exploit the programmable pre-organization of peptide nucleic acid (PNA) to induce a chemical reaction in the first case or modulate a ligand-receptor interaction in the second case. For nucleic acid detection, a DNA or RNA fragment will be utilized to bring two PNA fragments bearing reactive functionalities in close proximity thereby promoting a reaction. Two types of reactions are proposed, the first one to release a fluorophore for imaging purposes and the second one to release a drug as an “intelligent” therapeutic. If affinities are programmed such that hybridization is reversible, the template can work catalytically leading to large amplifications. As a proof of concept, this method will be used to measure the transcription level of genes implicated in stem cell differentiation and detect mutations in oncogenes. For the purpose of studying multivalent carbohydrate ligand architectures, the challenge of chemical synthesis has been a limiting factor. A supramolecular approach is proposed herein where different arrangements of carbohydrates can be displayed in a well organized fashion by hybridizing PNA-tagged carbohydrates to DNA templates. This will be used not only to control the distance between multiple ligands or to create combinatorial arrangements of hetero ligands but also to access more complex architectures such as Hollyday junctions. The oligosaccharide units will be prepared using de novo organoctalytic reactions. This technology will be first applied to probe the recognition events between HIV and dendritic cells which promote HIV infection.
Summary
The overall objective of the proposal is to develop enabling chemical technologies to address two important problems in biology: detect in a nondestructive fashion gene expression or microRNA sequences in vivo and, secondly, study the role of multivalency and spatial organization in carbohydrate recognition. Both of these projects exploit the programmable pre-organization of peptide nucleic acid (PNA) to induce a chemical reaction in the first case or modulate a ligand-receptor interaction in the second case. For nucleic acid detection, a DNA or RNA fragment will be utilized to bring two PNA fragments bearing reactive functionalities in close proximity thereby promoting a reaction. Two types of reactions are proposed, the first one to release a fluorophore for imaging purposes and the second one to release a drug as an “intelligent” therapeutic. If affinities are programmed such that hybridization is reversible, the template can work catalytically leading to large amplifications. As a proof of concept, this method will be used to measure the transcription level of genes implicated in stem cell differentiation and detect mutations in oncogenes. For the purpose of studying multivalent carbohydrate ligand architectures, the challenge of chemical synthesis has been a limiting factor. A supramolecular approach is proposed herein where different arrangements of carbohydrates can be displayed in a well organized fashion by hybridizing PNA-tagged carbohydrates to DNA templates. This will be used not only to control the distance between multiple ligands or to create combinatorial arrangements of hetero ligands but also to access more complex architectures such as Hollyday junctions. The oligosaccharide units will be prepared using de novo organoctalytic reactions. This technology will be first applied to probe the recognition events between HIV and dendritic cells which promote HIV infection.
Max ERC Funding
1 249 980 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym BIOINOHYB
Project Smart Bioinorganic Hybrids for Nanomedicine
Researcher (PI) Cristiana Di Valentin
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary The use of bioinorganic nanohybrids (nanoscaled systems based on an inorganic and a biological component) has already resulted in several innovative medical breakthroughs for drug delivery, therapeutics, imaging, diagnosis and biocompatibility. However, researchers still know relatively little about the structure, function and mechanism of these nanodevices. Theoretical investigations of bioinorganic interfaces are mostly limited to force-field approaches which cannot grasp the details of the physicochemical mechanisms. The BIOINOHYB project proposes to capitalize on recent massively parallelized codes to investigate bioinorganic nanohybrids by advanced quantum chemical methods. This approach will allow to master the chemical and electronic interplay between the bio and the inorganic components in the first part of the project, and the interaction of the hybrid systems with light in the second part. The ultimate goal is to provide the design principles for novel, unconventional assemblies with unprecedented functionalities and strong impact potential in nanomedicine.
More specifically, in this project the traditional metallic nanoparticle will be substituted by emerging semiconducting metal oxide nanostructures with photocatalytic or magnetic properties capable of opening totally new horizons in nanomedicine (e.g. photocatalytic therapy, a new class of contrast agents, magnetically guided drug delivery). Potentially efficient linkers will be screened regarding their ability both to anchor surfaces and to bind biomolecules. Different kinds of biomolecules (from oligopeptides and oligonucleotides to small drugs) will be tethered to the activated surface according to the desired functionality. The key computational challenge, requiring the recourse to more sophisticated methods, will be the investigation of the photo-response to light of the assembled bioinorganic systems, also with specific reference to their labelling with fluorescent markers and contrast agents.
Summary
The use of bioinorganic nanohybrids (nanoscaled systems based on an inorganic and a biological component) has already resulted in several innovative medical breakthroughs for drug delivery, therapeutics, imaging, diagnosis and biocompatibility. However, researchers still know relatively little about the structure, function and mechanism of these nanodevices. Theoretical investigations of bioinorganic interfaces are mostly limited to force-field approaches which cannot grasp the details of the physicochemical mechanisms. The BIOINOHYB project proposes to capitalize on recent massively parallelized codes to investigate bioinorganic nanohybrids by advanced quantum chemical methods. This approach will allow to master the chemical and electronic interplay between the bio and the inorganic components in the first part of the project, and the interaction of the hybrid systems with light in the second part. The ultimate goal is to provide the design principles for novel, unconventional assemblies with unprecedented functionalities and strong impact potential in nanomedicine.
More specifically, in this project the traditional metallic nanoparticle will be substituted by emerging semiconducting metal oxide nanostructures with photocatalytic or magnetic properties capable of opening totally new horizons in nanomedicine (e.g. photocatalytic therapy, a new class of contrast agents, magnetically guided drug delivery). Potentially efficient linkers will be screened regarding their ability both to anchor surfaces and to bind biomolecules. Different kinds of biomolecules (from oligopeptides and oligonucleotides to small drugs) will be tethered to the activated surface according to the desired functionality. The key computational challenge, requiring the recourse to more sophisticated methods, will be the investigation of the photo-response to light of the assembled bioinorganic systems, also with specific reference to their labelling with fluorescent markers and contrast agents.
Max ERC Funding
1 748 125 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym BIOLOCHANICS
Project Localization in biomechanics and mechanobiology of aneurysms: Towards personalized medicine
Researcher (PI) Stéphane Henri Anatole Avril
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Consolidator Grant (CoG), PE8, ERC-2014-CoG
Summary Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Summary
Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Max ERC Funding
1 999 396 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym BIOMIM
Project Biomimetic films and membranes as advanced materials for studies on cellular processes
Researcher (PI) Catherine Cecile Picart
Host Institution (HI) INSTITUT POLYTECHNIQUE DE GRENOBLE
Call Details Starting Grant (StG), PE5, ERC-2010-StG_20091028
Summary The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Summary
The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Max ERC Funding
1 499 996 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym BioMNP
Project Understanding the interaction between metal nanoparticles and biological membranes
Researcher (PI) Giulia Rossi
Host Institution (HI) UNIVERSITA DEGLI STUDI DI GENOVA
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary The BioMNP objective is the molecular-level understanding of the interactions between surface functionalized metal nanoparticles and biological membranes, by means of cutting-edge computational techniques and new molecular models.
Metal nanoparticles (NP) play more and more important roles in pharmaceutical and medical technology as diagnostic or therapeutic devices. Metal NPs can nowadays be engineered in a multitude of shapes, sizes and compositions, and they can be decorated with an almost infinite variety of functionalities. Despite such technological advances, there is still poor understanding of the molecular processes that drive the interactions of metal NPs with cells. Cell membranes are the first barrier encountered by NPs entering living organisms. The understanding and control of the interaction of nanoparticles with biological membranes is therefore of paramount importance to understand the molecular basis of the NP biological effects.
BioMNP will go beyond the state of the art by rationalizing the complex interplay of NP size, composition, functionalization and aggregation state during the interaction with model biomembranes. Membranes, in turn, will be modelled at an increasing level of complexity in terms of lipid composition and phase. BioMNP will rely on cutting-edge simulation techniques and facilities, and develop new coarse-grained models grounded on finer-level atomistic simulations, to study the NP-membrane interactions on an extremely large range of length and time scales.
BioMNP will benefit from important and complementary experimental collaborations, will propose interpretations of the available experimental data and make predictions to guide the design of functional, non-toxic metal nanoparticles for biomedical applications. BioMNP aims at answering fundamental questions at the crossroads of physics, biology and chemistry. Its results will have an impact on nanomedicine, toxicology, nanotechnology and material sciences.
Summary
The BioMNP objective is the molecular-level understanding of the interactions between surface functionalized metal nanoparticles and biological membranes, by means of cutting-edge computational techniques and new molecular models.
Metal nanoparticles (NP) play more and more important roles in pharmaceutical and medical technology as diagnostic or therapeutic devices. Metal NPs can nowadays be engineered in a multitude of shapes, sizes and compositions, and they can be decorated with an almost infinite variety of functionalities. Despite such technological advances, there is still poor understanding of the molecular processes that drive the interactions of metal NPs with cells. Cell membranes are the first barrier encountered by NPs entering living organisms. The understanding and control of the interaction of nanoparticles with biological membranes is therefore of paramount importance to understand the molecular basis of the NP biological effects.
BioMNP will go beyond the state of the art by rationalizing the complex interplay of NP size, composition, functionalization and aggregation state during the interaction with model biomembranes. Membranes, in turn, will be modelled at an increasing level of complexity in terms of lipid composition and phase. BioMNP will rely on cutting-edge simulation techniques and facilities, and develop new coarse-grained models grounded on finer-level atomistic simulations, to study the NP-membrane interactions on an extremely large range of length and time scales.
BioMNP will benefit from important and complementary experimental collaborations, will propose interpretations of the available experimental data and make predictions to guide the design of functional, non-toxic metal nanoparticles for biomedical applications. BioMNP aims at answering fundamental questions at the crossroads of physics, biology and chemistry. Its results will have an impact on nanomedicine, toxicology, nanotechnology and material sciences.
Max ERC Funding
1 131 250 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym BIOMOFS
Project Bioapplications of Metal Organic Frameworks
Researcher (PI) Christian Serre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary This project will focus on the use of nanoporous metal organic frameworks (Fe, Zn, Ti) for bioapplications. These systems are exciting porous solids, built up from inorganic clusters and polycarboxylates. This results in open-framework solids with different pore shapes and dimensions, and applications such as catalysis, separation and storage of gases. I have recently initiated the synthesis of new trivalent transition metal carboxylates. Among them, the metal carboxylates MIL-100 and MIL-101 (MIL: Materials of Institut Lavoisier) are spectacular solids with giant pores (25-34 Å), accessible metal sites and huge surface areas (3100-5900 m2.g-1). Recently, it was shown that these solids could be used for drug delivery with a loading of 1.4 g of Ibuprofen per gram of MIL-101 solid and a total release in six days. This project will concentrate on the implication of MOFs for drug release and other bioapplications. Whereas research on drug delivery is currently focused either on the use of bio-compatible polymers or mesoporous materials, our method will combine advantages of both routes including a high loading and a slow release of therapeutic molecules. A second application will use solids with accessible metal sites to coordinate NO for its controlled delivery. This would provide exogenous NO for prophylactic and therapeutic processes, anti-thrombogenic medical devices, improved dressings for wounds and ulcers, and the treatment of fungal and bacterial infections. Finally, other applications will be envisaged such as the purification of physiological fluids. The project, which will consist of a systematic study of the relation between these properties and both the composition and structure of the hybrid solids, will be assisted by a strong modelling effort including top of the art computational methods (QSAR and QSPKR). This highly impact project will be realised by assembling experienced researchers in multidisplinary areas including materials science, biology and modelling. It will involve P. Horcajada (Institut Lavoisier), whose background in pharmaceutical science will fit with my experience in inorganic chemistry and G. Maurin (Institut Gerhardt, Montpellier) expert in computational chemistry.
Summary
This project will focus on the use of nanoporous metal organic frameworks (Fe, Zn, Ti) for bioapplications. These systems are exciting porous solids, built up from inorganic clusters and polycarboxylates. This results in open-framework solids with different pore shapes and dimensions, and applications such as catalysis, separation and storage of gases. I have recently initiated the synthesis of new trivalent transition metal carboxylates. Among them, the metal carboxylates MIL-100 and MIL-101 (MIL: Materials of Institut Lavoisier) are spectacular solids with giant pores (25-34 Å), accessible metal sites and huge surface areas (3100-5900 m2.g-1). Recently, it was shown that these solids could be used for drug delivery with a loading of 1.4 g of Ibuprofen per gram of MIL-101 solid and a total release in six days. This project will concentrate on the implication of MOFs for drug release and other bioapplications. Whereas research on drug delivery is currently focused either on the use of bio-compatible polymers or mesoporous materials, our method will combine advantages of both routes including a high loading and a slow release of therapeutic molecules. A second application will use solids with accessible metal sites to coordinate NO for its controlled delivery. This would provide exogenous NO for prophylactic and therapeutic processes, anti-thrombogenic medical devices, improved dressings for wounds and ulcers, and the treatment of fungal and bacterial infections. Finally, other applications will be envisaged such as the purification of physiological fluids. The project, which will consist of a systematic study of the relation between these properties and both the composition and structure of the hybrid solids, will be assisted by a strong modelling effort including top of the art computational methods (QSAR and QSPKR). This highly impact project will be realised by assembling experienced researchers in multidisplinary areas including materials science, biology and modelling. It will involve P. Horcajada (Institut Lavoisier), whose background in pharmaceutical science will fit with my experience in inorganic chemistry and G. Maurin (Institut Gerhardt, Montpellier) expert in computational chemistry.
Max ERC Funding
1 250 000 €
Duration
Start date: 2008-06-01, End date: 2013-05-31
Project acronym BIORECAR
Project Direct cell reprogramming therapy in myocardial regeneration through an engineered multifunctional platform integrating biochemical instructive cues
Researcher (PI) Valeria CHIONO
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE8, ERC-2017-COG
Summary In BIORECAR I will develop a new breakthrough multifunctional biomaterial-based platform for myocardial regeneration after myocardial infarction, provided with biochemical cues able to enhance the direct reprogramming of human cardiac fibroblasts into functional cardiomyocytes.
My expertise in bioartificial materials and biomimetic scaffolds and the versatile chemistry of polyurethanes will be the key elements to achieve a significant knowledge and technological advancement in cell reprogramming therapy, opening the way to the future translation of the therapy into the clinics.
I will implement this advanced approach through the design of a novel 3D in vitro tissue-engineered model of human cardiac fibrotic tissue, as a tool for testing and validation, to maximise research efforts and reduce animal tests.
I will adapt novel nanomedicine approaches I have recently developed for drug release to design innovative cell-friendly and efficient polyurethane nanoparticles for targeted reprogramming of cardiac fibroblasts.
I will design an injectable bioartificial hydrogel based on a blend of a thermosensitive polyurethane and a natural component selected among a novel cell-secreted natural polymer mixture (“biomatrix”) recapitulating the complexity of cardiac extracellular matrix or one of its main protein constituents. Such multifunctional hydrogel will deliver in situ agents stimulating recruitment of cardiac fibroblasts together with the nanoparticles loaded with reprogramming therapeutics, and will provide biochemical signalling to stimulate efficient conversion of fibroblasts into mature cardiomyocytes.
First-in-field biomaterials-based innovations introduced by BIORECAR will enable more effective regeneration of functional myocardial tissue respect to state-of-the art approaches. BIORECAR innovation is multidisciplinary in nature and will be accelerated towards future clinical translation through my clinical, scientific and industrial collaborations.
Summary
In BIORECAR I will develop a new breakthrough multifunctional biomaterial-based platform for myocardial regeneration after myocardial infarction, provided with biochemical cues able to enhance the direct reprogramming of human cardiac fibroblasts into functional cardiomyocytes.
My expertise in bioartificial materials and biomimetic scaffolds and the versatile chemistry of polyurethanes will be the key elements to achieve a significant knowledge and technological advancement in cell reprogramming therapy, opening the way to the future translation of the therapy into the clinics.
I will implement this advanced approach through the design of a novel 3D in vitro tissue-engineered model of human cardiac fibrotic tissue, as a tool for testing and validation, to maximise research efforts and reduce animal tests.
I will adapt novel nanomedicine approaches I have recently developed for drug release to design innovative cell-friendly and efficient polyurethane nanoparticles for targeted reprogramming of cardiac fibroblasts.
I will design an injectable bioartificial hydrogel based on a blend of a thermosensitive polyurethane and a natural component selected among a novel cell-secreted natural polymer mixture (“biomatrix”) recapitulating the complexity of cardiac extracellular matrix or one of its main protein constituents. Such multifunctional hydrogel will deliver in situ agents stimulating recruitment of cardiac fibroblasts together with the nanoparticles loaded with reprogramming therapeutics, and will provide biochemical signalling to stimulate efficient conversion of fibroblasts into mature cardiomyocytes.
First-in-field biomaterials-based innovations introduced by BIORECAR will enable more effective regeneration of functional myocardial tissue respect to state-of-the art approaches. BIORECAR innovation is multidisciplinary in nature and will be accelerated towards future clinical translation through my clinical, scientific and industrial collaborations.
Max ERC Funding
2 000 000 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym BIOSMA
Project Mathematics for Shape Memory Technologies in Biomechanics
Researcher (PI) Ulisse Stefanelli
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary Shape Memory Alloys (SMAs) are nowadays widely exploited for the realization of innovative devices and have a great impact on the development of a variety of biomedical applications ranging from orthodontic archwires to vascular stents. The design, realization, and optimization of such devices are quite demanding tasks. Mathematics is involved in this process as a major tool in order to let the modeling more accurate, the numerical simulations more reliable, and the design more effective. Many material properties of SMAs such as martensitic reorientation, training, and ferromagnetic behavior, are still to be properly and efficiently addressed. Therefore, new modeling ideas, along with original analytical and numerical techniques, are required. This project is aimed at addressing novel mathematical issues in order to move from experimental materials results toward the solution of real-scale biomechanical Engineering problems. The research focus will be multidisciplinary and include modeling, analytic, numerical, and computational issues. A progress in the macroscopic description of SMAs, the computational simulation of real-scale SMA devices, and the optimization of the production processes will contribute to advance in the direction of innovative applications.
Summary
Shape Memory Alloys (SMAs) are nowadays widely exploited for the realization of innovative devices and have a great impact on the development of a variety of biomedical applications ranging from orthodontic archwires to vascular stents. The design, realization, and optimization of such devices are quite demanding tasks. Mathematics is involved in this process as a major tool in order to let the modeling more accurate, the numerical simulations more reliable, and the design more effective. Many material properties of SMAs such as martensitic reorientation, training, and ferromagnetic behavior, are still to be properly and efficiently addressed. Therefore, new modeling ideas, along with original analytical and numerical techniques, are required. This project is aimed at addressing novel mathematical issues in order to move from experimental materials results toward the solution of real-scale biomechanical Engineering problems. The research focus will be multidisciplinary and include modeling, analytic, numerical, and computational issues. A progress in the macroscopic description of SMAs, the computational simulation of real-scale SMA devices, and the optimization of the production processes will contribute to advance in the direction of innovative applications.
Max ERC Funding
700 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym bioSPINspired
Project Bio-inspired Spin-Torque Computing Architectures
Researcher (PI) Julie Grollier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Summary
In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Max ERC Funding
1 907 767 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym BIOTENSORS
Project Biomedical Data Fusion using Tensor based Blind Source Separation
Researcher (PI) Sabine Jeanne A Van Huffel
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Summary
"Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym BIOTORQUE
Project Probing the angular dynamics of biological systems with the optical torque wrench
Researcher (PI) Francesco Pedaci
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary "The ability to apply forces to single molecules and bio-polymers has fundamentally changed the way we can interact with and understand biological systems. Yet, for many cellular mechanisms, it is rather the torque that is the relevant physical parameter. Excitingly, novel single-molecule techniques that utilize this parameter are now poised to contribute to novel discoveries. Here, I will study the angular dynamical behavior and response to external torque of biological systems at the molecular and cellular levels using the new optical torque wrench that I recently developed.
In a first research line, I will unravel the angular dynamics of the e.coli flagellar motor, a complex and powerful rotary nano-motor that rotates the flagellum in order to propel the bacterium forwards. I will quantitatively study different aspects of torque generation of the motor, aiming to connect evolutionary, dynamical, and structural principles. In a second research line, I will develop an in-vivo manipulation technique based on the transfer of optical torque and force onto novel nano-fabricated particles. This new scanning method will allow me to map physical properties such as the local viscosity inside living cells and the spatial organization and topography of internal membranes, thereby expanding the capabilities of existing techniques towards in-vivo and ultra-low force scanning imaging.
This project is founded on a multidisciplinary approach in which fundamental optics, novel nanoparticle fabrication, and molecular and cellular biology are integrated. It has the potential to answer biophysical questions that have challenged the field for over two decades and to impact fields ranging from single-molecule biophysics to scanning-probe microscopy and nanorheology, provided ERC funding is granted."
Summary
"The ability to apply forces to single molecules and bio-polymers has fundamentally changed the way we can interact with and understand biological systems. Yet, for many cellular mechanisms, it is rather the torque that is the relevant physical parameter. Excitingly, novel single-molecule techniques that utilize this parameter are now poised to contribute to novel discoveries. Here, I will study the angular dynamical behavior and response to external torque of biological systems at the molecular and cellular levels using the new optical torque wrench that I recently developed.
In a first research line, I will unravel the angular dynamics of the e.coli flagellar motor, a complex and powerful rotary nano-motor that rotates the flagellum in order to propel the bacterium forwards. I will quantitatively study different aspects of torque generation of the motor, aiming to connect evolutionary, dynamical, and structural principles. In a second research line, I will develop an in-vivo manipulation technique based on the transfer of optical torque and force onto novel nano-fabricated particles. This new scanning method will allow me to map physical properties such as the local viscosity inside living cells and the spatial organization and topography of internal membranes, thereby expanding the capabilities of existing techniques towards in-vivo and ultra-low force scanning imaging.
This project is founded on a multidisciplinary approach in which fundamental optics, novel nanoparticle fabrication, and molecular and cellular biology are integrated. It has the potential to answer biophysical questions that have challenged the field for over two decades and to impact fields ranging from single-molecule biophysics to scanning-probe microscopy and nanorheology, provided ERC funding is granted."
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym BISMUTH
Project Breaking Inversion Symmetry in Magnets: Understand via THeory
Researcher (PI) Silvia Picozzi
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE3, ERC-2007-StG
Summary Multiferroics (i.e. materials where ferroelectricity and magnetism coexist) are presently drawing enormous interests, due to their technologically-relevant multifunctional character and to the astoundingly rich playground for fundamental condensed-matter physics they constitute. Here, we put forward several concepts on how to break inversion symmetry and achieve sizable ferroelectricity in collinear magnets; our approach is corroborated via first-principles calculations as tools to quantitatively estimate relevant ferroelectric and magnetic properties as well as to reveal ab-initio the main mechanisms behind the dipolar and magnetic orders. In closer detail, we focus on the interplay between ferroelectricity and electronic degrees of freedom in magnets, i.e. on those cases where spin- or orbital- or charge-ordering can be the driving force for a spontaneous polarization to develop. Antiferromagnetism will be considered as a primary mechanism for lifting inversion symmetry; however, the effects of charge disproportionation and orbital ordering will also be studied by examining a wide class of materials, including ortho-manganites with E-type spin-arrangement, non-E-type antiferromagnets, nickelates, etc. Finally, as an example of materials-design accessible to our ab-initio approach, we use “chemistry” to break inversion symmetry by artificially constructing an oxide superlattice and propose a way to switch, via an electric field, from antiferromagnetism to ferrimagnetism. To our knowledge, the link between electronic degrees of freedom and ferroelectricity in collinear magnets is an almost totally unexplored field by ab-initio methods; indeed, its clear understanding and optimization would lead to a scientific breakthrough in the multiferroics area. Technologically, it would pave the way to materials design of magnetic ferroelectrics with properties persisting above room temperature and, therefore, to a novel generation of electrically-controlled spintronic devices
Summary
Multiferroics (i.e. materials where ferroelectricity and magnetism coexist) are presently drawing enormous interests, due to their technologically-relevant multifunctional character and to the astoundingly rich playground for fundamental condensed-matter physics they constitute. Here, we put forward several concepts on how to break inversion symmetry and achieve sizable ferroelectricity in collinear magnets; our approach is corroborated via first-principles calculations as tools to quantitatively estimate relevant ferroelectric and magnetic properties as well as to reveal ab-initio the main mechanisms behind the dipolar and magnetic orders. In closer detail, we focus on the interplay between ferroelectricity and electronic degrees of freedom in magnets, i.e. on those cases where spin- or orbital- or charge-ordering can be the driving force for a spontaneous polarization to develop. Antiferromagnetism will be considered as a primary mechanism for lifting inversion symmetry; however, the effects of charge disproportionation and orbital ordering will also be studied by examining a wide class of materials, including ortho-manganites with E-type spin-arrangement, non-E-type antiferromagnets, nickelates, etc. Finally, as an example of materials-design accessible to our ab-initio approach, we use “chemistry” to break inversion symmetry by artificially constructing an oxide superlattice and propose a way to switch, via an electric field, from antiferromagnetism to ferrimagnetism. To our knowledge, the link between electronic degrees of freedom and ferroelectricity in collinear magnets is an almost totally unexplored field by ab-initio methods; indeed, its clear understanding and optimization would lead to a scientific breakthrough in the multiferroics area. Technologically, it would pave the way to materials design of magnetic ferroelectrics with properties persisting above room temperature and, therefore, to a novel generation of electrically-controlled spintronic devices
Max ERC Funding
684 000 €
Duration
Start date: 2008-05-01, End date: 2012-04-30
Project acronym BITCRUMBS
Project Towards a Reliable and Automated Analysis of Compromised Systems
Researcher (PI) Davide BALZAROTTI
Host Institution (HI) EURECOM
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Summary
"The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Max ERC Funding
1 991 504 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BLACK
Project The formation and evolution of massive black holes
Researcher (PI) Marta Volonteri
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Summary
"Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Max ERC Funding
1 668 385 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym BLOC
Project Mathematical study of Boundary Layers in Oceanic Motions
Researcher (PI) Anne-Laure Perrine Dalibard
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Summary
Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Max ERC Funding
1 267 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym BLOWDISOL
Project "BLOW UP, DISPERSION AND SOLITONS"
Researcher (PI) Franck Merle
Host Institution (HI) UNIVERSITE DE CERGY-PONTOISE
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary "Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Summary
"Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Max ERC Funding
2 079 798 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym BoneImplant
Project Monitoring bone healing around endosseous implants: from multiscale modeling to the patient’s bed
Researcher (PI) Guillaume Loïc Haiat
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Summary
Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Max ERC Funding
1 992 154 €
Duration
Start date: 2016-10-01, End date: 2021-09-30