Project acronym 2D-4-CO2
Project DESIGNING 2D NANOSHEETS FOR CO2 REDUCTION AND INTEGRATION INTO vdW HETEROSTRUCTURES FOR ARTIFICIAL PHOTOSYNTHESIS
Researcher (PI) Damien VOIRY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Summary
CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Max ERC Funding
1 499 931 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym 2F4BIODYN
Project Two-Field Nuclear Magnetic Resonance Spectroscopy for the Exploration of Biomolecular Dynamics
Researcher (PI) Fabien Ferrage
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Summary
The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Max ERC Funding
1 462 080 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 3D-BioMat
Project Deciphering biomineralization mechanisms through 3D explorations of mesoscale crystalline structure in calcareous biomaterials
Researcher (PI) VIRGINIE CHAMARD
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Summary
The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Max ERC Funding
1 966 429 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3D-CAP
Project 3D micro-supercapacitors for embedded electronics
Researcher (PI) David Sarinn PECH
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary The realization of high-performance micro-supercapacitors is currently a big challenge but the ineluctable applications requiring such miniaturized energy storage devices are continuously emerging, from wearable electronic gadgets to wireless sensor networks. Although they store less energy than micro-batteries, micro-supercapacitors can be charged and discharged very rapidly and exhibit a quasi-unlimited lifetime. The global scientific research is consequently largely focused on the improvement of their capacitance and energetic performances. However, to date, they are still far from being able to power sensors or electronic components.
Here I propose a 3D paradigm shift of micro-supercapacitor design to ensure increased energy storage capacities. Hydrous ruthenium dioxide (RuO2) is a pseudocapacitive material for supercapacitor electrode well-known for its high capacitance. A thin-film of ruthenium will be deposited by atomic layer deposition (ALD), followed by an electrochemical oxidation process, onto a high-surface-area 3D current collector prepared via an ingenious dynamic template built with hydrogen bubbles. The structural features of these 3D architectures will be controllably tailored by the processing methodologies. These electrodes will be combined with an innovative electrolyte in solid form (a protic ionogel) able to operate over an extended cell voltage. In a parallel investigation, we will develop a fundamental understanding of electrochemical reactions occurring at the nanoscale with a FIB-patterned (Focused Ion Beam) RuO2 nano-supercapacitor. The resulting 3D micro-supercapacitors should display extremely high power, long lifetime and – for the first time – energy densities competing or even exceeding that of micro-batteries. As a key achievement, prototypes will be designed using a new concept based on a self-adaptative micro-supercapacitors matrix, which arranges itself according to the global amount of energy stored.
Summary
The realization of high-performance micro-supercapacitors is currently a big challenge but the ineluctable applications requiring such miniaturized energy storage devices are continuously emerging, from wearable electronic gadgets to wireless sensor networks. Although they store less energy than micro-batteries, micro-supercapacitors can be charged and discharged very rapidly and exhibit a quasi-unlimited lifetime. The global scientific research is consequently largely focused on the improvement of their capacitance and energetic performances. However, to date, they are still far from being able to power sensors or electronic components.
Here I propose a 3D paradigm shift of micro-supercapacitor design to ensure increased energy storage capacities. Hydrous ruthenium dioxide (RuO2) is a pseudocapacitive material for supercapacitor electrode well-known for its high capacitance. A thin-film of ruthenium will be deposited by atomic layer deposition (ALD), followed by an electrochemical oxidation process, onto a high-surface-area 3D current collector prepared via an ingenious dynamic template built with hydrogen bubbles. The structural features of these 3D architectures will be controllably tailored by the processing methodologies. These electrodes will be combined with an innovative electrolyte in solid form (a protic ionogel) able to operate over an extended cell voltage. In a parallel investigation, we will develop a fundamental understanding of electrochemical reactions occurring at the nanoscale with a FIB-patterned (Focused Ion Beam) RuO2 nano-supercapacitor. The resulting 3D micro-supercapacitors should display extremely high power, long lifetime and – for the first time – energy densities competing or even exceeding that of micro-batteries. As a key achievement, prototypes will be designed using a new concept based on a self-adaptative micro-supercapacitors matrix, which arranges itself according to the global amount of energy stored.
Max ERC Funding
1 673 438 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 4TH-NU-AVENUE
Project Search for a fourth neutrino with a PBq anti-neutrino source
Researcher (PI) Thierry Michel René Lasserre
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Summary
Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym A-LIFE
Project The asymmetry of life: towards a unified view of the emergence of biological homochirality
Researcher (PI) Cornelia MEINERT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2018-STG
Summary What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Summary
What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AAA
Project Adaptive Actin Architectures
Researcher (PI) Laurent Blanchoin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2016-ADG
Summary Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Summary
Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Max ERC Funding
2 349 898 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AArteMIS
Project Aneurysmal Arterial Mechanics: Into the Structure
Researcher (PI) Pierre Joseph Badel
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Summary
The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Max ERC Funding
1 499 783 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ABIOS
Project ABIOtic Synthesis of RNA: an investigation on how life started before biology existed
Researcher (PI) Guillaume STIRNEMANN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Summary
The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Max ERC Funding
1 497 031 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADEQUATE
Project Advanced optoelectronic Devices with Enhanced QUAntum efficiency at THz frEquencies
Researcher (PI) Carlo Sirtori
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Summary
The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Max ERC Funding
1 761 000 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym AdS-CFT-solvable
Project Origins of integrability in AdS/CFT correspondence
Researcher (PI) Vladimir Kazakov
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Summary
Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Max ERC Funding
1 456 140 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AEROFLEX
Project AEROelastic instabilities and control of FLEXible Structures
Researcher (PI) Olivier Pierre MARQUET
Host Institution (HI) OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Summary
Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Max ERC Funding
1 377 290 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym AIRSEA
Project Air-Sea Exchanges driven by Light
Researcher (PI) Christian George
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Summary
The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Max ERC Funding
2 366 276 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym ALOGLADIS
Project From Anderson localization to Bose, Fermi and spin glasses in disordered ultracold gases
Researcher (PI) Laurent Sanchez-Palencia
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Summary
The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Max ERC Funding
985 200 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym AMPERE
Project Accounting for Metallicity, Polarization of the Electrolyte, and Redox reactions in computational Electrochemistry
Researcher (PI) Mathieu Eric Salanne
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Consolidator Grant (CoG), PE4, ERC-2017-COG
Summary Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Summary
Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Max ERC Funding
1 588 769 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ANAMORPHISM
Project Asymptotic and Numerical Analysis of MOdels of Resonant Physics Involving Structured Materials
Researcher (PI) Sebastien Roger Louis Guenneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Summary
One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Max ERC Funding
1 280 391 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ANDLICA
Project Anderson Localization of Light by Cold Atoms
Researcher (PI) Robin KAISER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2018-ADG
Summary I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Summary
I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Max ERC Funding
2 490 717 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym APOGEE
Project Atomic-scale physics of single-photon sources.
Researcher (PI) GUILLAUME ARTHUR FRANCOIS SCHULL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2017-COG
Summary Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Summary
Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Max ERC Funding
1 996 848 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym APPL
Project Anionic PhosPhoLipids in plant receptor kinase signaling
Researcher (PI) Yvon Jaillais
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS3, ERC-2013-StG
Summary "In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Summary
"In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Max ERC Funding
1 797 840 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AQUARAMAN
Project Pipet Based Scanning Probe Microscopy Tip-Enhanced Raman Spectroscopy: A Novel Approach for TERS in Liquids
Researcher (PI) Aleix Garcia Guell
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE4, ERC-2016-STG
Summary Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Summary
Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Max ERC Funding
1 528 442 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ARCHEIS
Project Understanding the onset and impact of Aquatic Resource Consumption in Human Evolution using novel Isotopic tracerS
Researcher (PI) Klervia Marie Madalen JAOUEN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2018-STG
Summary The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Summary
The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Max ERC Funding
1 361 991 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ARENA
Project Arrays of entangled atoms
Researcher (PI) Antoine Browaeys
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2009-StG
Summary The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Summary
The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Max ERC Funding
1 449 600 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym ARFMEMBRANESENSORS
Project Membrane sensors in the Arf orbit
Researcher (PI) Bruno Antonny
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2010-AdG_20100317
Summary Cellular organelles are continuously remodelled by numerous cytosolic proteins that associate transiently with their lipid membrane. Some distort the bilayer, others change its composition, extract lipids or bridge membranes at distance. Previous works from my laboratory have underlined the importance of membrane sensors, i.e. elements within proteins that help to organize membrane-remodelling events by sensing the physical and chemical state of the underlying membrane. A membrane sensor is not necessarily of well-folded domain that interacts with a specific lipid polar head: some intrinsically unfolded motifs harboring deceptively simple sequences can display remarkable membrane adhesive properties. Among these are some amphipathic helices: the ALPS motif with a polar face made mostly by small uncharged polar residues, the Spo20 helix with several histidines in its polar face and, like a mirror image of the ALPS motif, the alpha-synuclein helix with very small hydrophobic residues. Using biochemistry and molecular dynamics, we will compare the membrane binding properties of these sequences (effect of curvature, charge, lipid unsaturation); using bioinformatics we will look for new motifs, using cell biology we will assess the adaptation of these motifs to the physical and chemical features of organelle membranes. Concurrently, we will use reconstitution approaches on artificial membranes to dissect how membrane sensors contribute to the organization of vesicle tethering by golgins and sterol transport by ORP proteins. We surmise that the combination of a molecular ¿switch¿, a small G protein of the Arf family, and of membrane sensors permit to organize these complex reactions in time and in space.
Summary
Cellular organelles are continuously remodelled by numerous cytosolic proteins that associate transiently with their lipid membrane. Some distort the bilayer, others change its composition, extract lipids or bridge membranes at distance. Previous works from my laboratory have underlined the importance of membrane sensors, i.e. elements within proteins that help to organize membrane-remodelling events by sensing the physical and chemical state of the underlying membrane. A membrane sensor is not necessarily of well-folded domain that interacts with a specific lipid polar head: some intrinsically unfolded motifs harboring deceptively simple sequences can display remarkable membrane adhesive properties. Among these are some amphipathic helices: the ALPS motif with a polar face made mostly by small uncharged polar residues, the Spo20 helix with several histidines in its polar face and, like a mirror image of the ALPS motif, the alpha-synuclein helix with very small hydrophobic residues. Using biochemistry and molecular dynamics, we will compare the membrane binding properties of these sequences (effect of curvature, charge, lipid unsaturation); using bioinformatics we will look for new motifs, using cell biology we will assess the adaptation of these motifs to the physical and chemical features of organelle membranes. Concurrently, we will use reconstitution approaches on artificial membranes to dissect how membrane sensors contribute to the organization of vesicle tethering by golgins and sterol transport by ORP proteins. We surmise that the combination of a molecular ¿switch¿, a small G protein of the Arf family, and of membrane sensors permit to organize these complex reactions in time and in space.
Max ERC Funding
1 997 321 €
Duration
Start date: 2011-05-01, End date: 2015-04-30
Project acronym ARPEMA
Project Anionic redox processes: A transformational approach for advanced energy materials
Researcher (PI) Jean-Marie Tarascon
Host Institution (HI) COLLEGE DE FRANCE
Call Details Advanced Grant (AdG), PE5, ERC-2014-ADG
Summary Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Summary
Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Max ERC Funding
2 249 196 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ARTHUS
Project Advances in Research on Theories of the Dark Universe - Inhomogeneity Effects in Relativistic Cosmology
Researcher (PI) Thomas BUCHERT
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Summary
The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Max ERC Funding
2 091 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARTISTIC
Project Advanced and Reusable Theory for the In Silico-optimization of composite electrode fabrication processes for rechargeable battery Technologies with Innovative Chemistries
Researcher (PI) Alejandro Antonio FRANCO
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2017-COG
Summary The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Summary
The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Max ERC Funding
1 976 445 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym ATMO
Project Atmospheres across the Universe
Researcher (PI) Pascal TREMBLIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Summary
Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ATMOFLEX
Project Turbulent Transport in the Atmosphere: Fluctuations and Extreme Events
Researcher (PI) Jérémie Bec
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Summary
A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Max ERC Funding
1 200 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym ATOMAG
Project From Attosecond Magnetism towards Ultrafast Spin Photonics
Researcher (PI) Jean-Yves Bigot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Summary
We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Max ERC Funding
2 492 561 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym Atto-Zepto
Project Ultrasensitive Nano-Optomechanical Sensors
Researcher (PI) Olivier ARCIZET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Summary
By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Max ERC Funding
2 067 905 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym AUGURY
Project Reconstructing Earth’s mantle convection
Researcher (PI) Nicolas Coltice
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Summary
Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Max ERC Funding
1 994 000 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym BACEMO
Project Bacterial Cell Morphogenesis
Researcher (PI) Rut Carballido Lopez
Host Institution (HI) INSTITUT NATIONAL DE LA RECHERCHE AGRONOMIQUE
Call Details Starting Grant (StG), LS3, ERC-2012-StG_20111109
Summary In bacteria, the though external cell wall and the intracellular actin-like (MreB) cytoskeleton are major determinants of cell shape. The biosynthetic pathways and chemical composition of the cell wall, a three dimensional polymer network that is one of the most prominent targets for antibiotics, are well understood. However, despite decades of study, little is known about the complex cell wall ultrastructure and the molecular mechanisms that control cell wall morphogenesis in time and space. In rod-shaped bacteria, MreB homologues assemble into dynamic structures thought to control shape by serving as organizers for the movement and assembly of macromolecular machineries that effect sidewall elongation. However, the mechanistic details used by the MreB cytoskeleton to fulfil this role remain to be elucidated. Furthermore, development of high-resolution microscopy techniques has led to new breakthroughs this year, published by our lab and others, which are shaking the model developed over the last decade and re-questioning the MreB “actin cytoskeleton” designation.
The aim of this project is to combine powerful genetic, biochemical, genomic and systems biology approaches available in the model bacterium Bacillus subtilis with modern high-resolution light microscopic techniques to study the dynamics and mechanistic details of the MreB cytoskeleton and of CW assembly. Parameters measured by the different approaches will be combined to quantitatively describe the features of bacterial cell morphogenesis.
Summary
In bacteria, the though external cell wall and the intracellular actin-like (MreB) cytoskeleton are major determinants of cell shape. The biosynthetic pathways and chemical composition of the cell wall, a three dimensional polymer network that is one of the most prominent targets for antibiotics, are well understood. However, despite decades of study, little is known about the complex cell wall ultrastructure and the molecular mechanisms that control cell wall morphogenesis in time and space. In rod-shaped bacteria, MreB homologues assemble into dynamic structures thought to control shape by serving as organizers for the movement and assembly of macromolecular machineries that effect sidewall elongation. However, the mechanistic details used by the MreB cytoskeleton to fulfil this role remain to be elucidated. Furthermore, development of high-resolution microscopy techniques has led to new breakthroughs this year, published by our lab and others, which are shaking the model developed over the last decade and re-questioning the MreB “actin cytoskeleton” designation.
The aim of this project is to combine powerful genetic, biochemical, genomic and systems biology approaches available in the model bacterium Bacillus subtilis with modern high-resolution light microscopic techniques to study the dynamics and mechanistic details of the MreB cytoskeleton and of CW assembly. Parameters measured by the different approaches will be combined to quantitatively describe the features of bacterial cell morphogenesis.
Max ERC Funding
1 650 050 €
Duration
Start date: 2013-02-01, End date: 2019-01-31