Project acronym 2D-4-CO2
Project DESIGNING 2D NANOSHEETS FOR CO2 REDUCTION AND INTEGRATION INTO vdW HETEROSTRUCTURES FOR ARTIFICIAL PHOTOSYNTHESIS
Researcher (PI) Damien VOIRY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Summary
CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Max ERC Funding
1 499 931 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym 2F4BIODYN
Project Two-Field Nuclear Magnetic Resonance Spectroscopy for the Exploration of Biomolecular Dynamics
Researcher (PI) Fabien Ferrage
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Summary
The paradigm of the structure-function relationship in proteins is outdated. Biological macromolecules and supramolecular assemblies are highly dynamic objects. Evidence that their motions are of utmost importance to their functions is regularly identified. The understanding of the physical chemistry of biological processes at an atomic level has to rely not only on the description of structure but also on the characterization of molecular motions.
The investigation of protein motions will be undertaken with a very innovative methodological approach in nuclear magnetic resonance relaxation. In order to widen the ranges of frequencies at which local motions in proteins are probed, we will first use and develop new techniques for a prototype shuttle system for the measurement of relaxation at low fields on a high-field NMR spectrometer. Second, we will develop a novel system: a set of low-field NMR spectrometers designed as accessories for high-field spectrometers. Used in conjunction with the shuttle, this system will offer (i) the sensitivity and resolution (i.e. atomic level information) of a high-field spectrometer (ii) the access to low fields of a relaxometer and (iii) the ability to measure a wide variety of relaxation rates with high accuracy. This system will benefit from the latest technology in homogeneous permanent magnet development to allow a control of spin systems identical to that of a high-resolution probe. This new apparatus will open the way to the use of NMR relaxation at low fields for the refinement of protein motions at an atomic scale.
Applications of this novel approach will focus on the bright side of protein dynamics: (i) the largely unexplored dynamics of intrinsically disordered proteins, and (ii) domain motions in large proteins. In both cases, we will investigate a series of diverse protein systems with implications in development, cancer and immunity.
Max ERC Funding
1 462 080 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 3D-BioMat
Project Deciphering biomineralization mechanisms through 3D explorations of mesoscale crystalline structure in calcareous biomaterials
Researcher (PI) VIRGINIE CHAMARD
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Summary
The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Max ERC Funding
1 966 429 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3D-CAP
Project 3D micro-supercapacitors for embedded electronics
Researcher (PI) David Sarinn PECH
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary The realization of high-performance micro-supercapacitors is currently a big challenge but the ineluctable applications requiring such miniaturized energy storage devices are continuously emerging, from wearable electronic gadgets to wireless sensor networks. Although they store less energy than micro-batteries, micro-supercapacitors can be charged and discharged very rapidly and exhibit a quasi-unlimited lifetime. The global scientific research is consequently largely focused on the improvement of their capacitance and energetic performances. However, to date, they are still far from being able to power sensors or electronic components.
Here I propose a 3D paradigm shift of micro-supercapacitor design to ensure increased energy storage capacities. Hydrous ruthenium dioxide (RuO2) is a pseudocapacitive material for supercapacitor electrode well-known for its high capacitance. A thin-film of ruthenium will be deposited by atomic layer deposition (ALD), followed by an electrochemical oxidation process, onto a high-surface-area 3D current collector prepared via an ingenious dynamic template built with hydrogen bubbles. The structural features of these 3D architectures will be controllably tailored by the processing methodologies. These electrodes will be combined with an innovative electrolyte in solid form (a protic ionogel) able to operate over an extended cell voltage. In a parallel investigation, we will develop a fundamental understanding of electrochemical reactions occurring at the nanoscale with a FIB-patterned (Focused Ion Beam) RuO2 nano-supercapacitor. The resulting 3D micro-supercapacitors should display extremely high power, long lifetime and – for the first time – energy densities competing or even exceeding that of micro-batteries. As a key achievement, prototypes will be designed using a new concept based on a self-adaptative micro-supercapacitors matrix, which arranges itself according to the global amount of energy stored.
Summary
The realization of high-performance micro-supercapacitors is currently a big challenge but the ineluctable applications requiring such miniaturized energy storage devices are continuously emerging, from wearable electronic gadgets to wireless sensor networks. Although they store less energy than micro-batteries, micro-supercapacitors can be charged and discharged very rapidly and exhibit a quasi-unlimited lifetime. The global scientific research is consequently largely focused on the improvement of their capacitance and energetic performances. However, to date, they are still far from being able to power sensors or electronic components.
Here I propose a 3D paradigm shift of micro-supercapacitor design to ensure increased energy storage capacities. Hydrous ruthenium dioxide (RuO2) is a pseudocapacitive material for supercapacitor electrode well-known for its high capacitance. A thin-film of ruthenium will be deposited by atomic layer deposition (ALD), followed by an electrochemical oxidation process, onto a high-surface-area 3D current collector prepared via an ingenious dynamic template built with hydrogen bubbles. The structural features of these 3D architectures will be controllably tailored by the processing methodologies. These electrodes will be combined with an innovative electrolyte in solid form (a protic ionogel) able to operate over an extended cell voltage. In a parallel investigation, we will develop a fundamental understanding of electrochemical reactions occurring at the nanoscale with a FIB-patterned (Focused Ion Beam) RuO2 nano-supercapacitor. The resulting 3D micro-supercapacitors should display extremely high power, long lifetime and – for the first time – energy densities competing or even exceeding that of micro-batteries. As a key achievement, prototypes will be designed using a new concept based on a self-adaptative micro-supercapacitors matrix, which arranges itself according to the global amount of energy stored.
Max ERC Funding
1 673 438 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 4TH-NU-AVENUE
Project Search for a fourth neutrino with a PBq anti-neutrino source
Researcher (PI) Thierry Michel René Lasserre
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Summary
Several observed anomalies in neutrino oscillation data can be explained by a hypothetical fourth neutrino separated from the three standard neutrinos by a squared mass difference of a few eV2. This hypothesis can be tested with a PBq (ten kilocurie scale) 144Ce antineutrino beta-source deployed at the center of a large low background liquid scintillator detector, such like Borexino, KamLAND, and SNO+. In particular, the compact size of such a source could yield an energy-dependent oscillating pattern in event spatial distribution that would unambiguously determine neutrino mass differences and mixing angles.
The proposed program aims to perform the necessary research and developments to produce and deploy an intense antineutrino source in a large liquid scintillator detector. Our program will address the definition of the production process of the neutrino source as well as its experimental characterization, the detailed physics simulation of both signal and backgrounds, the complete design and the realization of the thick shielding, the preparation of the interfaces with the antineutrino detector, including the safety and security aspects.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym A-LIFE
Project The asymmetry of life: towards a unified view of the emergence of biological homochirality
Researcher (PI) Cornelia MEINERT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2018-STG
Summary What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Summary
What is responsible for the emergence of homochirality, the almost exclusive use of one enantiomer over its mirror image? And what led to the evolution of life’s homochiral biopolymers, DNA/RNA, proteins and lipids, where all the constituent monomers exhibit the same handedness?
Based on in-situ observations and laboratory studies, we propose that this handedness occurs when chiral biomolecules are synthesized asymmetrically through interaction with circularly polarized photons in interstellar space. The ultimate goal of this project will be to demonstrate how the diverse set of heterogeneous enantioenriched molecules, available from meteoritic impact, assembles into homochiral pre-biopolymers, by simulating the evolutionary stages on early Earth. My recent research has shown that the central chiral unit of RNA, ribose, forms readily under simulated comet conditions and this has provided valuable new insights into the accessibility of precursors of genetic material in interstellar environments. The significance of this project arises due to the current lack of experimental demonstration that amino acids, sugars and lipids can simultaneously and asymmetrically be synthesized by a universal physical selection process.
A synergistic methodology will be developed to build a unified theory for the origin of all chiral biological building blocks and their assembly into homochiral supramolecular entities. For the first time, advanced analyses of astrophysical-relevant samples, asymmetric photochemistry triggered by circularly polarized synchrotron and laser sources, and chiral amplification due to polymerization processes will be combined. Intermediates and autocatalytic reaction kinetics will be monitored and supported by quantum calculations to understand the underlying processes. A unified theory on the asymmetric formation and self-assembly of life’s biopolymers is groundbreaking and will impact the whole conceptual foundation of the origin of life.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AAA
Project Adaptive Actin Architectures
Researcher (PI) Laurent Blanchoin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2016-ADG
Summary Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Summary
Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Max ERC Funding
2 349 898 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym AArteMIS
Project Aneurysmal Arterial Mechanics: Into the Structure
Researcher (PI) Pierre Joseph Badel
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Summary
The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Max ERC Funding
1 499 783 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ABIOS
Project ABIOtic Synthesis of RNA: an investigation on how life started before biology existed
Researcher (PI) Guillaume STIRNEMANN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Summary
The emergence of life is one of the most fascinating and yet largely unsolved questions in the natural sciences, and thus a significant challenge for scientists from many disciplines. There is growing evidence that ribonucleic acid (RNA) polymers, which are capable of genetic information storage and self-catalysis, were involved in the early forms of life. But despite recent progress, RNA synthesis without biological machineries is very challenging. The current project aims at understanding how to synthesize RNA in abiotic conditions. I will solve problems associated with three critical aspects of RNA formation that I will rationalize at a molecular level: (i) accumulation of precursors, (ii) formation of a chemical bond between RNA monomers, and (iii) tolerance for alternative backbone sugars or linkages. Because I will study problems ranging from the formation of chemical bonds up to the stability of large biopolymers, I propose an original computational multi-scale approach combining techniques that range from quantum calculations to large-scale all-atom simulations, employed together with efficient enhanced-sampling algorithms, forcefield improvement, cutting-edge analysis methods and model development.
My objectives are the following:
1 • To explain why the poorly-understood thermally-driven process of thermophoresis can contribute to the accumulation of dilute precursors.
2 • To understand why linking RNA monomers with phosphoester bonds is so difficult, to understand the molecular mechanism of possible catalysts and to suggest key improvements.
3 • To rationalize the molecular basis for RNA tolerance for alternative backbone sugars or linkages that have probably been incorporated in abiotic conditions.
This unique in-silico laboratory setup should significantly impact our comprehension of life’s origin by overcoming major obstacles to RNA abiotic formation, and in addition will reveal significant orthogonal outcomes for (bio)technological applications.
Max ERC Funding
1 497 031 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ADEQUATE
Project Advanced optoelectronic Devices with Enhanced QUAntum efficiency at THz frEquencies
Researcher (PI) Carlo Sirtori
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Summary
The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Max ERC Funding
1 761 000 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AdS-CFT-solvable
Project Origins of integrability in AdS/CFT correspondence
Researcher (PI) Vladimir Kazakov
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Summary
Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Max ERC Funding
1 456 140 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AEROFLEX
Project AEROelastic instabilities and control of FLEXible Structures
Researcher (PI) Olivier Pierre MARQUET
Host Institution (HI) OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Summary
Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Max ERC Funding
1 377 290 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym AIRSEA
Project Air-Sea Exchanges driven by Light
Researcher (PI) Christian George
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Summary
The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Max ERC Funding
2 366 276 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym ALOGLADIS
Project From Anderson localization to Bose, Fermi and spin glasses in disordered ultracold gases
Researcher (PI) Laurent Sanchez-Palencia
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Summary
The field of disordered quantum gases is developing rapidly. Dramatic progress has been achieved recently and first experimental observation of one-dimensional Anderson localization (AL) of matterwaves has been reported using Bose-Einstein condensates in controlled disorder (in our group at Institut d'Optique and at LENS; Nature, 2008). This dramatic success results from joint theoretical and experimental efforts, we have contributed to. Most importantly, it opens unprecedented routes to pursue several outstanding challenges in the multidisciplinary field of disordered systems, which, after fifty years of Anderson localization, is more active than ever.
This theoretical project aims at further developing the emerging field of disordered quantum gases towards novel challenges. Our aim is twofold. First, we will propose and analyze schemes where experiments on ultracold atoms can address unsolved issues: AL in dimensions higher than one, effects of inter-atomic interactions on AL, strongly-correlated disordered gases and quantum simulators for spin systems (spin glasses). Second, by taking into account specific features of ultracold atoms, beyond standard toy-models, we will raise and study new questions which have not been addressed before (eg long-range correlations of speckle potentials, finite-size effects, controlled interactions). Both aspects would open new frontiers to disordered quantum gases and offer new possibilities to shed new light on highly debated issues.
Our main concerns are thus to (i) study situations relevant to experiments, (ii) develop new approaches, applicable to ultracold atoms, (iii) identify key observables, and (iv) propose new challenging experiments. In this project, we will benefit from the original situation of our theory team: It is independent but forms part of a larger group (lead by A. Aspect), which is a world-leader in experiments on disordered quantum gases, we have already developed close collaborative relationship with.
Max ERC Funding
985 200 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym AMPERE
Project Accounting for Metallicity, Polarization of the Electrolyte, and Redox reactions in computational Electrochemistry
Researcher (PI) Mathieu Eric Salanne
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Consolidator Grant (CoG), PE4, ERC-2017-COG
Summary Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Summary
Applied electrochemistry plays a key role in many technologies, such as batteries, fuel cells, supercapacitors or solar cells. It is therefore at the core of many research programs all over the world. Yet, fundamental electrochemical investigations remain scarce. In particular, electrochemistry is among the fields for which the gap between theory and experiment is the largest. From the computational point of view, there is no molecular dynamics (MD) software devoted to the simulation of electrochemical systems while other fields such as biochemistry (GROMACS) or material science (LAMMPS) have dedicated tools. This is due to the difficulty of accounting for complex effects arising from (i) the degree of metallicity of the electrode (i.e. from semimetals to perfect conductors), (ii) the mutual polarization occurring at the electrode/electrolyte interface and (iii) the redox reactivity through explicit electron transfers. Current understanding therefore relies on standard theories that derive from an inaccurate molecular-scale picture. My objective is to fill this gap by introducing a whole set of new methods for simulating electrochemical systems. They will be provided to the computational electrochemistry community as a cutting-edge MD software adapted to supercomputers. First applications will aim at the discovery of new electrolytes for energy storage. Here I will focus on (1) ‘‘water-in-salts’’ to understand why these revolutionary liquids enable much higher voltage than conventional solutions (2) redox reactions inside a nanoporous electrode to support the development of future capacitive energy storage devices. These selected applications are timely and rely on collaborations with leading experimental partners. The results are expected to shed an unprecedented light on the importance of polarization effects on the structure and the reactivity of electrode/electrolyte interfaces, establishing MD as a prominent tool for solving complex electrochemistry problems.
Max ERC Funding
1 588 769 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ANADEL
Project Analysis of Geometrical Effects on Dispersive Equations
Researcher (PI) Danela Oana IVANOVICI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Summary
We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Max ERC Funding
1 293 763 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym analysisdirac
Project The analysis of the Dirac operator: the hypoelliptic Laplacian and its applications
Researcher (PI) Jean-Michel Philippe Marie-José Bismut
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Summary
This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Max ERC Funding
1 112 400 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ANAMORPHISM
Project Asymptotic and Numerical Analysis of MOdels of Resonant Physics Involving Structured Materials
Researcher (PI) Sebastien Roger Louis Guenneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Summary
One already available method to expand the range of material properties is to adjust the composition of materials at the molecular level using chemistry. We would like to develop the alternative approach of homogenization which broadens the definition of a material to include artificially structured media (fluids and solids) in which the effective electromagnetic, hydrodynamic or elastic responses result from a macroscopic patterning or arrangement of two or more distinct materials. This project will explore the latter avenue in order to markedly enhance control of surface water waves and elastodynamic waves propagating within artificially structured fluids and solid materials, thereafter called acoustic metamaterials.
Pendry's perfect lens, the paradigm of electromagnetic metamaterials, is a slab of negative refractive index material that takes rays of light and causes them to converge with unprecedented resolution. This flat lens is a combination of periodically arranged resonant electric and magnetic elements. We will draw systematic analogies with resonant mechanical systems in order to achieve similar control of hydrodynamic and elastic waves. This will allow us to extend the design of metamaterials to acoustics to go beyond the scope of Snell-Descartes' laws of optics and Newton's laws of mechanics.
Acoustic metamaterials allow the construction of invisibility cloaks for non-linear surface water waves (e.g. tsunamis) propagating in structured fluids, as well as seismic waves propagating in thin structured elastic plates.
Maritime and civil engineering applications are in the protection of harbours, off-shore platforms and anti-earthquake passive systems. Acoustic cloaks for an enhanced control of pressure waves in fluids will be also designed for underwater camouflaging.
Light and sound interplay will be finally analysed in order to design controllable metamaterials with a special emphasis on undetectable microstructured fibres (acoustic wormholes).
Max ERC Funding
1 280 391 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ANDLICA
Project Anderson Localization of Light by Cold Atoms
Researcher (PI) Robin KAISER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2018-ADG
Summary I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Summary
I propose to use large clouds of cold Ytterbium atoms to observe Anderson localization of light in three dimensions, which has challenged theoreticians and experimentalists for many decades.
After the prediction by Anderson of a disorder-induced conductor to insulator transition for electrons, light has been proposed as ideal non interacting waves to explore coherent transport properties in the absence of interactions. The development in experiments and theory over the past several years have shown a route towards the experimental realization of this phase transition.
Previous studies on Anderson localization of light using semiconductor powders or dielectric particles have shown that intrinsic material properties, such as absorption or inelastic scattering of light, need to be taken into account in the interpretation of experimental signatures of Anderson localization. Laser-cooled clouds of atoms avoid the problems of samples used so far to study Anderson localization of light. Ab initio theoretical models, available for cold Ytterbium atoms, have shown that the mere high spatial density of the scattering sample is not sufficient to allow for Anderson localization of photons in three dimensions, but that an additional magnetic field or additional disorder on the level shifts can induce a phase transition in three dimensions.
The role of disorder in atom-light interactions has important consequences for the next generation of high precision atomic clocks and quantum memories. By connecting the mesoscopic physics approach to quantum optics and cooperative scattering, this project will allow better control of cold atoms as building blocks of future quantum technologies. Time-resolved transport experiments will connect super- and subradiant assisted transmission with the extended and localized eigenstates of the system.
Having pioneered studies on weak localization and cooperative scattering enables me to diagnostic strong localization of light by cold atoms.
Max ERC Funding
2 490 717 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANT
Project Automata in Number Theory
Researcher (PI) Boris Adamczewski
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Summary
Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Max ERC Funding
1 438 745 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym APOGEE
Project Atomic-scale physics of single-photon sources.
Researcher (PI) GUILLAUME ARTHUR FRANCOIS SCHULL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2017-COG
Summary Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Summary
Single-photon sources (SPSs) are systems capable of emitting photons one by one. These sources are of major importance for quantum-information science and applications. SPSs experiments generally rely on the optical excitation of two level systems of atomic-scale dimensions (single-molecules, vacancies in diamond…). Many fundamental questions related to the nature of these sources and the impact of their environment remain to be explored:
Can SPSs be addressed with atomic-scale spatial accuracy? How do the nanometer-scale distance or the orientation between two (or more) SPSs affect their emission properties? Does coherence emerge from the proximity between the sources? Do these structures still behave as SPSs or do they lead to the emission of correlated photons? How can we then control the degree of entanglement between the sources? Can we remotely excite the emission of these sources by using molecular chains as charge-carrying wires? Can we couple SPSs embodied in one or two-dimensional arrays? How does mechanical stress or localised plasmons affect the properties of an electrically-driven SPS?
Answering these questions requires probing, manipulating and exciting SPSs with an atomic-scale precision. This is beyond what is attainable with an all-optical method. Since they can be confined to atomic-scale pathways we propose to use electrons rather than photons to excite the SPSs. This unconventional approach provides a direct access to the atomic-scale physics of SPSs and is relevant for the implementation of these sources in hybrid devices combining electronic and photonic components. To this end, a scanning probe microscope will be developed that provides simultaneous spatial, chemical, spectral, and temporal resolutions. Single-molecules and defects in monolayer transition metal dichalcogenides are SPSs that will be studied in the project, and which are respectively of interest for fundamental and more applied issues.
Max ERC Funding
1 996 848 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym APPL
Project Anionic PhosPhoLipids in plant receptor kinase signaling
Researcher (PI) Yvon Jaillais
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS3, ERC-2013-StG
Summary "In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Summary
"In plants, receptor kinases form the largest family of plasma membrane (PM) receptors and they are involved in virtually all aspects of the plant life, including development, immunity and reproduction. In animals, key molecules that orchestrate the recruitment of signaling proteins to membranes are anionic phospholipids (e.g. phosphatidylinositol phosphate or PIPs). Besides, recent reports in animal and yeast cells suggest the existence of PM nanodomains that are independent of cholesterol and lipid phase and rely on anionic phospholipids as well as electrostatic protein/lipid interactions. Strikingly, we know very little on the role of anionic phospholipids in plant signaling. However, our preliminary data suggest that BKI1, an inhibitory protein of the steroid receptor kinase BRI1, interacts with various PIPs in vitro and is likely targeted to the PM by electrostatic interactions with these anionic lipids. These results open the possibility that BRI1, but also other receptor kinases, might be regulated by anionic phospholipids in plants. Here, we propose to analyze the function of anionic phospholipids in BRI1 signaling, using the root epidermis as a model system. First, we will ask what are the lipids that control membrane surface charge in this tissue and recruit BR-signaling component to the PM. Second, we will probe the presence of PIP-enriched nanodomains at the plant PM using super-resolution microscopy techniques and investigate the roles of these domains in BRI1 signaling. Finally, we will analyze the function of the BKI1-related plant-specific family of anionic phospholipid effectors in plant development. In summary, using a transversal approach ranging from in vitro studies to in vivo validation and whole organism physiology, this work will unravel the interplay between anionic phospholipids and receptor signaling in plants."
Max ERC Funding
1 797 840 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AQUARAMAN
Project Pipet Based Scanning Probe Microscopy Tip-Enhanced Raman Spectroscopy: A Novel Approach for TERS in Liquids
Researcher (PI) Aleix Garcia Guell
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE4, ERC-2016-STG
Summary Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Summary
Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Max ERC Funding
1 528 442 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ARCHEIS
Project Understanding the onset and impact of Aquatic Resource Consumption in Human Evolution using novel Isotopic tracerS
Researcher (PI) Klervia Marie Madalen JAOUEN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2018-STG
Summary The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Summary
The onset of the systematic consumption of marine resources is thought to mark a turning point for the hominin lineage. To date, this onset cannot be traced, since classic isotope markers are not preserved beyond 50 - 100 ky. Aquatic food products are essential in human nutrition as the main source of polyunsaturated fatty acids in hunter-gatherer diets. The exploitation of marine resources is also thought to have reduced human mobility and enhanced social and technological complexification. Systematic aquatic food consumption could well have been a distinctive feature of Homo sapiens species among his fellow hominins, and has been linked to the astonishing leap in human intelligence and conscience. Yet, this hypothesis is challenged by the existence of mollusk and marine mammal bone remains at Neanderthal archeological sites. Recent work demonstrated the sensitivity of Zn isotope composition in bioapatite, the mineral part of bones and teeth, to dietary Zn. By combining classic (C and C/N isotope analyses) and innovative techniques (compound specific C/N and bulk Zn isotope analyses), I will develop a suite of sensitive tracers for shellfish, fish and marine mammal consumption. Shellfish consumption will be investigated by comparing various South American and European prehistoric populations from the Atlantic coast associated to shell-midden and fish-mounds. Marine mammal consumption will be traced using an Inuit population of Arctic Canada and the Wairau Bar population of New Zealand. C/N/Zn isotope compositions of various aquatic products will also be assessed, as well as isotope fractionation during intestinal absorption. I will then use the fully calibrated isotope tools to detect and characterize the onset of marine food exploitation in human history, which will answer the question of its specificity to our species. Neanderthal, early modern humans and possibly other hominin remains from coastal and inland sites will be compared in that purpose.
Max ERC Funding
1 361 991 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ARENA
Project Arrays of entangled atoms
Researcher (PI) Antoine Browaeys
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2009-StG
Summary The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Summary
The goal of this project is to prepare in a deterministic way, and then to characterize, various entangled states of up to 25 individual atoms held in an array of optical tweezers. Such a system provides a new arena to explore quantum entangled states of a large number of particles. Entanglement is the existence of quantum correlations between different parts of a system, and it is recognized as an essential property that distinguishes the quantum and the classical worlds. It is also a resource in various areas of physics, such as quantum information processing, quantum metrology, correlated quantum systems and quantum simulation. In the proposed design, each site is individually addressable, which enables single atom manipulation and detection. This will provide the largest entangled state ever produced and fully characterized at the individual particle level. The experiment will be implemented by combining two crucial novel features, that I was able to demonstrate very recently: first, the manipulation of quantum bits written on long-lived hyperfine ground states of single ultra-cold atoms trapped in microscopic optical tweezers; second, the generation of entanglement by using the strong long-range interactions between Rydberg states. These interactions lead to the so-called dipole blockade , and enable the preparation of various classes of entangled states, such as states carrying only one excitation (W states), and states analogous to Schrödinger s cats (GHZ states). Finally, I will also explore strategies to protect these states against decoherence, developed in the framework of fault-tolerant and topological quantum computing. This project therefore combines an experimental challenge and the exploration of entanglement in a mesoscopic system.
Max ERC Funding
1 449 600 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym ARFMEMBRANESENSORS
Project Membrane sensors in the Arf orbit
Researcher (PI) Bruno Antonny
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2010-AdG_20100317
Summary Cellular organelles are continuously remodelled by numerous cytosolic proteins that associate transiently with their lipid membrane. Some distort the bilayer, others change its composition, extract lipids or bridge membranes at distance. Previous works from my laboratory have underlined the importance of membrane sensors, i.e. elements within proteins that help to organize membrane-remodelling events by sensing the physical and chemical state of the underlying membrane. A membrane sensor is not necessarily of well-folded domain that interacts with a specific lipid polar head: some intrinsically unfolded motifs harboring deceptively simple sequences can display remarkable membrane adhesive properties. Among these are some amphipathic helices: the ALPS motif with a polar face made mostly by small uncharged polar residues, the Spo20 helix with several histidines in its polar face and, like a mirror image of the ALPS motif, the alpha-synuclein helix with very small hydrophobic residues. Using biochemistry and molecular dynamics, we will compare the membrane binding properties of these sequences (effect of curvature, charge, lipid unsaturation); using bioinformatics we will look for new motifs, using cell biology we will assess the adaptation of these motifs to the physical and chemical features of organelle membranes. Concurrently, we will use reconstitution approaches on artificial membranes to dissect how membrane sensors contribute to the organization of vesicle tethering by golgins and sterol transport by ORP proteins. We surmise that the combination of a molecular ¿switch¿, a small G protein of the Arf family, and of membrane sensors permit to organize these complex reactions in time and in space.
Summary
Cellular organelles are continuously remodelled by numerous cytosolic proteins that associate transiently with their lipid membrane. Some distort the bilayer, others change its composition, extract lipids or bridge membranes at distance. Previous works from my laboratory have underlined the importance of membrane sensors, i.e. elements within proteins that help to organize membrane-remodelling events by sensing the physical and chemical state of the underlying membrane. A membrane sensor is not necessarily of well-folded domain that interacts with a specific lipid polar head: some intrinsically unfolded motifs harboring deceptively simple sequences can display remarkable membrane adhesive properties. Among these are some amphipathic helices: the ALPS motif with a polar face made mostly by small uncharged polar residues, the Spo20 helix with several histidines in its polar face and, like a mirror image of the ALPS motif, the alpha-synuclein helix with very small hydrophobic residues. Using biochemistry and molecular dynamics, we will compare the membrane binding properties of these sequences (effect of curvature, charge, lipid unsaturation); using bioinformatics we will look for new motifs, using cell biology we will assess the adaptation of these motifs to the physical and chemical features of organelle membranes. Concurrently, we will use reconstitution approaches on artificial membranes to dissect how membrane sensors contribute to the organization of vesicle tethering by golgins and sterol transport by ORP proteins. We surmise that the combination of a molecular ¿switch¿, a small G protein of the Arf family, and of membrane sensors permit to organize these complex reactions in time and in space.
Max ERC Funding
1 997 321 €
Duration
Start date: 2011-05-01, End date: 2015-04-30
Project acronym ARPEMA
Project Anionic redox processes: A transformational approach for advanced energy materials
Researcher (PI) Jean-Marie Tarascon
Host Institution (HI) COLLEGE DE FRANCE
Call Details Advanced Grant (AdG), PE5, ERC-2014-ADG
Summary Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Summary
Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Max ERC Funding
2 249 196 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ARTHUS
Project Advances in Research on Theories of the Dark Universe - Inhomogeneity Effects in Relativistic Cosmology
Researcher (PI) Thomas BUCHERT
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Summary
The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Max ERC Funding
2 091 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARTISTIC
Project Advanced and Reusable Theory for the In Silico-optimization of composite electrode fabrication processes for rechargeable battery Technologies with Innovative Chemistries
Researcher (PI) Alejandro Antonio FRANCO
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2017-COG
Summary The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Summary
The aim of this project is to develop and to demonstrate a novel theoretical framework devoted to rationalizing the formulation of composite electrodes containing next-generation material chemistries for high energy density secondary batteries. The framework will be established through the combination of discrete particle and continuum mathematical models within a multiscale computational workflow integrating the individual models and mimicking the different steps along the electrode fabrication process, including slurry preparation, drying and calendering. Strongly complemented by dedicated experimental characterizations which are devoted to its validation, the goal of this framework is to provide insights about the impacts of material properties and fabrication process parameters on the electrode mesostructures and their corresponding correlation to the resulting electrochemical performance. It targets self-organization mechanisms of material mixtures in slurries by considering the interactions between the active and conductive materials, solvent, binders and dispersants and the relationship between the materials properties such as surface chemistry and wettability. Optimal electrode formulation, fabrication process and the arising electrode mesostructure can then be achieved. Additionally, the framework will be integrated into an online and open access infrastructure, allowing predictive direct and reverse engineering for optimized electrode designs to attain high quality electrochemical performances. Through the demonstration of a multidisciplinary, flexible and transferable framework, this project has tremendous potential to provide insights leading to proposals of new and highly efficient industrial techniques for the fabrication of cheaper and reliable next-generation secondary battery electrodes for a wide spectrum of applications, including Electric Transportation.
Max ERC Funding
1 976 445 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym ATMO
Project Atmospheres across the Universe
Researcher (PI) Pascal TREMBLIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Summary
Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ATMOFLEX
Project Turbulent Transport in the Atmosphere: Fluctuations and Extreme Events
Researcher (PI) Jérémie Bec
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2009-StG
Summary A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Summary
A major part of the physical and chemical processes occurring in the atmosphere involves the turbulent transport of tiny particles. Current studies and models use a formulation in terms of mean fields, where the strong variations in the dynamical and statistical properties of the particles are neglected and where the underlying fluctuations of the fluid flow velocity are oversimplified. Devising an accurate understanding of the influence of air turbulence and of the extreme fluctuations that it generates in the dispersed phase remains a challenging issue. This project aims at coordinating and integrating theoretical, numerical, experimental, and observational efforts to develop a new statistical understanding of the role of fluctuations in atmospheric transport processes. The proposed work will cover individual as well as collective behaviors and will provide a systematic and unified description of targeted specific processes involving suspended drops or particles: the dispersion of pollutants from a source, the growth by condensation and coagulation of droplets and ice crystals in clouds, the scavenging, settling and re-suspension of aerosols, and the radiative and climatic effects of particles. The proposed approach is based on the use of tools borrowed from statistical physics and field theory, and from the theory of large deviations and of random dynamical systems in order to design new observables that will be simultaneously tractable analytically in simplified models and of relevance for the quantitative handling of such physical mechanisms. One of the outcomes will be to provide a new framework for improving and refining the methods used in meteorology and atmospheric sciences and to answer the long-standing question of the effects of suspended particles onto climate.
Max ERC Funding
1 200 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym ATOMAG
Project From Attosecond Magnetism towards Ultrafast Spin Photonics
Researcher (PI) Jean-Yves Bigot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Summary
We propose to investigate a new frontier in Physics: the study of Magnetic systems using attosecond laser pulses. The main disciplines concerned are: Ultrafast laser sciences, Magnetism and Spin-Photonics, Relativistic Quantum Electrodynamics. Three issues of modern magnetism are addressed. 1. How fast can one modify and control the magnetization of a magnetic system ? 2. What is the role and essence of the coherent interaction between light and spins ? 3. How far spin-photonics can bring us to the real world of data acquisition and storage ? - We want first to provide solid ground experiments, unravelling the mechanisms involved in the demagnetization induced by laser pulses in a variety of magnetic materials (ferromagnetic nanostructures, aggregates and molecular magnets). We will explore the ultrafast magnetization dynamics of magnets using an attosecond laser source. - Second we want to explore how the photon field interacts with the spins. We will investigate the dynamical regime when the potential of the atoms is dressed by the Coulomb potential induced by the laser field. A strong support from the relativistic Quantum Electro-Dynamics is necessary towards that goal. - Third, even though our general approach is fundamental, we want to provide a benchmark of what is realistically possible in ultrafast spin-photonics, breaking the conventional thought that spin photonics is hard to implement at the application level. We will realize ultimate devices combining magneto-optical microscopy with the conventional magnetic recording. This new field will raise the interest of a number of competitive laboratories at the international level. Due to the overlapping disciplines the project also carries a large amount of educational impact both fundamental and applied.
Max ERC Funding
2 492 561 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym Atto-Zepto
Project Ultrasensitive Nano-Optomechanical Sensors
Researcher (PI) Olivier ARCIZET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Summary
By enabling the conversion of forces into measurable displacements, mechanical oscillators have always played a central role in experimental physics. Recent developments in the PI group demonstrated the possibility to realize ultrasensitive and vectorial force field sensing by using suspended SiC nanowires and optical readout of their transverse vibrations. Astonishing sensitivities were obtained at room and dilution temperatures, at the Atto- Zepto-newton level, for which the electron-electron interaction becomes detectable at 100µm.
The goal of the project is to push forward those ultrasensitive nano-optomechanical force sensors, to realize even more challenging explorations of novel fundamental interactions at the quantum-classical interface.
We will develop universal advanced sensing protocols to explore the vectorial structure of fundamental optical, electrostatic or magnetic interactions, and investigate Casimir force fields above nanostructured surfaces, in geometries where it was recently predicted to become repulsive. The second research axis is the one of cavity nano-optomechanics: inserting the ultrasensitive nanowire in a high finesse optical microcavity should enhance the light-nanowire interaction up to the point where a single cavity photon can displace the nanowire by more than its zero point quantum fluctuations. We will investigate this so-called ultrastrong optomechanical coupling regime, and further explore novel regimes in cavity optomechanics, where optical non-linearities at the single photon level become accessible. The last part is dedicated to the exploration of hybrid qubit-mechanical systems, in which nanowire vibrations are magnetically coupled to the spin of a single Nitrogen Vacancy defect in diamond. We will focus on the exploration of spin-dependent forces, aiming at mechanically detecting qubit excitations, opening a novel road towards the generation of non-classical states of motion, and mechanically enhanced quantum sensors.
Max ERC Funding
2 067 905 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym AUGURY
Project Reconstructing Earth’s mantle convection
Researcher (PI) Nicolas Coltice
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Summary
Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Max ERC Funding
1 994 000 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym BACEMO
Project Bacterial Cell Morphogenesis
Researcher (PI) Rut Carballido Lopez
Host Institution (HI) INSTITUT NATIONAL DE LA RECHERCHE AGRONOMIQUE
Call Details Starting Grant (StG), LS3, ERC-2012-StG_20111109
Summary In bacteria, the though external cell wall and the intracellular actin-like (MreB) cytoskeleton are major determinants of cell shape. The biosynthetic pathways and chemical composition of the cell wall, a three dimensional polymer network that is one of the most prominent targets for antibiotics, are well understood. However, despite decades of study, little is known about the complex cell wall ultrastructure and the molecular mechanisms that control cell wall morphogenesis in time and space. In rod-shaped bacteria, MreB homologues assemble into dynamic structures thought to control shape by serving as organizers for the movement and assembly of macromolecular machineries that effect sidewall elongation. However, the mechanistic details used by the MreB cytoskeleton to fulfil this role remain to be elucidated. Furthermore, development of high-resolution microscopy techniques has led to new breakthroughs this year, published by our lab and others, which are shaking the model developed over the last decade and re-questioning the MreB “actin cytoskeleton” designation.
The aim of this project is to combine powerful genetic, biochemical, genomic and systems biology approaches available in the model bacterium Bacillus subtilis with modern high-resolution light microscopic techniques to study the dynamics and mechanistic details of the MreB cytoskeleton and of CW assembly. Parameters measured by the different approaches will be combined to quantitatively describe the features of bacterial cell morphogenesis.
Summary
In bacteria, the though external cell wall and the intracellular actin-like (MreB) cytoskeleton are major determinants of cell shape. The biosynthetic pathways and chemical composition of the cell wall, a three dimensional polymer network that is one of the most prominent targets for antibiotics, are well understood. However, despite decades of study, little is known about the complex cell wall ultrastructure and the molecular mechanisms that control cell wall morphogenesis in time and space. In rod-shaped bacteria, MreB homologues assemble into dynamic structures thought to control shape by serving as organizers for the movement and assembly of macromolecular machineries that effect sidewall elongation. However, the mechanistic details used by the MreB cytoskeleton to fulfil this role remain to be elucidated. Furthermore, development of high-resolution microscopy techniques has led to new breakthroughs this year, published by our lab and others, which are shaking the model developed over the last decade and re-questioning the MreB “actin cytoskeleton” designation.
The aim of this project is to combine powerful genetic, biochemical, genomic and systems biology approaches available in the model bacterium Bacillus subtilis with modern high-resolution light microscopic techniques to study the dynamics and mechanistic details of the MreB cytoskeleton and of CW assembly. Parameters measured by the different approaches will be combined to quantitatively describe the features of bacterial cell morphogenesis.
Max ERC Funding
1 650 050 €
Duration
Start date: 2013-02-01, End date: 2019-01-31
Project acronym BACTIN
Project Shaping the bacterial cell wall: the actin-like cytoskeleton, from single molecules to morphogenesis and antimicrobials
Researcher (PI) Rut CARBALLIDO LOPEZ
Host Institution (HI) INSTITUT NATIONAL DE LA RECHERCHE AGRONOMIQUE
Call Details Consolidator Grant (CoG), LS3, ERC-2017-COG
Summary One of the ultimate goals in cell biology is to understand how cells determine their shape. In bacteria, the cell wall and the actin-like (MreB) cytoskeleton are major determinants of cell shape. As a hallmark of microbial life, the external cell wall is the most conspicuous macromolecule expanding in concert with cell growth and one of the most prominent targets for antibiotics. Despite decades of study, the mechanism of cell wall morphogenesis remains poorly understood. In rod-shaped bacteria, actin-like MreB proteins assemble into disconnected membrane-associated structures (patches) that move processively around the cell periphery and are thought to control shape by spatiotemporally organizing macromolecular machineries that effect sidewall elongation. However, the ultrastructure of MreB assemblies and the mechanistic details underlying their morphogenetic function remain to be elucidated.
The aim of this project is to combine ground-breaking light microscopy and spectroscopy techniques with cutting-edge genetic, biochemical and systems biology approaches available in the model rod-shaped bacterium Bacillus subtilis to elucidate how MreB and cell wall biosynthetic enzymes collectively act to build a cell. Within this context, new features of MreB assemblies will be determined in vivo and in vitro, and a “toolbox” of approaches to determine the modes of action of antibiotics targeting cell wall processes will be developed. Parameters measured by the different approaches will be used to refine a mathematical model aiming to quantitatively describe the features of bacterial cell wall growth. The long-term goals of BActin are to understand general principles of bacterial cell morphogenesis and to provide mechanistic templates and new reporters for the screening of novel antibiotics.
Summary
One of the ultimate goals in cell biology is to understand how cells determine their shape. In bacteria, the cell wall and the actin-like (MreB) cytoskeleton are major determinants of cell shape. As a hallmark of microbial life, the external cell wall is the most conspicuous macromolecule expanding in concert with cell growth and one of the most prominent targets for antibiotics. Despite decades of study, the mechanism of cell wall morphogenesis remains poorly understood. In rod-shaped bacteria, actin-like MreB proteins assemble into disconnected membrane-associated structures (patches) that move processively around the cell periphery and are thought to control shape by spatiotemporally organizing macromolecular machineries that effect sidewall elongation. However, the ultrastructure of MreB assemblies and the mechanistic details underlying their morphogenetic function remain to be elucidated.
The aim of this project is to combine ground-breaking light microscopy and spectroscopy techniques with cutting-edge genetic, biochemical and systems biology approaches available in the model rod-shaped bacterium Bacillus subtilis to elucidate how MreB and cell wall biosynthetic enzymes collectively act to build a cell. Within this context, new features of MreB assemblies will be determined in vivo and in vitro, and a “toolbox” of approaches to determine the modes of action of antibiotics targeting cell wall processes will be developed. Parameters measured by the different approaches will be used to refine a mathematical model aiming to quantitatively describe the features of bacterial cell wall growth. The long-term goals of BActin are to understand general principles of bacterial cell morphogenesis and to provide mechanistic templates and new reporters for the screening of novel antibiotics.
Max ERC Funding
1 902 195 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym BALLISTOP
Project Revealing 1D ballistic charge and spin currents in second order topological insulators
Researcher (PI) helene BOUCHIAT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2018-ADG
Summary One of the greatest recent achievement in Condensed matter physics is the discovery of a new class of materials, Topological Insulators (TI), whose bulk is insulating, while the edges conduct current in a quasi-ideal way. In particular, the 1D edges of 2DTI realize the Quantum Spin Hall state, where current is carried dissipationlessly by two counter-propagating ballistic edge states with a spin orientation locked to that of the propagation direction (a helical edge state). This opens many possibilities, ranging from dissipationless charge and spin transport at room temperature to new avenues for quantum computing. We propose to investigate charge and spin currents in a newly discovered class of TIs, Second Order Topological Insulators (SOTIs), i.e. 3D crystals with insulating bulk and surfaces, but perfectly conducting (topologically protected) 1D helical “hinge” states. Bismuth, despite its well-known semimetallic character, has recently been shown theoretically to belong to this class of materials, explaining our recent intriguing findings on nanowires. Our goal is to reveal, characterize and exploit the unique properties of SOTIs, in particular the high velocity, ballistic, and dissipationless hinge currents. We will probe crystalline bismuth samples with refined new experimental tools. The superconducting proximity effect will reveal the spatial distribution of conduction paths, and test the ballisticity of the hinge modes (that may coexist with non-topological surface modes). High frequency and tunnel spectroscopies of hybrid superconductor/Bi circuits will probe their topological nature, including the existence of Majorana modes. We will use high sensitivity magnetometers to detect the orbital magnetism of SOTI platelets, which should be dominated by topological edge currents. Lastly, we propose to detect the predicted equilibrium spin currents in 2DTIs and SOTIs via the generated electric field, using single electron transistors-based electrometers.
Summary
One of the greatest recent achievement in Condensed matter physics is the discovery of a new class of materials, Topological Insulators (TI), whose bulk is insulating, while the edges conduct current in a quasi-ideal way. In particular, the 1D edges of 2DTI realize the Quantum Spin Hall state, where current is carried dissipationlessly by two counter-propagating ballistic edge states with a spin orientation locked to that of the propagation direction (a helical edge state). This opens many possibilities, ranging from dissipationless charge and spin transport at room temperature to new avenues for quantum computing. We propose to investigate charge and spin currents in a newly discovered class of TIs, Second Order Topological Insulators (SOTIs), i.e. 3D crystals with insulating bulk and surfaces, but perfectly conducting (topologically protected) 1D helical “hinge” states. Bismuth, despite its well-known semimetallic character, has recently been shown theoretically to belong to this class of materials, explaining our recent intriguing findings on nanowires. Our goal is to reveal, characterize and exploit the unique properties of SOTIs, in particular the high velocity, ballistic, and dissipationless hinge currents. We will probe crystalline bismuth samples with refined new experimental tools. The superconducting proximity effect will reveal the spatial distribution of conduction paths, and test the ballisticity of the hinge modes (that may coexist with non-topological surface modes). High frequency and tunnel spectroscopies of hybrid superconductor/Bi circuits will probe their topological nature, including the existence of Majorana modes. We will use high sensitivity magnetometers to detect the orbital magnetism of SOTI platelets, which should be dominated by topological edge currents. Lastly, we propose to detect the predicted equilibrium spin currents in 2DTIs and SOTIs via the generated electric field, using single electron transistors-based electrometers.
Max ERC Funding
2 432 676 €
Duration
Start date: 2020-04-01, End date: 2025-03-31
Project acronym BEBOP
Project Bacterial biofilms in porous structures: from biomechanics to control
Researcher (PI) Yohan, Jean-Michel, Louis DAVIT
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary The key ideas motivating this project are that: 1) precise control of the properties of porous systems can be obtained by exploiting bacteria and their fantastic abilities; 2) conversely, porous media (large surface to volume ratios, complex structures) could be a major part of bacterial synthetic biology, as a scaffold for growing large quantities of microorganisms in controlled bioreactors.
The main scientific obstacle to precise control of such processes is the lack of understanding of biophysical mechanisms in complex porous structures, even in the case of single-strain biofilms. The central hypothesis of this project is that a better fundamental understanding of biofilm biomechanics and physical ecology will yield a novel theoretical basis for engineering and control.
The first scientific objective is thus to gain insight into how fluid flow, transport phenomena and biofilms interact within connected multiscale heterogeneous structures - a major scientific challenge with wide-ranging implications. To this end, we will combine microfluidic and 3D printed micro-bioreactor experiments; fluorescence and X-ray imaging; high performance computing blending CFD, individual-based models and pore network approaches.
The second scientific objective is to create the primary building blocks toward a control theory of bacteria in porous media and innovative designs of microbial bioreactors. Building upon the previous objective, we first aim to extract from the complexity of biological responses the most universal engineering principles applying to such systems. We will then design a novel porous micro-bioreactor to demonstrate how the permeability and solute residence times can be controlled in a dynamic, reversible and stable way - an initial step toward controlling reaction rates.
We envision that this will unlock a new generation of biotechnologies and novel bioreactor designs enabling translation from proof-of-concept synthetic microbiology to industrial processes.
Summary
The key ideas motivating this project are that: 1) precise control of the properties of porous systems can be obtained by exploiting bacteria and their fantastic abilities; 2) conversely, porous media (large surface to volume ratios, complex structures) could be a major part of bacterial synthetic biology, as a scaffold for growing large quantities of microorganisms in controlled bioreactors.
The main scientific obstacle to precise control of such processes is the lack of understanding of biophysical mechanisms in complex porous structures, even in the case of single-strain biofilms. The central hypothesis of this project is that a better fundamental understanding of biofilm biomechanics and physical ecology will yield a novel theoretical basis for engineering and control.
The first scientific objective is thus to gain insight into how fluid flow, transport phenomena and biofilms interact within connected multiscale heterogeneous structures - a major scientific challenge with wide-ranging implications. To this end, we will combine microfluidic and 3D printed micro-bioreactor experiments; fluorescence and X-ray imaging; high performance computing blending CFD, individual-based models and pore network approaches.
The second scientific objective is to create the primary building blocks toward a control theory of bacteria in porous media and innovative designs of microbial bioreactors. Building upon the previous objective, we first aim to extract from the complexity of biological responses the most universal engineering principles applying to such systems. We will then design a novel porous micro-bioreactor to demonstrate how the permeability and solute residence times can be controlled in a dynamic, reversible and stable way - an initial step toward controlling reaction rates.
We envision that this will unlock a new generation of biotechnologies and novel bioreactor designs enabling translation from proof-of-concept synthetic microbiology to industrial processes.
Max ERC Funding
1 649 861 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym Big Mac
Project Microfluidic Approaches mimicking BIoGeological conditions to investigate subsurface CO2 recycling
Researcher (PI) SAMUEL CHARLES GEORGES MARRE
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2016-COG
Summary The management of anthropogenic CO2 will be one of the main challenges of this century given the dramatic impact of greenhouse gases on our living environment. A fascinating strategy to restore the advantages of stored CO2 as a raw material would be to consider a slow biological upgrading process of CO2 in deep geological formations.
Significantly, the recent development of microfluidic tools to study pore-scale phenomena under high pressure, opens new avenues to investigate such strategies. Thus, the strategic objective of this project is to develop and to use “Biological Geological Laboratories on a Chip - BioGLoCs” mimicking reservoir conditions in order to gain greater understanding in the mechanisms associated with the biogeological conversion process of CO2 to methane in CGS environment at pore scale.
The specific objectives are: (1) to determine the experimental conditions for the development of competent micro-organisms (methanogens) and to establish the methane production rates depending on the operating parameters, (2) to evaluate the feasibility of a H2 in situ production strategy (required to sustain the methanogenesis process), (3) to investigate the full bioconversion process in 2D and 3D, (4) to demonstrate the process scaling from pore scale to liter scale and (5) to evaluate the overall process performance.
This multidisciplinary project gathering expertise in chemical engineering and geomicrobiology will be the first ever use of microfluidics approaches to investigate a biogeological transformation taking into account the thermo-hydro-bio-chemical processes. It will result in the identification of efficient geomicrobiological methods and materials to accelerate the CO2 to methane biogeoconversion process. New generic lab scale tools will be also made available for investigating geological-related topics (enhanced oil recovery, deep geothermal energy, bioremediation of groundwater, shale gas recovery).
Summary
The management of anthropogenic CO2 will be one of the main challenges of this century given the dramatic impact of greenhouse gases on our living environment. A fascinating strategy to restore the advantages of stored CO2 as a raw material would be to consider a slow biological upgrading process of CO2 in deep geological formations.
Significantly, the recent development of microfluidic tools to study pore-scale phenomena under high pressure, opens new avenues to investigate such strategies. Thus, the strategic objective of this project is to develop and to use “Biological Geological Laboratories on a Chip - BioGLoCs” mimicking reservoir conditions in order to gain greater understanding in the mechanisms associated with the biogeological conversion process of CO2 to methane in CGS environment at pore scale.
The specific objectives are: (1) to determine the experimental conditions for the development of competent micro-organisms (methanogens) and to establish the methane production rates depending on the operating parameters, (2) to evaluate the feasibility of a H2 in situ production strategy (required to sustain the methanogenesis process), (3) to investigate the full bioconversion process in 2D and 3D, (4) to demonstrate the process scaling from pore scale to liter scale and (5) to evaluate the overall process performance.
This multidisciplinary project gathering expertise in chemical engineering and geomicrobiology will be the first ever use of microfluidics approaches to investigate a biogeological transformation taking into account the thermo-hydro-bio-chemical processes. It will result in the identification of efficient geomicrobiological methods and materials to accelerate the CO2 to methane biogeoconversion process. New generic lab scale tools will be also made available for investigating geological-related topics (enhanced oil recovery, deep geothermal energy, bioremediation of groundwater, shale gas recovery).
Max ERC Funding
1 995 354 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym BigFastData
Project Charting a New Horizon of Big and Fast Data Analysis through Integrated Algorithm Design
Researcher (PI) Yanlei DIAO
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2016-COG
Summary This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Summary
This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Max ERC Funding
2 472 752 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BinD
Project Mitotic Bookmarking, Stem Cells and early Development
Researcher (PI) Pablo Navarro Gil
Host Institution (HI) INSTITUT PASTEUR
Call Details Consolidator Grant (CoG), LS3, ERC-2017-COG
Summary The goal of this proposal is to deliver a new theoretical framework to understand how transcription factors (TFs) sustain cell identity during developmental processes. Recognised as key drivers of cell fate acquisition, TFs are currently not considered to directly contribute to the mitotic inheritance of chromatin states. Instead, these are passively propagated through cell division by a variety of epigenetic marks. Recent discoveries, including by our lab, challenge this view: developmental TFs may impact the propagation of regulatory information from mother to daughter cells through a process known as mitotic bookmarking. This hypothesis, largely overlooked by mainstream epigenetic research during the last two decades, will be investigated in embryo-derived stem cells and during early mouse development. Indeed, these immature cell identities are largely independent from canonical epigenetic repression; hence, current models cannot account for their properties. We will comprehensively identify mitotic bookmarking factors in stem cells and early embryos, establish their function in stem cell self-renewal, cell fate acquisition and dissect how they contribute to chromatin regulation in mitosis. This will allow us to study the relationships between bookmarking factors and other mechanisms of epigenetic inheritance. To achieve this, unique techniques to modulate protein activity and histone modifications specifically in mitotic cells will be established. Thus, a mechanistic understanding of how mitosis influences gene regulation and of how mitotic bookmarking contributes to the propagation of immature cell identities will be delivered. Based on robust preliminary data, we anticipate the discovery of new functions for TFs in several genetic and epigenetic processes. This knowledge should have a wide impact on chromatin biology and cell fate studies as well as in other fields studying processes dominated by TFs and cell proliferation.
Summary
The goal of this proposal is to deliver a new theoretical framework to understand how transcription factors (TFs) sustain cell identity during developmental processes. Recognised as key drivers of cell fate acquisition, TFs are currently not considered to directly contribute to the mitotic inheritance of chromatin states. Instead, these are passively propagated through cell division by a variety of epigenetic marks. Recent discoveries, including by our lab, challenge this view: developmental TFs may impact the propagation of regulatory information from mother to daughter cells through a process known as mitotic bookmarking. This hypothesis, largely overlooked by mainstream epigenetic research during the last two decades, will be investigated in embryo-derived stem cells and during early mouse development. Indeed, these immature cell identities are largely independent from canonical epigenetic repression; hence, current models cannot account for their properties. We will comprehensively identify mitotic bookmarking factors in stem cells and early embryos, establish their function in stem cell self-renewal, cell fate acquisition and dissect how they contribute to chromatin regulation in mitosis. This will allow us to study the relationships between bookmarking factors and other mechanisms of epigenetic inheritance. To achieve this, unique techniques to modulate protein activity and histone modifications specifically in mitotic cells will be established. Thus, a mechanistic understanding of how mitosis influences gene regulation and of how mitotic bookmarking contributes to the propagation of immature cell identities will be delivered. Based on robust preliminary data, we anticipate the discovery of new functions for TFs in several genetic and epigenetic processes. This knowledge should have a wide impact on chromatin biology and cell fate studies as well as in other fields studying processes dominated by TFs and cell proliferation.
Max ERC Funding
1 900 844 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym BIOFUNCTION
Project Self assembly into biofunctional molecules, translating instructions into function
Researcher (PI) Nicolas Winssinger
Host Institution (HI) UNIVERSITE DE STRASBOURG
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary The overall objective of the proposal is to develop enabling chemical technologies to address two important problems in biology: detect in a nondestructive fashion gene expression or microRNA sequences in vivo and, secondly, study the role of multivalency and spatial organization in carbohydrate recognition. Both of these projects exploit the programmable pre-organization of peptide nucleic acid (PNA) to induce a chemical reaction in the first case or modulate a ligand-receptor interaction in the second case. For nucleic acid detection, a DNA or RNA fragment will be utilized to bring two PNA fragments bearing reactive functionalities in close proximity thereby promoting a reaction. Two types of reactions are proposed, the first one to release a fluorophore for imaging purposes and the second one to release a drug as an “intelligent” therapeutic. If affinities are programmed such that hybridization is reversible, the template can work catalytically leading to large amplifications. As a proof of concept, this method will be used to measure the transcription level of genes implicated in stem cell differentiation and detect mutations in oncogenes. For the purpose of studying multivalent carbohydrate ligand architectures, the challenge of chemical synthesis has been a limiting factor. A supramolecular approach is proposed herein where different arrangements of carbohydrates can be displayed in a well organized fashion by hybridizing PNA-tagged carbohydrates to DNA templates. This will be used not only to control the distance between multiple ligands or to create combinatorial arrangements of hetero ligands but also to access more complex architectures such as Hollyday junctions. The oligosaccharide units will be prepared using de novo organoctalytic reactions. This technology will be first applied to probe the recognition events between HIV and dendritic cells which promote HIV infection.
Summary
The overall objective of the proposal is to develop enabling chemical technologies to address two important problems in biology: detect in a nondestructive fashion gene expression or microRNA sequences in vivo and, secondly, study the role of multivalency and spatial organization in carbohydrate recognition. Both of these projects exploit the programmable pre-organization of peptide nucleic acid (PNA) to induce a chemical reaction in the first case or modulate a ligand-receptor interaction in the second case. For nucleic acid detection, a DNA or RNA fragment will be utilized to bring two PNA fragments bearing reactive functionalities in close proximity thereby promoting a reaction. Two types of reactions are proposed, the first one to release a fluorophore for imaging purposes and the second one to release a drug as an “intelligent” therapeutic. If affinities are programmed such that hybridization is reversible, the template can work catalytically leading to large amplifications. As a proof of concept, this method will be used to measure the transcription level of genes implicated in stem cell differentiation and detect mutations in oncogenes. For the purpose of studying multivalent carbohydrate ligand architectures, the challenge of chemical synthesis has been a limiting factor. A supramolecular approach is proposed herein where different arrangements of carbohydrates can be displayed in a well organized fashion by hybridizing PNA-tagged carbohydrates to DNA templates. This will be used not only to control the distance between multiple ligands or to create combinatorial arrangements of hetero ligands but also to access more complex architectures such as Hollyday junctions. The oligosaccharide units will be prepared using de novo organoctalytic reactions. This technology will be first applied to probe the recognition events between HIV and dendritic cells which promote HIV infection.
Max ERC Funding
1 249 980 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym BIOLOCHANICS
Project Localization in biomechanics and mechanobiology of aneurysms: Towards personalized medicine
Researcher (PI) Stéphane Henri Anatole Avril
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Consolidator Grant (CoG), PE8, ERC-2014-CoG
Summary Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Summary
Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Max ERC Funding
1 999 396 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym BIOMECAMORPH
Project The Biomechanics of Epithelial Cell and Tissue Morphogenesis
Researcher (PI) Thomas Marie Michel Lecuit
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2012-ADG_20120314
Summary Tissue morphogenesis is a complex process that emerges from spatially controlled patterns of cell shape changes. Dedicated genetic programmes regulate cell behaviours, exemplified in animals by the specification of apical constriction in invaginating epithelial tissues, or the orientation of cell intercalation during tissue extension. This genetic control is constrained by physical properties of cells that dictate how they can modify their shape. A major challenge is to understand how biochemical pathways control subcellular mechanics in epithelia, such as how forces are produced by interactions between actin filaments and myosin motors, and how these forces are transmitted at cell junctions. The major objective of our project is to investigate the fundamental principles of epithelial mechanics and to understand how intercellular signals and mechanical coupling between cells coordinate individual behaviours at the tissue level.
We will study early Drosophila embryogenesis and combine quantitative cell biological studies of cell dynamics, biophysical characterization of cell mechanics and genetic control of cell signalling to answer the following questions: i) how are forces generated, in particular what underlies deformation and stabilization of cell shape by actomyosin networks, and pulsatile contractility; ii) how are forces transmitted at junctions, what are the feedback interactions between tension generation and transmission; iii) how are individual cell mechanics orchestrated at the tissue level to yield collective tissue morphogenesis?
We expect to encapsulate the information-based, cell biological and physical descriptions of morphogenesis in a single, coherent framework. The project should impact more broadly on morphogenesis in other organisms and shed light on the mechanisms underlying robustness and plasticity in epithelia.
Summary
Tissue morphogenesis is a complex process that emerges from spatially controlled patterns of cell shape changes. Dedicated genetic programmes regulate cell behaviours, exemplified in animals by the specification of apical constriction in invaginating epithelial tissues, or the orientation of cell intercalation during tissue extension. This genetic control is constrained by physical properties of cells that dictate how they can modify their shape. A major challenge is to understand how biochemical pathways control subcellular mechanics in epithelia, such as how forces are produced by interactions between actin filaments and myosin motors, and how these forces are transmitted at cell junctions. The major objective of our project is to investigate the fundamental principles of epithelial mechanics and to understand how intercellular signals and mechanical coupling between cells coordinate individual behaviours at the tissue level.
We will study early Drosophila embryogenesis and combine quantitative cell biological studies of cell dynamics, biophysical characterization of cell mechanics and genetic control of cell signalling to answer the following questions: i) how are forces generated, in particular what underlies deformation and stabilization of cell shape by actomyosin networks, and pulsatile contractility; ii) how are forces transmitted at junctions, what are the feedback interactions between tension generation and transmission; iii) how are individual cell mechanics orchestrated at the tissue level to yield collective tissue morphogenesis?
We expect to encapsulate the information-based, cell biological and physical descriptions of morphogenesis in a single, coherent framework. The project should impact more broadly on morphogenesis in other organisms and shed light on the mechanisms underlying robustness and plasticity in epithelia.
Max ERC Funding
2 473 313 €
Duration
Start date: 2013-05-01, End date: 2018-04-30
Project acronym BIOMIM
Project Biomimetic films and membranes as advanced materials for studies on cellular processes
Researcher (PI) Catherine Cecile Picart
Host Institution (HI) INSTITUT POLYTECHNIQUE DE GRENOBLE
Call Details Starting Grant (StG), PE5, ERC-2010-StG_20091028
Summary The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Summary
The main objective nowadays in the field of biomaterials is to design highly performing bioinspired materials learning from natural processes. Importantly, biochemical and physical cues are key parameters that can affect cellular processes. Controlling processes that occur at the cell/material interface is also of prime importance to guide the cell response. The main aim of the current project is to develop novel functional bio-nanomaterials for in vitro biological studies. Our strategy is based on two related projects.
The first project deals with the rational design of smart films with foreseen applications in musculoskeletal tissue engineering. We will gain knowledge of key cellular processes by designing well defined self-assembled thin coatings. These multi-functional surfaces with bioactivity (incorporation of growth factors), mechanical (film stiffness) and topographical properties (spatial control of the film s properties) will serve as tools to mimic the complexity of the natural materials in vivo and to present bioactive molecules in the solid phase. We will get a better fundamental understanding of how cellular functions, including adhesion and differentiation of muscle cells are affected by the materials s surface properties.
In the second project, we will investigate at the molecular level a crucial aspect of cell adhesion and motility, which is the intracellular linkage between the plasma membrane and the cell cytoskeleton. We aim to elucidate the role of ERM proteins, especially ezrin and moesin, in the direct linkage between the plasma membrane and actin filaments. Here again, we will use a well defined microenvironment in vitro to simplify the complexity of the interactions that occur in cellulo. To this end, lipid membranes containing a key regulator lipid from the phosphoinositides familly, PIP2, will be employed in conjunction with purified proteins to investigate actin regulation by ERM proteins in the presence of PIP2-membranes.
Max ERC Funding
1 499 996 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym BIOMOFS
Project Bioapplications of Metal Organic Frameworks
Researcher (PI) Christian Serre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary This project will focus on the use of nanoporous metal organic frameworks (Fe, Zn, Ti) for bioapplications. These systems are exciting porous solids, built up from inorganic clusters and polycarboxylates. This results in open-framework solids with different pore shapes and dimensions, and applications such as catalysis, separation and storage of gases. I have recently initiated the synthesis of new trivalent transition metal carboxylates. Among them, the metal carboxylates MIL-100 and MIL-101 (MIL: Materials of Institut Lavoisier) are spectacular solids with giant pores (25-34 Å), accessible metal sites and huge surface areas (3100-5900 m2.g-1). Recently, it was shown that these solids could be used for drug delivery with a loading of 1.4 g of Ibuprofen per gram of MIL-101 solid and a total release in six days. This project will concentrate on the implication of MOFs for drug release and other bioapplications. Whereas research on drug delivery is currently focused either on the use of bio-compatible polymers or mesoporous materials, our method will combine advantages of both routes including a high loading and a slow release of therapeutic molecules. A second application will use solids with accessible metal sites to coordinate NO for its controlled delivery. This would provide exogenous NO for prophylactic and therapeutic processes, anti-thrombogenic medical devices, improved dressings for wounds and ulcers, and the treatment of fungal and bacterial infections. Finally, other applications will be envisaged such as the purification of physiological fluids. The project, which will consist of a systematic study of the relation between these properties and both the composition and structure of the hybrid solids, will be assisted by a strong modelling effort including top of the art computational methods (QSAR and QSPKR). This highly impact project will be realised by assembling experienced researchers in multidisplinary areas including materials science, biology and modelling. It will involve P. Horcajada (Institut Lavoisier), whose background in pharmaceutical science will fit with my experience in inorganic chemistry and G. Maurin (Institut Gerhardt, Montpellier) expert in computational chemistry.
Summary
This project will focus on the use of nanoporous metal organic frameworks (Fe, Zn, Ti) for bioapplications. These systems are exciting porous solids, built up from inorganic clusters and polycarboxylates. This results in open-framework solids with different pore shapes and dimensions, and applications such as catalysis, separation and storage of gases. I have recently initiated the synthesis of new trivalent transition metal carboxylates. Among them, the metal carboxylates MIL-100 and MIL-101 (MIL: Materials of Institut Lavoisier) are spectacular solids with giant pores (25-34 Å), accessible metal sites and huge surface areas (3100-5900 m2.g-1). Recently, it was shown that these solids could be used for drug delivery with a loading of 1.4 g of Ibuprofen per gram of MIL-101 solid and a total release in six days. This project will concentrate on the implication of MOFs for drug release and other bioapplications. Whereas research on drug delivery is currently focused either on the use of bio-compatible polymers or mesoporous materials, our method will combine advantages of both routes including a high loading and a slow release of therapeutic molecules. A second application will use solids with accessible metal sites to coordinate NO for its controlled delivery. This would provide exogenous NO for prophylactic and therapeutic processes, anti-thrombogenic medical devices, improved dressings for wounds and ulcers, and the treatment of fungal and bacterial infections. Finally, other applications will be envisaged such as the purification of physiological fluids. The project, which will consist of a systematic study of the relation between these properties and both the composition and structure of the hybrid solids, will be assisted by a strong modelling effort including top of the art computational methods (QSAR and QSPKR). This highly impact project will be realised by assembling experienced researchers in multidisplinary areas including materials science, biology and modelling. It will involve P. Horcajada (Institut Lavoisier), whose background in pharmaceutical science will fit with my experience in inorganic chemistry and G. Maurin (Institut Gerhardt, Montpellier) expert in computational chemistry.
Max ERC Funding
1 250 000 €
Duration
Start date: 2008-06-01, End date: 2013-05-31
Project acronym bioSPINspired
Project Bio-inspired Spin-Torque Computing Architectures
Researcher (PI) Julie Grollier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Summary
In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Max ERC Funding
1 907 767 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym BIOTORQUE
Project Probing the angular dynamics of biological systems with the optical torque wrench
Researcher (PI) Francesco Pedaci
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary "The ability to apply forces to single molecules and bio-polymers has fundamentally changed the way we can interact with and understand biological systems. Yet, for many cellular mechanisms, it is rather the torque that is the relevant physical parameter. Excitingly, novel single-molecule techniques that utilize this parameter are now poised to contribute to novel discoveries. Here, I will study the angular dynamical behavior and response to external torque of biological systems at the molecular and cellular levels using the new optical torque wrench that I recently developed.
In a first research line, I will unravel the angular dynamics of the e.coli flagellar motor, a complex and powerful rotary nano-motor that rotates the flagellum in order to propel the bacterium forwards. I will quantitatively study different aspects of torque generation of the motor, aiming to connect evolutionary, dynamical, and structural principles. In a second research line, I will develop an in-vivo manipulation technique based on the transfer of optical torque and force onto novel nano-fabricated particles. This new scanning method will allow me to map physical properties such as the local viscosity inside living cells and the spatial organization and topography of internal membranes, thereby expanding the capabilities of existing techniques towards in-vivo and ultra-low force scanning imaging.
This project is founded on a multidisciplinary approach in which fundamental optics, novel nanoparticle fabrication, and molecular and cellular biology are integrated. It has the potential to answer biophysical questions that have challenged the field for over two decades and to impact fields ranging from single-molecule biophysics to scanning-probe microscopy and nanorheology, provided ERC funding is granted."
Summary
"The ability to apply forces to single molecules and bio-polymers has fundamentally changed the way we can interact with and understand biological systems. Yet, for many cellular mechanisms, it is rather the torque that is the relevant physical parameter. Excitingly, novel single-molecule techniques that utilize this parameter are now poised to contribute to novel discoveries. Here, I will study the angular dynamical behavior and response to external torque of biological systems at the molecular and cellular levels using the new optical torque wrench that I recently developed.
In a first research line, I will unravel the angular dynamics of the e.coli flagellar motor, a complex and powerful rotary nano-motor that rotates the flagellum in order to propel the bacterium forwards. I will quantitatively study different aspects of torque generation of the motor, aiming to connect evolutionary, dynamical, and structural principles. In a second research line, I will develop an in-vivo manipulation technique based on the transfer of optical torque and force onto novel nano-fabricated particles. This new scanning method will allow me to map physical properties such as the local viscosity inside living cells and the spatial organization and topography of internal membranes, thereby expanding the capabilities of existing techniques towards in-vivo and ultra-low force scanning imaging.
This project is founded on a multidisciplinary approach in which fundamental optics, novel nanoparticle fabrication, and molecular and cellular biology are integrated. It has the potential to answer biophysical questions that have challenged the field for over two decades and to impact fields ranging from single-molecule biophysics to scanning-probe microscopy and nanorheology, provided ERC funding is granted."
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym BITCRUMBS
Project Towards a Reliable and Automated Analysis of Compromised Systems
Researcher (PI) Davide BALZAROTTI
Host Institution (HI) EURECOM
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Summary
"The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Max ERC Funding
1 991 504 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BLACK
Project The formation and evolution of massive black holes
Researcher (PI) Marta Volonteri
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Summary
"Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Max ERC Funding
1 668 385 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym blackQD
Project Optoelectronic of narrow band gap nanocrystals
Researcher (PI) Emmanuel LHUILLIER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE7, ERC-2017-STG
Summary Over the past decades, silicon became the most used material for electronic, however its indirect band gap limits its use for optics and optoelectronics. As a result alternatives semiconductor such as III-V and II-VI materials are used to address a broad range of complementary application such as LED, laser diode and photodiode. However in the infrared (IR), the material challenge becomes far more complex.
New IR applications, such as flame detection or night car driving assistance are emerging and request low cost detectors. Current technologies, based on epitaxially grown semiconductors are unlikely to bring a cost disruption and organic electronics, often viewed as the alternative to silicon based materials is ineffective in the mid-IR. The blackQD project aims at transforming colloidal quantum dots (CQD) into the next generation of active material for IR detection. CQD are attracting a high interest because of their size tunable optical features and next challenges is their integration in optoelectronic devices and in particular for IR features.
The project requires a combination of material knowledge, with clean room nanofabrication and IR photoconduction which is unique in Europe. I organize blackQD in three mains parts. The first part relates to the growth of mercury chalcogenides nanocrystals with unique tunable properties in the mid and far-IR. To design devices with enhanced properties, more needs to be known on the electronic structure of these nanomaterials. In part II, I propose to develop original methods to probe static and dynamic aspects of the electronic structure. Finally the main task of the project relates to the design of a new generation of transistors and IR detectors. I propose several geometries of demonstrator which for the first time integrate from the beginning the colloidal nature of the CQD and constrain of IR photodetection. The project more generally aims to develop a tool box for the design of the next generation of low cost IR.
Summary
Over the past decades, silicon became the most used material for electronic, however its indirect band gap limits its use for optics and optoelectronics. As a result alternatives semiconductor such as III-V and II-VI materials are used to address a broad range of complementary application such as LED, laser diode and photodiode. However in the infrared (IR), the material challenge becomes far more complex.
New IR applications, such as flame detection or night car driving assistance are emerging and request low cost detectors. Current technologies, based on epitaxially grown semiconductors are unlikely to bring a cost disruption and organic electronics, often viewed as the alternative to silicon based materials is ineffective in the mid-IR. The blackQD project aims at transforming colloidal quantum dots (CQD) into the next generation of active material for IR detection. CQD are attracting a high interest because of their size tunable optical features and next challenges is their integration in optoelectronic devices and in particular for IR features.
The project requires a combination of material knowledge, with clean room nanofabrication and IR photoconduction which is unique in Europe. I organize blackQD in three mains parts. The first part relates to the growth of mercury chalcogenides nanocrystals with unique tunable properties in the mid and far-IR. To design devices with enhanced properties, more needs to be known on the electronic structure of these nanomaterials. In part II, I propose to develop original methods to probe static and dynamic aspects of the electronic structure. Finally the main task of the project relates to the design of a new generation of transistors and IR detectors. I propose several geometries of demonstrator which for the first time integrate from the beginning the colloidal nature of the CQD and constrain of IR photodetection. The project more generally aims to develop a tool box for the design of the next generation of low cost IR.
Max ERC Funding
1 499 903 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym BLOC
Project Mathematical study of Boundary Layers in Oceanic Motions
Researcher (PI) Anne-Laure Perrine Dalibard
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Summary
Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Max ERC Funding
1 267 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym BLOWDISOL
Project "BLOW UP, DISPERSION AND SOLITONS"
Researcher (PI) Franck Merle
Host Institution (HI) UNIVERSITE DE CERGY-PONTOISE
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary "Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Summary
"Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Max ERC Funding
2 079 798 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym BODYBUILT
Project Building The Vertebrate Body
Researcher (PI) Olivier Pourquie
Host Institution (HI) CENTRE EUROPEEN DE RECHERCHE EN BIOLOGIE ET MEDECINE
Call Details Advanced Grant (AdG), LS3, ERC-2009-AdG
Summary My lab is interested in the development of the tissue that gives rise to vertebrae and skeletal muscles called the paraxial mesoderm. A striking feature of this tissue is its segmental organization and we have made major contributions to the understanding of the molecular control of the segmentation process. We identified a molecular oscillator associated to the rhythmic production of somites and proposed a model for vertebrate segmentation based on the integration of a rhythmic signaling pulse gated spatially by a system of traveling FGF and Wnt signaling gradients. We are also studying the differentiation of paraxial mesoderm precursors into the muscle, cartilage and dermis lineages. Our work identified the Wnt, FGF and Notch pathways as playing a prominent role in the patterning and differentiation of paraxial mesoderm. In this application, we largely focus on the molecular control of paraxial mesoderm development. Using microarray and high throughput sequencing-based approaches and bioinformatics, we will characterize the transcriptional network acting downstream of Wnt, FGF and Notch in the presomitic mesoderm (PSM). We will also use genetic and pharmacological approaches utilizing real-time imaging reporters to characterize the pacemaker of the segmentation clock in vivo, and also in vitro using differentiated embryonic stem cells. We further propose to characterize in detail a novel RA-dependent pathway that we identified and which controls the somite left-right symmetry. Our work is expected to have a strong impact in the field of congenital spine anomalies, currently an understudied biomedical problem, and will be of utility in elucidating the etiology and eventual prevention of these disorders. This work is also expected to further our understanding of the Notch, Wnt, FGF and RA signalling pathways which are involved in segmentation and in the establishment of the vertebrate body plan, and which play important roles in a wide array of human diseases.
Summary
My lab is interested in the development of the tissue that gives rise to vertebrae and skeletal muscles called the paraxial mesoderm. A striking feature of this tissue is its segmental organization and we have made major contributions to the understanding of the molecular control of the segmentation process. We identified a molecular oscillator associated to the rhythmic production of somites and proposed a model for vertebrate segmentation based on the integration of a rhythmic signaling pulse gated spatially by a system of traveling FGF and Wnt signaling gradients. We are also studying the differentiation of paraxial mesoderm precursors into the muscle, cartilage and dermis lineages. Our work identified the Wnt, FGF and Notch pathways as playing a prominent role in the patterning and differentiation of paraxial mesoderm. In this application, we largely focus on the molecular control of paraxial mesoderm development. Using microarray and high throughput sequencing-based approaches and bioinformatics, we will characterize the transcriptional network acting downstream of Wnt, FGF and Notch in the presomitic mesoderm (PSM). We will also use genetic and pharmacological approaches utilizing real-time imaging reporters to characterize the pacemaker of the segmentation clock in vivo, and also in vitro using differentiated embryonic stem cells. We further propose to characterize in detail a novel RA-dependent pathway that we identified and which controls the somite left-right symmetry. Our work is expected to have a strong impact in the field of congenital spine anomalies, currently an understudied biomedical problem, and will be of utility in elucidating the etiology and eventual prevention of these disorders. This work is also expected to further our understanding of the Notch, Wnt, FGF and RA signalling pathways which are involved in segmentation and in the establishment of the vertebrate body plan, and which play important roles in a wide array of human diseases.
Max ERC Funding
2 500 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym BoneImplant
Project Monitoring bone healing around endosseous implants: from multiscale modeling to the patient’s bed
Researcher (PI) Guillaume Loïc Haiat
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Summary
Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Max ERC Funding
1 992 154 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym BRAIN MICRO SNOOPER
Project A mimetic implant for low perturbation, stable stimulation and recording of neural units inside the brain.
Researcher (PI) Gaelle Offranc piret
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Developing brain implants is crucial to better decipher the neuronal information and intervene in a very thin way on neural networks using microstimulations. This project aims to address two major challenges: to achieve the realization of a highly mechanically stable implant, allowing long term connection between neurons and microelectrodes and to provide neural implants with a high temporal and spatial resolution. To do so, the present project will develop implants with structural and mechanical properties that resemble those of the natural brain environment. According to the literature, using electrodes and electric leads with a size of a few microns allows for a better neural tissue reconstruction around the implant. Also, the mechanical mismatch between the usually stiff implant material and the soft brain tissue affects the adhesion between tissue cells and electrodes. With the objective to implant a highly flexible free-floating microelectrode array in the brain tissue, we will develop a new method using micro-nanotechnology steps as well as a combination of polymers. Moreover, the literature and preliminary studies indicate that some surface chemistries and nanotopographies can promote neurite outgrowth while limiting glial cell proliferation. Implants will be nanostructured so as to help the neural tissue growth and to be provided with a highly adhesive property, which will ensure its stable contact with the brain neural tissue over time. Implants with different microelectrode configurations and number will be tested in vitro and in vivo for their biocompatibility and their ability to record and stimulate neurons with high stability. This project will produce high-performance generic implants that can be used for various fundamental studies and applications, including neural prostheses and brain machine interfaces.
Summary
Developing brain implants is crucial to better decipher the neuronal information and intervene in a very thin way on neural networks using microstimulations. This project aims to address two major challenges: to achieve the realization of a highly mechanically stable implant, allowing long term connection between neurons and microelectrodes and to provide neural implants with a high temporal and spatial resolution. To do so, the present project will develop implants with structural and mechanical properties that resemble those of the natural brain environment. According to the literature, using electrodes and electric leads with a size of a few microns allows for a better neural tissue reconstruction around the implant. Also, the mechanical mismatch between the usually stiff implant material and the soft brain tissue affects the adhesion between tissue cells and electrodes. With the objective to implant a highly flexible free-floating microelectrode array in the brain tissue, we will develop a new method using micro-nanotechnology steps as well as a combination of polymers. Moreover, the literature and preliminary studies indicate that some surface chemistries and nanotopographies can promote neurite outgrowth while limiting glial cell proliferation. Implants will be nanostructured so as to help the neural tissue growth and to be provided with a highly adhesive property, which will ensure its stable contact with the brain neural tissue over time. Implants with different microelectrode configurations and number will be tested in vitro and in vivo for their biocompatibility and their ability to record and stimulate neurons with high stability. This project will produce high-performance generic implants that can be used for various fundamental studies and applications, including neural prostheses and brain machine interfaces.
Max ERC Funding
1 499 850 €
Duration
Start date: 2015-08-01, End date: 2021-07-31
Project acronym BrainConquest
Project Boosting Brain-Computer Communication with high Quality User Training
Researcher (PI) Fabien LOTTE
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Brain-Computer Interfaces (BCIs) are communication systems that enable users to send commands to computers through brain signals only, by measuring and processing these signals. Making computer control possible without any physical activity, BCIs have promised to revolutionize many application areas, notably assistive technologies, e.g., for wheelchair control, and human-machine interaction. Despite this promising potential, BCIs are still barely used outside laboratories, due to their current poor reliability. For instance, BCIs only using two imagined hand movements as mental commands decode, on average, less than 80% of these commands correctly, while 10 to 30% of users cannot control a BCI at all.
A BCI should be considered a co-adaptive communication system: its users learn to encode commands in their brain signals (with mental imagery) that the machine learns to decode using signal processing. Most research efforts so far have been dedicated to decoding the commands. However, BCI control is a skill that users have to learn too. Unfortunately how BCI users learn to encode the commands is essential but is barely studied, i.e., fundamental knowledge about how users learn BCI control is lacking. Moreover standard training approaches are only based on heuristics, without satisfying human learning principles. Thus, poor BCI reliability is probably largely due to highly suboptimal user training.
In order to obtain a truly reliable BCI we need to completely redefine user training approaches. To do so, I propose to study and statistically model how users learn to encode BCI commands. Then, based on human learning principles and this model, I propose to create a new generation of BCIs which ensure that users learn how to successfully encode commands with high signal-to-noise ratio in their brain signals, hence making BCIs dramatically more reliable. Such a reliable BCI could positively change human-machine interaction as BCIs have promised but failed to do so far.
Summary
Brain-Computer Interfaces (BCIs) are communication systems that enable users to send commands to computers through brain signals only, by measuring and processing these signals. Making computer control possible without any physical activity, BCIs have promised to revolutionize many application areas, notably assistive technologies, e.g., for wheelchair control, and human-machine interaction. Despite this promising potential, BCIs are still barely used outside laboratories, due to their current poor reliability. For instance, BCIs only using two imagined hand movements as mental commands decode, on average, less than 80% of these commands correctly, while 10 to 30% of users cannot control a BCI at all.
A BCI should be considered a co-adaptive communication system: its users learn to encode commands in their brain signals (with mental imagery) that the machine learns to decode using signal processing. Most research efforts so far have been dedicated to decoding the commands. However, BCI control is a skill that users have to learn too. Unfortunately how BCI users learn to encode the commands is essential but is barely studied, i.e., fundamental knowledge about how users learn BCI control is lacking. Moreover standard training approaches are only based on heuristics, without satisfying human learning principles. Thus, poor BCI reliability is probably largely due to highly suboptimal user training.
In order to obtain a truly reliable BCI we need to completely redefine user training approaches. To do so, I propose to study and statistically model how users learn to encode BCI commands. Then, based on human learning principles and this model, I propose to create a new generation of BCIs which ensure that users learn how to successfully encode commands with high signal-to-noise ratio in their brain signals, hence making BCIs dramatically more reliable. Such a reliable BCI could positively change human-machine interaction as BCIs have promised but failed to do so far.
Max ERC Funding
1 498 751 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym BrainMicroFlow
Project Brain Microcirculation : Numerical simulation for inter-species translation with applications in human health
Researcher (PI) Sylvie, Jeanine Lejeune Ép Lorthois
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary The cerebral microvascular system is essential to a large variety of physiological processes in the brain, including blood delivery and blood flow regulation as a function of neuronal activity (neuro-vascular coupling). It plays a major role in the associated mechanisms leading to disease (stroke, neurodegenerative diseases, …). In the last decade, cutting edge technologies, including two-photon scanning laser microscopy (TPSLM) and optical manipulation of blood flow, have produced huge amounts of anatomic and functional experimental data in normal and Alzheimer Disease (AD) mice. These require accurate, highly quantitative, physiologically informed modeling and analysis for any coherent understanding and for translating results between species.
In this context, our first aim is to develop a general methodological framework for physiologically informed microvascular fluid dynamics modeling, understood in a broad meaning, i.e. blood flow, molecule transport and resulting functional imaging signals or signal surrogates.
Our second aim is to validate this methodological framework by direct comparison of in vivo anatomical and functional TPSLM measurements with the simulation results based on the same anatomical data.
The third objective is to exploit these methodologies in order to identify the logic of the structure/function relationships of brain microcirculation and neurovascular coupling, in human health and disease, with a focus on the role of vascular factors in AD.
Specific hypotheses on how vascular changes in AD affect both vascular function and neurovascular coupling can be experimentally tested in animal models of AD. Crucially, similar anatomical (but not functional) data can be acquired in healthy and AD humans. This will enable us to model how AD-induced vascular alterations could affect human patients. Ultimately, it provides us with new avenues for design and/or evaluation of improved diagnosis/preventive/treatment strategies.
Summary
The cerebral microvascular system is essential to a large variety of physiological processes in the brain, including blood delivery and blood flow regulation as a function of neuronal activity (neuro-vascular coupling). It plays a major role in the associated mechanisms leading to disease (stroke, neurodegenerative diseases, …). In the last decade, cutting edge technologies, including two-photon scanning laser microscopy (TPSLM) and optical manipulation of blood flow, have produced huge amounts of anatomic and functional experimental data in normal and Alzheimer Disease (AD) mice. These require accurate, highly quantitative, physiologically informed modeling and analysis for any coherent understanding and for translating results between species.
In this context, our first aim is to develop a general methodological framework for physiologically informed microvascular fluid dynamics modeling, understood in a broad meaning, i.e. blood flow, molecule transport and resulting functional imaging signals or signal surrogates.
Our second aim is to validate this methodological framework by direct comparison of in vivo anatomical and functional TPSLM measurements with the simulation results based on the same anatomical data.
The third objective is to exploit these methodologies in order to identify the logic of the structure/function relationships of brain microcirculation and neurovascular coupling, in human health and disease, with a focus on the role of vascular factors in AD.
Specific hypotheses on how vascular changes in AD affect both vascular function and neurovascular coupling can be experimentally tested in animal models of AD. Crucially, similar anatomical (but not functional) data can be acquired in healthy and AD humans. This will enable us to model how AD-induced vascular alterations could affect human patients. Ultimately, it provides us with new avenues for design and/or evaluation of improved diagnosis/preventive/treatment strategies.
Max ERC Funding
1 999 873 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym BREAD
Project Breaking the curse of dimensionality: numerical challenges in high dimensional analysis and simulation
Researcher (PI) Albert Cohen
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Summary
"This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Max ERC Funding
1 848 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym BRIDGING
Project The function of membrane tethering in plant intercellular communication
Researcher (PI) Emmanuelle Maria Françoise Bayer
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), LS3, ERC-2017-COG
Summary Intercellular communication is critical for multicellularity. It coordinates the activities within individual cells to support the function of an organism as a whole. Plants have developed remarkable cellular machines -the Plasmodesmata (PD) pores- which interconnect every single cell within the plant body, establishing direct membrane and cytoplasmic continuity, a situation unique to plants. PD are indispensable for plant life. They control the flux of molecules between cells and are decisive for development, environmental adaptation and defence signalling. However, how PD integrate signalling to coordinate responses at a multicellular level remains unclear.
A striking feature of PD organisation, setting them apart from animal cell junctions, is a strand of endoplasmic reticulum (ER) running through the pore, tethered extremely tight (~10nm) to the plasma membrane (PM) by unidentified “spokes”. To date, the function of ER-PM contacts at PD remains a complete enigma. We don’t know how and why the two organelles come together at PD cellular junctions.
I recently proposed that ER-PM tethering is in fact central to PD function. In this project I will investigate the question of how integrated cellular responses benefit from organelle cross-talk at PD. The project integrates proteomic/bioinformatic approaches, biophysical/modelling methods and ultra-high resolution 3D imaging into molecular cell biology of plant cell-to-cell communication and will, for the first time, directly address the mechanism and function of ER-PM contacts at PD. We will pursue three complementary objectives to attain our goal: 1) Identify the mechanisms of PD membrane-tethering at the molecular level 2) Elucidate the dynamics and 3D architecture of ER-PM contact sites at PD 3) Uncover the function of ER-PM apposition for plant intercellular communication. Overall, the project will pioneer a radically new perspective on PD-mediated cell-to-cell communication, a fundamental aspect of plant biology
Summary
Intercellular communication is critical for multicellularity. It coordinates the activities within individual cells to support the function of an organism as a whole. Plants have developed remarkable cellular machines -the Plasmodesmata (PD) pores- which interconnect every single cell within the plant body, establishing direct membrane and cytoplasmic continuity, a situation unique to plants. PD are indispensable for plant life. They control the flux of molecules between cells and are decisive for development, environmental adaptation and defence signalling. However, how PD integrate signalling to coordinate responses at a multicellular level remains unclear.
A striking feature of PD organisation, setting them apart from animal cell junctions, is a strand of endoplasmic reticulum (ER) running through the pore, tethered extremely tight (~10nm) to the plasma membrane (PM) by unidentified “spokes”. To date, the function of ER-PM contacts at PD remains a complete enigma. We don’t know how and why the two organelles come together at PD cellular junctions.
I recently proposed that ER-PM tethering is in fact central to PD function. In this project I will investigate the question of how integrated cellular responses benefit from organelle cross-talk at PD. The project integrates proteomic/bioinformatic approaches, biophysical/modelling methods and ultra-high resolution 3D imaging into molecular cell biology of plant cell-to-cell communication and will, for the first time, directly address the mechanism and function of ER-PM contacts at PD. We will pursue three complementary objectives to attain our goal: 1) Identify the mechanisms of PD membrane-tethering at the molecular level 2) Elucidate the dynamics and 3D architecture of ER-PM contact sites at PD 3) Uncover the function of ER-PM apposition for plant intercellular communication. Overall, the project will pioneer a radically new perspective on PD-mediated cell-to-cell communication, a fundamental aspect of plant biology
Max ERC Funding
1 999 840 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym BrightSens
Project Ultrabright Turn-on Fluorescent Organic Nanoparticles for Amplified Molecular Sensing in Living Cells
Researcher (PI) Andrii Andrey Klymchenko
Host Institution (HI) UNIVERSITE DE STRASBOURG
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary Existing fluorescent molecular probes, due to limited brightness, do not allow imaging individual biomolecules directly in living cells, whereas bright fluorescent nanoparticles are unable to respond to single molecular stimuli and their inorganic core is not biodegradable. The aim of BrightSens is to develop ultrabright fluorescent organic nanoparticles (FONs) capable to convert single molecular stimuli into collective turn-on response of >100 encapsulated dyes, and to apply them in amplified molecular sensing of specific targets at the cell surface (receptors) and in the cytosol (mRNA). The project is composed of three work packages. (1) Synthesis of FONs: Dye-doped polymer and micellar FONs will be obtained by self-assembly. Molecular design of dyes and the use of bulky hydrophobic counterions will enable precise control of dyes organization inside FONs, which will resolve the fundamental problems of self-quenching and cooperative on/off switching in dye ensembles. (2) Synthesis of nanoprobes: Using cooperative Forster Resonance Energy Transfer from FONs to originally designed acceptor-sensor unit, we propose synthesis of the first nanoprobes that (a) undergo complete turn-on or colour switch in response to single molecular targets and (b) harvest light energy into photochemical disruption of cell membrane barriers. (3) Cellular applications: The obtained nanoprobes will be applied in 2D and 3D cultures of cancer cells for background-free single-molecule detection of membrane receptors and intracellular mRNA, which are important markers of cancer and apoptosis. An original concept of amplified photochemical internalization is proposed to trigger by light entry of nanoprobes into the cytosol. This high-risk/high-gain multidisciplinary project will result in new organic nanomaterials with unique photophysical properties that will enable visualization of biomolecules at work in living cells with expected impact on cancer research.
Summary
Existing fluorescent molecular probes, due to limited brightness, do not allow imaging individual biomolecules directly in living cells, whereas bright fluorescent nanoparticles are unable to respond to single molecular stimuli and their inorganic core is not biodegradable. The aim of BrightSens is to develop ultrabright fluorescent organic nanoparticles (FONs) capable to convert single molecular stimuli into collective turn-on response of >100 encapsulated dyes, and to apply them in amplified molecular sensing of specific targets at the cell surface (receptors) and in the cytosol (mRNA). The project is composed of three work packages. (1) Synthesis of FONs: Dye-doped polymer and micellar FONs will be obtained by self-assembly. Molecular design of dyes and the use of bulky hydrophobic counterions will enable precise control of dyes organization inside FONs, which will resolve the fundamental problems of self-quenching and cooperative on/off switching in dye ensembles. (2) Synthesis of nanoprobes: Using cooperative Forster Resonance Energy Transfer from FONs to originally designed acceptor-sensor unit, we propose synthesis of the first nanoprobes that (a) undergo complete turn-on or colour switch in response to single molecular targets and (b) harvest light energy into photochemical disruption of cell membrane barriers. (3) Cellular applications: The obtained nanoprobes will be applied in 2D and 3D cultures of cancer cells for background-free single-molecule detection of membrane receptors and intracellular mRNA, which are important markers of cancer and apoptosis. An original concept of amplified photochemical internalization is proposed to trigger by light entry of nanoprobes into the cytosol. This high-risk/high-gain multidisciplinary project will result in new organic nanomaterials with unique photophysical properties that will enable visualization of biomolecules at work in living cells with expected impact on cancer research.
Max ERC Funding
1 999 750 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BubbleBoost
Project Microfluidic bubbles for novel applications: acoustic laser and ultrasonically controlled swimming microrobots
Researcher (PI) Philippe, Guy, Marie Marmottant
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary Microfluidic techniques developed since the year 2000 have now matured to provide a unique tool to produce large amounts of microbubbles that are not only finely tuned in size, but that can also be embedded in tiny microfabricated structures.
In the present proposal, we plan to take advantage of these novel microfabrication techniques to develop two innovative acoustic applications. These applications, which were out of reach without current techniques, are based on the use of microbubbles with a huge acoustic resonance. The project is structured in two parts that only differ in the way bubbles are embedded in microfluidic environments:
1) Arrays of bubbles: Acoustic Laser
This first part is the development of an acoustic laser, based on microbubbles trapped in a microfluidic circuit. To obtain the conditions for an acoustic laser, arrays of microbubbles will be designed so that they bubbles pulsate in phase, reemitting their energy coherently. The applications are novel systems for high ultrasonic emission power, or meta-materials that store vibration energy.
2) Mobile “armoured” bubbles: swimming micro-robots remotely powered by ultrasound
The second part is the conception of ultrasonically activated microswimming devices, with microbubbles embedded within freely moving objects. Their application is to behave as carriers, such as drug carriers, activated at a distance, or to be active tracers that enhance mixing. Microswimmers are mechanical analogues to RFID devices (where electromagnetic vibration is converted into current), here sound is converted into motion at small scales.
Both parts include the same three complementary steps: step 1 is the 3D microfabrication of the geometry where bubbles are embedded, step 2 is their ultrasonic activation, and then step 3 is the optimisation of their resonance by a study of individual resonators.
Summary
Microfluidic techniques developed since the year 2000 have now matured to provide a unique tool to produce large amounts of microbubbles that are not only finely tuned in size, but that can also be embedded in tiny microfabricated structures.
In the present proposal, we plan to take advantage of these novel microfabrication techniques to develop two innovative acoustic applications. These applications, which were out of reach without current techniques, are based on the use of microbubbles with a huge acoustic resonance. The project is structured in two parts that only differ in the way bubbles are embedded in microfluidic environments:
1) Arrays of bubbles: Acoustic Laser
This first part is the development of an acoustic laser, based on microbubbles trapped in a microfluidic circuit. To obtain the conditions for an acoustic laser, arrays of microbubbles will be designed so that they bubbles pulsate in phase, reemitting their energy coherently. The applications are novel systems for high ultrasonic emission power, or meta-materials that store vibration energy.
2) Mobile “armoured” bubbles: swimming micro-robots remotely powered by ultrasound
The second part is the conception of ultrasonically activated microswimming devices, with microbubbles embedded within freely moving objects. Their application is to behave as carriers, such as drug carriers, activated at a distance, or to be active tracers that enhance mixing. Microswimmers are mechanical analogues to RFID devices (where electromagnetic vibration is converted into current), here sound is converted into motion at small scales.
Both parts include the same three complementary steps: step 1 is the 3D microfabrication of the geometry where bubbles are embedded, step 2 is their ultrasonic activation, and then step 3 is the optimisation of their resonance by a study of individual resonators.
Max ERC Funding
1 856 542 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym ByoPiC
Project The Baryon Picture of the Cosmos
Researcher (PI) nabila AGHANIM
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Summary
The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Max ERC Funding
2 488 350 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym C0PEP0D
Project Life and death of a virtual copepod in turbulence
Researcher (PI) Christophe ELOY
Host Institution (HI) ECOLE CENTRALE DE MARSEILLE EGIM
Call Details Advanced Grant (AdG), PE8, ERC-2018-ADG
Summary Life is tough for planktonic copepods, constantly washed by turbulent flows. Yet, these millimetric crustaceans dominate the oceans in numbers. What have made them so successful? Copepod antennae are covered with hydrodynamic and chemical sensing hairs that allow copepods to detect preys, predators and mates, although they are blind. How do copepods process this sensing information? How do they extract a meaningful signal from turbulence noise? Today, we do not know.
C0PEP0D hypothesises that reinforcement learning tools can decipher how copepod process hydrodynamic and chemical sensing. Copepods face a problem similar to speech recognition or object detection, two common applications of reinforcement learning. However, copepods only have 1000 neurons, much less than in most artificial neural networks. To approach the simple brain of copepods, we will use Darwinian evolution together with reinforcement learning, with the goal of finding minimal neural networks able to learn.
If we are to build a learning virtual copepod, challenging problems are ahead: we need fast methods to simulate turbulence and animal-flow interactions, new models of hydrodynamic signalling at finite Reynolds number, innovative reinforcement learning algorithms that embrace evolution and experiments with real copepods in turbulence. With these theoretical, numerical and experimental tools, we will address three questions:
Q1: Mating. How do male copepods follow the pheromone trail left by females?
Q2: Finding. How do copepods use hydrodynamic signals to ‘see’?
Q3: Feeding. What are the best feeding strategies in turbulent flow?
C0PEP0D will decipher how copepods process sensing information, but not only that. Because evolution is explicitly considered, it will offer a new perspective on marine ecology and evolution that could inspire artificial sensors. The evolutionary approach of reinforcement learning also offers a promising tool to tackle complex problems in biology and engineering.
Summary
Life is tough for planktonic copepods, constantly washed by turbulent flows. Yet, these millimetric crustaceans dominate the oceans in numbers. What have made them so successful? Copepod antennae are covered with hydrodynamic and chemical sensing hairs that allow copepods to detect preys, predators and mates, although they are blind. How do copepods process this sensing information? How do they extract a meaningful signal from turbulence noise? Today, we do not know.
C0PEP0D hypothesises that reinforcement learning tools can decipher how copepod process hydrodynamic and chemical sensing. Copepods face a problem similar to speech recognition or object detection, two common applications of reinforcement learning. However, copepods only have 1000 neurons, much less than in most artificial neural networks. To approach the simple brain of copepods, we will use Darwinian evolution together with reinforcement learning, with the goal of finding minimal neural networks able to learn.
If we are to build a learning virtual copepod, challenging problems are ahead: we need fast methods to simulate turbulence and animal-flow interactions, new models of hydrodynamic signalling at finite Reynolds number, innovative reinforcement learning algorithms that embrace evolution and experiments with real copepods in turbulence. With these theoretical, numerical and experimental tools, we will address three questions:
Q1: Mating. How do male copepods follow the pheromone trail left by females?
Q2: Finding. How do copepods use hydrodynamic signals to ‘see’?
Q3: Feeding. What are the best feeding strategies in turbulent flow?
C0PEP0D will decipher how copepods process sensing information, but not only that. Because evolution is explicitly considered, it will offer a new perspective on marine ecology and evolution that could inspire artificial sensors. The evolutionary approach of reinforcement learning also offers a promising tool to tackle complex problems in biology and engineering.
Max ERC Funding
2 215 794 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym Calcyan
Project A living carbonate factory: how do cyanobacteria make rocks? (Calcification in Cyanobacteria)
Researcher (PI) Karim Benzerara
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2012-StG_20111012
Summary This interdisciplinary proposal stems from our recent discovery of deep-branching cyanobacteria that form intracellular Ca-Mg-Sr-Ba carbonates. So far, calcification by cyanobacteria was considered as exclusively extracellular, hence dependent on external conditions. The existence of intracellularly calcifying cyanobacteria may thus deeply modify our view on the role of cyanobacteria in the formation of modern and past carbonate deposits and the degree of control they achieve on this geochemically significant process. Moreover, since these cyanobacteria concentrate selectively Sr and Ba over Ca, it suggests the existence of processes that can alter the message conveyed by proxies such as Sr/Ca ratios in carbonates, classically used for paleoenvironmental reconstruction. Finally, such a biomineralization process, if globally significant may impact our view of how an ecosystem responds to external CO2 changes in particular by affecting most likely a key parameter such as the balance between organic carbon fixed by photosynthesis and inorganic carbon fixed by CaCO3 precipitation.
Here, I aim to bring a qualitative jump in the understanding of this process. The core of this project is to provide a detailed picture of intracellular calcification by cyanobacteria. This will be achieved by studying laboratory cultures of cyanobacteria, field samples of modern calcifying biofilms and ancient microbialites. Diverse tools from molecular biology, biochemistry, mineralogy and geochemistry will be used. Altogether these techniques will help unveiling the molecular and mineralogical mechanisms involved in cyanobacterial intracellular calcification, assessing the phylogenetic diversity of these cyanobacteria and the preservability of their traces in ancient rocks. My goal is to establish a unique expertise in the study of calcification by cyanobacteria, the scope of which can be developed and broadened in the future for the study of interactions between life and minerals.
Summary
This interdisciplinary proposal stems from our recent discovery of deep-branching cyanobacteria that form intracellular Ca-Mg-Sr-Ba carbonates. So far, calcification by cyanobacteria was considered as exclusively extracellular, hence dependent on external conditions. The existence of intracellularly calcifying cyanobacteria may thus deeply modify our view on the role of cyanobacteria in the formation of modern and past carbonate deposits and the degree of control they achieve on this geochemically significant process. Moreover, since these cyanobacteria concentrate selectively Sr and Ba over Ca, it suggests the existence of processes that can alter the message conveyed by proxies such as Sr/Ca ratios in carbonates, classically used for paleoenvironmental reconstruction. Finally, such a biomineralization process, if globally significant may impact our view of how an ecosystem responds to external CO2 changes in particular by affecting most likely a key parameter such as the balance between organic carbon fixed by photosynthesis and inorganic carbon fixed by CaCO3 precipitation.
Here, I aim to bring a qualitative jump in the understanding of this process. The core of this project is to provide a detailed picture of intracellular calcification by cyanobacteria. This will be achieved by studying laboratory cultures of cyanobacteria, field samples of modern calcifying biofilms and ancient microbialites. Diverse tools from molecular biology, biochemistry, mineralogy and geochemistry will be used. Altogether these techniques will help unveiling the molecular and mineralogical mechanisms involved in cyanobacterial intracellular calcification, assessing the phylogenetic diversity of these cyanobacteria and the preservability of their traces in ancient rocks. My goal is to establish a unique expertise in the study of calcification by cyanobacteria, the scope of which can be developed and broadened in the future for the study of interactions between life and minerals.
Max ERC Funding
1 659 478 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym CALENDS
Project Clusters And LENsing of Distant Sources
Researcher (PI) Johan Pierre Richard
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary Some of the primary questions in extragalactic astronomy concern the formation and evolution of galaxies in the distant Universe. In particular, little is known about the less luminous (and therefore less massive) galaxy populations, which are currently missed from large observing surveys and could contribute significantly to the overall star formation happening at early times. One way to overcome the current observing limitations prior to the arrival of the future James Webb Space Telescope or the European Extremely Large Telescopes is to use the natural magnification of strong lensing clusters to look at distant sources with an improved sensitivity and resolution.
The aim of CALENDS is to build and study in great details a large sample of accurately-modelled, strongly lensed galaxies at high redshift (1<z<5) selected in the fields of massive clusters, and compare them with the more luminous or lower redshift populations. We will develop novel techniques in this process, in order to improve the accuracy of strong-lensing models and precisely determine the mass content of these clusters. By performing a systematic modelling of the cluster sample we will look into the relative distribution of baryons and dark matter as well as the amount of substructure in cluster cores. Regarding the population of lensed galaxies, we will study their global properties through a multiwavelength analysis covering the optical to millimeter domains, including spectroscopic information from MUSE and KMOS on the VLT, and ALMA.
We will look for scaling relations between the stellar, gas and dust parameters, and compare them with known relations for lower redshift and more massive galaxy samples. For the most extended sources, we will be able to spatially resolve their inner properties, and compare the results of individual regions with predictions from simulations. We will look into key physical processes: star formation, gas accretion, inflows and outflows, in these distant sources.
Summary
Some of the primary questions in extragalactic astronomy concern the formation and evolution of galaxies in the distant Universe. In particular, little is known about the less luminous (and therefore less massive) galaxy populations, which are currently missed from large observing surveys and could contribute significantly to the overall star formation happening at early times. One way to overcome the current observing limitations prior to the arrival of the future James Webb Space Telescope or the European Extremely Large Telescopes is to use the natural magnification of strong lensing clusters to look at distant sources with an improved sensitivity and resolution.
The aim of CALENDS is to build and study in great details a large sample of accurately-modelled, strongly lensed galaxies at high redshift (1<z<5) selected in the fields of massive clusters, and compare them with the more luminous or lower redshift populations. We will develop novel techniques in this process, in order to improve the accuracy of strong-lensing models and precisely determine the mass content of these clusters. By performing a systematic modelling of the cluster sample we will look into the relative distribution of baryons and dark matter as well as the amount of substructure in cluster cores. Regarding the population of lensed galaxies, we will study their global properties through a multiwavelength analysis covering the optical to millimeter domains, including spectroscopic information from MUSE and KMOS on the VLT, and ALMA.
We will look for scaling relations between the stellar, gas and dust parameters, and compare them with known relations for lower redshift and more massive galaxy samples. For the most extended sources, we will be able to spatially resolve their inner properties, and compare the results of individual regions with predictions from simulations. We will look into key physical processes: star formation, gas accretion, inflows and outflows, in these distant sources.
Max ERC Funding
1 450 992 €
Duration
Start date: 2013-09-01, End date: 2019-08-31
Project acronym CARB-City
Project Physico-Chemistry of Carbonaceous Aerosol Pollution in Evolving Cities
Researcher (PI) Alma Hodzic
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Carbonaceous aerosols (organic and black carbon) remain a major unresolved issue in atmospheric science, especially in urban centers, where they are one of the dominant aerosol constituents and among most toxic to human health. The challenge is twofold: first, our understanding of the sources, sinks and physico-chemical properties of the complex mixture of carbonaceous species is still incomplete; and second, the representation of urban heterogeneities in air quality models is inadequate as they are designed for regional applications.
The CARB-City project proposes the development of an innovative modeling framework that will address both issues by combining molecular-level chemical constraints and city-scale modeling to achieve the following objectives: (WP1) to develop and apply new chemical parameterizations, constrained by an explicit chemical model, for carbonaceous aerosol formation from urban precursors, and (WP2) to examine whether urban heterogeneities in sources and mixing can enhance non-linearities in chemistry of carbonaceous compounds and modify their predicted composition. The new modeling framework will then be applied (WP3) to quantify the contribution of traditional and emerging urban aerosol precursor sources to chemistry and toxicity of carbonaceous aerosols; and (WP4) to assess the effectiveness of greener-city strategies in removing aerosol pollutants.
This work will enhance fundamental scientific understanding as to how key physico-chemical processes control the lifecycle of carbonaceous aerosols in cities, and will improve the predictability of air quality models in terms of composition and toxicity of urban aerosols, and their sensitivity to changes in energy and land use that cities are currently experiencing. The modeling framework will have the required chemical and spatial resolution for assessing human exposure to urban aerosols. This will allow policy makers to optimize urban emission reductions and sustainable urban development.
Summary
Carbonaceous aerosols (organic and black carbon) remain a major unresolved issue in atmospheric science, especially in urban centers, where they are one of the dominant aerosol constituents and among most toxic to human health. The challenge is twofold: first, our understanding of the sources, sinks and physico-chemical properties of the complex mixture of carbonaceous species is still incomplete; and second, the representation of urban heterogeneities in air quality models is inadequate as they are designed for regional applications.
The CARB-City project proposes the development of an innovative modeling framework that will address both issues by combining molecular-level chemical constraints and city-scale modeling to achieve the following objectives: (WP1) to develop and apply new chemical parameterizations, constrained by an explicit chemical model, for carbonaceous aerosol formation from urban precursors, and (WP2) to examine whether urban heterogeneities in sources and mixing can enhance non-linearities in chemistry of carbonaceous compounds and modify their predicted composition. The new modeling framework will then be applied (WP3) to quantify the contribution of traditional and emerging urban aerosol precursor sources to chemistry and toxicity of carbonaceous aerosols; and (WP4) to assess the effectiveness of greener-city strategies in removing aerosol pollutants.
This work will enhance fundamental scientific understanding as to how key physico-chemical processes control the lifecycle of carbonaceous aerosols in cities, and will improve the predictability of air quality models in terms of composition and toxicity of urban aerosols, and their sensitivity to changes in energy and land use that cities are currently experiencing. The modeling framework will have the required chemical and spatial resolution for assessing human exposure to urban aerosols. This will allow policy makers to optimize urban emission reductions and sustainable urban development.
Max ERC Funding
1 727 009 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym CARBONFIX
Project Towards a Self-Amplifying Carbon-Fixing Anabolic Cycle
Researcher (PI) Joseph Moran
Host Institution (HI) CENTRE INTERNATIONAL DE RECHERCHE AUX FRONTIERES DE LA CHIMIE FONDATION
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary How can simple molecules self-organize into a growing synthetic reaction network like biochemical metabolism? This proposal takes a novel synthesis-driven approach to the question by mimicking a central self-amplifying CO2-fixing biochemical reaction cycle known as the reductive tricarboxylic acid cycle. The intermediates of this cycle are the synthetic precursors to all major classes of biomolecules and are built from CO2, an anhydride and electrons from simple reducing agents. Based on the nature of the reactions in the cycle and the specific structural features of the intermediates that comprise it, we propose that the entire cycle may be enabled in a single reaction vessel with a surprisingly small number of simple, mutually compatible catalysts from the recent synthetic organic literature. However, since one of the required reactions does not yet have an efficient synthetic equivalent in the literature and since those that do have not yet been carried out sequentially in a single reaction vessel, we will first independently develop the new reaction and sequences before attempting to combine them into the entire cycle. The new reaction and sequences will be useful green synthetic methods in their own right. Most significantly, this endeavour could provide the first experimental evidence of an exciting new alternative model for early biochemical evolution that finally illuminates the origins and necessity of biochemistry’s core reactions.
Summary
How can simple molecules self-organize into a growing synthetic reaction network like biochemical metabolism? This proposal takes a novel synthesis-driven approach to the question by mimicking a central self-amplifying CO2-fixing biochemical reaction cycle known as the reductive tricarboxylic acid cycle. The intermediates of this cycle are the synthetic precursors to all major classes of biomolecules and are built from CO2, an anhydride and electrons from simple reducing agents. Based on the nature of the reactions in the cycle and the specific structural features of the intermediates that comprise it, we propose that the entire cycle may be enabled in a single reaction vessel with a surprisingly small number of simple, mutually compatible catalysts from the recent synthetic organic literature. However, since one of the required reactions does not yet have an efficient synthetic equivalent in the literature and since those that do have not yet been carried out sequentially in a single reaction vessel, we will first independently develop the new reaction and sequences before attempting to combine them into the entire cycle. The new reaction and sequences will be useful green synthetic methods in their own right. Most significantly, this endeavour could provide the first experimental evidence of an exciting new alternative model for early biochemical evolution that finally illuminates the origins and necessity of biochemistry’s core reactions.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym CASTLES
Project Charge And Spin in TopologicaL Edge States
Researcher (PI) ERWANN YANN EMILE BOCQUILLON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Topology provides mathematical tools to sort objects according to global properties regardless of local details, and manifests itself in various fields of physics. In solid-state physics, specific topological properties of the band structure, such as a band inversion, can for example robustly enforce the appearance of spin-polarized conducting states at the boundaries of the material, while its bulk remains insulating. The boundary states of these ‘topological insulators’ in fact provide a support system to encode information non-locally in ‘topological quantum bits’ robust to local perturbations. The emerging ‘topological quantum computation’ is as such an envisioned solution to decoherence problems in the realization of quantum computers. Despite immense theoretical and experimental efforts, the rise of these new materials has however been hampered by strong difficulties to observe robust and clear signatures of their predicted properties such as spin-polarization or perfect conductance.
These challenges strongly motivate my proposal to study two-dimensional topological insulators, and in particular explore the unknown dynamics of their topological edge states in normal and superconducting regimes. First it is possible to capture information both on charge and spin dynamics, and more clearly highlight the basic properties of topological edge states. Second, the dynamics reveals the effects of Coulomb interactions, an unexplored aspect that may explain the fragility of topological edge states. Finally, it enables the manipulation and characterization of quantum states on short time scales, relevant to quantum information processing. This project relies on the powerful toolbox offered by radiofrequency and current-correlations techniques and promises to open a new field of dynamical explorations of topological materials.
Summary
Topology provides mathematical tools to sort objects according to global properties regardless of local details, and manifests itself in various fields of physics. In solid-state physics, specific topological properties of the band structure, such as a band inversion, can for example robustly enforce the appearance of spin-polarized conducting states at the boundaries of the material, while its bulk remains insulating. The boundary states of these ‘topological insulators’ in fact provide a support system to encode information non-locally in ‘topological quantum bits’ robust to local perturbations. The emerging ‘topological quantum computation’ is as such an envisioned solution to decoherence problems in the realization of quantum computers. Despite immense theoretical and experimental efforts, the rise of these new materials has however been hampered by strong difficulties to observe robust and clear signatures of their predicted properties such as spin-polarization or perfect conductance.
These challenges strongly motivate my proposal to study two-dimensional topological insulators, and in particular explore the unknown dynamics of their topological edge states in normal and superconducting regimes. First it is possible to capture information both on charge and spin dynamics, and more clearly highlight the basic properties of topological edge states. Second, the dynamics reveals the effects of Coulomb interactions, an unexplored aspect that may explain the fragility of topological edge states. Finally, it enables the manipulation and characterization of quantum states on short time scales, relevant to quantum information processing. This project relies on the powerful toolbox offered by radiofrequency and current-correlations techniques and promises to open a new field of dynamical explorations of topological materials.
Max ERC Funding
1 499 940 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CELLO
Project From Cells to Organs on Chips: Development of an Integrative Microfluidic Platform
Researcher (PI) Jean-Louis Viovy
Host Institution (HI) INSTITUT CURIE
Call Details Advanced Grant (AdG), PE3, ERC-2012-ADG_20120216
Summary We shall develop a microfluidic and microsystems toolbox allowing the construction and study of complex cellular assemblies (“tissue or organ mimics on chip”), in a highly controlled and parallelized way. This platform will allow the selection of specific cells from one or several populations, their deterministic positioning and/or connection relative to each other, yielding functional assemblies with a degree of complexity, determinism and physiological realism unavailable to current in vitro systems We shall in particular develop “semi-3D” architectures, reproducing the local 3D arrangement of tissues, but presenting at mesoscale a planar and periodic arrangement facilitating high resolution stimulation and recording. This will provide biologists and clinicians with new experimental models able to bridge the gap between current in vitro systems, in which cells can be observed in parallel at high resolution, but lack the highly ordered architecture present in living systems, and in vivo models, in which observation and stimulation means are more limited. This development will follow a functional approach, and gather competences and concepts from micr-nano-systems, surface science, hydrodynamics, soft matter and biology. We shall validate it on three specific applications, the sorting and study of circulating tumour cells for understanding metastases, the creation of “miniguts”, artificial intestinal tissue, for applications in developmental biology and cancerogenesis, and the in vitro construction of active and connected neuron arrays, for studying the molecular mechanisms of Alzheimer, and signal processing by neuron networks. This platform will also open new routes for drug testing, replacing animal models and reducing the health and economic risk of clinical tests, developmental biology , stem cells research. and regenerative medicine.
Summary
We shall develop a microfluidic and microsystems toolbox allowing the construction and study of complex cellular assemblies (“tissue or organ mimics on chip”), in a highly controlled and parallelized way. This platform will allow the selection of specific cells from one or several populations, their deterministic positioning and/or connection relative to each other, yielding functional assemblies with a degree of complexity, determinism and physiological realism unavailable to current in vitro systems We shall in particular develop “semi-3D” architectures, reproducing the local 3D arrangement of tissues, but presenting at mesoscale a planar and periodic arrangement facilitating high resolution stimulation and recording. This will provide biologists and clinicians with new experimental models able to bridge the gap between current in vitro systems, in which cells can be observed in parallel at high resolution, but lack the highly ordered architecture present in living systems, and in vivo models, in which observation and stimulation means are more limited. This development will follow a functional approach, and gather competences and concepts from micr-nano-systems, surface science, hydrodynamics, soft matter and biology. We shall validate it on three specific applications, the sorting and study of circulating tumour cells for understanding metastases, the creation of “miniguts”, artificial intestinal tissue, for applications in developmental biology and cancerogenesis, and the in vitro construction of active and connected neuron arrays, for studying the molecular mechanisms of Alzheimer, and signal processing by neuron networks. This platform will also open new routes for drug testing, replacing animal models and reducing the health and economic risk of clinical tests, developmental biology , stem cells research. and regenerative medicine.
Max ERC Funding
2 260 000 €
Duration
Start date: 2013-07-01, End date: 2018-06-30
Project acronym CEMYSS
Project Cosmochemical Exploration of the first two Million Years of the Solar System
Researcher (PI) Marc Chaussidon
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Summary
One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Max ERC Funding
1 270 419 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym CENNS
Project Probing new physics with Coherent Elastic Neutrino-Nucleus Scattering and a tabletop experiment
Researcher (PI) Julien Billard
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2018-STG
Summary Ever since the Higgs boson was discovered at the LHC in 2012, we had the confirmation that the Standard Model (SM) of particle physics has to be extended. In parallel, the long lasting Dark Matter (DM) problem, supported by a wealth of evidence ranging from precision cosmology to local astrophysical observations, has been suggesting that new particles should exist. Unfortunately, neither the LHC nor the DM dedicated experiments have significantly detected any exotic signals pointing toward a particular new physics extension of the SM so far.
With this proposal, I want to take a new path in the quest of new physics searches by providing the first high-precision measurement of the neutral current Coherent Elastic Neutrino-Nucleus Scattering (CENNS). By focusing on the sub-100 eV CENNS induced nuclear recoils, my goal is to reach unprecedented sensitivities to various exotic physics scenarios with major implications from cosmology to particle physics, beyond the reach of existing particle physics experiments. These include for instance the existence of sterile neutrinos and of new mediators, that could be related to the DM problem, and the possibility of Non Standard Interactions that would have tremendous implications on the global neutrino physics program.
To this end, I propose to build a kg-scale cryogenic tabletop neutrino experiment with outstanding sensitivity to low-energy nuclear recoils, called CryoCube, that will be deployed at an optimal nuclear reactor site. The key feature of this proposed detector technology is to combine two target materials: Ge-semiconductor and Zn-superconducting metal. I want to push these two detector techniques beyond the state-of-the-art performance to reach sub-100 eV energy thresholds with unparalleled background rejection capabilities.
As my proposed CryoCube detector will reach a 5-sigma level CENNS detection significance in a single day, it will be uniquely positioned to probe new physics extensions beyond the SM.
Summary
Ever since the Higgs boson was discovered at the LHC in 2012, we had the confirmation that the Standard Model (SM) of particle physics has to be extended. In parallel, the long lasting Dark Matter (DM) problem, supported by a wealth of evidence ranging from precision cosmology to local astrophysical observations, has been suggesting that new particles should exist. Unfortunately, neither the LHC nor the DM dedicated experiments have significantly detected any exotic signals pointing toward a particular new physics extension of the SM so far.
With this proposal, I want to take a new path in the quest of new physics searches by providing the first high-precision measurement of the neutral current Coherent Elastic Neutrino-Nucleus Scattering (CENNS). By focusing on the sub-100 eV CENNS induced nuclear recoils, my goal is to reach unprecedented sensitivities to various exotic physics scenarios with major implications from cosmology to particle physics, beyond the reach of existing particle physics experiments. These include for instance the existence of sterile neutrinos and of new mediators, that could be related to the DM problem, and the possibility of Non Standard Interactions that would have tremendous implications on the global neutrino physics program.
To this end, I propose to build a kg-scale cryogenic tabletop neutrino experiment with outstanding sensitivity to low-energy nuclear recoils, called CryoCube, that will be deployed at an optimal nuclear reactor site. The key feature of this proposed detector technology is to combine two target materials: Ge-semiconductor and Zn-superconducting metal. I want to push these two detector techniques beyond the state-of-the-art performance to reach sub-100 eV energy thresholds with unparalleled background rejection capabilities.
As my proposed CryoCube detector will reach a 5-sigma level CENNS detection significance in a single day, it will be uniquely positioned to probe new physics extensions beyond the SM.
Max ERC Funding
1 495 000 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym CENTROSTEMCANCER
Project Investigating the link between centrosomes, stem cells and cancer
Researcher (PI) Renata Homem De Gouveia Xavier De Basto
Host Institution (HI) INSTITUT CURIE
Call Details Starting Grant (StG), LS3, ERC-2009-StG
Summary Centrosomes are cytoplasmic organelles found in most animal cells with important roles in polarity establishment and maintenance. Theodor Boveri s pioneering work first suggested that extra-centrosomes could contribute to genetic instability and consequently to tumourigenesis. Although many human tumours do exhibit centrosome amplification (extra centrosomes) or centrosome abnormalities, the exact contribution of centrosomes to tumour initiation in vertebrate organisms remains to be determined. I have recently showed that Drosophila flies carrying extra-centrosomes, following the over-expression of the centriole replication kinase Sak, did not exhibit chromosome segregation errors and were able to maintain a stable diploid genome over many generations. Surprisingly, however, neural stem cells fail frequently to align the mitotic spindle with their polarity axis during asymmetric division. Moreover, I have found that centrosome amplification is permissive to tumour formation in flies. So far, however, we do not know the molecular mechanisms that allow transformation when extra centrosomes are present and elucidating these mechanisms is the aim of the work presented in this proposal. Here, I describe a series of complementary approaches that will help us to decipher the link between centrosomes, stem cells and tumour biology. In addition, I wish to pursue the original observations made in Drosophila and investigate the consequences of centrosome amplification in mammals.
Summary
Centrosomes are cytoplasmic organelles found in most animal cells with important roles in polarity establishment and maintenance. Theodor Boveri s pioneering work first suggested that extra-centrosomes could contribute to genetic instability and consequently to tumourigenesis. Although many human tumours do exhibit centrosome amplification (extra centrosomes) or centrosome abnormalities, the exact contribution of centrosomes to tumour initiation in vertebrate organisms remains to be determined. I have recently showed that Drosophila flies carrying extra-centrosomes, following the over-expression of the centriole replication kinase Sak, did not exhibit chromosome segregation errors and were able to maintain a stable diploid genome over many generations. Surprisingly, however, neural stem cells fail frequently to align the mitotic spindle with their polarity axis during asymmetric division. Moreover, I have found that centrosome amplification is permissive to tumour formation in flies. So far, however, we do not know the molecular mechanisms that allow transformation when extra centrosomes are present and elucidating these mechanisms is the aim of the work presented in this proposal. Here, I describe a series of complementary approaches that will help us to decipher the link between centrosomes, stem cells and tumour biology. In addition, I wish to pursue the original observations made in Drosophila and investigate the consequences of centrosome amplification in mammals.
Max ERC Funding
1 550 000 €
Duration
Start date: 2010-01-01, End date: 2015-06-30