Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTAR TPC
Project Active Target and Time Projection Chamber
Researcher (PI) Gwen Grinyer
Host Institution (HI) GRAND ACCELERATEUR NATIONAL D'IONS LOURDS
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Summary
The active target and time projection chamber (ACTAR TPC) is a novel gas-filled detection system that will permit new studies into the structure and decays of the most exotic nuclei. The use of a gas volume that acts as a sensitive detection medium and as the reaction target itself (an “active target”) offers considerable advantages over traditional nuclear physics detectors and techniques. In high-energy physics, TPC detectors have found profitable applications but their use in nuclear physics has been limited. With the ACTAR TPC design, individual detection pad sizes of 2 mm are the smallest ever attempted in either discipline but is a requirement for high-efficiency and high-resolution nuclear spectroscopy. The corresponding large number of electronic channels (16000 from a surface of only 25×25 cm) requires new developments in high-density electronics and data-acquisition systems that are not yet available in the nuclear physics domain. New experiments in regions of the nuclear chart that cannot be presently contemplated will become feasible with ACTAR TPC.
Max ERC Funding
1 290 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AdOC
Project Advance Optical Clocks
Researcher (PI) Sebastien André Marcel Bize
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2013-CoG
Summary "The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Summary
"The proposed research program has three main objectives. The first and second objectives are to seek extreme precisions in optical atomic spectroscopy and optical clocks, and to use this quest as a mean of exploration in atomic physics. The third objective is to explore new possibilities that stem from extreme precision. These goals will be pursued via three complementary activities: #1: Search for extreme precisions with an Hg optical lattice clock. #2: Explore and exploit the rich Hg system, which is essentially unexplored in the cold and ultra-cold regime. #3: Identify new applications of clocks with extreme precision to Earth science. Clocks can measure directly the gravitational potential via Einstein’s gravitational redshift, leading to the idea of “clock-based geodesy”.
The 2 first activities are experimental and build on an existing setup, where we demonstrated the feasibility of an Hg optical lattice clock. Hg is chosen for its potential to surpass competing systems. We will investigate the unexplored physics of the Hg clock. This includes interactions between Hg atoms, lattice-induced light shifts, and sensitivity to external fields which are specific to the atomic species. Beyond, we will explore the fundamental limits of the optical lattice scheme. We will exploit other remarkable features of Hg associated to the high atomic number and the diversity of stable isotopes. These features enable tests of fundamental physical laws, ultra-precise measurements of isotope shifts, measurement of collisional properties toward evaporative cooling and quantum gases of Hg, investigation of forbidden transitions promising for measuring the nuclear anapole moment of Hg.
The third activity is theoretical and is aimed at initiating collaborations with experts in modelling Earth gravity. With this expertise, we will identify the most promising and realistic approaches for clocks and emerging remote comparison methods to contribute to geodesy, hydrology, oceanography, etc."
Max ERC Funding
1 946 432 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym AUGURY
Project Reconstructing Earth’s mantle convection
Researcher (PI) Nicolas Coltice
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Summary
Knowledge of the state of the Earth mantle and its temporal evolution is fundamental to a variety of disciplines in Earth Sciences, from the internal dynamics to its many expressions in the geological record (postglacial rebound, sea level change, ore deposit, tectonics or geomagnetic reversals). Mantle convection theory is the centerpiece to unravel the present and past state of the mantle. For the past 40 years considerable efforts have been made to improve the quality of numerical models of mantle convection. However, they are still sparsely used to estimate the convective history of the solid Earth, in comparison to ocean or atmospheric models for weather and climate prediction. The main shortcoming is their inability to successfully produce Earth-like seafloor spreading and continental drift self-consistently. Recent convection models have begun to successfully predict these processes (Coltice et al., Science 336, 335-33, 2012). Such breakthrough opens the opportunity to combine high-level data assimilation methodologies and convection models together with advanced tectonic datasets to retrieve Earth's mantle history. The scope of this project is to produce a new generation of tectonic and convection reconstructions, which are key to improve our understanding and knowledge of the evolution of the solid Earth. The development of sustainable high performance numerical models will set new standards for geodynamic data assimilation. The outcome of the AUGURY project will be a new generation of models crucial to a wide variety of disciplines.
Max ERC Funding
1 994 000 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym BLACK
Project The formation and evolution of massive black holes
Researcher (PI) Marta Volonteri
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Summary
"Massive black holes (MBHs) weighing million solar masses and above inhabit the centers of today's galaxies, weighing about a thousandth of the host bulge mass. MBHs also powered quasars known to exist just a few hundred million years after the Big Bang. Owing to observational breakthroughs and remarkable advancements in theoretical models, we do now that MBHs are out there and evolved with their hosts, but we do not know how they got there nor how, and when, the connection between MBHs and hosts was established.
To have a full view of MBH formation and growth we have to look at the global process where galaxies form, as determined by the large-scale structure, on Mpc scales. On the other hand, the region where MBHs dominate the dynamics of gas and stars, and accretion occurs, is merely pc-scale. To study the formation of MBHs and their fuelling we must bridge from Mpc to pc scale in order to follow how galaxies influence MBHs and how in turn MBHs influence galaxies.
BLACK aims to connect the cosmic context to the nuclear region where MBHs reside, and to study MBH formation, feeding and feedback on their hosts through a multi-scale approach following the thread of MBHs from cosmological, to galactic, to nuclear scales. Analytical work guides and tests numerical simulations, allowing us to probe a wide dynamical range.
Our theoretical work will be crucial for planning and interpreting current and future observations. Today and in the near future facilities at wavelengths spanning from radio to X-ray will widen and deepen our view of the Universe, making this an ideal time for this line of research."
Max ERC Funding
1 668 385 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym BrainMicroFlow
Project Brain Microcirculation : Numerical simulation for inter-species translation with applications in human health
Researcher (PI) Sylvie, Jeanine Lejeune Ép Lorthois
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary The cerebral microvascular system is essential to a large variety of physiological processes in the brain, including blood delivery and blood flow regulation as a function of neuronal activity (neuro-vascular coupling). It plays a major role in the associated mechanisms leading to disease (stroke, neurodegenerative diseases, …). In the last decade, cutting edge technologies, including two-photon scanning laser microscopy (TPSLM) and optical manipulation of blood flow, have produced huge amounts of anatomic and functional experimental data in normal and Alzheimer Disease (AD) mice. These require accurate, highly quantitative, physiologically informed modeling and analysis for any coherent understanding and for translating results between species.
In this context, our first aim is to develop a general methodological framework for physiologically informed microvascular fluid dynamics modeling, understood in a broad meaning, i.e. blood flow, molecule transport and resulting functional imaging signals or signal surrogates.
Our second aim is to validate this methodological framework by direct comparison of in vivo anatomical and functional TPSLM measurements with the simulation results based on the same anatomical data.
The third objective is to exploit these methodologies in order to identify the logic of the structure/function relationships of brain microcirculation and neurovascular coupling, in human health and disease, with a focus on the role of vascular factors in AD.
Specific hypotheses on how vascular changes in AD affect both vascular function and neurovascular coupling can be experimentally tested in animal models of AD. Crucially, similar anatomical (but not functional) data can be acquired in healthy and AD humans. This will enable us to model how AD-induced vascular alterations could affect human patients. Ultimately, it provides us with new avenues for design and/or evaluation of improved diagnosis/preventive/treatment strategies.
Summary
The cerebral microvascular system is essential to a large variety of physiological processes in the brain, including blood delivery and blood flow regulation as a function of neuronal activity (neuro-vascular coupling). It plays a major role in the associated mechanisms leading to disease (stroke, neurodegenerative diseases, …). In the last decade, cutting edge technologies, including two-photon scanning laser microscopy (TPSLM) and optical manipulation of blood flow, have produced huge amounts of anatomic and functional experimental data in normal and Alzheimer Disease (AD) mice. These require accurate, highly quantitative, physiologically informed modeling and analysis for any coherent understanding and for translating results between species.
In this context, our first aim is to develop a general methodological framework for physiologically informed microvascular fluid dynamics modeling, understood in a broad meaning, i.e. blood flow, molecule transport and resulting functional imaging signals or signal surrogates.
Our second aim is to validate this methodological framework by direct comparison of in vivo anatomical and functional TPSLM measurements with the simulation results based on the same anatomical data.
The third objective is to exploit these methodologies in order to identify the logic of the structure/function relationships of brain microcirculation and neurovascular coupling, in human health and disease, with a focus on the role of vascular factors in AD.
Specific hypotheses on how vascular changes in AD affect both vascular function and neurovascular coupling can be experimentally tested in animal models of AD. Crucially, similar anatomical (but not functional) data can be acquired in healthy and AD humans. This will enable us to model how AD-induced vascular alterations could affect human patients. Ultimately, it provides us with new avenues for design and/or evaluation of improved diagnosis/preventive/treatment strategies.
Max ERC Funding
1 999 873 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym BREAD
Project Breaking the curse of dimensionality: numerical challenges in high dimensional analysis and simulation
Researcher (PI) Albert Cohen
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Summary
"This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Max ERC Funding
1 848 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym BubbleBoost
Project Microfluidic bubbles for novel applications: acoustic laser and ultrasonically controlled swimming microrobots
Researcher (PI) Philippe, Guy, Marie Marmottant
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary Microfluidic techniques developed since the year 2000 have now matured to provide a unique tool to produce large amounts of microbubbles that are not only finely tuned in size, but that can also be embedded in tiny microfabricated structures.
In the present proposal, we plan to take advantage of these novel microfabrication techniques to develop two innovative acoustic applications. These applications, which were out of reach without current techniques, are based on the use of microbubbles with a huge acoustic resonance. The project is structured in two parts that only differ in the way bubbles are embedded in microfluidic environments:
1) Arrays of bubbles: Acoustic Laser
This first part is the development of an acoustic laser, based on microbubbles trapped in a microfluidic circuit. To obtain the conditions for an acoustic laser, arrays of microbubbles will be designed so that they bubbles pulsate in phase, reemitting their energy coherently. The applications are novel systems for high ultrasonic emission power, or meta-materials that store vibration energy.
2) Mobile “armoured” bubbles: swimming micro-robots remotely powered by ultrasound
The second part is the conception of ultrasonically activated microswimming devices, with microbubbles embedded within freely moving objects. Their application is to behave as carriers, such as drug carriers, activated at a distance, or to be active tracers that enhance mixing. Microswimmers are mechanical analogues to RFID devices (where electromagnetic vibration is converted into current), here sound is converted into motion at small scales.
Both parts include the same three complementary steps: step 1 is the 3D microfabrication of the geometry where bubbles are embedded, step 2 is their ultrasonic activation, and then step 3 is the optimisation of their resonance by a study of individual resonators.
Summary
Microfluidic techniques developed since the year 2000 have now matured to provide a unique tool to produce large amounts of microbubbles that are not only finely tuned in size, but that can also be embedded in tiny microfabricated structures.
In the present proposal, we plan to take advantage of these novel microfabrication techniques to develop two innovative acoustic applications. These applications, which were out of reach without current techniques, are based on the use of microbubbles with a huge acoustic resonance. The project is structured in two parts that only differ in the way bubbles are embedded in microfluidic environments:
1) Arrays of bubbles: Acoustic Laser
This first part is the development of an acoustic laser, based on microbubbles trapped in a microfluidic circuit. To obtain the conditions for an acoustic laser, arrays of microbubbles will be designed so that they bubbles pulsate in phase, reemitting their energy coherently. The applications are novel systems for high ultrasonic emission power, or meta-materials that store vibration energy.
2) Mobile “armoured” bubbles: swimming micro-robots remotely powered by ultrasound
The second part is the conception of ultrasonically activated microswimming devices, with microbubbles embedded within freely moving objects. Their application is to behave as carriers, such as drug carriers, activated at a distance, or to be active tracers that enhance mixing. Microswimmers are mechanical analogues to RFID devices (where electromagnetic vibration is converted into current), here sound is converted into motion at small scales.
Both parts include the same three complementary steps: step 1 is the 3D microfabrication of the geometry where bubbles are embedded, step 2 is their ultrasonic activation, and then step 3 is the optimisation of their resonance by a study of individual resonators.
Max ERC Funding
1 856 542 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym CALENDS
Project Clusters And LENsing of Distant Sources
Researcher (PI) Johan Pierre Richard
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary Some of the primary questions in extragalactic astronomy concern the formation and evolution of galaxies in the distant Universe. In particular, little is known about the less luminous (and therefore less massive) galaxy populations, which are currently missed from large observing surveys and could contribute significantly to the overall star formation happening at early times. One way to overcome the current observing limitations prior to the arrival of the future James Webb Space Telescope or the European Extremely Large Telescopes is to use the natural magnification of strong lensing clusters to look at distant sources with an improved sensitivity and resolution.
The aim of CALENDS is to build and study in great details a large sample of accurately-modelled, strongly lensed galaxies at high redshift (1<z<5) selected in the fields of massive clusters, and compare them with the more luminous or lower redshift populations. We will develop novel techniques in this process, in order to improve the accuracy of strong-lensing models and precisely determine the mass content of these clusters. By performing a systematic modelling of the cluster sample we will look into the relative distribution of baryons and dark matter as well as the amount of substructure in cluster cores. Regarding the population of lensed galaxies, we will study their global properties through a multiwavelength analysis covering the optical to millimeter domains, including spectroscopic information from MUSE and KMOS on the VLT, and ALMA.
We will look for scaling relations between the stellar, gas and dust parameters, and compare them with known relations for lower redshift and more massive galaxy samples. For the most extended sources, we will be able to spatially resolve their inner properties, and compare the results of individual regions with predictions from simulations. We will look into key physical processes: star formation, gas accretion, inflows and outflows, in these distant sources.
Summary
Some of the primary questions in extragalactic astronomy concern the formation and evolution of galaxies in the distant Universe. In particular, little is known about the less luminous (and therefore less massive) galaxy populations, which are currently missed from large observing surveys and could contribute significantly to the overall star formation happening at early times. One way to overcome the current observing limitations prior to the arrival of the future James Webb Space Telescope or the European Extremely Large Telescopes is to use the natural magnification of strong lensing clusters to look at distant sources with an improved sensitivity and resolution.
The aim of CALENDS is to build and study in great details a large sample of accurately-modelled, strongly lensed galaxies at high redshift (1<z<5) selected in the fields of massive clusters, and compare them with the more luminous or lower redshift populations. We will develop novel techniques in this process, in order to improve the accuracy of strong-lensing models and precisely determine the mass content of these clusters. By performing a systematic modelling of the cluster sample we will look into the relative distribution of baryons and dark matter as well as the amount of substructure in cluster cores. Regarding the population of lensed galaxies, we will study their global properties through a multiwavelength analysis covering the optical to millimeter domains, including spectroscopic information from MUSE and KMOS on the VLT, and ALMA.
We will look for scaling relations between the stellar, gas and dust parameters, and compare them with known relations for lower redshift and more massive galaxy samples. For the most extended sources, we will be able to spatially resolve their inner properties, and compare the results of individual regions with predictions from simulations. We will look into key physical processes: star formation, gas accretion, inflows and outflows, in these distant sources.
Max ERC Funding
1 450 992 €
Duration
Start date: 2013-09-01, End date: 2019-08-31
Project acronym CAUSALPATH
Project Next Generation Causal Analysis: Inspired by the Induction of Biological Pathways from Cytometry Data
Researcher (PI) Ioannis Tsamardinos
Host Institution (HI) PANEPISTIMIO KRITIS
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary Discovering the causal mechanisms of a complex system of interacting components is necessary in order to control it. Computational Causal Discovery (CD) is a field that offers the potential to discover causal relations under certain conditions from observational data alone or with a limited number of interventions/manipulations.
An important, challenging biological problem that may take decades of experimental work is the induction of biological cellular pathways; pathways are informal causal models indispensable in biological research and drug design. Recent exciting advances in flow/mass cytometry biotechnology allow the generation of large-sample datasets containing measurements on single cells, thus setting the problem of pathway learning suitable for CD methods.
CAUSALPATH builds upon and further advances recent breakthrough developments in CD methods to enable the induction of biological pathways from cytometry and other omics data. As a testbed problem we focus on the differentiation of human T-cells; these are involved in autoimmune and inflammatory diseases, as well as cancer and thus, are targets of new drug development for a range of chronic diseases. The biological problem acts as our campus for general novel formalisms, practical algorithms, and useful tools development, pointing to fundamental CD problems: presence of feedback cycles, presence of latent confounding variables, CD from time-course data, Integrative Causal Analysis (INCA) of heterogeneous datasets and others.
Three features complement CAUSALPATH’s approach: (A) methods development will co-evolve with biological wet-lab experiments periodically testing the algorithmic postulates, (B) Open-source tools will be developed for the non-expert, and (C) Commercial exploitation of the results will be sought out.
CAUSALPATH brings together an interdisciplinary team, committed to this vision. It builds upon the PI’s group recent important results on INCA algorithms.
Summary
Discovering the causal mechanisms of a complex system of interacting components is necessary in order to control it. Computational Causal Discovery (CD) is a field that offers the potential to discover causal relations under certain conditions from observational data alone or with a limited number of interventions/manipulations.
An important, challenging biological problem that may take decades of experimental work is the induction of biological cellular pathways; pathways are informal causal models indispensable in biological research and drug design. Recent exciting advances in flow/mass cytometry biotechnology allow the generation of large-sample datasets containing measurements on single cells, thus setting the problem of pathway learning suitable for CD methods.
CAUSALPATH builds upon and further advances recent breakthrough developments in CD methods to enable the induction of biological pathways from cytometry and other omics data. As a testbed problem we focus on the differentiation of human T-cells; these are involved in autoimmune and inflammatory diseases, as well as cancer and thus, are targets of new drug development for a range of chronic diseases. The biological problem acts as our campus for general novel formalisms, practical algorithms, and useful tools development, pointing to fundamental CD problems: presence of feedback cycles, presence of latent confounding variables, CD from time-course data, Integrative Causal Analysis (INCA) of heterogeneous datasets and others.
Three features complement CAUSALPATH’s approach: (A) methods development will co-evolve with biological wet-lab experiments periodically testing the algorithmic postulates, (B) Open-source tools will be developed for the non-expert, and (C) Commercial exploitation of the results will be sought out.
CAUSALPATH brings together an interdisciplinary team, committed to this vision. It builds upon the PI’s group recent important results on INCA algorithms.
Max ERC Funding
1 724 000 €
Duration
Start date: 2015-01-01, End date: 2019-12-31
Project acronym CIRQUSS
Project Circuit Quantum Electrodynamics with Single Electronic and Nuclear Spins
Researcher (PI) Patrice Emmanuel Bertet
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Consolidator Grant (CoG), PE3, ERC-2013-CoG
Summary "Electronic spins are usually detected by their interaction with electromagnetic fields at microwave frequencies. Since this interaction is very weak, only large ensembles of spins can be detected. In circuit quantum electrodynamics (cQED) on the other hand, artificial superconducting atoms are made to interact strongly with microwave fields at the single photon level, and quantum-limited detection of few-photon microwave signals has been developed.
The goal of this project is to apply the concepts and techniques of cQED to the detection and manipulation of electronic and nuclear spins, in order to reach a novel regime in which a single electronic spin strongly interacts with single microwave photons. This will lead to
1) A considerable enhancement of the sensitivity of spin detection by microwave methods. We plan to detect resonantly single electronic spins in a few milliseconds. This could enable A) to perform electron spin resonance spectroscopy on few-molecule samples B) to measure the magnetization of various nano-objects at millikelvin temperatures, using the spin as a magnetic sensor with nanoscale resolution.
2) Applications in quantum information science. Strong interaction with microwave fields at the quantum level will enable the generation of entangled states of distant individual electronic and nuclear spins, using superconducting qubits, resonators and microwave photons, as “quantum data buses” mediating the entanglement. Since spins can have coherence times in the seconds range, this could pave the way towards a scalable implementation of quantum information processing protocols.
These ideas will be primarily implemented with NV centers in diamond, which are electronic spins with properties suitable for the project."
Summary
"Electronic spins are usually detected by their interaction with electromagnetic fields at microwave frequencies. Since this interaction is very weak, only large ensembles of spins can be detected. In circuit quantum electrodynamics (cQED) on the other hand, artificial superconducting atoms are made to interact strongly with microwave fields at the single photon level, and quantum-limited detection of few-photon microwave signals has been developed.
The goal of this project is to apply the concepts and techniques of cQED to the detection and manipulation of electronic and nuclear spins, in order to reach a novel regime in which a single electronic spin strongly interacts with single microwave photons. This will lead to
1) A considerable enhancement of the sensitivity of spin detection by microwave methods. We plan to detect resonantly single electronic spins in a few milliseconds. This could enable A) to perform electron spin resonance spectroscopy on few-molecule samples B) to measure the magnetization of various nano-objects at millikelvin temperatures, using the spin as a magnetic sensor with nanoscale resolution.
2) Applications in quantum information science. Strong interaction with microwave fields at the quantum level will enable the generation of entangled states of distant individual electronic and nuclear spins, using superconducting qubits, resonators and microwave photons, as “quantum data buses” mediating the entanglement. Since spins can have coherence times in the seconds range, this could pave the way towards a scalable implementation of quantum information processing protocols.
These ideas will be primarily implemented with NV centers in diamond, which are electronic spins with properties suitable for the project."
Max ERC Funding
1 999 995 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym CO2Recycling
Project A Diagonal Approach to CO2 Recycling to Fine Chemicals
Researcher (PI) Thibault Matthias Daniel Cantat
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary Because fossil resources are a limited feedstock and their use results in the accumulation of atmospheric CO2, the organic chemistry industry will face important challenges in the next decades to find alternative feedstocks. New methods for the recycling of CO2 are therefore needed, to use CO2 as a carbon source for the production of organic chemicals. Yet, CO2 is difficult to transform and only 3 chemical processes recycling CO2 have been industrialized to date. To tackle this problem, my idea is to design novel catalytic transformations where CO2 is reacted, in a single step, with a functionalizing reagent and a reductant that can be independently modified, to produce a large spectrum of molecules. The proof of concept for this new “diagonal approach” has been established in 2012, in my team, with a new reaction able to co-recycle CO2 and a chemical waste of the silicones industry (PMHS) to convert amines to formamides. The goal of this proposal is to develop new diagonal reactions to enable the use of CO2 for the synthesis of amines, esters and amides, which are currently obtained from fossil materials. The novel catalytic reactions will be applied to the production of important molecules: methylamines, acrylamide and methyladipic acid. The methodology will rely on the development of molecular catalysts able to promote the reductive functionalization of CO2 in the presence of H2 or hydrosilanes. Rational design of efficient catalysts will be performed based on theoretical and experimental mechanistic investigations and utilized for the production of industrially important chemicals. Overall, this proposal will contribute to achieving sustainability in the chemical industry. The results will also increase our understanding of CO2 activation and provide invaluable insights into the basic modes of action of organocatalysts in reduction chemistry. They will serve the scientific community involved in the field of organocatalysis, green chemistry and energy storage.
Summary
Because fossil resources are a limited feedstock and their use results in the accumulation of atmospheric CO2, the organic chemistry industry will face important challenges in the next decades to find alternative feedstocks. New methods for the recycling of CO2 are therefore needed, to use CO2 as a carbon source for the production of organic chemicals. Yet, CO2 is difficult to transform and only 3 chemical processes recycling CO2 have been industrialized to date. To tackle this problem, my idea is to design novel catalytic transformations where CO2 is reacted, in a single step, with a functionalizing reagent and a reductant that can be independently modified, to produce a large spectrum of molecules. The proof of concept for this new “diagonal approach” has been established in 2012, in my team, with a new reaction able to co-recycle CO2 and a chemical waste of the silicones industry (PMHS) to convert amines to formamides. The goal of this proposal is to develop new diagonal reactions to enable the use of CO2 for the synthesis of amines, esters and amides, which are currently obtained from fossil materials. The novel catalytic reactions will be applied to the production of important molecules: methylamines, acrylamide and methyladipic acid. The methodology will rely on the development of molecular catalysts able to promote the reductive functionalization of CO2 in the presence of H2 or hydrosilanes. Rational design of efficient catalysts will be performed based on theoretical and experimental mechanistic investigations and utilized for the production of industrially important chemicals. Overall, this proposal will contribute to achieving sustainability in the chemical industry. The results will also increase our understanding of CO2 activation and provide invaluable insights into the basic modes of action of organocatalysts in reduction chemistry. They will serve the scientific community involved in the field of organocatalysis, green chemistry and energy storage.
Max ERC Funding
1 494 734 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym CORPHO
Project Theory of strongly correlated photonic systems
Researcher (PI) Cristiano Ciuti
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Consolidator Grant (CoG), PE3, ERC-2013-CoG
Summary "The physics of complex quantum systems with controllable interactions is emerging as a fundamental topic for a broad community, providing an opportunity to test theories of strongly correlated quantum many-body systems and opening interesting applications such as quantum simulators. Recently, in solid-state structures with effective photon-photon interactions the rich physics of quantum fluids of light has been explored, albeit not yet in the regime of strong photonic correlations. Exciting advances in cavity Quantum Electro-Dynamics (QED) and superconducting circuit QED make strong photon-photon interactions now accessible. A growing interest is focusing on lattices of coupled resonators, implementing Hubbard-like Hamiltonians for photons injected by pump driving fields. Similarly to electronic systems, the physics of large two-dimensional (2D) photonic lattices is a fundamental theoretical challenge in the regime of strong correlations. CORPHO has the ambition to develop novel scalable theoretical methods for 2D lattices of cavities, including spatially inhomogeneous driving and dissipation. The proposed methods are based on a hybrid strategy combining cluster mean-field theory and Wave Function Monte Carlo on a physical ‘Corner’ of the Hilbert space in order to calculate the steady-state density matrix and the properties of the non-equilibrium phases. We will study 2D lattices with complex unit cells and ‘fractional’ driving (only a fraction of the sites is pumped), a configuration that, according to recent preliminary studies, is expected to dramatically enhance and enrich quantum correlations. We will also investigate the interplay between driving and geometric frustration in 2D lattices with polarization-dependent interactions. Finally, the quantum control of strongly correlated photonic systems will be explored, including quantum feedback processes, cooling of thermal fluctuations and switching between multi-stable phases."
Summary
"The physics of complex quantum systems with controllable interactions is emerging as a fundamental topic for a broad community, providing an opportunity to test theories of strongly correlated quantum many-body systems and opening interesting applications such as quantum simulators. Recently, in solid-state structures with effective photon-photon interactions the rich physics of quantum fluids of light has been explored, albeit not yet in the regime of strong photonic correlations. Exciting advances in cavity Quantum Electro-Dynamics (QED) and superconducting circuit QED make strong photon-photon interactions now accessible. A growing interest is focusing on lattices of coupled resonators, implementing Hubbard-like Hamiltonians for photons injected by pump driving fields. Similarly to electronic systems, the physics of large two-dimensional (2D) photonic lattices is a fundamental theoretical challenge in the regime of strong correlations. CORPHO has the ambition to develop novel scalable theoretical methods for 2D lattices of cavities, including spatially inhomogeneous driving and dissipation. The proposed methods are based on a hybrid strategy combining cluster mean-field theory and Wave Function Monte Carlo on a physical ‘Corner’ of the Hilbert space in order to calculate the steady-state density matrix and the properties of the non-equilibrium phases. We will study 2D lattices with complex unit cells and ‘fractional’ driving (only a fraction of the sites is pumped), a configuration that, according to recent preliminary studies, is expected to dramatically enhance and enrich quantum correlations. We will also investigate the interplay between driving and geometric frustration in 2D lattices with polarization-dependent interactions. Finally, the quantum control of strongly correlated photonic systems will be explored, including quantum feedback processes, cooling of thermal fluctuations and switching between multi-stable phases."
Max ERC Funding
1 378 440 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym CORRELMAT
Project Predictive electronic structure calculations for materials with strong electronic correlations: long-range Coulomb interactions and many-body screening
Researcher (PI) Silke Biermann
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Consolidator Grant (CoG), PE3, ERC-2013-CoG
Summary "Materials with strong electronic Coulomb correlations present unique electronic properties such as exotic magnetism, charge or orbital order, or unconventional optical or transport properties, including superconductivity, thermoelectricity or metal-insulator transitions. The concerted behavior of the electrons in these ``correlated materials"" moreover leads to an extreme sensitivity to external stimuli such as changes in temperature, pressure, or external fields. This tuneability of even fundamental properties is both a harbinger for technological applications and a challenge to currently available theoretical methods: Indeed, these properties are the result of strong electron-electron interactions and subtle quantum correlations, and cannot be understood without a proper description of excited states.
The aim of the present project is to elaborate, implement and test new approaches to investigate the spectral and optical properties of correlated materials ``from first principles"", that is, without adjustable parameters. I will build on the success of state-of-the-art dynamical mean field-based electronic structure techniques, but aim at developing them into truly first-principles methods, where a full treatment of the long-range Coulomb interactions replaces the current practice of purely local Hubbard interaction parameters. My target materials are among the most interesting for modern technologies, such as transition metal oxides (with potential applications ranging from oxide electronics to battery materials) and rare earth compounds used as environmentally-responsible pigments. Establishing first-principles techniques with truly predictive power for these classes of materials will bring us closer to the final goal of tailoring correlated materials with preassigned properties."
Summary
"Materials with strong electronic Coulomb correlations present unique electronic properties such as exotic magnetism, charge or orbital order, or unconventional optical or transport properties, including superconductivity, thermoelectricity or metal-insulator transitions. The concerted behavior of the electrons in these ``correlated materials"" moreover leads to an extreme sensitivity to external stimuli such as changes in temperature, pressure, or external fields. This tuneability of even fundamental properties is both a harbinger for technological applications and a challenge to currently available theoretical methods: Indeed, these properties are the result of strong electron-electron interactions and subtle quantum correlations, and cannot be understood without a proper description of excited states.
The aim of the present project is to elaborate, implement and test new approaches to investigate the spectral and optical properties of correlated materials ``from first principles"", that is, without adjustable parameters. I will build on the success of state-of-the-art dynamical mean field-based electronic structure techniques, but aim at developing them into truly first-principles methods, where a full treatment of the long-range Coulomb interactions replaces the current practice of purely local Hubbard interaction parameters. My target materials are among the most interesting for modern technologies, such as transition metal oxides (with potential applications ranging from oxide electronics to battery materials) and rare earth compounds used as environmentally-responsible pigments. Establishing first-principles techniques with truly predictive power for these classes of materials will bring us closer to the final goal of tailoring correlated materials with preassigned properties."
Max ERC Funding
1 713 600 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym COXINEL
Project COherent Xray source INferred from Electrons accelerated by Laser
Researcher (PI) Marie-Emmanuelle Couprie
Host Institution (HI) SYNCHROTRON SOLEIL SOCIETE CIVILE
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary "Since the first laser discovery in 1960 and the first Free Electron Laser (FEL) in 1977, Linac based fourth generation light sources provide intense coherent fs pulses in the X-ray range for multidisciplinary investigations of matter. In parallel, Laser Wakefield Accelerator (LWFA) by using intense laser beams interacting with cm long plasmas can now provide high quality electron beams of very short bunches (few fs) with high peak currents (few kA). The so-called 5th generation light source aims at reducing the size and the cost of these FELs by replacing the linac by LWFA. Indeed, spontaneous emission from LWFA has already been observed, but the presently still rather large energy spread (1 %) and divergence (mrad) prevent from the FEL amplification. In 2012, two novel schemes in the transport proposed in the community, including my SOLEIL group, predict a laser gain increase by 3 or 4 orders of magnitudes. COXINEL aims at demonstrating the first lasing of an LWFA FEL and its detailed study in close interaction with future potential users. The key concept relies on an innovative electron beam longitudinal and transverse manipulation in the transport towards an undulator: a ""demixing"" chicane sorts the electrons in energy and reduces the spread from 1 % to a slice one of 0.1%, and the transverse density is maintained constant all along the undulator (supermatching). Simulations based on the performance of the 60 TW laser of the Laboratoire d’Optique Appliquée and existing undulators from SOLEIL suggest that the conditions for lasing are fulfilled. The SOLEIL environment also possesses the engineering fabrication capability for the actual realization of these theoretical ideas, with original undulators and innovative variable permanent compact magnets for the transport. COXINEL will enable to master in Europe advanced schemes scalable to shorter wavelengths and pulses, paving the way towards FEL light sources on laboratory size, for fs time resolved experiments."
Summary
"Since the first laser discovery in 1960 and the first Free Electron Laser (FEL) in 1977, Linac based fourth generation light sources provide intense coherent fs pulses in the X-ray range for multidisciplinary investigations of matter. In parallel, Laser Wakefield Accelerator (LWFA) by using intense laser beams interacting with cm long plasmas can now provide high quality electron beams of very short bunches (few fs) with high peak currents (few kA). The so-called 5th generation light source aims at reducing the size and the cost of these FELs by replacing the linac by LWFA. Indeed, spontaneous emission from LWFA has already been observed, but the presently still rather large energy spread (1 %) and divergence (mrad) prevent from the FEL amplification. In 2012, two novel schemes in the transport proposed in the community, including my SOLEIL group, predict a laser gain increase by 3 or 4 orders of magnitudes. COXINEL aims at demonstrating the first lasing of an LWFA FEL and its detailed study in close interaction with future potential users. The key concept relies on an innovative electron beam longitudinal and transverse manipulation in the transport towards an undulator: a ""demixing"" chicane sorts the electrons in energy and reduces the spread from 1 % to a slice one of 0.1%, and the transverse density is maintained constant all along the undulator (supermatching). Simulations based on the performance of the 60 TW laser of the Laboratoire d’Optique Appliquée and existing undulators from SOLEIL suggest that the conditions for lasing are fulfilled. The SOLEIL environment also possesses the engineering fabrication capability for the actual realization of these theoretical ideas, with original undulators and innovative variable permanent compact magnets for the transport. COXINEL will enable to master in Europe advanced schemes scalable to shorter wavelengths and pulses, paving the way towards FEL light sources on laboratory size, for fs time resolved experiments."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym CryptoCloud
Project Cryptography for the Cloud
Researcher (PI) David Daniel Rene Pointcheval
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Summary
Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Max ERC Funding
2 168 261 €
Duration
Start date: 2014-06-01, End date: 2019-05-31