Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym AlterMateria
Project Designer Quantum Materials Out of Equilibrium
Researcher (PI) Andrea Caviglia
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Recently, ‘designer’ quantum materials, synthesised layer by layer, have been realised, sparking ground-breaking new scientific insights. These artificial materials, such as oxide heterostructures, are interesting building blocks for a new generation of technologies, provided that one is able to access, study and ultimately control their quantum phases in practical conditions such as at room temperature and high speeds.
On the other hand, an independent research area is emerging that uses ultra-short bursts of light to stimulate changes in the macroscopic electronic properties of solids at unprecedented speeds.
Here I propose to bridge the gap between material design and ultrafast control of solids. This new synergy will allow us to explore fundamental research questions on the non-equilibrium dynamics of quantum materials with competing ground states. Specifically, I will utilize intense THz and mid-infrared electromagnetic fields to manipulate the electronic properties of artificial quantum materials on pico- to femto-second time scales. Beyond the development of novel techniques to generate THz electric fields of unprecedented intensity, I will investigate metal-insulator and magnetic transitions in oxide heterostructures as they unfold in time. This research programme takes oxide electronics in a new direction and establishes a new methodology for the control of quantum phases at high temperature and high speed.
Summary
Recently, ‘designer’ quantum materials, synthesised layer by layer, have been realised, sparking ground-breaking new scientific insights. These artificial materials, such as oxide heterostructures, are interesting building blocks for a new generation of technologies, provided that one is able to access, study and ultimately control their quantum phases in practical conditions such as at room temperature and high speeds.
On the other hand, an independent research area is emerging that uses ultra-short bursts of light to stimulate changes in the macroscopic electronic properties of solids at unprecedented speeds.
Here I propose to bridge the gap between material design and ultrafast control of solids. This new synergy will allow us to explore fundamental research questions on the non-equilibrium dynamics of quantum materials with competing ground states. Specifically, I will utilize intense THz and mid-infrared electromagnetic fields to manipulate the electronic properties of artificial quantum materials on pico- to femto-second time scales. Beyond the development of novel techniques to generate THz electric fields of unprecedented intensity, I will investigate metal-insulator and magnetic transitions in oxide heterostructures as they unfold in time. This research programme takes oxide electronics in a new direction and establishes a new methodology for the control of quantum phases at high temperature and high speed.
Max ERC Funding
1 499 982 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym ARTISYM
Project Artificial endosymbiosis
Researcher (PI) Jan Van hest
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary Living organisms have acquired new functionalities by uptake and integration of species to create symbiotic life-forms. This process of endosymbiosis has intrigued scientists over the years, albeit mostly from an evolution biology perspective. With the advance of chemical and synthetic biology, our ability to create molecular-life-like systems has increased tremendously, which enables us to build cell and organelle-like structures. However, these advances have not been taken to a level to study comprehensively if endosymbiosis can be applied to non-living systems or to integrate living with non-living matter. The aim of the research described in the ARTISYM proposal is to establish the field of artificial endosymbiosis. Two lines of research will be followed. First, we will incorporate artificial organelles in living cells to design hybrid cells with acquired functionality. This investigation is scientifically of great interest, as it will show us how to introduce novel compartmentalized pathways into living organisms. It also serves an important societal goal, as with these compartments dysfunctional cellular processes can be corrected. We will follow both a transient and a permanent approach. With the transient route biodegradable nanoreactors are introduced to supply living cells temporarily with novel function. Functionality is permanently introduced using genetic engineering to express protein-based nanoreactors in living cells, or via organelle transplantation of healthy mitochondria in diseased living cells. Secondly I aim to create artificial cells with the ability to perform endosymbiosis; the uptake and presence of artificial organelles in synthetic vesicles allows them to dynamically respond to their environment. Responses that are envisaged are shape changes, motility, and growth and division. Furthermore, the incorporation of natural organelles in liposomes provides biocatalytic cascades with the necessary cofactors to function in an artificial cell
Summary
Living organisms have acquired new functionalities by uptake and integration of species to create symbiotic life-forms. This process of endosymbiosis has intrigued scientists over the years, albeit mostly from an evolution biology perspective. With the advance of chemical and synthetic biology, our ability to create molecular-life-like systems has increased tremendously, which enables us to build cell and organelle-like structures. However, these advances have not been taken to a level to study comprehensively if endosymbiosis can be applied to non-living systems or to integrate living with non-living matter. The aim of the research described in the ARTISYM proposal is to establish the field of artificial endosymbiosis. Two lines of research will be followed. First, we will incorporate artificial organelles in living cells to design hybrid cells with acquired functionality. This investigation is scientifically of great interest, as it will show us how to introduce novel compartmentalized pathways into living organisms. It also serves an important societal goal, as with these compartments dysfunctional cellular processes can be corrected. We will follow both a transient and a permanent approach. With the transient route biodegradable nanoreactors are introduced to supply living cells temporarily with novel function. Functionality is permanently introduced using genetic engineering to express protein-based nanoreactors in living cells, or via organelle transplantation of healthy mitochondria in diseased living cells. Secondly I aim to create artificial cells with the ability to perform endosymbiosis; the uptake and presence of artificial organelles in synthetic vesicles allows them to dynamically respond to their environment. Responses that are envisaged are shape changes, motility, and growth and division. Furthermore, the incorporation of natural organelles in liposomes provides biocatalytic cascades with the necessary cofactors to function in an artificial cell
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ATUNE
Project Attenuation Tomography Using Novel observations of Earth's free oscillations
Researcher (PI) Arwen Fedora Deuss
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Tectonic phenomena at the Earth's surface, like volcanic eruptions and earthquakes,are driven by convection deep in the mantle. Seismic tomography has been very successful in elucidating the Earth's internal velocity structure. However, seismic velocity is insufficient to obtain robust estimates of temperature and composition, and make direct links with mantle convection. Thus, fundamental questions remain unanswered: Do subducting slabs bring water into the lower mantle? Are the large low-shear velocity provinces under the Pacific and Africa mainly thermal or compositional? Is there any partial melt or water near the mantle transition zone or core mantle boundary?
Seismic attenuation, or loss of energy, is key to mapping melt, water and temperature variations, and answering these questions. Unfortunately, attenuation has only been imaged using short- and intermediate-period seismic data, showing little similarity even for the upper mantle and no reliable lower mantle models exist. The aim of ATUNE is to develop novel full-spectrum techniques and apply them to Earth's long period free oscillations to observe global-scale regional variations in seismic attenuation from the lithosphere to the core mantle boundary. Scattering and focussing - problematic for shorter period techniques - are easily included using cross-coupling (or resonance) between free oscillations not requiring approximations. The recent occurrence of large earthquakes, increase in computer power and my world-leading expertise in free oscillations now make it possible to increase the frequency dependence of attenuation to a much wider frequency band, allowing us to distinguish between scattering (redistribution of energy) versus intrinsic attenuation. ATUNE will deliver the first ever full-waveform global tomographic model of 3D attenuation variations in the lower mantle, providing essential constraints on melt, water and temperature for understanding the complex dynamics of our planet.
Summary
Tectonic phenomena at the Earth's surface, like volcanic eruptions and earthquakes,are driven by convection deep in the mantle. Seismic tomography has been very successful in elucidating the Earth's internal velocity structure. However, seismic velocity is insufficient to obtain robust estimates of temperature and composition, and make direct links with mantle convection. Thus, fundamental questions remain unanswered: Do subducting slabs bring water into the lower mantle? Are the large low-shear velocity provinces under the Pacific and Africa mainly thermal or compositional? Is there any partial melt or water near the mantle transition zone or core mantle boundary?
Seismic attenuation, or loss of energy, is key to mapping melt, water and temperature variations, and answering these questions. Unfortunately, attenuation has only been imaged using short- and intermediate-period seismic data, showing little similarity even for the upper mantle and no reliable lower mantle models exist. The aim of ATUNE is to develop novel full-spectrum techniques and apply them to Earth's long period free oscillations to observe global-scale regional variations in seismic attenuation from the lithosphere to the core mantle boundary. Scattering and focussing - problematic for shorter period techniques - are easily included using cross-coupling (or resonance) between free oscillations not requiring approximations. The recent occurrence of large earthquakes, increase in computer power and my world-leading expertise in free oscillations now make it possible to increase the frequency dependence of attenuation to a much wider frequency band, allowing us to distinguish between scattering (redistribution of energy) versus intrinsic attenuation. ATUNE will deliver the first ever full-waveform global tomographic model of 3D attenuation variations in the lower mantle, providing essential constraints on melt, water and temperature for understanding the complex dynamics of our planet.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym BIO-ORIGAMI
Project Meta-biomaterials: 3D printing meets Origami
Researcher (PI) Amir Abbas Zadpoor
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary Meta-materials, best known for their extraordinary properties (e.g. negative stiffness), are halfway from both materials and structures: their unusual properties are direct results of their complex 3D structures. This project introduces a new class of meta-materials called meta-biomaterials. Meta-biomaterials go beyond meta-materials by adding an extra dimension to the complex 3D structure, i.e. complex and precisely controlled surface nano-patterns. The 3D structure gives rise to unprecedented or rare combination of mechanical (e.g. stiffness), mass transport (e.g. permeability, diffusivity), and biological (e.g. tissue regeneration rate) properties. Those properties optimize the distribution of mechanical loads and the transport of nutrients and oxygen while providing geometrical shapes preferable for tissue regeneration (e.g. higher curvatures). Surface nano-patterns communicate with (stem) cells, control their differentiation behavior, and enhance tissue regeneration.
There is one important problem: meta-biomaterials cannot be manufactured with current technology. 3D printing can create complex shapes while nanolithography creates complex surface nano-patterns down to a few nanometers but only on flat surfaces. There is, however, no way of combining complex shapes with complex surface nano-patterns. The groundbreaking nature of this project is in solving that deadlock using the Origami concept (the ancient Japanese art of paper folding). In this approach, I first decorate flat 3D-printed sheets with nano-patterns. Then, I apply Origami techniques to fold the decorated flat sheet and create complex 3D shapes. The sheet knows how to self-fold to the desired structure when subjected to compression, owing to pre-designed joints, crease patterns, and thickness/material distributions that control its mechanical instability. I will demonstrate the added value of meta-biomaterials in improving bone tissue regeneration using in vitro cell culture assays and animal models
Summary
Meta-materials, best known for their extraordinary properties (e.g. negative stiffness), are halfway from both materials and structures: their unusual properties are direct results of their complex 3D structures. This project introduces a new class of meta-materials called meta-biomaterials. Meta-biomaterials go beyond meta-materials by adding an extra dimension to the complex 3D structure, i.e. complex and precisely controlled surface nano-patterns. The 3D structure gives rise to unprecedented or rare combination of mechanical (e.g. stiffness), mass transport (e.g. permeability, diffusivity), and biological (e.g. tissue regeneration rate) properties. Those properties optimize the distribution of mechanical loads and the transport of nutrients and oxygen while providing geometrical shapes preferable for tissue regeneration (e.g. higher curvatures). Surface nano-patterns communicate with (stem) cells, control their differentiation behavior, and enhance tissue regeneration.
There is one important problem: meta-biomaterials cannot be manufactured with current technology. 3D printing can create complex shapes while nanolithography creates complex surface nano-patterns down to a few nanometers but only on flat surfaces. There is, however, no way of combining complex shapes with complex surface nano-patterns. The groundbreaking nature of this project is in solving that deadlock using the Origami concept (the ancient Japanese art of paper folding). In this approach, I first decorate flat 3D-printed sheets with nano-patterns. Then, I apply Origami techniques to fold the decorated flat sheet and create complex 3D shapes. The sheet knows how to self-fold to the desired structure when subjected to compression, owing to pre-designed joints, crease patterns, and thickness/material distributions that control its mechanical instability. I will demonstrate the added value of meta-biomaterials in improving bone tissue regeneration using in vitro cell culture assays and animal models
Max ERC Funding
1 499 600 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym BioCircuit
Project Programmable BioMolecular Circuits: Emulating Regulatory Functions in Living Cells Using a Bottom-Up Approach
Researcher (PI) Tom Antonius Franciscus De greef
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary Programmable biomolecular circuits have received increasing attention in recent years as the scope of chemistry expands from the synthesis of individual molecules to the construction of chemical networks that can perform sophisticated functions such as logic operations and feedback control. Rationally engineered biomolecular circuits that robustly execute higher-order spatiotemporal behaviours typically associated with intracellular regulatory functions present a unique and uncharted platform to systematically explore the molecular logic and physical design principles of the cell. The experience gained by in-vitro construction of artificial cells displaying advanced system-level functions deepens our understanding of regulatory networks in living cells and allows theoretical assumptions and models to be refined in a controlled setting. This proposal combines elements from systems chemistry, in-vitro synthetic biology and micro-engineering and explores generic strategies to investigate the molecular logic of biology’s regulatory circuits by applying a physical chemistry-driven bottom-up approach. Progress in this field requires 1) proof-of-principle systems where in-vitro biomolecular circuits are designed to emulate characteristic system-level functions of regulatory circuits in living cells and 2) novel experimental tools to operate biochemical networks under out-of-equilibrium conditions. Here, a comprehensive research program is proposed that addresses these challenges by engineering three biochemical model systems that display elementary signal transduction and information processing capabilities. In addition, an open microfluidic droplet reactor is developed that will allow, for the first time, high-throughput analysis of biomolecular circuits encapsulated in water-in-oil droplets. An integral part of the research program is to combine the computational design of in-vitro circuits with novel biochemistry and innovative micro-engineering tools.
Summary
Programmable biomolecular circuits have received increasing attention in recent years as the scope of chemistry expands from the synthesis of individual molecules to the construction of chemical networks that can perform sophisticated functions such as logic operations and feedback control. Rationally engineered biomolecular circuits that robustly execute higher-order spatiotemporal behaviours typically associated with intracellular regulatory functions present a unique and uncharted platform to systematically explore the molecular logic and physical design principles of the cell. The experience gained by in-vitro construction of artificial cells displaying advanced system-level functions deepens our understanding of regulatory networks in living cells and allows theoretical assumptions and models to be refined in a controlled setting. This proposal combines elements from systems chemistry, in-vitro synthetic biology and micro-engineering and explores generic strategies to investigate the molecular logic of biology’s regulatory circuits by applying a physical chemistry-driven bottom-up approach. Progress in this field requires 1) proof-of-principle systems where in-vitro biomolecular circuits are designed to emulate characteristic system-level functions of regulatory circuits in living cells and 2) novel experimental tools to operate biochemical networks under out-of-equilibrium conditions. Here, a comprehensive research program is proposed that addresses these challenges by engineering three biochemical model systems that display elementary signal transduction and information processing capabilities. In addition, an open microfluidic droplet reactor is developed that will allow, for the first time, high-throughput analysis of biomolecular circuits encapsulated in water-in-oil droplets. An integral part of the research program is to combine the computational design of in-vitro circuits with novel biochemistry and innovative micro-engineering tools.
Max ERC Funding
1 887 180 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym bioSPINspired
Project Bio-inspired Spin-Torque Computing Architectures
Researcher (PI) Julie Grollier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Summary
In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Max ERC Funding
1 907 767 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym BoneImplant
Project Monitoring bone healing around endosseous implants: from multiscale modeling to the patient’s bed
Researcher (PI) Guillaume Loïc Haiat
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Summary
Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Max ERC Funding
1 992 154 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym BUILDUP
Project Galaxy Buildup in the Young Universe: from the First Billion Years through the Peak Activity Epoch
Researcher (PI) Karina Caputi
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Deep galaxy surveys are the most valuable asset to understand the history of our Universe. They are key to test galaxy formation models which, based on the Cold Dark Matter framework, are successful at reproducing general aspects of galaxy evolution with cosmic time. However, important discrepancies still exist between models and observations, most notably at high redshifts. This Project will reconstruct the history of galaxy buildup from the first billion years of cosmic time through the peak activity epoch of the Universe, which occurred 10 billion years ago, providing a fundamental constraint for galaxy formation models.
I am leading the largest ultra-deep galaxy survey that will ever be conducted with the Spitzer Space Telescope. In this Project, I will exploit my new Spitzer program to do a groundbreaking study of galaxy buildup in the young Universe, paving the way for further galaxy evolution studies with the forthcoming James Webb Space Telescope (JWST). My main objectives are: 1) quantifying galaxy stellar mass assembly beyond the peak activity epoch, through the study of the galaxy stellar mass function up to z~7; 2) measuring, for the first time, galaxy clustering with stellar mass information up to such high redshifts; 3) linking galaxy growth to dust-obscured star formation using Spitzer and new APEX/AMKID sub-millimetre data; 4) unveiling the first steps of galaxy buildup at z>7 with JWST; 5) optimizing the official JWST Mid Infrared Instrument (MIRI) data reduction pipeline for the analysis of deep galaxy surveys. The delivery of an optimized MIRI pipeline is an important added value to the scientific outcome of this Project, which will benefit the general Astronomical community.
This is the right time for this Project to make a maximum impact. We are now in a turning point for IR Astronomy, and this opportunity should not be missed. This Project will have a long-lasting legacy, bridging current and next generations of IR galaxy surveys.
Summary
Deep galaxy surveys are the most valuable asset to understand the history of our Universe. They are key to test galaxy formation models which, based on the Cold Dark Matter framework, are successful at reproducing general aspects of galaxy evolution with cosmic time. However, important discrepancies still exist between models and observations, most notably at high redshifts. This Project will reconstruct the history of galaxy buildup from the first billion years of cosmic time through the peak activity epoch of the Universe, which occurred 10 billion years ago, providing a fundamental constraint for galaxy formation models.
I am leading the largest ultra-deep galaxy survey that will ever be conducted with the Spitzer Space Telescope. In this Project, I will exploit my new Spitzer program to do a groundbreaking study of galaxy buildup in the young Universe, paving the way for further galaxy evolution studies with the forthcoming James Webb Space Telescope (JWST). My main objectives are: 1) quantifying galaxy stellar mass assembly beyond the peak activity epoch, through the study of the galaxy stellar mass function up to z~7; 2) measuring, for the first time, galaxy clustering with stellar mass information up to such high redshifts; 3) linking galaxy growth to dust-obscured star formation using Spitzer and new APEX/AMKID sub-millimetre data; 4) unveiling the first steps of galaxy buildup at z>7 with JWST; 5) optimizing the official JWST Mid Infrared Instrument (MIRI) data reduction pipeline for the analysis of deep galaxy surveys. The delivery of an optimized MIRI pipeline is an important added value to the scientific outcome of this Project, which will benefit the general Astronomical community.
This is the right time for this Project to make a maximum impact. We are now in a turning point for IR Astronomy, and this opportunity should not be missed. This Project will have a long-lasting legacy, bridging current and next generations of IR galaxy surveys.
Max ERC Funding
1 902 235 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ByoPiC
Project The Baryon Picture of the Cosmos
Researcher (PI) nabila AGHANIM
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Summary
The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Max ERC Funding
2 488 350 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CAT
Project Climbing the Asian Water Tower
Researcher (PI) Wouter Willem Immerzeel
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Starting Grant (StG), PE10, ERC-2015-STG
Summary The water cycle in the Himalaya is poorly understood because of its extreme topography that results in complex interactions between climate and water stored in snow and glaciers. Hydrological extremes in the greater Himalayas regularly cause great damage, e.g. the Pakistan floods in 2010, while the Himalayas also supply water to over 25% of the global population. So, the stakes are high and an accurate understanding of the Himalayan water cycle is imperative. The discovery of the monumental error on the future of the Himalayan glaciers in the fourth assessment report of the IPCC is exemplary for the scientific misconceptions which are associated to the Himalayan glaciers and its water supplying function. The underlying reason is the huge scale gap that exists between studies for individual glaciers that are not representative of the entire region and hydrological modelling studies that represent the variability in Himalayan climates. In CAT, I will bridge this knowledge gap and explain spatial differences in Himalayan glacio-hydrology at an unprecedented level of detail by combining high-altitude observations, the latest remote sensing technology and state-of-the-art atmospheric and hydrological models. I will generate a high-altitude meteorological observations and will employ drones to monitor glacier dynamics. The data will be used to parameterize key processes in hydro-meteorological models such as cloud resolving mechanisms, glacier dynamics and the ice and snow energy balance. The results will be integrated into atmospheric and glacio-hyrological models for two representative, but contrasting catchments using in combination with the systematic inclusion of the newly developed algorithms. CAT will unambiguously reveal spatial differences in Himalayan glacio-hydrology necessary to project future changes in water availability and extreme events. As such, CAT may provide the scientific base for climate change adaptation policies in this vulnerable region.
Summary
The water cycle in the Himalaya is poorly understood because of its extreme topography that results in complex interactions between climate and water stored in snow and glaciers. Hydrological extremes in the greater Himalayas regularly cause great damage, e.g. the Pakistan floods in 2010, while the Himalayas also supply water to over 25% of the global population. So, the stakes are high and an accurate understanding of the Himalayan water cycle is imperative. The discovery of the monumental error on the future of the Himalayan glaciers in the fourth assessment report of the IPCC is exemplary for the scientific misconceptions which are associated to the Himalayan glaciers and its water supplying function. The underlying reason is the huge scale gap that exists between studies for individual glaciers that are not representative of the entire region and hydrological modelling studies that represent the variability in Himalayan climates. In CAT, I will bridge this knowledge gap and explain spatial differences in Himalayan glacio-hydrology at an unprecedented level of detail by combining high-altitude observations, the latest remote sensing technology and state-of-the-art atmospheric and hydrological models. I will generate a high-altitude meteorological observations and will employ drones to monitor glacier dynamics. The data will be used to parameterize key processes in hydro-meteorological models such as cloud resolving mechanisms, glacier dynamics and the ice and snow energy balance. The results will be integrated into atmospheric and glacio-hyrological models for two representative, but contrasting catchments using in combination with the systematic inclusion of the newly developed algorithms. CAT will unambiguously reveal spatial differences in Himalayan glacio-hydrology necessary to project future changes in water availability and extreme events. As such, CAT may provide the scientific base for climate change adaptation policies in this vulnerable region.
Max ERC Funding
1 499 631 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym CatASus
Project Cleave and couple: Fully sustainable catalytic conversion of renewable resources to amines
Researcher (PI) Katalin Barta Weissert
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Starting Grant (StG), PE5, ERC-2015-STG
Summary Amines are crucially important classes of chemicals, widely present in pharmaceuticals, agrochemicals and surfactants. Yet, surprisingly, a systematic approach to obtaining this essential class of compounds from renewables has not been realized to date.
The aim of this proposal is to enable chemical pathways for the production of amines through alcohols from renewable resources, preferably lignocellulose waste. Two key scientific challenges will be addressed: The development of efficient cleavage reactions of complex renewable resources by novel heterogeneous catalysts; and finding new homogeneous catalyst based on earth-abundant metals for the atom-economic coupling of the derived alcohol building blocks directly with ammonia as well as possible further functionalization reactions. The program is divided into 3 interrelated but not mutually dependent work packages, each research addressing a key challenge in their respective fields, these are:
WP1: Lignin conversion to aromatics; WP2: Cellulose-derived platform chemicals to aromatic and aliphatic diols and solvents. WP3: New iron-based homogeneous catalysts for the direct, atom-economic C-O to C-N transformations.
The approach taken will embrace the inherent complexity present in the renewable feedstock. A unique balance between cleavage and coupling pathways will allow to access chemical diversity in products that is necessary to achieve economic competitiveness with current fossil fuel-based pathways and will permit rapid conversion to higher value products such as functionalized amines that can enter the chemical supply chain at a much later stage than bulk chemicals derived from petroleum. The proposed high risk-high gain research will push the frontiers of sustainable and green chemistry and reach well beyond state of the art in this area. This universal, flexible and iterative approach is anticipated to give rise to a variety of similar systems targeting diverse product outcomes starting from renewables.
Summary
Amines are crucially important classes of chemicals, widely present in pharmaceuticals, agrochemicals and surfactants. Yet, surprisingly, a systematic approach to obtaining this essential class of compounds from renewables has not been realized to date.
The aim of this proposal is to enable chemical pathways for the production of amines through alcohols from renewable resources, preferably lignocellulose waste. Two key scientific challenges will be addressed: The development of efficient cleavage reactions of complex renewable resources by novel heterogeneous catalysts; and finding new homogeneous catalyst based on earth-abundant metals for the atom-economic coupling of the derived alcohol building blocks directly with ammonia as well as possible further functionalization reactions. The program is divided into 3 interrelated but not mutually dependent work packages, each research addressing a key challenge in their respective fields, these are:
WP1: Lignin conversion to aromatics; WP2: Cellulose-derived platform chemicals to aromatic and aliphatic diols and solvents. WP3: New iron-based homogeneous catalysts for the direct, atom-economic C-O to C-N transformations.
The approach taken will embrace the inherent complexity present in the renewable feedstock. A unique balance between cleavage and coupling pathways will allow to access chemical diversity in products that is necessary to achieve economic competitiveness with current fossil fuel-based pathways and will permit rapid conversion to higher value products such as functionalized amines that can enter the chemical supply chain at a much later stage than bulk chemicals derived from petroleum. The proposed high risk-high gain research will push the frontiers of sustainable and green chemistry and reach well beyond state of the art in this area. This universal, flexible and iterative approach is anticipated to give rise to a variety of similar systems targeting diverse product outcomes starting from renewables.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym CEMYSS
Project Cosmochemical Exploration of the first two Million Years of the Solar System
Researcher (PI) Marc Chaussidon
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Summary
One of the major outcomes of recent studies on the formation of the Solar System is the reconnaissance of the fundamental importance of processes which took place during the first 10 thousands to 2 or 3 millions years of the lifetime of the Sun and its accretion disk. Astrophysical observations in the optical to infrared wavelengths of circumstellar disks around young stars have shown the existence in the inner disk of high-temperature processing of the dust. X-ray observations of T-Tauri stars revealed that they exhibit X-ray flare enhancements by several orders of magnitude. The work we have performed over the last years on the isotopic analysis of either solar wind trapped in lunar soils or of Ca-, Al-rich inclusions and chondrules from primitive chondrites, has allowed us to link some of these astrophysical observations around young stars with processes, such as irradiation by energetic particles and UV light, which took place around the T-Tauri Sun. The aim of this project is to make decisive progress in our understanding of the early solar system though the development of in situ high-precision isotopic measurements by ion microprobe in extra-terrestrial matter. The project will be focused on the exploration of the variations in the isotopic composition of O and Mg and in the concentration of short-lived radioactive nuclides, such as 26Al and 10Be, with half-lives shorter than 1.5 millions years. A special emphasis will be put on the search for nuclides with very short half-lives such as 32Si (650 years) and 14C (5730 years), nuclides which have never been discovered yet in meteorites. These new data will bring critical information on, for instance, the astrophysical context for the formation of the Sun and the first solids in the accretion disk, or the timing and the processes by which protoplanets were formed and destroyed close to the Sun during the first 2 million years of the lifetime of the Solar System.
Max ERC Funding
1 270 419 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym CHAMPAGNE
Project Charge orders, Magnetism and Pairings in High Temperature Superconductors
Researcher (PI) Catherine, Marie, Elisabeth PEPIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE3, ERC-2015-AdG
Summary For nearly thirty years, the search for a room-temperature superconductor has focused on exotic materials known as cuprates, obtained by doping a parent Mott insulator, and which can carry currents without losing energy as heat at temperatures up to 164 Kelvin. Conventionally three main players were identified as being crucial i) the Mott insulating phase, ii) the anti-ferromagnetic order and iii) the superconducting (SC) phase. Recently a body of experimental probes suggested the presence of a fourth forgotten player, charge ordering-, as a direct competitor for superconductivity. In this project we propose that the relationship between charge ordering and superconductivity is more intimate than previously thought and is protected by an emerging SU(2) symmetry relating the two. The beauty of our theory resides in that it can be encapsulated in one simple and universal “gap equation”, which in contrast to strong coupling approaches used up to now, can easily be connected to experiments. In the first part of this work, we will refine the theoretical model in order to shape it for comparison with experiments and consistently test the SU(2) symmetry. In the second part of the work, we will search for the experimental signatures of our theory through a back and forth interaction with experimental groups. We expect our theory to generate new insights and experimental developments, and to lead to a major breakthrough if it correctly explains the origin of anomalous superconductivity in these materials.
Summary
For nearly thirty years, the search for a room-temperature superconductor has focused on exotic materials known as cuprates, obtained by doping a parent Mott insulator, and which can carry currents without losing energy as heat at temperatures up to 164 Kelvin. Conventionally three main players were identified as being crucial i) the Mott insulating phase, ii) the anti-ferromagnetic order and iii) the superconducting (SC) phase. Recently a body of experimental probes suggested the presence of a fourth forgotten player, charge ordering-, as a direct competitor for superconductivity. In this project we propose that the relationship between charge ordering and superconductivity is more intimate than previously thought and is protected by an emerging SU(2) symmetry relating the two. The beauty of our theory resides in that it can be encapsulated in one simple and universal “gap equation”, which in contrast to strong coupling approaches used up to now, can easily be connected to experiments. In the first part of this work, we will refine the theoretical model in order to shape it for comparison with experiments and consistently test the SU(2) symmetry. In the second part of the work, we will search for the experimental signatures of our theory through a back and forth interaction with experimental groups. We expect our theory to generate new insights and experimental developments, and to lead to a major breakthrough if it correctly explains the origin of anomalous superconductivity in these materials.
Max ERC Funding
1 318 145 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym chemech
Project From Chemical Bond Forces and Breakage to Macroscopic Fracture of Soft Materials
Researcher (PI) Costantino CRETON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary Soft materials are irreplaceable in engineering applications where large reversible deformations are needed, and in life sciences to mimic ever more closely or replace a variety of living tissues. While mechanical strength may not be essential for all applications, excessive brittleness is a strong limitation. Yet predicting if a soft material will be tough or brittle from its molecular composition or structure relies on empirical concepts due to the lack of proper tools to detect the damage occurring to the material before it breaks. Taking advantage of the recent advances in materials science and mechanochemistry, we propose a ground-breaking method to investigate the mechanisms of fracture of tough soft materials. To achieve this objective we will use a series of model materials containing a variable population of internal sacrificial bonds that break before the material fails macroscopically, and use a combination of advanced characterization techniques and molecular probes to map stress, strain, bond breakage and structure in a region ~100 µm in size ahead of the propagating crack. By using mechanoluminescent and mechanophore molecules incorporated in the model material in selected positions, confocal laser microscopy, digital image correlation and small-angle X-ray scattering we will gain an unprecedented molecular understanding of where and when bonds break as the material fails and the crack propagates, and will then be able to establish a direct relation between the architecture of soft polymer networks and their fracture energy, leading to a new molecular and multi-scale vision of macroscopic fracture of soft materials. Such advances will be invaluable to guide materials chemists to design and develop better and more finely tuned soft but tough and sometimes self-healing materials to replace living tissues (in bio engineering) and make lightweight tough and flexible parts for energy efficient transport.
Summary
Soft materials are irreplaceable in engineering applications where large reversible deformations are needed, and in life sciences to mimic ever more closely or replace a variety of living tissues. While mechanical strength may not be essential for all applications, excessive brittleness is a strong limitation. Yet predicting if a soft material will be tough or brittle from its molecular composition or structure relies on empirical concepts due to the lack of proper tools to detect the damage occurring to the material before it breaks. Taking advantage of the recent advances in materials science and mechanochemistry, we propose a ground-breaking method to investigate the mechanisms of fracture of tough soft materials. To achieve this objective we will use a series of model materials containing a variable population of internal sacrificial bonds that break before the material fails macroscopically, and use a combination of advanced characterization techniques and molecular probes to map stress, strain, bond breakage and structure in a region ~100 µm in size ahead of the propagating crack. By using mechanoluminescent and mechanophore molecules incorporated in the model material in selected positions, confocal laser microscopy, digital image correlation and small-angle X-ray scattering we will gain an unprecedented molecular understanding of where and when bonds break as the material fails and the crack propagates, and will then be able to establish a direct relation between the architecture of soft polymer networks and their fracture energy, leading to a new molecular and multi-scale vision of macroscopic fracture of soft materials. Such advances will be invaluable to guide materials chemists to design and develop better and more finely tuned soft but tough and sometimes self-healing materials to replace living tissues (in bio engineering) and make lightweight tough and flexible parts for energy efficient transport.
Max ERC Funding
2 251 026 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CHRiSHarMa
Project Commutators, Hilbert and Riesz transforms,Shifts, Harmonic extensions and Martingales
Researcher (PI) Stefanie Petermichl
Host Institution (HI) UNIVERSITE PAUL SABATIER TOULOUSE III
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Summary
This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Max ERC Funding
1 523 963 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CIRCUS
Project An end-to-end verification architecture for building Certified Implementations of Robust, Cryptographically Secure web applications
Researcher (PI) Karthikeyan Bhargavan
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Summary
The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Max ERC Funding
1 885 248 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CLEAN-ICE
Project Detailed chemical kinetic models for cleaner internal combustion engines
Researcher (PI) Frederique Battin-Leclerc
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2008-AdG
Summary The key objective of this project is to promote cleaner and more efficient combustion technologies through the development of theoretically grounded and more accurate chemical models. This is motivated by the fact that the current models which have been developed for the combustion of constituents of gasoline, kerosene, and diesel fuels do a reasonable job in predicting auto-ignition and flame propagation parameters, and the formation of the main regulated pollutants. However their success rate deteriorates sharply in the prediction of the formation of minor products (alkenes, dienes, aromatics, aldehydes) and soot nano-particles, which have a deleterious impact on both the environment and on human health. At the same time, despite an increasing emphasis in shifting from hydrocarbon fossil fuels to bio-fuels (particularly bioethanol and biodiesel), there is a great lack of chemical models for the combustion of oxygenated reactants. The main scientific focus will then be to enlarge and deepen the understanding of the reaction mechanisms and pathways associated with the combustion of an increased range of fuels (hydrocarbons and oxygenated compounds) and to elucidate the formation of a large number of hazardous minor pollutants. The core of the project is to describe at a fundamental level more accurately the reactive chemistry of minor pollutants within extensively validated detailed mechanisms for not only traditional fuels, but also innovative surrogates, describing the complex chemistry of new environmentally important bio-fuels. At the level of individual reactions rate constants, generalized rate constant classes and molecular data will be enhanced by using techniques based on quantum mechanics and on statistical mechanics. Experimental data for validation will be obtained in well defined laboratory reactors by using analytical methods of increased accuracy.
Summary
The key objective of this project is to promote cleaner and more efficient combustion technologies through the development of theoretically grounded and more accurate chemical models. This is motivated by the fact that the current models which have been developed for the combustion of constituents of gasoline, kerosene, and diesel fuels do a reasonable job in predicting auto-ignition and flame propagation parameters, and the formation of the main regulated pollutants. However their success rate deteriorates sharply in the prediction of the formation of minor products (alkenes, dienes, aromatics, aldehydes) and soot nano-particles, which have a deleterious impact on both the environment and on human health. At the same time, despite an increasing emphasis in shifting from hydrocarbon fossil fuels to bio-fuels (particularly bioethanol and biodiesel), there is a great lack of chemical models for the combustion of oxygenated reactants. The main scientific focus will then be to enlarge and deepen the understanding of the reaction mechanisms and pathways associated with the combustion of an increased range of fuels (hydrocarbons and oxygenated compounds) and to elucidate the formation of a large number of hazardous minor pollutants. The core of the project is to describe at a fundamental level more accurately the reactive chemistry of minor pollutants within extensively validated detailed mechanisms for not only traditional fuels, but also innovative surrogates, describing the complex chemistry of new environmentally important bio-fuels. At the level of individual reactions rate constants, generalized rate constant classes and molecular data will be enhanced by using techniques based on quantum mechanics and on statistical mechanics. Experimental data for validation will be obtained in well defined laboratory reactors by using analytical methods of increased accuracy.
Max ERC Funding
1 869 450 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym CLIM
Project Computational Light fields IMaging
Researcher (PI) Christine GUILLEMOT
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary Light fields technology holds great promises in computational imaging. Light fields cameras capture light rays as they interact with physical objects in the scene. The recorded flow of rays (the light field) yields a rich description of the scene enabling advanced image creation capabilities from a single capture. This technology is expected to bring disruptive changes in computational imaging. However, the trajectory to a deployment of light fields remains cumbersome. Bottlenecks need to be alleviated before being able to fully exploit its potential. Barriers that CLIM addresses are the huge amount of high-dimensional (4D/5D) data produced by light fields, limitations of capturing devices, editing and image creation capabilities from compressed light fields. These barriers cannot be overcome by a simple application of methods which have made the success of digital imaging in past decades. The 4D/5D sampling of the geometric distribution of light rays striking the camera sensors imply radical changes in the signal processing chain compared to traditional imaging systems.
The ambition of CLIM is to lay new algorithmic foundations for the 4D/5D light fields processing chain, going from representation, compression to rendering. Data processing becomes tougher as dimensionality increases, which is the case of light fields compared to 2D images. This leads to the first challenge of CLIM that is the development of methods for low dimensional embedding and sparse representations of 4D/5D light fields. The second challenge is to develop a coding/decoding architecture for light fields which will exploit their geometrical models while preserving the structures that are critical for advanced image creation capabilities. CLIM targets ground-breaking solutions which should open new horizons for a number of consumer and professional markets (photography, augmented reality, light field microscopy, medical imaging, particle image velocimetry).
Summary
Light fields technology holds great promises in computational imaging. Light fields cameras capture light rays as they interact with physical objects in the scene. The recorded flow of rays (the light field) yields a rich description of the scene enabling advanced image creation capabilities from a single capture. This technology is expected to bring disruptive changes in computational imaging. However, the trajectory to a deployment of light fields remains cumbersome. Bottlenecks need to be alleviated before being able to fully exploit its potential. Barriers that CLIM addresses are the huge amount of high-dimensional (4D/5D) data produced by light fields, limitations of capturing devices, editing and image creation capabilities from compressed light fields. These barriers cannot be overcome by a simple application of methods which have made the success of digital imaging in past decades. The 4D/5D sampling of the geometric distribution of light rays striking the camera sensors imply radical changes in the signal processing chain compared to traditional imaging systems.
The ambition of CLIM is to lay new algorithmic foundations for the 4D/5D light fields processing chain, going from representation, compression to rendering. Data processing becomes tougher as dimensionality increases, which is the case of light fields compared to 2D images. This leads to the first challenge of CLIM that is the development of methods for low dimensional embedding and sparse representations of 4D/5D light fields. The second challenge is to develop a coding/decoding architecture for light fields which will exploit their geometrical models while preserving the structures that are critical for advanced image creation capabilities. CLIM targets ground-breaking solutions which should open new horizons for a number of consumer and professional markets (photography, augmented reality, light field microscopy, medical imaging, particle image velocimetry).
Max ERC Funding
2 461 086 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CoBCoM
Project Computational Brain Connectivity Mapping
Researcher (PI) Rachid DERICHE
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Summary
One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Max ERC Funding
2 469 123 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COHEGRAPH
Project Electron quantum optics in Graphene
Researcher (PI) Séverin Preden Roulleau
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Summary
Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym COHERENCE
Project Exploiting light coherence in photoacoustic imaging
Researcher (PI) Emmanuel Bossy
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Consolidator Grant (CoG), PE7, ERC-2015-CoG
Summary Photoacoustic imaging is an emerging multi-wave imaging modality that couples light excitation to acoustic detection, via the photoacoustic effect (sound generation via light absorption). Photoacoustic imaging provides images of optical absorption (as opposed to optical scattering). In addition, as photoacoustic imaging relies on detecting ultrasound waves that are very weakly scattered in biological tissue, it provides acoustic-resolution images of optical absorption non-invasively at large depths (up to several cm), where purely optical techniques have a poor resolution because of multiple scattering. As for conventional purely optical approaches, optical-resolution photoacoustic microscopy can also be performed non-invasively for shallow depth (< 1 mm), or invasively at depth by endoscopic approaches. However, photoacoustic imaging suffers several limitations. For imaging at greater depths, non-invasive photoacoustic imaging in the acoustic-resolution regime is limited by a depth-to-resolution ratio of about 100, because ultrasound attenuation increases with frequency. Optical-resolution photoacoustic endoscopy has very recently been introduced as a complementary approach, but is currently limited in terms of resolution (> 6 µm) and footprint (diameter > 2 mm).
The overall objective of COHERENCE is to break the above limitations and reach diffraction-limited optical-resolution photoacoustic imaging at depth in tissue in vivo. To do so, the core concept of COHERENCE is to use and manipulate coherent light in photoacoustic imaging. Specifically, COHERENCE will develop novel methods based on speckle illumination, wavefront shaping and super-resolution imaging. COHERENCE will result in two prototypes for tissue imaging, an optical-resolution photoacoustic endoscope for minimally-invasive any-depth tissue imaging, and a non-invasive photoacoustic microscope with enhanced depth-to-resolution ratio, up to optical resolution in the multiply-scattered light regime.
Summary
Photoacoustic imaging is an emerging multi-wave imaging modality that couples light excitation to acoustic detection, via the photoacoustic effect (sound generation via light absorption). Photoacoustic imaging provides images of optical absorption (as opposed to optical scattering). In addition, as photoacoustic imaging relies on detecting ultrasound waves that are very weakly scattered in biological tissue, it provides acoustic-resolution images of optical absorption non-invasively at large depths (up to several cm), where purely optical techniques have a poor resolution because of multiple scattering. As for conventional purely optical approaches, optical-resolution photoacoustic microscopy can also be performed non-invasively for shallow depth (< 1 mm), or invasively at depth by endoscopic approaches. However, photoacoustic imaging suffers several limitations. For imaging at greater depths, non-invasive photoacoustic imaging in the acoustic-resolution regime is limited by a depth-to-resolution ratio of about 100, because ultrasound attenuation increases with frequency. Optical-resolution photoacoustic endoscopy has very recently been introduced as a complementary approach, but is currently limited in terms of resolution (> 6 µm) and footprint (diameter > 2 mm).
The overall objective of COHERENCE is to break the above limitations and reach diffraction-limited optical-resolution photoacoustic imaging at depth in tissue in vivo. To do so, the core concept of COHERENCE is to use and manipulate coherent light in photoacoustic imaging. Specifically, COHERENCE will develop novel methods based on speckle illumination, wavefront shaping and super-resolution imaging. COHERENCE will result in two prototypes for tissue imaging, an optical-resolution photoacoustic endoscope for minimally-invasive any-depth tissue imaging, and a non-invasive photoacoustic microscope with enhanced depth-to-resolution ratio, up to optical resolution in the multiply-scattered light regime.
Max ERC Funding
2 116 290 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym COSMIC-DANCE
Project Unraveling the origin of the Initial Mass Function
Researcher (PI) Herve Bouy
Host Institution (HI) UNIVERSITE DE BORDEAUX
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Despite the tremendous progress achieved over the past decade, the study of stellar formation is far from complete. We have not yet measured the minimum mass for star formation, nor the shape of the IMF down to the least massive free-floating planets, or know how universal this shape is. Although clusters are the building blocks of galaxies, little is known about their early dynamical evolution and dispersal into the field. The main culprit for this state of affairs is the high level of contamination and incompleteness in the sub-stellar regime, even for the best photometric and astrometric surveys.
COSMIC-DANCE aims at overcoming these drawbacks and revealing the shape of the IMF with a precision and completeness surpassing current and foreseeable surveys of the next 15 years. We will:
1) Measure: using a groundbreaking, proven and so far unique method I designed, we will measure proper motions with an accuracy comparable to Gaia but 5 magnitudes deeper, reaching the planetary mass domain, and, critically, piercing through the dust obscured young clusters inaccessible to Gaia’s optical sensors.
2) Discover: feeding these proper motions and the multi-wavelength photometry to innovative hyper-dimensional data mining techniques, we will securely identify cluster members within the millions of sources of the COSMIC-DANCE database, complemented by Gaia at the bright end, to obtain the final census over the entire mass spectrum for 20 young nearby clusters, the end of a 60-year quest.
3) Understand: by providing conclusive empirical constraints over a broad parameter space unaccessible to current state-of-the-art surveys on the much debated respective contributions of evolutionary effects (dynamics, feedback and competitive accretion) and initial conditions (core properties) to the shape and bottom of the IMF, the most fundamental and informative product of star formation, with essential bearings on many areas of general astrophysics.
Summary
Despite the tremendous progress achieved over the past decade, the study of stellar formation is far from complete. We have not yet measured the minimum mass for star formation, nor the shape of the IMF down to the least massive free-floating planets, or know how universal this shape is. Although clusters are the building blocks of galaxies, little is known about their early dynamical evolution and dispersal into the field. The main culprit for this state of affairs is the high level of contamination and incompleteness in the sub-stellar regime, even for the best photometric and astrometric surveys.
COSMIC-DANCE aims at overcoming these drawbacks and revealing the shape of the IMF with a precision and completeness surpassing current and foreseeable surveys of the next 15 years. We will:
1) Measure: using a groundbreaking, proven and so far unique method I designed, we will measure proper motions with an accuracy comparable to Gaia but 5 magnitudes deeper, reaching the planetary mass domain, and, critically, piercing through the dust obscured young clusters inaccessible to Gaia’s optical sensors.
2) Discover: feeding these proper motions and the multi-wavelength photometry to innovative hyper-dimensional data mining techniques, we will securely identify cluster members within the millions of sources of the COSMIC-DANCE database, complemented by Gaia at the bright end, to obtain the final census over the entire mass spectrum for 20 young nearby clusters, the end of a 60-year quest.
3) Understand: by providing conclusive empirical constraints over a broad parameter space unaccessible to current state-of-the-art surveys on the much debated respective contributions of evolutionary effects (dynamics, feedback and competitive accretion) and initial conditions (core properties) to the shape and bottom of the IMF, the most fundamental and informative product of star formation, with essential bearings on many areas of general astrophysics.
Max ERC Funding
1 859 413 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym COSMO_SIMS
Project Astrophysics for the Dark Universe: Cosmological simulations in the context of dark matter and dark energy research
Researcher (PI) Oliver Jens Hahn
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Summary
The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Max ERC Funding
1 471 382 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COSMOKEMS
Project EXPERIMENTAL CONSTRAINTS ON THE ISOTOPE SIGNATURES OF THE EARLY SOLAR SYSTEM
Researcher (PI) bernard BOURDON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary This project aims at simulating the processes that took place in the early Solar System to determine how these processes shaped the chemical and isotope compositions of solids that accreted to ultimately form terrestrial planets. Planetary materials exhibit mass dependent and mass independent isotope signatures and their origin and relationships are not fully understood. This proposal will be based on new experiments reproducing the conditions of the solar nebula in its first few million years and on a newly designed Knudsen Effusion Mass Spectrometer (KEMS) that will be built for the purpose of this project. This project consists of three main subprojects: (1) we will simulate the effect of particle irradiation on solids to examine how isotopes can be fractionated by these processes to identify whether this can explain chemical variations in meteorites. We will examine whether particle irradiation can cause mass independent fractionation, (2) the novel KEMS instrument will be used to determine the equilibrium isotope fractionation associated with reactions between gas and condensed phases at high temperature. It will also be used to determine the kinetic isotope fractionation associated with evaporation and condensation of solids. This will provide new constraints on the thermodynamic conditions, T, P and fO2 during heating events that have modified the chemical composition of planetary materials. These constraints will also help identify the processes that cause the depletion in volatile elements and the fractionation in refractory elements observed in planetesimals and planets, (3) we will examine the effect of UV irradiation on chemical species in the vapour phase as an attempt to reproduce observed isotope compositions found in meteorites or their components. These results may radically change our view on how the protoplanetary disk evolved and how solids were transported and mixed.
Summary
This project aims at simulating the processes that took place in the early Solar System to determine how these processes shaped the chemical and isotope compositions of solids that accreted to ultimately form terrestrial planets. Planetary materials exhibit mass dependent and mass independent isotope signatures and their origin and relationships are not fully understood. This proposal will be based on new experiments reproducing the conditions of the solar nebula in its first few million years and on a newly designed Knudsen Effusion Mass Spectrometer (KEMS) that will be built for the purpose of this project. This project consists of three main subprojects: (1) we will simulate the effect of particle irradiation on solids to examine how isotopes can be fractionated by these processes to identify whether this can explain chemical variations in meteorites. We will examine whether particle irradiation can cause mass independent fractionation, (2) the novel KEMS instrument will be used to determine the equilibrium isotope fractionation associated with reactions between gas and condensed phases at high temperature. It will also be used to determine the kinetic isotope fractionation associated with evaporation and condensation of solids. This will provide new constraints on the thermodynamic conditions, T, P and fO2 during heating events that have modified the chemical composition of planetary materials. These constraints will also help identify the processes that cause the depletion in volatile elements and the fractionation in refractory elements observed in planetesimals and planets, (3) we will examine the effect of UV irradiation on chemical species in the vapour phase as an attempt to reproduce observed isotope compositions found in meteorites or their components. These results may radically change our view on how the protoplanetary disk evolved and how solids were transported and mixed.
Max ERC Funding
3 106 625 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym CoupledIceClim
Project Coupled climate and Greenland ice sheet evolution:past, present and future
Researcher (PI) Miren Vizcaino Trueba
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE10, ERC-2015-STG
Summary The Greenland ice sheet (GrIS) is losing mass at an increasing pace, in response to atmospheric and ocean forcing. The mechanisms leading to the observed mass loss are poorly understood. It is not clear whether the current trends will be sustained into the future, and how they are affected by regional and global climate variability. In addition, the impacts of Greenland deglaciation on the local and global climate are not well known. This project aims to explain the relationship between GrIS surface melt trends and climate variability, to determine the timing and impacts of multi-century deglaciation of Greenland, and to explain the relationship between ongoing and previous deglaciations during the last interglacial and the Holocene. For this purpose, we will use the Community Earth System Model (CESM), the first full-complexity global climate model to include interactive ice sheet flow and a realistic and physical-based simulation of surface mass balance (the difference between surface accumulation and losses from runoff and sublimation). This tool will include for the first time a large range of temporal and spatial scales of ice sheet-climate interaction in the same model. Previous work has been done with oversimplified and/or uncoupled representations of ice sheet and climate processes, for instance with simplified ocean and/or atmospheric dynamics in Earth System Models of Intermediate Complexity, with fixed topography and prescribed ocean components in Regional Climate Models, or with highly parameterized snow albedo and/or melt schemes in General Circulation Models. This project will provide new insights into the coupling between the GrIS and climate change, will lead widespread integration of ice sheets as a new and indispensable component of complex Earth System Models, and will advance our understanding of present and past climate dynamics.
Summary
The Greenland ice sheet (GrIS) is losing mass at an increasing pace, in response to atmospheric and ocean forcing. The mechanisms leading to the observed mass loss are poorly understood. It is not clear whether the current trends will be sustained into the future, and how they are affected by regional and global climate variability. In addition, the impacts of Greenland deglaciation on the local and global climate are not well known. This project aims to explain the relationship between GrIS surface melt trends and climate variability, to determine the timing and impacts of multi-century deglaciation of Greenland, and to explain the relationship between ongoing and previous deglaciations during the last interglacial and the Holocene. For this purpose, we will use the Community Earth System Model (CESM), the first full-complexity global climate model to include interactive ice sheet flow and a realistic and physical-based simulation of surface mass balance (the difference between surface accumulation and losses from runoff and sublimation). This tool will include for the first time a large range of temporal and spatial scales of ice sheet-climate interaction in the same model. Previous work has been done with oversimplified and/or uncoupled representations of ice sheet and climate processes, for instance with simplified ocean and/or atmospheric dynamics in Earth System Models of Intermediate Complexity, with fixed topography and prescribed ocean components in Regional Climate Models, or with highly parameterized snow albedo and/or melt schemes in General Circulation Models. This project will provide new insights into the coupling between the GrIS and climate change, will lead widespread integration of ice sheets as a new and indispensable component of complex Earth System Models, and will advance our understanding of present and past climate dynamics.
Max ERC Funding
1 677 282 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym CoVeCe
Project Coinduction for Verification and Certification
Researcher (PI) Damien Gabriel Jacques Pous
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Summary
Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Max ERC Funding
1 407 413 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CRESUCHIRP
Project Ultrasensitive Chirped-Pulse Fourier Transform mm-Wave Detection of Transient Species in Uniform Supersonic Flows for Reaction Kinetics Studies under Extreme Conditions
Researcher (PI) Ian SIMS
Host Institution (HI) UNIVERSITE DE RENNES I
Call Details Advanced Grant (AdG), PE4, ERC-2015-AdG
Summary This proposal aims to develop a combination of a chirped-pulse (sub)mm-wave rotational spectrometer with uniform supersonic flows generated by expansion of gases through Laval nozzles and apply it to problems at the frontiers of reaction kinetics.
The CRESU (Reaction Kinetics in Uniform Supersonic Flow) technique, combined with laser photochemical methods, has been applied with great success to perform research in gas-phase chemical kinetics at low temperatures, of particular interest for astrochemistry and cold planetary atmospheres. Recently, the PI has been involved in the development of a new combination of the revolutionary chirped pulse broadband rotational spectroscopy technique invented by B. Pate and co-workers with a novel pulsed CRESU, which we have called Chirped Pulse in Uniform Flow (CPUF). Rotational cooling by frequent collisions with cold buffer gas in the CRESU flow at ca. 20 K drastically increases the sensitivity of the technique, making broadband rotational spectroscopy suitable for detecting a wide range of transient species, such as photodissociation or reaction products.
We propose to exploit the exceptional quality of the Rennes CRESU flows to build an improved CPUF instrument (only the second worldwide), and use it for the quantitative determination of product branching ratios in elementary chemical reactions over a wide temperature range (data which are sorely lacking as input to models of gas-phase chemical environments), as well as the detection of reactive intermediates and the testing of modern reaction kinetics theory. Low temperature reactions will be initially targeted; as it is here that there is the greatest need for data. A challenging development of the technique towards the study of high temperature reactions is also proposed, exploiting existing expertise in high enthalpy sources.
Summary
This proposal aims to develop a combination of a chirped-pulse (sub)mm-wave rotational spectrometer with uniform supersonic flows generated by expansion of gases through Laval nozzles and apply it to problems at the frontiers of reaction kinetics.
The CRESU (Reaction Kinetics in Uniform Supersonic Flow) technique, combined with laser photochemical methods, has been applied with great success to perform research in gas-phase chemical kinetics at low temperatures, of particular interest for astrochemistry and cold planetary atmospheres. Recently, the PI has been involved in the development of a new combination of the revolutionary chirped pulse broadband rotational spectroscopy technique invented by B. Pate and co-workers with a novel pulsed CRESU, which we have called Chirped Pulse in Uniform Flow (CPUF). Rotational cooling by frequent collisions with cold buffer gas in the CRESU flow at ca. 20 K drastically increases the sensitivity of the technique, making broadband rotational spectroscopy suitable for detecting a wide range of transient species, such as photodissociation or reaction products.
We propose to exploit the exceptional quality of the Rennes CRESU flows to build an improved CPUF instrument (only the second worldwide), and use it for the quantitative determination of product branching ratios in elementary chemical reactions over a wide temperature range (data which are sorely lacking as input to models of gas-phase chemical environments), as well as the detection of reactive intermediates and the testing of modern reaction kinetics theory. Low temperature reactions will be initially targeted; as it is here that there is the greatest need for data. A challenging development of the technique towards the study of high temperature reactions is also proposed, exploiting existing expertise in high enthalpy sources.
Max ERC Funding
2 100 230 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym DEEPVISION
Project Information-age microscopy for deep vision imaging of biological tissue
Researcher (PI) Ivo Micha Vellekoop
Host Institution (HI) UNIVERSITEIT TWENTE
Call Details Starting Grant (StG), PE7, ERC-2015-STG
Summary Modern biology could not exist without the optical microscope. Hundreds of years of research have seemingly developed microscopes to perfection, with one essential limitation: in turbid biological tissue, not even the most advanced microscope can penetrate deeper than a fraction of a millimetre. At larger depths light scattering prevents the formation of an image. DEEP VISION takes a radically new approach to microscopy in order to lift this final limitation.
Microscopes are based on the idea that light propagates along a straight line. In biological tissue, however, this picture is naive: light is scattered by every structure in the specimen. Since the amount of ‘non-scattered’ light decreases exponentially with depth, a significant improvement of the imaging depth is fundamentally impossible, unless scattered light itself is used for imaging.
In 2007, Allard Mosk and I pioneered the field of wavefront shaping. The game-changing message of wavefront shaping is that scattering is not a fundamental limitation for imaging: using a spatial light modulator, light can be focused even inside the most turbid materials, if ‘only’ the correct wavefront is known.
DEEP VISION aims to initiate a fundamental change in how we think about microscopy: to use scattered light rather than straight rays for imaging. The microscope of the future is no longer based on Newtonian optics. Instead, it combines new insights in scattering physics, wavefront shaping, and compressed sensing to extract all useful information from a specimen.
Whereas existing microscopes are ignorant to the nature of the specimen, DEEP VISION is inspired by information theory; imaging revolves around a model that integrates observations with statistical a-priori information about the tissue. This model is used to calculate the wavefronts for focusing deeper into the specimen. Simulations indicate that my approach will penetrate at least four times deeper than existing microscopes, without loss of resolution.
Summary
Modern biology could not exist without the optical microscope. Hundreds of years of research have seemingly developed microscopes to perfection, with one essential limitation: in turbid biological tissue, not even the most advanced microscope can penetrate deeper than a fraction of a millimetre. At larger depths light scattering prevents the formation of an image. DEEP VISION takes a radically new approach to microscopy in order to lift this final limitation.
Microscopes are based on the idea that light propagates along a straight line. In biological tissue, however, this picture is naive: light is scattered by every structure in the specimen. Since the amount of ‘non-scattered’ light decreases exponentially with depth, a significant improvement of the imaging depth is fundamentally impossible, unless scattered light itself is used for imaging.
In 2007, Allard Mosk and I pioneered the field of wavefront shaping. The game-changing message of wavefront shaping is that scattering is not a fundamental limitation for imaging: using a spatial light modulator, light can be focused even inside the most turbid materials, if ‘only’ the correct wavefront is known.
DEEP VISION aims to initiate a fundamental change in how we think about microscopy: to use scattered light rather than straight rays for imaging. The microscope of the future is no longer based on Newtonian optics. Instead, it combines new insights in scattering physics, wavefront shaping, and compressed sensing to extract all useful information from a specimen.
Whereas existing microscopes are ignorant to the nature of the specimen, DEEP VISION is inspired by information theory; imaging revolves around a model that integrates observations with statistical a-priori information about the tissue. This model is used to calculate the wavefronts for focusing deeper into the specimen. Simulations indicate that my approach will penetrate at least four times deeper than existing microscopes, without loss of resolution.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym DigitalDoctors
Project Making Clinical Sense: A comparative study of how doctors learn in digital times
Researcher (PI) Anna Harris
Host Institution (HI) UNIVERSITEIT MAASTRICHT
Call Details Starting Grant (StG), SH3, ERC-2015-STG
Summary Digital technologies are reconfiguring medical practices in ways we still don’t understand. This research project seeks to examine the impact of the digital in medicine by studying the role of pedagogical technologies in how doctors learn the skills of their profession. It focuses on the centuries-old skill of physical examination; a sensing of the body, through the body. Increasingly medical students are learning these skills away from the bedside, through videos, simulated models and in laboratories. My research team will interrogate how learning with these technologies impacts on how doctors learn to sense bodies. Through the rich case of doctors-in-training the study addresses a key challenge in social scientific scholarship regarding how technologies, particularly those digital and virtual, are implicated in bodily, sensory knowing of the world. Our research takes a historically-attuned comparative anthropology approach, advancing the social study of medicine and medical education research in three new directions. First, a team of three ethnographers will attend to both spectacular and mundane technologies in medical education, recognising that everyday learning situations are filled with technologies old and new. Second, it offers the first comparative social study of medical education with fieldwork in three materially and culturally different settings in Western and Eastern Europe, and West Africa. Finally, the study brings historical and ethnographic research of technologies closer together, with a historian conducting oral histories and archival research at each site. Findings will have impact in the social sciences and education research by advancing understanding of how the digital and other technologies are implicated in skills learning. The study will develop novel digital-sensory methodologies and boldly, a new theory of techno-perception. These academic contributions will have practical relevance by improving the training of doctors in digital times.
Summary
Digital technologies are reconfiguring medical practices in ways we still don’t understand. This research project seeks to examine the impact of the digital in medicine by studying the role of pedagogical technologies in how doctors learn the skills of their profession. It focuses on the centuries-old skill of physical examination; a sensing of the body, through the body. Increasingly medical students are learning these skills away from the bedside, through videos, simulated models and in laboratories. My research team will interrogate how learning with these technologies impacts on how doctors learn to sense bodies. Through the rich case of doctors-in-training the study addresses a key challenge in social scientific scholarship regarding how technologies, particularly those digital and virtual, are implicated in bodily, sensory knowing of the world. Our research takes a historically-attuned comparative anthropology approach, advancing the social study of medicine and medical education research in three new directions. First, a team of three ethnographers will attend to both spectacular and mundane technologies in medical education, recognising that everyday learning situations are filled with technologies old and new. Second, it offers the first comparative social study of medical education with fieldwork in three materially and culturally different settings in Western and Eastern Europe, and West Africa. Finally, the study brings historical and ethnographic research of technologies closer together, with a historian conducting oral histories and archival research at each site. Findings will have impact in the social sciences and education research by advancing understanding of how the digital and other technologies are implicated in skills learning. The study will develop novel digital-sensory methodologies and boldly, a new theory of techno-perception. These academic contributions will have practical relevance by improving the training of doctors in digital times.
Max ERC Funding
1 361 507 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym DOPING-ON-DEMAND
Project Doping on Demand: precise and permanent control of the Fermi level in nanocrystal assemblies
Researcher (PI) Arjan Houtepen
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary The aim of the work proposed here is to develop a completely new method to electronically dope assemblies of semiconductor nanocrystals (a.k.a quantum dots, QDs), and porous semiconductors in general. External dopants are added on demand in the form of electrolyte ions in the voids between QDs. These ions will be introduced via electrochemical charge injection, and will subsequently be immobilized by (1) freezing the electrolyte solvent at room temperature or (2) chemically linking the ions to ligands on the QD surface, or by a combination of both. Encapsulating doped QD films using atomic layer deposition will provide further stability. This will result in stable doped nanocrystal assemblies with a constant Fermi level that is controlled by the potential set during electrochemical charging.
QDs are small semiconductor crystals with size-tunable electronic properties that are considered promising materials for a range of opto-electronic applications. Electronic doping of QDs remains a big challenge even after two decades of research into this area. At the same time it is highly desired to dope QDs in a controlled way for applications such as LEDs, FETs and solar cells. This research project will provide unprecedented control over the doping level in QD films and will provided a major step in the optimization of optoelectronic devices based on QDs. The “Doping-on-Demand” approach will be exploited to develop degenerately doped, low-threshold QD lasers that can be operated under continuous wave excitation, and QD laser diodes that use electrical injection of charge carriers. The precise control of the Fermi-level will further be used to optimize pin junction QD solar cells and to develop, for the first time, QD pn junction solar cells with precise control over the Fermi levels.
Summary
The aim of the work proposed here is to develop a completely new method to electronically dope assemblies of semiconductor nanocrystals (a.k.a quantum dots, QDs), and porous semiconductors in general. External dopants are added on demand in the form of electrolyte ions in the voids between QDs. These ions will be introduced via electrochemical charge injection, and will subsequently be immobilized by (1) freezing the electrolyte solvent at room temperature or (2) chemically linking the ions to ligands on the QD surface, or by a combination of both. Encapsulating doped QD films using atomic layer deposition will provide further stability. This will result in stable doped nanocrystal assemblies with a constant Fermi level that is controlled by the potential set during electrochemical charging.
QDs are small semiconductor crystals with size-tunable electronic properties that are considered promising materials for a range of opto-electronic applications. Electronic doping of QDs remains a big challenge even after two decades of research into this area. At the same time it is highly desired to dope QDs in a controlled way for applications such as LEDs, FETs and solar cells. This research project will provide unprecedented control over the doping level in QD films and will provided a major step in the optimization of optoelectronic devices based on QDs. The “Doping-on-Demand” approach will be exploited to develop degenerately doped, low-threshold QD lasers that can be operated under continuous wave excitation, and QD laser diodes that use electrical injection of charge carriers. The precise control of the Fermi-level will further be used to optimize pin junction QD solar cells and to develop, for the first time, QD pn junction solar cells with precise control over the Fermi levels.
Max ERC Funding
1 497 842 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym DYNAMIQS
Project Relaxation dynamics in closed quantum systems
Researcher (PI) Marc Cheneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Summary
Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym E-motion
Project Electro-motion for the sustainable recovery of high-value nutrients from waste water
Researcher (PI) Louis Cornelia Patrick Maria de Smet
Host Institution (HI) WAGENINGEN UNIVERSITY
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Current water treatment technologies are mainly aimed to improve the quality of water. High-value nutrients, like nitrate and phosphate ions, often remain present in waste streams. Electro-driven separation processes offer a sustainable way to recover these nutrients. Ion-selective polymer membranes are a strong candidate to achieve selectivity in such processes.
The aim of E-motion is to chemically modify porous electrodes with membranes to introduce selectivity in electro-driven separation processes. New, ultrathin ion-selective films will be designed, synthesized and characterized. The films will be made by successively adsorbing polycations and polyanions onto the electrodes. Selectivity will be introduced by the incorporation of ion-selective receptors. The adsorbed multilayer films will be studied in detail regarding their stability, selectivity and transport properties under varying experimental conditions of salinity, pH and applied electrical field, both under adsorption and desorption conditions.
The first main challenge is to optimize and to understand the film architecture in terms of 1) stability towards an electrical field, 2) ability to facilitate ion transport. Also the influence of ion charge and ion size on the transport dynamics will be addressed. The focus of E-motion is set on phosphate ions, which is rather complex due to their large size, pH-dependent speciation and the development of phosphate-selective materials. Theoretical modelling of the solubility equilibria and electrical double layers will be pursued to frame the details of the electrosorption of phosphate.
E-motion represents a major step forward in the selective recovery of nutrients from water in a cost-effective, chemical-free way at high removal efficiency. The proposed surface modification strategies and the increased understanding of ion transport and ionic interactions in membrane media offer also applications in the areas of batteries, fuel cells and solar fuel devices.
Summary
Current water treatment technologies are mainly aimed to improve the quality of water. High-value nutrients, like nitrate and phosphate ions, often remain present in waste streams. Electro-driven separation processes offer a sustainable way to recover these nutrients. Ion-selective polymer membranes are a strong candidate to achieve selectivity in such processes.
The aim of E-motion is to chemically modify porous electrodes with membranes to introduce selectivity in electro-driven separation processes. New, ultrathin ion-selective films will be designed, synthesized and characterized. The films will be made by successively adsorbing polycations and polyanions onto the electrodes. Selectivity will be introduced by the incorporation of ion-selective receptors. The adsorbed multilayer films will be studied in detail regarding their stability, selectivity and transport properties under varying experimental conditions of salinity, pH and applied electrical field, both under adsorption and desorption conditions.
The first main challenge is to optimize and to understand the film architecture in terms of 1) stability towards an electrical field, 2) ability to facilitate ion transport. Also the influence of ion charge and ion size on the transport dynamics will be addressed. The focus of E-motion is set on phosphate ions, which is rather complex due to their large size, pH-dependent speciation and the development of phosphate-selective materials. Theoretical modelling of the solubility equilibria and electrical double layers will be pursued to frame the details of the electrosorption of phosphate.
E-motion represents a major step forward in the selective recovery of nutrients from water in a cost-effective, chemical-free way at high removal efficiency. The proposed surface modification strategies and the increased understanding of ion transport and ionic interactions in membrane media offer also applications in the areas of batteries, fuel cells and solar fuel devices.
Max ERC Funding
1 950 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym ELAB4LIFE
Project eLab4Life: Electr(ochem)ical Labs-on-a-Chip for Life Sciences
Researcher (PI) Albert Van Den Berg
Host Institution (HI) UNIVERSITEIT TWENTE
Call Details Advanced Grant (AdG), PE7, ERC-2008-AdG
Summary We propose the development of new electrochemical techniques for health and life sciences applications in Lab-on-a-Chip devices. A Scanning ElectroChemical Microscope (SECM) will be used to study surface properties, such as local consumption and/or release of electroactive chemical compounds by (single) cells by electrochemical sensing, new detection methods for proteins using redox cycling, and new separation methods for DNA exploiting nanoscale electrical field gradients. The ability to generate and control electrical fields (and gradients) at the scale of the size of biomolecules using nanostructures, and the simple translation of novel electrical methods into practical Lab-on-a-Chip devices will create a breakthrough in bioanalytical methods. The knowledge and expertise obtained from SECM experimentation will be used to design and realize Labs-on-a-Chip that can be used for efficient production of drugs by electrofused cells, for early biomarker detection using nanowires and nano-spaced electrodes (Point-of-Care application), and rapid DNA analysis using nanofluidic structures. Besides this, the results can have great benefits for study of embryonic cell growth and for advanced tissue engineering. The results will be translated into devices and systems that can be used in Point-of-Care (POC) applications and will bring this area a big step closer to successful commercialization.
Summary
We propose the development of new electrochemical techniques for health and life sciences applications in Lab-on-a-Chip devices. A Scanning ElectroChemical Microscope (SECM) will be used to study surface properties, such as local consumption and/or release of electroactive chemical compounds by (single) cells by electrochemical sensing, new detection methods for proteins using redox cycling, and new separation methods for DNA exploiting nanoscale electrical field gradients. The ability to generate and control electrical fields (and gradients) at the scale of the size of biomolecules using nanostructures, and the simple translation of novel electrical methods into practical Lab-on-a-Chip devices will create a breakthrough in bioanalytical methods. The knowledge and expertise obtained from SECM experimentation will be used to design and realize Labs-on-a-Chip that can be used for efficient production of drugs by electrofused cells, for early biomarker detection using nanowires and nano-spaced electrodes (Point-of-Care application), and rapid DNA analysis using nanofluidic structures. Besides this, the results can have great benefits for study of embryonic cell growth and for advanced tissue engineering. The results will be translated into devices and systems that can be used in Point-of-Care (POC) applications and will bring this area a big step closer to successful commercialization.
Max ERC Funding
2 382 442 €
Duration
Start date: 2008-12-01, End date: 2013-10-31
Project acronym Emergent-BH
Project Emergent spacetime and maximally spinning black holes
Researcher (PI) Monica Guica
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Summary
One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Max ERC Funding
1 495 476 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym EQEC
Project Engineering Quantum Error Correction
Researcher (PI) Barbara Terhal
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Consolidator Grant (CoG), PE2, ERC-2015-CoG
Summary This proposal will advance the theory of quantum error correction towards its application in real physical devices, in particular superconducting transmon qubit systems. The research will result in proposals for experiments: how to use physical qubits to redundantly represent logical quantum information and how error information can be obtained and classically processed. The research will consider novel ways of using transmon qubits to achieve a universal fault-tolerant surface code architecture. The research will produce a design of a universal fault-tolerant architecture in which qubits are encoded in the electromagnetic field of a (microwave) cavity. Research will also focus on mathematical and numerical studies in quantum error correction which are technology-independent, but shed light on coding overhead, decoding efficiency and logical universality.
Summary
This proposal will advance the theory of quantum error correction towards its application in real physical devices, in particular superconducting transmon qubit systems. The research will result in proposals for experiments: how to use physical qubits to redundantly represent logical quantum information and how error information can be obtained and classically processed. The research will consider novel ways of using transmon qubits to achieve a universal fault-tolerant surface code architecture. The research will produce a design of a universal fault-tolerant architecture in which qubits are encoded in the electromagnetic field of a (microwave) cavity. Research will also focus on mathematical and numerical studies in quantum error correction which are technology-independent, but shed light on coding overhead, decoding efficiency and logical universality.
Max ERC Funding
1 786 563 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym EUREC4A
Project Elucidating the Role of Clouds-Circulation Coupling in Climate
Researcher (PI) Sandrine Bony
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary This proposal focuses on two of climate science’s most fundamental questions: How sensitive is Earth's surface temperature to radiative forcing? and What governs the organization of the atmosphere into rain bands, cloud clusters and storms? These seemingly different questions are central to an ability to assess climate change on regional and global scales, and are in large part tied to a single and critical gap in our knowledge: A poor understanding of how clouds and atmospheric circulations interact.
To fill this gap, my goal is to answer three questions, which are critical to an understanding of cloud-circulation coupling and its role in climate: (i) How strongly is the low-clouds response to global warming controlled by atmospheric circulations within the first few kilometres of the atmosphere? (ii) What controls the propensity of the atmosphere to aggregate into clusters or rain bands, and what role does it play in the large-scale atmospheric circulation and in climate sensitivity? (iii) How much do cloud-radiative effects influence the frequency and strength of extreme events?
I will address these questions by organising the first airborne field campaign focused on elucidating the interplay between low-level clouds and the small-scale and large-scale circulations in which they are embedded, as this is key for questions (i) and (ii), by analysing data from other field campaigns and satellite observations, and by conducting targeted numerical experiments with a hierarchy of models and configurations.
This research stands a very good chance to reduce the primary source of the forty-year uncertainty in climate sensitivity, to demystify long-standing questions of tropical meteorology, and to advance the physical understanding and prediction of extreme events. EUREC4A will also support, motivate and train a team of young scientists to exploit the synergy between observational and modelling approaches to answer pressing questions of atmospheric and climate science.
Summary
This proposal focuses on two of climate science’s most fundamental questions: How sensitive is Earth's surface temperature to radiative forcing? and What governs the organization of the atmosphere into rain bands, cloud clusters and storms? These seemingly different questions are central to an ability to assess climate change on regional and global scales, and are in large part tied to a single and critical gap in our knowledge: A poor understanding of how clouds and atmospheric circulations interact.
To fill this gap, my goal is to answer three questions, which are critical to an understanding of cloud-circulation coupling and its role in climate: (i) How strongly is the low-clouds response to global warming controlled by atmospheric circulations within the first few kilometres of the atmosphere? (ii) What controls the propensity of the atmosphere to aggregate into clusters or rain bands, and what role does it play in the large-scale atmospheric circulation and in climate sensitivity? (iii) How much do cloud-radiative effects influence the frequency and strength of extreme events?
I will address these questions by organising the first airborne field campaign focused on elucidating the interplay between low-level clouds and the small-scale and large-scale circulations in which they are embedded, as this is key for questions (i) and (ii), by analysing data from other field campaigns and satellite observations, and by conducting targeted numerical experiments with a hierarchy of models and configurations.
This research stands a very good chance to reduce the primary source of the forty-year uncertainty in climate sensitivity, to demystify long-standing questions of tropical meteorology, and to advance the physical understanding and prediction of extreme events. EUREC4A will also support, motivate and train a team of young scientists to exploit the synergy between observational and modelling approaches to answer pressing questions of atmospheric and climate science.
Max ERC Funding
3 013 334 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym EXCITERS
Project Extreme Ultraviolet Circular Time-Resolved Spectroscopy
Researcher (PI) Yann Mairesse
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2015-CoG
Summary Chiral molecules exist as two forms, so-called enantiomers, which have essentially the same physical and chemical properties and can only be distinguished via their interaction with a chiral system, such as circularly polarized light. Many biological processes are chiral-sensitive and unraveling the dynamical aspects of chirality is of prime importance for chemistry, biology and pharmacology. Studying the ultrafast electron dynamics of chiral processes requires characterization techniques at the attosecond (10−18 s) time-scale.
Molecular attosecond spectroscopy has the potential to resolve the couplings between electronic and nuclear degrees of freedom in such chiral chemical processes. There are, however, two major challenges: the generation of chiral attosecond light pulse, and the development of highly sensitive chiral discrimination techniques for time-resolved spectroscopy in the gas phase.
This ERC research project aims at developing vectorial attosecond spectroscopy using elliptical strong fields and circular attosecond pulses, and to apply it for the investigation of chiral molecules. To achieve this, I will (1) establish a new type of highly sensitive chiroptical spectroscopy using high-order harmonic generation by elliptical laser fields; (2) create and characterize sources of circular attosecond pulses; (3) use trains of circularly polarized attosecond pulses to probe the dynamics of photoionization of chiral molecules and (4) deploy ultrafast dynamical measurements to address the link between nuclear geometry and electronic chirality.
The developments from this project will set a landmark in the field of chiral recognition. They will also completely change the way ellipticity is considered in attosecond science and have an impact far beyond the study of chiral compounds, opening new perspectives for the resolution of the fastest dynamics occurring in polyatomic molecules and solid state physics.
Summary
Chiral molecules exist as two forms, so-called enantiomers, which have essentially the same physical and chemical properties and can only be distinguished via their interaction with a chiral system, such as circularly polarized light. Many biological processes are chiral-sensitive and unraveling the dynamical aspects of chirality is of prime importance for chemistry, biology and pharmacology. Studying the ultrafast electron dynamics of chiral processes requires characterization techniques at the attosecond (10−18 s) time-scale.
Molecular attosecond spectroscopy has the potential to resolve the couplings between electronic and nuclear degrees of freedom in such chiral chemical processes. There are, however, two major challenges: the generation of chiral attosecond light pulse, and the development of highly sensitive chiral discrimination techniques for time-resolved spectroscopy in the gas phase.
This ERC research project aims at developing vectorial attosecond spectroscopy using elliptical strong fields and circular attosecond pulses, and to apply it for the investigation of chiral molecules. To achieve this, I will (1) establish a new type of highly sensitive chiroptical spectroscopy using high-order harmonic generation by elliptical laser fields; (2) create and characterize sources of circular attosecond pulses; (3) use trains of circularly polarized attosecond pulses to probe the dynamics of photoionization of chiral molecules and (4) deploy ultrafast dynamical measurements to address the link between nuclear geometry and electronic chirality.
The developments from this project will set a landmark in the field of chiral recognition. They will also completely change the way ellipticity is considered in attosecond science and have an impact far beyond the study of chiral compounds, opening new perspectives for the resolution of the fastest dynamics occurring in polyatomic molecules and solid state physics.
Max ERC Funding
1 691 865 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ExCoMet
Project CONTROLLING AND MEASURING RELATIVISTIC MOTION OF MATTER WITH ULTRAINTENSE STRUCTURED LIGHT
Researcher (PI) Fabien, Hervé, Jean QUERE
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Femtosecond lasers can now provide intensities such that the light field induces relativistic motion of large ensembles of electrons. The ultimate goal of this Ultra-High Intensity (UHI) Physics is the control of relativistic motion of matter with light, which requires a deep understanding of this extreme regime of laser-matter interaction. Such a control holds the promise of major scientific and societal applications, by providing ultra-compact laser-driven particle accelerators and attosecond X-ray sources. Until now, advances in UHI Physics have relied on a quest for the highest laser intensities, pursued by focusing optimally-compressed laser pulses to their diffraction limit. In contrast, the goal of the ExCoMet project is to establish a new paradigm, by demonstrating the potential of driving UHI laser plasma-interactions with sophisticated structured laser beams–i.e. beams whose amplitude, phase or polarization are shaped in space-time.
Based on this new paradigm, we will show that unprecedented experimental insight can be gained on UHI laser-matter interactions. For instance, by using laser fields whose propagation direction rotates on a femtosecond time scale, we will temporally resolve the synchrotron emission of laser-driven relativistic electrons in plasmas, and thus gather direct information on their dynamics. We will also show that such structured laser fields can be exploited to introduce new physics in UHI experiments, and can provide advanced degrees of control that will be essential for future light and particles sources based on these interactions. Using Laguerre-Gauss beams, we will in particular investigate the transfer of orbital angular momentum from UHI lasers to plasmas, and its consequences on the physics and performances of laser-plasma accelerators. This project thus aims at bringing conceptual breakthroughs in UHI physics, at a time where major projects relying on this physics are being launched, in particular in Europe.
Summary
Femtosecond lasers can now provide intensities such that the light field induces relativistic motion of large ensembles of electrons. The ultimate goal of this Ultra-High Intensity (UHI) Physics is the control of relativistic motion of matter with light, which requires a deep understanding of this extreme regime of laser-matter interaction. Such a control holds the promise of major scientific and societal applications, by providing ultra-compact laser-driven particle accelerators and attosecond X-ray sources. Until now, advances in UHI Physics have relied on a quest for the highest laser intensities, pursued by focusing optimally-compressed laser pulses to their diffraction limit. In contrast, the goal of the ExCoMet project is to establish a new paradigm, by demonstrating the potential of driving UHI laser plasma-interactions with sophisticated structured laser beams–i.e. beams whose amplitude, phase or polarization are shaped in space-time.
Based on this new paradigm, we will show that unprecedented experimental insight can be gained on UHI laser-matter interactions. For instance, by using laser fields whose propagation direction rotates on a femtosecond time scale, we will temporally resolve the synchrotron emission of laser-driven relativistic electrons in plasmas, and thus gather direct information on their dynamics. We will also show that such structured laser fields can be exploited to introduce new physics in UHI experiments, and can provide advanced degrees of control that will be essential for future light and particles sources based on these interactions. Using Laguerre-Gauss beams, we will in particular investigate the transfer of orbital angular momentum from UHI lasers to plasmas, and its consequences on the physics and performances of laser-plasma accelerators. This project thus aims at bringing conceptual breakthroughs in UHI physics, at a time where major projects relying on this physics are being launched, in particular in Europe.
Max ERC Funding
2 250 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym EXO-ATMOS
Project Exploring the Plurality of New Worlds: Their Origins, Climate and Habitability
Researcher (PI) Jean-Michel Lucien-Bernard Desert
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary Recent surveys have revealed an amazing, and yet unexplained, diversity of planets orbiting other stars. The key to understanding and exploiting this diversity is to study their atmospheres. This is because exoplanets’ atmospheres are unique laboratories that hold the potential to transform our understanding of planet formation, physics, and habitability. This is a new opportunity to place the Solar System and the Earth’s ecosystem in a broader context; one of the main goals of modern astrophysics.
The aim of this proposal is to leverage exoplanet detections, as well as observational capabilities and theoretical frameworks, to deepen and broaden our understanding of planetary physics. This project will transform the field of exoplanet atmospheres by contributing to three major advances. We will: i) push exoplanet characterization new frontiers by providing the largest in-depth study of atmospheres through the measurements of precise spectra, and the retrieval of their composition, in order to constrain their origins; ii) reveal for the first time global exo-climate through a novel method to probe atmospheric structure and dynamics; and iii) pioneer an innovative approach that uses robotic small telescopes to estimate the impact of stellar radiation on atmospheres, with a particular focus on their habitability. Theses objectives will be achieved via an ambitious portfolio of cutting-edge observations, combined with state-of-the-art modelling for their interpretation. Their accomplishment would be a major breakthrough, culminating in a comprehensive comparative exoplanetology, which in turn will open up new key discoveries in planetary formation and evolution. Our expertise will also enable predictions on conditions for habitability and direct the search atmospheric biosignatures with upcoming capabilities. The impact of our discoveries will go well beyond the scientific community since the quest of our origins is of interest to mankind.
Summary
Recent surveys have revealed an amazing, and yet unexplained, diversity of planets orbiting other stars. The key to understanding and exploiting this diversity is to study their atmospheres. This is because exoplanets’ atmospheres are unique laboratories that hold the potential to transform our understanding of planet formation, physics, and habitability. This is a new opportunity to place the Solar System and the Earth’s ecosystem in a broader context; one of the main goals of modern astrophysics.
The aim of this proposal is to leverage exoplanet detections, as well as observational capabilities and theoretical frameworks, to deepen and broaden our understanding of planetary physics. This project will transform the field of exoplanet atmospheres by contributing to three major advances. We will: i) push exoplanet characterization new frontiers by providing the largest in-depth study of atmospheres through the measurements of precise spectra, and the retrieval of their composition, in order to constrain their origins; ii) reveal for the first time global exo-climate through a novel method to probe atmospheric structure and dynamics; and iii) pioneer an innovative approach that uses robotic small telescopes to estimate the impact of stellar radiation on atmospheres, with a particular focus on their habitability. Theses objectives will be achieved via an ambitious portfolio of cutting-edge observations, combined with state-of-the-art modelling for their interpretation. Their accomplishment would be a major breakthrough, culminating in a comprehensive comparative exoplanetology, which in turn will open up new key discoveries in planetary formation and evolution. Our expertise will also enable predictions on conditions for habitability and direct the search atmospheric biosignatures with upcoming capabilities. The impact of our discoveries will go well beyond the scientific community since the quest of our origins is of interest to mankind.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym EXOPLANETBIO
Project Exoplanet atmospheres as indicators of life: From hot gas giants to Earth-like planets
Researcher (PI) Ignas Snellen
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary This is a proposal to use a new ground-breaking spectroscopic technique to study the atmospheres of extrasolar planets. Understanding planet atmospheric processes and their evolutionary histories is crucial for unambiguously identifying biomarker gases, and forms the main driver behind the enormous surge in exoplanet atmospheric research.
I propose to lead a program using the new VLT CRIRES+ instrument, that will focus on the new ground-breaking developments in ground-based high-dispersion spectroscopy, in which my work plays a leading role. We successfully determined the dominant spectroscopically-active species in hot Jupiter atmospheres (e.g. Brogi, Snellen et al. Nature 2012), provided the first evidence for high altitude winds (Snellen et al. Nature 2010), and determined for the first time the spin-rotation rate of a young gas-giant planet (Snellen et al. Nature 2014) – pioneering a technique that combines high-dispersion spectroscopy with high-contrast imaging.
The new CRIRES+ spectrograph at the VLT (2017) will have a revolutionary impact in the field, changing the main focus of current atmospheric research from hot 1000-1500 K gas giants to cooler 400-700 K Neptunes and Super-Earths. With this new instrument, I will 1) make a large inventory of planet spin rates as function of planet mass and age, 2) probe the atmospheres of cool super-Earths above the cloud-deck for the first time, solving for their bulk compositions. 3) determine the vertical and longitudinal atmospheric temperature profiles of hot Jupiters, and obtain a complete inventory of the C and O bearing molecules in their upper atmospheres. 4) I will for the first time probe isotope-ratios in exoplanet atmospheres. This project will be an important stepping stone in developing high-dispersion spectroscopic techniques for studying Earth-like exoplanets with the European Extremely Large Telescope.
Summary
This is a proposal to use a new ground-breaking spectroscopic technique to study the atmospheres of extrasolar planets. Understanding planet atmospheric processes and their evolutionary histories is crucial for unambiguously identifying biomarker gases, and forms the main driver behind the enormous surge in exoplanet atmospheric research.
I propose to lead a program using the new VLT CRIRES+ instrument, that will focus on the new ground-breaking developments in ground-based high-dispersion spectroscopy, in which my work plays a leading role. We successfully determined the dominant spectroscopically-active species in hot Jupiter atmospheres (e.g. Brogi, Snellen et al. Nature 2012), provided the first evidence for high altitude winds (Snellen et al. Nature 2010), and determined for the first time the spin-rotation rate of a young gas-giant planet (Snellen et al. Nature 2014) – pioneering a technique that combines high-dispersion spectroscopy with high-contrast imaging.
The new CRIRES+ spectrograph at the VLT (2017) will have a revolutionary impact in the field, changing the main focus of current atmospheric research from hot 1000-1500 K gas giants to cooler 400-700 K Neptunes and Super-Earths. With this new instrument, I will 1) make a large inventory of planet spin rates as function of planet mass and age, 2) probe the atmospheres of cool super-Earths above the cloud-deck for the first time, solving for their bulk compositions. 3) determine the vertical and longitudinal atmospheric temperature profiles of hot Jupiters, and obtain a complete inventory of the C and O bearing molecules in their upper atmospheres. 4) I will for the first time probe isotope-ratios in exoplanet atmospheres. This project will be an important stepping stone in developing high-dispersion spectroscopic techniques for studying Earth-like exoplanets with the European Extremely Large Telescope.
Max ERC Funding
2 300 962 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FACTORY
Project New paradigms for latent factor estimation
Researcher (PI) Cédric Févotte
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Data is often available in matrix form, in which columns are samples, and processing of such data often entails finding an approximate factorisation of the matrix in two factors. The first factor yields recurring patterns characteristic of the data. The second factor describes in which proportions each data sample is made of these patterns. Latent factor estimation (LFE) is the problem of finding such a factorisation, usually under given constraints. LFE appears under other domain-specific names such as dictionary learning, low-rank approximation, factor analysis or latent semantic analysis. It is used for tasks such as dimensionality reduction, unmixing, soft clustering, coding or matrix completion in very diverse fields.
In this project, I propose to explore three new paradigms that push the frontiers of traditional LFE. First, I want to break beyond the ubiquitous Gaussian assumption, a practical choice that too rarely complies with the nature and geometry of the data. Estimation in non-Gaussian models is more difficult, but recent work in audio and text processing has shown that it pays off in practice. Second, in traditional settings the data matrix is often a collection of features computed from raw data. These features are computed with generic off-the-shelf transforms that loosely preprocess the data, setting a limit to performance. I propose a new paradigm in which an optimal low-rank inducing transform is learnt together with the factors in a single step. Thirdly, I show that the dominant deterministic approach to LFE should be reconsidered and I propose a novel statistical estimation paradigm, based on the marginal likelihood, with enhanced capabilities. The new methodology is applied to real-world problems with societal impact in audio signal processing (speech enhancement, music remastering), remote sensing (Earth observation, cosmic object discovery) and data mining (multimodal information retrieval, user recommendation).
Summary
Data is often available in matrix form, in which columns are samples, and processing of such data often entails finding an approximate factorisation of the matrix in two factors. The first factor yields recurring patterns characteristic of the data. The second factor describes in which proportions each data sample is made of these patterns. Latent factor estimation (LFE) is the problem of finding such a factorisation, usually under given constraints. LFE appears under other domain-specific names such as dictionary learning, low-rank approximation, factor analysis or latent semantic analysis. It is used for tasks such as dimensionality reduction, unmixing, soft clustering, coding or matrix completion in very diverse fields.
In this project, I propose to explore three new paradigms that push the frontiers of traditional LFE. First, I want to break beyond the ubiquitous Gaussian assumption, a practical choice that too rarely complies with the nature and geometry of the data. Estimation in non-Gaussian models is more difficult, but recent work in audio and text processing has shown that it pays off in practice. Second, in traditional settings the data matrix is often a collection of features computed from raw data. These features are computed with generic off-the-shelf transforms that loosely preprocess the data, setting a limit to performance. I propose a new paradigm in which an optimal low-rank inducing transform is learnt together with the factors in a single step. Thirdly, I show that the dominant deterministic approach to LFE should be reconsidered and I propose a novel statistical estimation paradigm, based on the marginal likelihood, with enhanced capabilities. The new methodology is applied to real-world problems with societal impact in audio signal processing (speech enhancement, music remastering), remote sensing (Earth observation, cosmic object discovery) and data mining (multimodal information retrieval, user recommendation).
Max ERC Funding
1 931 776 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FAILFLOW
Project Failure and Fluid Flow in Porous Quasibrittle Materials
Researcher (PI) Gilles Pijaudier-Cabot
Host Institution (HI) UNIVERSITE DE PAU ET DES PAYS DE L'ADOUR
Call Details Advanced Grant (AdG), PE8, ERC-2008-AdG
Summary This project focuses on fluid flow in porous materials with evolving microstructure in the context of civil engineering applications and geomechanics. When the distribution of cracks and the distribution of pore size evolve in concrete and rocks, the influence on the permeability in the case of a single or a multiphase fluid flow needs some in depth investigation. A recent review of state of the art in modelling progressive mechanical breakdown and associated fluid flow in heterogeneous rock shows that little is known on the coupled effects between micro cracking and the intrinsic permeability of a solid phase. The present project intends to tackle this relationship between mechanical breakdown and associated fluid flow in the context of poromechanics extended to non local modelling. In particular, we will investigate how the internal length which plays a pivotal role at the inception and propagation of material failure may interact with the permeability, what enhanced Darcy-like relationship might be derived in order to apprehend such effects and how to model fluid flow in tight porous materials. The models will be extended to complex and multicomponent systems reproducing as closely as possible the behaviour of real fluids in order to understand and to describe the thermodynamical behaviour due to confinement such as modification of phase transitions and capillary condensation. The principal investigator of this project is a specialist in the field of continuum damage mechanics, failure due to strain and damage localisation. He has been the founder and among the major promoters of non local damage modelling, which is today a state of the art model in computational structural failure analyses. After a decade of research on durability problems for which he was elected at Institut Universitaire de France, his research interests recently turned toward petroleum engineering, the focus of the research team he joined two years ago at université de Pau.
Summary
This project focuses on fluid flow in porous materials with evolving microstructure in the context of civil engineering applications and geomechanics. When the distribution of cracks and the distribution of pore size evolve in concrete and rocks, the influence on the permeability in the case of a single or a multiphase fluid flow needs some in depth investigation. A recent review of state of the art in modelling progressive mechanical breakdown and associated fluid flow in heterogeneous rock shows that little is known on the coupled effects between micro cracking and the intrinsic permeability of a solid phase. The present project intends to tackle this relationship between mechanical breakdown and associated fluid flow in the context of poromechanics extended to non local modelling. In particular, we will investigate how the internal length which plays a pivotal role at the inception and propagation of material failure may interact with the permeability, what enhanced Darcy-like relationship might be derived in order to apprehend such effects and how to model fluid flow in tight porous materials. The models will be extended to complex and multicomponent systems reproducing as closely as possible the behaviour of real fluids in order to understand and to describe the thermodynamical behaviour due to confinement such as modification of phase transitions and capillary condensation. The principal investigator of this project is a specialist in the field of continuum damage mechanics, failure due to strain and damage localisation. He has been the founder and among the major promoters of non local damage modelling, which is today a state of the art model in computational structural failure analyses. After a decade of research on durability problems for which he was elected at Institut Universitaire de France, his research interests recently turned toward petroleum engineering, the focus of the research team he joined two years ago at université de Pau.
Max ERC Funding
1 490 200 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym FALCONER
Project Forging Advanced Liquid-Crystal Coronagraphs Optimized for Novel Exoplanet Research
Researcher (PI) Frans Snik
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary The 39-m European Extremely Large Telescope (E-ELT) has the potential to directly observe and characterize habitable exoplanets, but current technologies are unable to sufficiently suppress the starlight very close to the star. I propose to develop a novel instrumental approach with breakthrough contrast performance by combining coronagraphs based on brand-new liquid crystal technology, and sensitive imaging polarimetry. The novel coronagraphs will provide an achromatic rejection of starlight even right next to the star such that exoplanets can be imaged efficiently in broadband light and characterized through spectropolarimetry. The coronagraphs will incorporate focal-plane wavefront sensing and polarimetry to achieve an ultimate contrast of 1E-9, which will enable the E-ELT to observe habitable exoplanets.
We will prototype coronagraph designs of increasing contrast performance, validate them in the lab, and apply them on-sky using 6-8 meter class telescopes. With our coronagraphs that offer a contrast improvement by a factor of 10 as compared to current systems in 360-degree dark holes, we will search for self-luminous exoplanets very close to stars at thermal infrared wavelengths, and characterize known targets with multi-wavelength observations. Through accurate photometry and polarimetry, we will study their atmospheric hazes. By combining liquid-crystal coronagraphy with sensitive polarimetry, we will study the inner regions of protoplanetary disks to find signs of planet formation.
By manipulating both phase and amplitude in pupil and focal planes, we will establish hybrid coronagraph systems that combine the strengths of individual concepts, and that can be adapted to the telescope mirror segmentation and the observational strategy. The proposed research will demonstrate the technologies necessary for building an instrument for the E-ELT that can successfully study rocky exoplanets in the habitable zones of nearby stars.
Summary
The 39-m European Extremely Large Telescope (E-ELT) has the potential to directly observe and characterize habitable exoplanets, but current technologies are unable to sufficiently suppress the starlight very close to the star. I propose to develop a novel instrumental approach with breakthrough contrast performance by combining coronagraphs based on brand-new liquid crystal technology, and sensitive imaging polarimetry. The novel coronagraphs will provide an achromatic rejection of starlight even right next to the star such that exoplanets can be imaged efficiently in broadband light and characterized through spectropolarimetry. The coronagraphs will incorporate focal-plane wavefront sensing and polarimetry to achieve an ultimate contrast of 1E-9, which will enable the E-ELT to observe habitable exoplanets.
We will prototype coronagraph designs of increasing contrast performance, validate them in the lab, and apply them on-sky using 6-8 meter class telescopes. With our coronagraphs that offer a contrast improvement by a factor of 10 as compared to current systems in 360-degree dark holes, we will search for self-luminous exoplanets very close to stars at thermal infrared wavelengths, and characterize known targets with multi-wavelength observations. Through accurate photometry and polarimetry, we will study their atmospheric hazes. By combining liquid-crystal coronagraphy with sensitive polarimetry, we will study the inner regions of protoplanetary disks to find signs of planet formation.
By manipulating both phase and amplitude in pupil and focal planes, we will establish hybrid coronagraph systems that combine the strengths of individual concepts, and that can be adapted to the telescope mirror segmentation and the observational strategy. The proposed research will demonstrate the technologies necessary for building an instrument for the E-ELT that can successfully study rocky exoplanets in the habitable zones of nearby stars.
Max ERC Funding
1 499 522 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym FERLODIM
Project Atomic Fermi Gases in Lower Dimensions
Researcher (PI) Christophe Salomon
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary The complex interplay between Coulomb repulsion and Fermi statistics in two dimensional systems is responsible for some of the most dramatic phenomena encountered in solid state physics (High critical temperature superfluidity, Fractional Quantum Hall Effect,..). However, despite decades of efforts, many questions regarding these systems are still unsolved. In FERLODIM, we plan to take advantage of recent progress in ultracold gases, to simulate several fundamental Hamiltonians describing these many-body systems in 1 and 2 dimensions. We will realize two ultra-cold atom machines allowing for a full characterization of the many-body wave function of an ensemble of interacting fermions in periodic potentials, called optical lattices. Our experiments will rely on a high resolution imaging system allowing both for single atom detection and the possibility of tailoring optical potentials of arbitrary shape and geometry. This unique design will allow us to address a variety of physical situations, depending on the geometry of the light induced potentials. One-dimensional problems will be addressed, from spin chains to Luttinger liquids. In pure two dimensional configurations, we will investigate the link between the repulsive Hubbard model, superfluidity and the Mott insulator transition, as well as frustration effects in periodic potentials. Finally we will explore the physics of interacting fermions under rotation in the lowest Landau level, and the connection with fractional Quantum Hall systems.
Summary
The complex interplay between Coulomb repulsion and Fermi statistics in two dimensional systems is responsible for some of the most dramatic phenomena encountered in solid state physics (High critical temperature superfluidity, Fractional Quantum Hall Effect,..). However, despite decades of efforts, many questions regarding these systems are still unsolved. In FERLODIM, we plan to take advantage of recent progress in ultracold gases, to simulate several fundamental Hamiltonians describing these many-body systems in 1 and 2 dimensions. We will realize two ultra-cold atom machines allowing for a full characterization of the many-body wave function of an ensemble of interacting fermions in periodic potentials, called optical lattices. Our experiments will rely on a high resolution imaging system allowing both for single atom detection and the possibility of tailoring optical potentials of arbitrary shape and geometry. This unique design will allow us to address a variety of physical situations, depending on the geometry of the light induced potentials. One-dimensional problems will be addressed, from spin chains to Luttinger liquids. In pure two dimensional configurations, we will investigate the link between the repulsive Hubbard model, superfluidity and the Mott insulator transition, as well as frustration effects in periodic potentials. Finally we will explore the physics of interacting fermions under rotation in the lowest Landau level, and the connection with fractional Quantum Hall systems.
Max ERC Funding
2 050 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym FIRSTSTEP
Project Synthesis of 2-D semiconductors with honeycomb nanogeometry, and study of their Dirac-type band structure and opto-electronic properties
Researcher (PI) Daniel Vanmaekelbergh
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Advanced Grant (AdG), PE3, ERC-2015-AdG
Summary Graphene redirected the pathways of solid-state physics with a revival of 2-D materials showing Dirac
physics due to their honeycomb geometry. The charge carriers are fundamentally different from those
in conventional electronic systems: the energy vs. wave vector relationship is linear instead of
quadratic, resulting in Dirac bands with massless carriers. A genuinely new class of materials will
emerge provided that classic semiconductor compounds can be molded in the nanoscale honeycomb
geometry: The Dirac-type band structure is then combined with the beneficial properties of
semiconductors, e.g. a band gap, optical and electrical switching, and strong spin-orbit coupling. The
PI recently prepared atomically coherent 2-D PbSe and CdSe semiconductors by nanocrystal assembly
and epitaxial attachment. Moreover, he showed theoretically that these systems combine a
semiconductor gap with Dirac-type valence and conduction bands, while the strong spin-orbit
coupling results in the quantum spin Hall effect. The ERC advanced grant will allow him to develop a
robust bottom-up synthesis platform for 2-D metal-chalcogenide semiconductor compounds with
honeycomb nanoscale geometry. The PI will study their band structure and opto-electronic properties
using several types of scanning tunnelling micro-spectroscopy and optical spectroscopy. The Fermilevel
will be controlled with an electrolyte-gated transistor in order to measure the carrier transport
properties. The results will be compared directly with those obtained on the same 2-D semiconductors
without honeycomb geometry, hence showing the conventional band structure. This should
unambiguously reveal the Dirac features of honeycomb semiconductors: valence band and conduction
band Dirac cones, non-trivial band openings at the K-points that may host the quantum spin Hall
effect, and non-trivial flat bands. 2-D semiconductors with massless holes and electrons open new
opportunities in opto-electronic devices and spintronics.
Summary
Graphene redirected the pathways of solid-state physics with a revival of 2-D materials showing Dirac
physics due to their honeycomb geometry. The charge carriers are fundamentally different from those
in conventional electronic systems: the energy vs. wave vector relationship is linear instead of
quadratic, resulting in Dirac bands with massless carriers. A genuinely new class of materials will
emerge provided that classic semiconductor compounds can be molded in the nanoscale honeycomb
geometry: The Dirac-type band structure is then combined with the beneficial properties of
semiconductors, e.g. a band gap, optical and electrical switching, and strong spin-orbit coupling. The
PI recently prepared atomically coherent 2-D PbSe and CdSe semiconductors by nanocrystal assembly
and epitaxial attachment. Moreover, he showed theoretically that these systems combine a
semiconductor gap with Dirac-type valence and conduction bands, while the strong spin-orbit
coupling results in the quantum spin Hall effect. The ERC advanced grant will allow him to develop a
robust bottom-up synthesis platform for 2-D metal-chalcogenide semiconductor compounds with
honeycomb nanoscale geometry. The PI will study their band structure and opto-electronic properties
using several types of scanning tunnelling micro-spectroscopy and optical spectroscopy. The Fermilevel
will be controlled with an electrolyte-gated transistor in order to measure the carrier transport
properties. The results will be compared directly with those obtained on the same 2-D semiconductors
without honeycomb geometry, hence showing the conventional band structure. This should
unambiguously reveal the Dirac features of honeycomb semiconductors: valence band and conduction
band Dirac cones, non-trivial band openings at the K-points that may host the quantum spin Hall
effect, and non-trivial flat bands. 2-D semiconductors with massless holes and electrons open new
opportunities in opto-electronic devices and spintronics.
Max ERC Funding
2 500 000 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym FLUDYCO
Project Fluid dynamics of planetary cores: formation, heterogeneous convection and rotational dynamics
Researcher (PI) Michael Le Bars
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary Understanding the flows in planetary cores from their formation to their current dynamics is a tremendous interdisciplinary challenge. Beyond the challenge in fundamental fluid dynamics to understand these extraordinary flows involving turbulence, rotation and buoyancy at typical scales well beyond our day-to-day experience, a global knowledge of the involved processes is fundamental to a better understanding of the initial state of planets, of their thermal and orbital evolution, and of magnetic field generation, all key ingredients for habitability. The purpose of the present project is to go beyond the state-of-the-art in tackling three barriers at the current frontier of knowledge. It combines groundbreaking laboratory experiments, complementary pioneering numerical simulations, and fruitful collaborations with leaders in various fields of planetary sciences. Improving on the latest advances in the field, I will address the fluid dynamics of iron fragmentation during the later stages of planetary accretion, in order to produce innovative, dynamically reliable models of planet formation. Considering the latest published data for Earth, I will investigate the flows driven in a stratified layer at the top of a liquid core and their influence on the global convective dynamics and related dynamo. Finally, building upon the recent emergence of alternative models for core dynamics, I will quantitatively examine the non-linear saturation and turbulent state of the flows driven by libration, as well as the shape and intensity of the corresponding dynamo. In the context of an international competition, the originality of my work comes from its multi-method and interdisciplinary character, building upon my successful past researches. Beyond scientific advances, this high-risk/high-gain project will benefit to a larger community through the dissemination of experimental and numerical improvements, and allow promoting science through an original outreach program.
Summary
Understanding the flows in planetary cores from their formation to their current dynamics is a tremendous interdisciplinary challenge. Beyond the challenge in fundamental fluid dynamics to understand these extraordinary flows involving turbulence, rotation and buoyancy at typical scales well beyond our day-to-day experience, a global knowledge of the involved processes is fundamental to a better understanding of the initial state of planets, of their thermal and orbital evolution, and of magnetic field generation, all key ingredients for habitability. The purpose of the present project is to go beyond the state-of-the-art in tackling three barriers at the current frontier of knowledge. It combines groundbreaking laboratory experiments, complementary pioneering numerical simulations, and fruitful collaborations with leaders in various fields of planetary sciences. Improving on the latest advances in the field, I will address the fluid dynamics of iron fragmentation during the later stages of planetary accretion, in order to produce innovative, dynamically reliable models of planet formation. Considering the latest published data for Earth, I will investigate the flows driven in a stratified layer at the top of a liquid core and their influence on the global convective dynamics and related dynamo. Finally, building upon the recent emergence of alternative models for core dynamics, I will quantitatively examine the non-linear saturation and turbulent state of the flows driven by libration, as well as the shape and intensity of the corresponding dynamo. In the context of an international competition, the originality of my work comes from its multi-method and interdisciplinary character, building upon my successful past researches. Beyond scientific advances, this high-risk/high-gain project will benefit to a larger community through the dissemination of experimental and numerical improvements, and allow promoting science through an original outreach program.
Max ERC Funding
1 992 602 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym FOVEDIS
Project Formal specification and verification of distributed data structures
Researcher (PI) Constantin Enea
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Summary
The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Max ERC Funding
1 300 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym GLOBALSEIS
Project NEW GOALS AND DIRECTIONS FOR OBSERVATIONAL GLOBAL SEISMOLOGY
Researcher (PI) Augustinus Nolet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary One of the major paradoxes in the geosciences is the contrast between the geochemical evidence for limited mass-exchange between lower and upper mantle, and the geophysical arguments for significant mass exchange, needed to prevent the mantle from melting in the geological past. Seismic tomography, when ultimately combined with geodynamical modeling, needs to provide estimates of present-day flux. Indeed, tomography has shown evidence for slabs penetrating into the lower mantle; but no quantitative information on the degree of mass exchange and heat flux can, as yet, reliably be obtained from tomographic images. It is crucial that the boundary between upper- and lower mantle be imaged at greater precision, certainly in the plume-rich southern hemisphere. This requires a combined effort of improvements both experimentally and theoretically. Much progress has recently been obtained by my group in Princeton before I returned to Europe. I propose to build upon those accomplishments, and to (1) Expand the data acquisition to the oceans by developing hydrophone-equipped floats, with the goal to improve data coverage in regions that are important to investigate heat flux: the plume-rich southern hemisphere in particular, (2) Combine different seismological data sets spanning a wide range of frequencies, with the goal to obtain tomographic images that allow for a quantitative estimate of heat flux (both upwards through plumes and downwards through the sinking of slab fragments), with emphasis on the boundary between upper- and lower mantle, (3) Exploit the extra resolution offered by the frequency-dependent sensitivity of body waves (multifrequency tomography), (4) Incorporate wavelet expansions into the tomographic inversion, with the aim to resolve more detail in the model where the data allow a higher resolution, (5) Obtain a multidisciplinary interpretation of new tomographic results through interaction with geodynamicists and geochemists.
Summary
One of the major paradoxes in the geosciences is the contrast between the geochemical evidence for limited mass-exchange between lower and upper mantle, and the geophysical arguments for significant mass exchange, needed to prevent the mantle from melting in the geological past. Seismic tomography, when ultimately combined with geodynamical modeling, needs to provide estimates of present-day flux. Indeed, tomography has shown evidence for slabs penetrating into the lower mantle; but no quantitative information on the degree of mass exchange and heat flux can, as yet, reliably be obtained from tomographic images. It is crucial that the boundary between upper- and lower mantle be imaged at greater precision, certainly in the plume-rich southern hemisphere. This requires a combined effort of improvements both experimentally and theoretically. Much progress has recently been obtained by my group in Princeton before I returned to Europe. I propose to build upon those accomplishments, and to (1) Expand the data acquisition to the oceans by developing hydrophone-equipped floats, with the goal to improve data coverage in regions that are important to investigate heat flux: the plume-rich southern hemisphere in particular, (2) Combine different seismological data sets spanning a wide range of frequencies, with the goal to obtain tomographic images that allow for a quantitative estimate of heat flux (both upwards through plumes and downwards through the sinking of slab fragments), with emphasis on the boundary between upper- and lower mantle, (3) Exploit the extra resolution offered by the frequency-dependent sensitivity of body waves (multifrequency tomography), (4) Incorporate wavelet expansions into the tomographic inversion, with the aim to resolve more detail in the model where the data allow a higher resolution, (5) Obtain a multidisciplinary interpretation of new tomographic results through interaction with geodynamicists and geochemists.
Max ERC Funding
2 500 000 €
Duration
Start date: 2009-02-01, End date: 2015-01-31
Project acronym HIGHZ
Project HIGHZ: Elucidating galaxy formation and evolution from very deep Near-IR imaging
Researcher (PI) Marijn Franx
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary "Studies of high redshift galaxies require very deep Near-IR imaging. This allows the study of z=2-4 galaxies redward of the Balmer/4000 Angstrom break, and the detection of UV-bright galaxies at z>7. Two new facilities wil revolutionize these studies: the VISTA telescope built for ESO, and the Near-IR channel on WF3 for HST. They will become available at the start of the grant period. We propose to build a group to analyze the imaging data from these facilities. We will make use of the fact that I am Co-PI on the ultra-deep ""ULTRA-VISTA"" survey on the VISTA telescope, and we will analyze public and privately proposed data from WF3. The following science questions will be addressed: (1) what is the origin and evolution of the Hubble sequence out to z=3, (2) what is the evolution of the Luminosity Function of UV bright galaxies between z=6 to z=11, and what galaxies cause re-ionization, (3) how does the mass function of quiescent and star forming galaxies evolve to z=4, and how do the correlation functions of subpopulations evolve as a function of redshift. A crucial component of this proposal is the request for support for a junior faculty position. This person will take on the lead for the highly specialized data processing, and will supervise the analysis of the selection effects, and other crucial components needed for a proper analysis."
Summary
"Studies of high redshift galaxies require very deep Near-IR imaging. This allows the study of z=2-4 galaxies redward of the Balmer/4000 Angstrom break, and the detection of UV-bright galaxies at z>7. Two new facilities wil revolutionize these studies: the VISTA telescope built for ESO, and the Near-IR channel on WF3 for HST. They will become available at the start of the grant period. We propose to build a group to analyze the imaging data from these facilities. We will make use of the fact that I am Co-PI on the ultra-deep ""ULTRA-VISTA"" survey on the VISTA telescope, and we will analyze public and privately proposed data from WF3. The following science questions will be addressed: (1) what is the origin and evolution of the Hubble sequence out to z=3, (2) what is the evolution of the Luminosity Function of UV bright galaxies between z=6 to z=11, and what galaxies cause re-ionization, (3) how does the mass function of quiescent and star forming galaxies evolve to z=4, and how do the correlation functions of subpopulations evolve as a function of redshift. A crucial component of this proposal is the request for support for a junior faculty position. This person will take on the lead for the highly specialized data processing, and will supervise the analysis of the selection effects, and other crucial components needed for a proper analysis."
Max ERC Funding
1 471 200 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym HITSUPERJU
Project Higher-dimensional topological solids realized with multiterminal superconducting junctions
Researcher (PI) Iouli Vyacheslavovitch Nazarov
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE3, ERC-2015-AdG
Summary "Recently I revealed a deep operational analogy between an exotic material and an electronic device, i.e. between a 3-dimensional topological solid and a 4-terminal superconducting junction. Specifically, the 3d Weyl singularities revealed in the energy spectrum of this quantum device give rise to quantized trans-conductance in two leads that is typical for 2-dimensional topological Quantum Hall materials. The quantized value can be tuned with the third control phase.
I propose to capitalize on this breakthrough by realizing artificial n-dimensional (topological) solid materials by (n+1)-terminal superconducting junctions. This seemed to be fundamentally forbidden so far. In particular, in the framework of one research direction I will address the realization of higher Chern numbers. The edges and interfaces are important in topological solids, they need to be structured. For the artificial topological materials made with multi-terminal superconducting junctions such structuring is impossible in geometric coordinate space. However, the fact that the charge and superconducting phase are quantum-conjugated quantities provide the unique possibility for the structuring in multi-dimensional charge space that I will access in the framework of another direction. These two research directions will be supplemented by a more technical effort devoted to computational (quantum) dynamics of multi-terminal superconducting junctions.
The proposed way to ""conquer"" higher dimensions for condensed matter physics is of clear fundamental importance. Exciting applications are at the horizon, too. The exotic quantum states under consideration can be topologically protected and thus useful for quantum information processing. Quantized trans-resistance as well as other topological invariants may be important in metrology. More generally, the research proposed will boost the whole field of electronic devices wherever topology guarantees the discrete stability of device characteristics"
Summary
"Recently I revealed a deep operational analogy between an exotic material and an electronic device, i.e. between a 3-dimensional topological solid and a 4-terminal superconducting junction. Specifically, the 3d Weyl singularities revealed in the energy spectrum of this quantum device give rise to quantized trans-conductance in two leads that is typical for 2-dimensional topological Quantum Hall materials. The quantized value can be tuned with the third control phase.
I propose to capitalize on this breakthrough by realizing artificial n-dimensional (topological) solid materials by (n+1)-terminal superconducting junctions. This seemed to be fundamentally forbidden so far. In particular, in the framework of one research direction I will address the realization of higher Chern numbers. The edges and interfaces are important in topological solids, they need to be structured. For the artificial topological materials made with multi-terminal superconducting junctions such structuring is impossible in geometric coordinate space. However, the fact that the charge and superconducting phase are quantum-conjugated quantities provide the unique possibility for the structuring in multi-dimensional charge space that I will access in the framework of another direction. These two research directions will be supplemented by a more technical effort devoted to computational (quantum) dynamics of multi-terminal superconducting junctions.
The proposed way to ""conquer"" higher dimensions for condensed matter physics is of clear fundamental importance. Exciting applications are at the horizon, too. The exotic quantum states under consideration can be topologically protected and thus useful for quantum information processing. Quantized trans-resistance as well as other topological invariants may be important in metrology. More generally, the research proposed will boost the whole field of electronic devices wherever topology guarantees the discrete stability of device characteristics"
Max ERC Funding
1 522 810 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym HOWTOCONTROLGRAPHENE
Project Search for mechanisms to control massless electrons in graphene
Researcher (PI) Carlo Beenakker
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary Conduction electrons in the carbon monolayer known as graphene have zero effective mass. This property offers unique opportunities for fast electronics, if we can somehow learn to control the dynamics of particles which have a charge but no mass. Fresh ideas are needed for this purpose, since an electric field is incapable of stopping a massless electron (its velocity being energy independent).
The applicant and his group at the Lorentz Institute for Theoretical Physics in Leiden University have started exploring the new physics of graphene soon after the announcement two years ago of the discovery of massless electrons in this material. We have identified several promising control mechanisms, and are now ready to embark on a systematic search. Our objective is to discover ways to manipulate in a controlled manner three independent electronic degrees of freedom: charge, spin, and valley.
The charge is the primary carrier of classical information, being strongly coupled to the environment, while the spin is the primary carrier of quantum information, in view of its weak coupling to the environment. The valley degree of freedom (which defines the chirality of the massless particles) is intermediate between charge and spin with regard to the coupling to the environment, and provides some unique opportunities for control. In particular, we have the idea that by acting on the valley rather than on the charge it would be possible to fully block the electronic current (something which an electric field by itself is incapable of). To study these effects we will need to develop new methodologies, since the established methods to model quantum transport in nanostructures are unsuitable for massless carriers.
Summary
Conduction electrons in the carbon monolayer known as graphene have zero effective mass. This property offers unique opportunities for fast electronics, if we can somehow learn to control the dynamics of particles which have a charge but no mass. Fresh ideas are needed for this purpose, since an electric field is incapable of stopping a massless electron (its velocity being energy independent).
The applicant and his group at the Lorentz Institute for Theoretical Physics in Leiden University have started exploring the new physics of graphene soon after the announcement two years ago of the discovery of massless electrons in this material. We have identified several promising control mechanisms, and are now ready to embark on a systematic search. Our objective is to discover ways to manipulate in a controlled manner three independent electronic degrees of freedom: charge, spin, and valley.
The charge is the primary carrier of classical information, being strongly coupled to the environment, while the spin is the primary carrier of quantum information, in view of its weak coupling to the environment. The valley degree of freedom (which defines the chirality of the massless particles) is intermediate between charge and spin with regard to the coupling to the environment, and provides some unique opportunities for control. In particular, we have the idea that by acting on the valley rather than on the charge it would be possible to fully block the electronic current (something which an electric field by itself is incapable of). To study these effects we will need to develop new methodologies, since the established methods to model quantum transport in nanostructures are unsuitable for massless carriers.
Max ERC Funding
1 563 800 €
Duration
Start date: 2009-06-01, End date: 2013-10-31
Project acronym ICARUS
Project Towards Innovative cost-effective astronomical instrumentation
Researcher (PI) Emmanuel Hugot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary Enabling disruptive technologies has always been crucial to trigger revolutionary science discoveries. The daring challenges in astronomy and astrophysics are extremely demanding in terms of high angular resolution and high contrast imaging, and require extreme stability and image quality. Instruments based on current classical designs tend to get bigger and more complex, and are faced to ever increasing difficulties to meet science requirements.
This proposal has the clear objective to propose breakthrough compact optical architectures for the next generation of giant observatories. The project focus on the niche of active components and is structured in two main research pillars to (I) enable the use of additive manufacturing (3D-printing) to produce affordable deformable mirrors for VIS or NIR observations, (II) pave the road for a common use of curved and deformable detectors. Extensive finite element analysis will allow to cover the parameter space and broad prototyping will demonstrate and characterize the performance of such devices.
Both pillars are extremely challenging, the fields of detectors and optical fabrication being driven by the market. We will then orientate the activities towards a mass production method.
To maximize the impact of this high gain R&D, the pillars are surrounded by two transverse activities: (i) design and optimization of a new zoo of optical systems using active mirrors and flexible detectors, and (ii) build a solid plan of technology transfer to end-user industrial companies, through a patenting and licensing strategy, to maximize the financial return and then perpetuate the activities.
The pathway proposed here is mandatory to develop affordable components in the near future, and will enable compact and high performance instrumentation. These high potential activities will dramatically reduce the complexity of instruments in the era of giant observatories, simplify the operability of systems and offer increased performance.
Summary
Enabling disruptive technologies has always been crucial to trigger revolutionary science discoveries. The daring challenges in astronomy and astrophysics are extremely demanding in terms of high angular resolution and high contrast imaging, and require extreme stability and image quality. Instruments based on current classical designs tend to get bigger and more complex, and are faced to ever increasing difficulties to meet science requirements.
This proposal has the clear objective to propose breakthrough compact optical architectures for the next generation of giant observatories. The project focus on the niche of active components and is structured in two main research pillars to (I) enable the use of additive manufacturing (3D-printing) to produce affordable deformable mirrors for VIS or NIR observations, (II) pave the road for a common use of curved and deformable detectors. Extensive finite element analysis will allow to cover the parameter space and broad prototyping will demonstrate and characterize the performance of such devices.
Both pillars are extremely challenging, the fields of detectors and optical fabrication being driven by the market. We will then orientate the activities towards a mass production method.
To maximize the impact of this high gain R&D, the pillars are surrounded by two transverse activities: (i) design and optimization of a new zoo of optical systems using active mirrors and flexible detectors, and (ii) build a solid plan of technology transfer to end-user industrial companies, through a patenting and licensing strategy, to maximize the financial return and then perpetuate the activities.
The pathway proposed here is mandatory to develop affordable components in the near future, and will enable compact and high performance instrumentation. These high potential activities will dramatically reduce the complexity of instruments in the era of giant observatories, simplify the operability of systems and offer increased performance.
Max ERC Funding
1 747 667 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym IMPACT
Project The giant impact and the Earth and Moon formation
Researcher (PI) Razvan Caracas
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Very little is understood of the physics governing the Giant Impact and the subsequent formation of the Moon. According to this model an impactor hit the proto-Earth; the resulting energy was enough to melt and partially vaporize the two bodies generating a large protolunar disk, from which the Earth-Moon couple formed. Hydrodynamic simulations of the impact and the subsequent evolution of the protolunar disk are currently based on models of equations of state and phase diagrams that are unconstrained by experiments or calculations. Estimates of the positions of critical points, when available at all, vary by one order of magnitude in both temperature and density. Here we propose to compute the thermodynamics of the major rock-forming minerals and rock aggregates, and use it to study the formation and evolution of the protolunar disk. For this we employ a unique combination of atomistic state-of-the-art ab initio simulations. We use large-scale density-functional theory (DFT) molecular dynamics to study bulk fluids, coupled with Green functions (GW) and time-dependent DFT techniques to analyze atomic clusters and molecular species. We compute the vaporization curves, position the supercritical points, and characterize the sub-critical and supercritical regimes. We construct equations of state of the rocks at the conditions of the giant impact that are beyond current experimental capabilities. We employ a multiscale approach to bridge the gap between atomic, geological sample, and planetary scales via thermodynamics; we simulate the thermal profile through the disk, the ratio between liquid and vapor, and the speciation. From speciation we predict elemental and isotopic partitioning during condensation. Plausible impact scenarios, features of the impactor and of the proto-Earth will be constrained with a feedback loop, until convergence between predictions of final Earth-Moon compositions and observations is reached.
Summary
Very little is understood of the physics governing the Giant Impact and the subsequent formation of the Moon. According to this model an impactor hit the proto-Earth; the resulting energy was enough to melt and partially vaporize the two bodies generating a large protolunar disk, from which the Earth-Moon couple formed. Hydrodynamic simulations of the impact and the subsequent evolution of the protolunar disk are currently based on models of equations of state and phase diagrams that are unconstrained by experiments or calculations. Estimates of the positions of critical points, when available at all, vary by one order of magnitude in both temperature and density. Here we propose to compute the thermodynamics of the major rock-forming minerals and rock aggregates, and use it to study the formation and evolution of the protolunar disk. For this we employ a unique combination of atomistic state-of-the-art ab initio simulations. We use large-scale density-functional theory (DFT) molecular dynamics to study bulk fluids, coupled with Green functions (GW) and time-dependent DFT techniques to analyze atomic clusters and molecular species. We compute the vaporization curves, position the supercritical points, and characterize the sub-critical and supercritical regimes. We construct equations of state of the rocks at the conditions of the giant impact that are beyond current experimental capabilities. We employ a multiscale approach to bridge the gap between atomic, geological sample, and planetary scales via thermodynamics; we simulate the thermal profile through the disk, the ratio between liquid and vapor, and the speciation. From speciation we predict elemental and isotopic partitioning during condensation. Plausible impact scenarios, features of the impactor and of the proto-Earth will be constrained with a feedback loop, until convergence between predictions of final Earth-Moon compositions and observations is reached.
Max ERC Funding
1 900 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym IsoMS
Project Mass Spectrometry of Isomeric Ions
Researcher (PI) Jana Roithova
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Consolidator Grant (CoG), PE4, ERC-2015-CoG
Summary Mass spectrometry (MS) in combination with electrospray ionization (ESI) is one of the principal tools currently used to gain insight into newly developed catalytic reactions. It is used to identify key reaction intermediates and to study their structure and reactivity. This proposal is based on the combination of modern MS approaches with novel experiments in a unique cryo-trapping instrument. This combination allows the study of short-lived ionic species that cannot be studied by other known methods. Our distinguishing feature is the in situ helium-tagging of ions, which allows us to record their infrared spectra via a pre-dissociation technique. Here, we will go beyond this state-of-the-art approach in two directions:
(1) The unparalleled advantage of ESI-MS is its high sensitivity to low-abundant and reactive species. The pertinent question at the heart of all reaction mechanism investigations via MS is how the ions found in the gas-phase relate to the condensed-phase reaction. We will address this question using “Delayed Reactant Labelling”, which will directly link condensed phase kinetics to the abundance of isolated gaseous ions.
(2) We will take advantage of long storage times in our cryogenic linear quadrupole trap and expand the portfolio of the methods available to address mixtures of ions with the same mass. Isobaric mixtures are resolved in MS by differences in ion mobilities, i.e. the ions are separated by their mass-to-charge ratios and by their shapes. We will perform ion mobility separation directly in the trap by excitation of the ion secular motion using a resonant dipolar electric field. Further, we will combine cryo-trapping experiments with the probing or modifying of the stored ions by reactive collisions with neutral molecules. The mobility experiments and the reactivity probing will be routinely combined with spectroscopic experiments.
Summary
Mass spectrometry (MS) in combination with electrospray ionization (ESI) is one of the principal tools currently used to gain insight into newly developed catalytic reactions. It is used to identify key reaction intermediates and to study their structure and reactivity. This proposal is based on the combination of modern MS approaches with novel experiments in a unique cryo-trapping instrument. This combination allows the study of short-lived ionic species that cannot be studied by other known methods. Our distinguishing feature is the in situ helium-tagging of ions, which allows us to record their infrared spectra via a pre-dissociation technique. Here, we will go beyond this state-of-the-art approach in two directions:
(1) The unparalleled advantage of ESI-MS is its high sensitivity to low-abundant and reactive species. The pertinent question at the heart of all reaction mechanism investigations via MS is how the ions found in the gas-phase relate to the condensed-phase reaction. We will address this question using “Delayed Reactant Labelling”, which will directly link condensed phase kinetics to the abundance of isolated gaseous ions.
(2) We will take advantage of long storage times in our cryogenic linear quadrupole trap and expand the portfolio of the methods available to address mixtures of ions with the same mass. Isobaric mixtures are resolved in MS by differences in ion mobilities, i.e. the ions are separated by their mass-to-charge ratios and by their shapes. We will perform ion mobility separation directly in the trap by excitation of the ion secular motion using a resonant dipolar electric field. Further, we will combine cryo-trapping experiments with the probing or modifying of the stored ions by reactive collisions with neutral molecules. The mobility experiments and the reactivity probing will be routinely combined with spectroscopic experiments.
Max ERC Funding
1 612 500 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym ISOREE
Project New insight into the origin of the Earth, its bulk composition and its early evolution
Researcher (PI) Maud Boyet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary The main geochemical features of the mantles of terrestrial planets and asteroids can be attributed to differentiation events that occurred during or shortly after the formation of the Solar System. Numerous questions remain regarding the Earth’s bulk composition and the most likely scenario for its evolution prior to the last major differentiation event caused by a giant impact leading to the formation of the Moon. The aim of this five-year project is to evaluate the state-of-the-art models of the Earth’s early evolution with the following main objectives: (i) Defining precisely the age of the Moon’s formation, (ii) Refining the giant impact model and the Earth-Moon relationship, (iii) Dating the successive magmatic ocean stages on Earth, and (iv) Constraining the Earth mantle’s composition in terms of rare earth element concentrations. These different questions will be addressed using trace elements, radiogenic isotopic systematics (146Sm-142Nd, 147Sm-143Nd, 138La-138Ce) and stable isotopes. ISOREE is a multi-disciplinary project that combines isotope and trace element geochemistry, experimental geochemistry and spectroscopy. A large number of samples will be analysed, including terrestrial rocks with ages up to 3.8 Ga, chondrites, achondrites and lunar samples.
This proposal will provide the tools to tackle a vast topic from various angles, using new methodologies and instrumentation and promoting innovation and creativity in European research. This research program is essential to further constrain the major events that occurred very early on in the Earth’s history, such as the Earth’s cooling, its crustal growth, the surface conditions and development of potential habitats for life.
Summary
The main geochemical features of the mantles of terrestrial planets and asteroids can be attributed to differentiation events that occurred during or shortly after the formation of the Solar System. Numerous questions remain regarding the Earth’s bulk composition and the most likely scenario for its evolution prior to the last major differentiation event caused by a giant impact leading to the formation of the Moon. The aim of this five-year project is to evaluate the state-of-the-art models of the Earth’s early evolution with the following main objectives: (i) Defining precisely the age of the Moon’s formation, (ii) Refining the giant impact model and the Earth-Moon relationship, (iii) Dating the successive magmatic ocean stages on Earth, and (iv) Constraining the Earth mantle’s composition in terms of rare earth element concentrations. These different questions will be addressed using trace elements, radiogenic isotopic systematics (146Sm-142Nd, 147Sm-143Nd, 138La-138Ce) and stable isotopes. ISOREE is a multi-disciplinary project that combines isotope and trace element geochemistry, experimental geochemistry and spectroscopy. A large number of samples will be analysed, including terrestrial rocks with ages up to 3.8 Ga, chondrites, achondrites and lunar samples.
This proposal will provide the tools to tackle a vast topic from various angles, using new methodologies and instrumentation and promoting innovation and creativity in European research. This research program is essential to further constrain the major events that occurred very early on in the Earth’s history, such as the Earth’s cooling, its crustal growth, the surface conditions and development of potential habitats for life.
Max ERC Funding
2 200 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym KERNEL
Project Ultimate Angular Resolution Astrophysics with kernel-phase and full-aperture interferometry
Researcher (PI) Frantz Martinache
Host Institution (HI) OBSERVATOIRE DE LA COTE D'AZUR (OCA)
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Astronomy requires large telescopes to improve the sensitivity and the angular resolution of its observations. Of these qualities, angular resolution is the most difficult to maintain in the optical and near-infrared, since the atmosphere reduces it to that of a 10 cm aperture, regardless of the telescope size. On the one-hand, Adaptive Optics (AO) actively compensates for this effect but the improvement is often partial only. On the other hand, interferometric techniques (most notably sparse aperture masking interferometry) passively allow the extraction of self-calibrating observables, that boost the angular resolution, but severely affect the sensitivity of observations. A framework newly established by the PI of the proposal however now makes it possible to extract generalized self-calibrating observables called kernel-phases from conventional AO-corrected images. The work outlined in this proposal will make it possible to scientifically exploit the high angular resolution imaging capability of this technique, to improve its robustness and to expand its capabilities. The framework offers a very general purpose high angular resolution imaging tool for astronomers as well as wavefront control experts. This proposal is organized in five work-packages of increasing challenge that include: the reinterpretation of existing archival data
with a super-resolution capability, the expansion of its robustness to open up new more challenging use-cases, a special focus on the development of a very high-dynamic range mode, the adaptation of interferometric image reconstruction techniques, and the development of new advanced AO concepts. The consequences of this project will have a major impact on the design and scientific exploitation of future high angular resolution instrumentation on the existing generation of 8-10 meter class telescopes as well as on the upcoming generation of 30-40 meter giants, championned by Europe and its E-ELT.
Summary
Astronomy requires large telescopes to improve the sensitivity and the angular resolution of its observations. Of these qualities, angular resolution is the most difficult to maintain in the optical and near-infrared, since the atmosphere reduces it to that of a 10 cm aperture, regardless of the telescope size. On the one-hand, Adaptive Optics (AO) actively compensates for this effect but the improvement is often partial only. On the other hand, interferometric techniques (most notably sparse aperture masking interferometry) passively allow the extraction of self-calibrating observables, that boost the angular resolution, but severely affect the sensitivity of observations. A framework newly established by the PI of the proposal however now makes it possible to extract generalized self-calibrating observables called kernel-phases from conventional AO-corrected images. The work outlined in this proposal will make it possible to scientifically exploit the high angular resolution imaging capability of this technique, to improve its robustness and to expand its capabilities. The framework offers a very general purpose high angular resolution imaging tool for astronomers as well as wavefront control experts. This proposal is organized in five work-packages of increasing challenge that include: the reinterpretation of existing archival data
with a super-resolution capability, the expansion of its robustness to open up new more challenging use-cases, a special focus on the development of a very high-dynamic range mode, the adaptation of interferometric image reconstruction techniques, and the development of new advanced AO concepts. The consequences of this project will have a major impact on the design and scientific exploitation of future high angular resolution instrumentation on the existing generation of 8-10 meter class telescopes as well as on the upcoming generation of 30-40 meter giants, championned by Europe and its E-ELT.
Max ERC Funding
1 717 811 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym LEASP
Project Learning spatiotemporal patterns in longitudinal image data sets of the aging brain
Researcher (PI) Stanley Durrleman
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Summary
Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Max ERC Funding
1 499 894 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LENA
Project non-LinEar sigNal processing for solving data challenges in Astrophysics
Researcher (PI) Jérôme Bobin
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Summary
Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Max ERC Funding
1 497 411 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LiKo
Project From Liouville to Kolmogorov: 2d quantum gravity, noise sensitivity and turbulent flows
Researcher (PI) Christophe Garban
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Summary
This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Max ERC Funding
935 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LOFAR-AUGER
Project From Black Holes to Ultra-High Energy Cosmic Rays: Exploring the Extremes of the Universe with Low-Frequency Radio Interferometry
Researcher (PI) Heino Falcke
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Black holes (BHs) and ultra-high energy cosmic rays (UHECRs) are two extremes of the universe that link particle physics and astrophysics. BHs are the most efficient power generators in the universe while UHECRs are the most energetic particles ever detected. As we showed previously, a major fraction of the power of BHs is channeled into radio-emitting plasma jets, which are also efficient particle accelerators. Are BHs also responsible for UHECRs? This long-standing question could be answered soon, through the dawn of cosmic ray astronomy. The giant Auger observatory has now shown for the first time that the arrival directions of UHECRs are non-isotropic, potentially pointing back to their sources of origin. BHs turned out to be major suspects, but other sources could still also be responsible. To address this conclusively and to establish cosmic ray astronomy as a productive new field in the coming years, we need to increase statistics, expand current observatories, and have complementary all-sky radio surveys available to identify sources, since radio emission traces particle acceleration sites. Here, techniques pioneered by the Low-Frequency Array (LOFAR) promise major advances. First of all, working on LOFAR we uncovered a new technique to detect UHECRs with radio antennas and verified it experimentally. The technique promises to increase the number of high-quality events by almost an order of magnitude and provides much improved energy and direction resolution. We now want to implement this technique in Auger, combining LOFAR and AUGER know-how. Secondly, LOFAR and soon other SKA pathfinders will significantly improve all-sky radio surveys with high sensitivity, resolution, and image quality. Hence, we will use LOFAR to understand the astrophysics of UHECR source candidates and compile a radio-based catalog thereof. We start with jets from BHs and move later to other sources. Together this will allow us to identify UHECR sources and study them in detail.
Summary
Black holes (BHs) and ultra-high energy cosmic rays (UHECRs) are two extremes of the universe that link particle physics and astrophysics. BHs are the most efficient power generators in the universe while UHECRs are the most energetic particles ever detected. As we showed previously, a major fraction of the power of BHs is channeled into radio-emitting plasma jets, which are also efficient particle accelerators. Are BHs also responsible for UHECRs? This long-standing question could be answered soon, through the dawn of cosmic ray astronomy. The giant Auger observatory has now shown for the first time that the arrival directions of UHECRs are non-isotropic, potentially pointing back to their sources of origin. BHs turned out to be major suspects, but other sources could still also be responsible. To address this conclusively and to establish cosmic ray astronomy as a productive new field in the coming years, we need to increase statistics, expand current observatories, and have complementary all-sky radio surveys available to identify sources, since radio emission traces particle acceleration sites. Here, techniques pioneered by the Low-Frequency Array (LOFAR) promise major advances. First of all, working on LOFAR we uncovered a new technique to detect UHECRs with radio antennas and verified it experimentally. The technique promises to increase the number of high-quality events by almost an order of magnitude and provides much improved energy and direction resolution. We now want to implement this technique in Auger, combining LOFAR and AUGER know-how. Secondly, LOFAR and soon other SKA pathfinders will significantly improve all-sky radio surveys with high sensitivity, resolution, and image quality. Hence, we will use LOFAR to understand the astrophysics of UHECR source candidates and compile a radio-based catalog thereof. We start with jets from BHs and move later to other sources. Together this will allow us to identify UHECR sources and study them in detail.
Max ERC Funding
3 460 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym MagneticYSOs
Project Interpreting Dust Polarization Maps to Characterize the Role of the Magnetic Field in Star Formation Processes
Researcher (PI) Anaëlle Julie Maury
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary "Rotation and angular momentum transport play a critical role in the formation and evolution of astrophysical objects, including the fundamental bricks of astrophysical structures: stars. Stars like our Sun form when rotating dense cores, in the interstellar medium, collapse until they eventually reach temperatures at which nuclear fusion begins; while planets, including the Earth, form in the rotationally supported disks around these same young stars. One of the major challenges of modern astrophysics is the “angular momentum problem"": observations show that a typical star-forming cloud needs to reduce its specific angular momentum by 5 to 10 orders of magnitude to form a typical star such as our Sun. It is also crucial to solve the angular momentum problem to understand the formation of protoplanetary disks, stellar binaries and the initial mass function of newly formed stars. Magnetic fields are one of the key ways of transporting angular momentum in astrophysical structures: understanding how angular momentum is transported to allow star formation requires characterizing the role of magnetic fields in shaping the dynamics of star-forming structures. The MagneticYSOs project aims at characterizing the role of magnetic field in the earliest stage of star formation, during the main accretion phase.
The simultaneous major improvements of instrumental and computational facilities provide us, for the first time, with the opportunity to confront observational information to magnetized models predictions. Polarization capabilities on the last generation of instrument in large facilities are producing sensitive observations of magnetic fields with a great level of detail, while numerical simulations of star formation are now including most of the physical ingredients for a detailed description of protostellar collapse at all the relevant scales, such as resistive MHD, radiative transfer and chemical networks. These new tools will undoubtedly lead to major discovery in the fields of planets and star formation in the coming years. It is necessary to conduct comprehensive projects able to combine theory and observations in a detailed fashion, which in turn require a collaboration with access to cutting edge observational datasets and numerical models. Through an ambitious multi-faceted program of dedicated observations probing magnetic fields (polarized dust emission and Zeeman effect maps), gas kinematics (molecular lines emission maps), ionization rates and dust properties in Class 0 protostars, and their comparison to synthetic observations of MHD simulations of protostellar collapse, we aim to transform our understanding of:
1) The long-standing problem of angular momentum in star formation
2) The origin of the stellar initial mass function
3) The formation of multiple stellar systems and circumstellar disks around young stellar objects (YSOs)
Not only this project will enable a major leap forward in our understanding of low-mass star formation, answering yet unexplored questions with innovative methods, but it will also allow to spread the expertise in interpreting high-angular resolution (sub-)mm polarization data. Although characterizing magnetic fields in astrophysical structures represents the next frontier in many fields (solar physics, evolved stars, compact objects, galactic nuclei are a few examples), only a handful of astronomers in the EU community are familiar with interferometric polarization data, mostly because of the absence of large european facilities providing such capabilities until the recent advent of ALMA. It is now crucial to strengthen the European position in this research field by training a new generation of physicists with a strong expertise on tailoring, analyzing and interpreting high angular resolution polarization data."
Summary
"Rotation and angular momentum transport play a critical role in the formation and evolution of astrophysical objects, including the fundamental bricks of astrophysical structures: stars. Stars like our Sun form when rotating dense cores, in the interstellar medium, collapse until they eventually reach temperatures at which nuclear fusion begins; while planets, including the Earth, form in the rotationally supported disks around these same young stars. One of the major challenges of modern astrophysics is the “angular momentum problem"": observations show that a typical star-forming cloud needs to reduce its specific angular momentum by 5 to 10 orders of magnitude to form a typical star such as our Sun. It is also crucial to solve the angular momentum problem to understand the formation of protoplanetary disks, stellar binaries and the initial mass function of newly formed stars. Magnetic fields are one of the key ways of transporting angular momentum in astrophysical structures: understanding how angular momentum is transported to allow star formation requires characterizing the role of magnetic fields in shaping the dynamics of star-forming structures. The MagneticYSOs project aims at characterizing the role of magnetic field in the earliest stage of star formation, during the main accretion phase.
The simultaneous major improvements of instrumental and computational facilities provide us, for the first time, with the opportunity to confront observational information to magnetized models predictions. Polarization capabilities on the last generation of instrument in large facilities are producing sensitive observations of magnetic fields with a great level of detail, while numerical simulations of star formation are now including most of the physical ingredients for a detailed description of protostellar collapse at all the relevant scales, such as resistive MHD, radiative transfer and chemical networks. These new tools will undoubtedly lead to major discovery in the fields of planets and star formation in the coming years. It is necessary to conduct comprehensive projects able to combine theory and observations in a detailed fashion, which in turn require a collaboration with access to cutting edge observational datasets and numerical models. Through an ambitious multi-faceted program of dedicated observations probing magnetic fields (polarized dust emission and Zeeman effect maps), gas kinematics (molecular lines emission maps), ionization rates and dust properties in Class 0 protostars, and their comparison to synthetic observations of MHD simulations of protostellar collapse, we aim to transform our understanding of:
1) The long-standing problem of angular momentum in star formation
2) The origin of the stellar initial mass function
3) The formation of multiple stellar systems and circumstellar disks around young stellar objects (YSOs)
Not only this project will enable a major leap forward in our understanding of low-mass star formation, answering yet unexplored questions with innovative methods, but it will also allow to spread the expertise in interpreting high-angular resolution (sub-)mm polarization data. Although characterizing magnetic fields in astrophysical structures represents the next frontier in many fields (solar physics, evolved stars, compact objects, galactic nuclei are a few examples), only a handful of astronomers in the EU community are familiar with interferometric polarization data, mostly because of the absence of large european facilities providing such capabilities until the recent advent of ALMA. It is now crucial to strengthen the European position in this research field by training a new generation of physicists with a strong expertise on tailoring, analyzing and interpreting high angular resolution polarization data."
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym MALIG
Project A mathematical approach to the liquid-glass transition: kinetically constrained models, cellular automata and mixed order phase transitions
Researcher (PI) cristina Toninelli
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Summary
This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Max ERC Funding
883 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MEQUANO
Project Mesoscopic Quantum Noise: from few electron statistics to shot noise based photon detection
Researcher (PI) D. Christian Glattli
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary We propose innovative approaches to electronic quantum noise going from very fundamental topics addressing the quantum statistics of few electrons transferred through conductors to direct applications with the realization of new types of versatile broadband photon detectors based on photon-assisted shot noise. We will develop electron counting tools which will not only allow to full characterization of electron statistics but also open the way to new quantum interference experiments involving few electrons or fractional charge carriers and will question our understanding of quantum statistics. Generation of few electron bunches will be obtained by the yet never done technique of short voltage pulses whose duration is limited to few action quanta, one quantum for one electron. Detection of electron bunches will be done by an unprecedented technique of cut and probe where carriers are suddenly isolated in the circuit for further sensitive charge detection. Using highly ballistic electron nanostructures such as Graphene, III-V semiconductors with light carriers, Carbone Nanotubes or simply tunnel barriers, we will bring mesoscopic quantum noise effects to higher temperature, energy and frequency range, and thus closer to applications. Inspired by late R. Landauer s saying: the noise IS the signal we will develop totally new detectors based on the universal effect of photon-assisted electron shot noise. These versatile broadband detectors will be used either for on-chip noise detection or for photon radiation detection, possibly including imaging. They will operate above liquid Helium temperature and at THz frequencies although projected operation includes room temperature and far-infrared range as no fundamental limitation is expected. The complete program, balanced between very fundamental quantum issues and applications of quantum effects, will open routes for new quantum investigations and offer to a broad community new applications of mesoscopic effects.
Summary
We propose innovative approaches to electronic quantum noise going from very fundamental topics addressing the quantum statistics of few electrons transferred through conductors to direct applications with the realization of new types of versatile broadband photon detectors based on photon-assisted shot noise. We will develop electron counting tools which will not only allow to full characterization of electron statistics but also open the way to new quantum interference experiments involving few electrons or fractional charge carriers and will question our understanding of quantum statistics. Generation of few electron bunches will be obtained by the yet never done technique of short voltage pulses whose duration is limited to few action quanta, one quantum for one electron. Detection of electron bunches will be done by an unprecedented technique of cut and probe where carriers are suddenly isolated in the circuit for further sensitive charge detection. Using highly ballistic electron nanostructures such as Graphene, III-V semiconductors with light carriers, Carbone Nanotubes or simply tunnel barriers, we will bring mesoscopic quantum noise effects to higher temperature, energy and frequency range, and thus closer to applications. Inspired by late R. Landauer s saying: the noise IS the signal we will develop totally new detectors based on the universal effect of photon-assisted electron shot noise. These versatile broadband detectors will be used either for on-chip noise detection or for photon radiation detection, possibly including imaging. They will operate above liquid Helium temperature and at THz frequencies although projected operation includes room temperature and far-infrared range as no fundamental limitation is expected. The complete program, balanced between very fundamental quantum issues and applications of quantum effects, will open routes for new quantum investigations and offer to a broad community new applications of mesoscopic effects.
Max ERC Funding
1 999 843 €
Duration
Start date: 2009-02-01, End date: 2015-01-31
Project acronym MicMactin
Project Dissecting active matter: Microscopic origins of macroscopic actomyosin activity
Researcher (PI) Martin Sylvain Peter Lenz
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary "Biological motion and forces originate from mechanically active proteins operating at the nanometer scale. These individual active elements interact through the surrounding cellular medium, collectively generating structures spanning tens of micrometers whose mechanical properties are perfectly tuned to their fundamentally out-of-equilibrium biological function. While both individual proteins and the resulting cellular behaviors are well characterized, understanding the relationship between these two scales remains a major challenge in both physics and cell biology.
We will bridge this gap through multiscale models of the emergence of active material properties in the experimentally well-characterized actin cytoskeleton. We will thus investigate unexplored, strongly interacting nonequilibrium regimes. We will develop a complete framework for cytoskeletal activity by separately studying all three fundamental processes driving it out of equilibrium: actin filament assembly and disassembly, force exertion by branched actin networks, and the action of molecular motors. We will then recombine these approaches into a unified understanding of complex cell motility processes.
To tackle the cytoskeleton's disordered geometry and many-body interactions, we will design new nonequilibrium self consistent methods in statistical mechanics and elasticity theory. Our findings will be validated through simulations and close experimental collaborations.
Our work will break new ground in both biology and physics. In the context of biology, it will establish a new framework to understand how the cell controls its achitecture and mechanics through biochemical regulation. On the physics side, it will set up new paradigms for the emergence of original out-of-equilibrium collective behaviors in an experimentally well-characterized system, addressing the foundations of existing macroscopic "active matter" approaches."
Summary
"Biological motion and forces originate from mechanically active proteins operating at the nanometer scale. These individual active elements interact through the surrounding cellular medium, collectively generating structures spanning tens of micrometers whose mechanical properties are perfectly tuned to their fundamentally out-of-equilibrium biological function. While both individual proteins and the resulting cellular behaviors are well characterized, understanding the relationship between these two scales remains a major challenge in both physics and cell biology.
We will bridge this gap through multiscale models of the emergence of active material properties in the experimentally well-characterized actin cytoskeleton. We will thus investigate unexplored, strongly interacting nonequilibrium regimes. We will develop a complete framework for cytoskeletal activity by separately studying all three fundamental processes driving it out of equilibrium: actin filament assembly and disassembly, force exertion by branched actin networks, and the action of molecular motors. We will then recombine these approaches into a unified understanding of complex cell motility processes.
To tackle the cytoskeleton's disordered geometry and many-body interactions, we will design new nonequilibrium self consistent methods in statistical mechanics and elasticity theory. Our findings will be validated through simulations and close experimental collaborations.
Our work will break new ground in both biology and physics. In the context of biology, it will establish a new framework to understand how the cell controls its achitecture and mechanics through biochemical regulation. On the physics side, it will set up new paradigms for the emergence of original out-of-equilibrium collective behaviors in an experimentally well-characterized system, addressing the foundations of existing macroscopic "active matter" approaches."
Max ERC Funding
1 491 868 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MICROLIPIDS
Project Microbial lipids: The three domain ‘lipid divide’ revisited
Researcher (PI) Jacobus Smede SINNINGHE DAMSTE
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary Tremendous progress has been made in the last decade in the genetic characterization of microorganisms, both in culture and in the environment. However, our knowledge of microbial membrane lipids, essential building blocks of the cell, has only marginally improved. This is remarkable since there exists a dichotomy in the distribution of lipids between the three Domains of Life. Diacyl glycerols based on straight-chain fatty acids are produced by bacteria and eukaryotes, whereas archaea synthesize isoprenoidal glycerol ether lipids. From a microbial evolutionary perspectives, this ‘lipid divide’ is enigmatic since it has recently become clear that eukaryotes evolved from the archaea. Preliminary results of my research group show that when novel analytical methodology is used, there is a large hidden diversity in microbial lipid composition that may resolve this fundamental question. Here I propose to systematically characterize prokaryotic intact polar lipids (IPLs) with state-of-the-art analytical techniques based on liquid chromatography and high-resolution mass spectrometry to bring our knowledge of microbial lipids to the next level. To this end, we will characterize (i) 250+ bacterial and archaeal cultures and (ii) 200+ environmental samples for IPLs by HPLC-MS, complemented by full identification of fatty acids and other lipids released after acid hydrolysis of total cells. This approach will be complemented by the characterisation of functional genes for lipid biosynthesis. This will involve both mapping of known genes, based on the analysis of published whole (meta)genome data, as well as the identification of as yet unknown genes in selected groups of prokaryotes. The results are expected to make a fundamental contribution to (i) our understanding of the evolution of biosynthesis of membrane lipids, (ii) their application as microbial markers in the environment, and (iii) in the development and application of organic proxies in earth sciences.
Summary
Tremendous progress has been made in the last decade in the genetic characterization of microorganisms, both in culture and in the environment. However, our knowledge of microbial membrane lipids, essential building blocks of the cell, has only marginally improved. This is remarkable since there exists a dichotomy in the distribution of lipids between the three Domains of Life. Diacyl glycerols based on straight-chain fatty acids are produced by bacteria and eukaryotes, whereas archaea synthesize isoprenoidal glycerol ether lipids. From a microbial evolutionary perspectives, this ‘lipid divide’ is enigmatic since it has recently become clear that eukaryotes evolved from the archaea. Preliminary results of my research group show that when novel analytical methodology is used, there is a large hidden diversity in microbial lipid composition that may resolve this fundamental question. Here I propose to systematically characterize prokaryotic intact polar lipids (IPLs) with state-of-the-art analytical techniques based on liquid chromatography and high-resolution mass spectrometry to bring our knowledge of microbial lipids to the next level. To this end, we will characterize (i) 250+ bacterial and archaeal cultures and (ii) 200+ environmental samples for IPLs by HPLC-MS, complemented by full identification of fatty acids and other lipids released after acid hydrolysis of total cells. This approach will be complemented by the characterisation of functional genes for lipid biosynthesis. This will involve both mapping of known genes, based on the analysis of published whole (meta)genome data, as well as the identification of as yet unknown genes in selected groups of prokaryotes. The results are expected to make a fundamental contribution to (i) our understanding of the evolution of biosynthesis of membrane lipids, (ii) their application as microbial markers in the environment, and (iii) in the development and application of organic proxies in earth sciences.
Max ERC Funding
2 499 426 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym MM-PGT
Project Modern Methods for Perturbative Gauge Theories
Researcher (PI) David A. Kosower
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Gauge theories are the basis of modern theories of high-energy physics. Perturbative calculations are crucial to developing our quantitative understanding of these theories, as well as seeking new and deeper structures in these theories. Precision higher-order calculations in the SU(3) component of the Standard Model, perturbative Quantum Chromodynamics (QCD), will be crucial to understanding data at the CERN-based Large Hadron Collider (LHC) and finding and measuring physics beyond the standard model. Precision calculations in the electroweak theory will also play a role in confronting later precision data with theoretical models. The related maximally (N=4) supersymmetric gauge theory has served both as an important theoretical laboratory for developing new calculational techniques, as well as a link to string theory via the AdS/CFT duality. It is also emerging as a fruitful meeting point for ideas and methods from three distinct areas of theoretical physics: perturbative gauge theories, integrable systems, and string theory. The Project covers three related areas of perturbative gauge theories: computation of one- and two-loop amplitudes in perturbative quantum chromodynamics; incorporation of these amplitudes and development of a fully-matched parton-shower formalism and numerical code; and higher-loop computations in the N=4 supersymmetric theory. It aims to develop a general-purpose numerical-analytic hybrid program for computing phenomenologically-relevant one- and two-loop amplitudes in perturbative QCD. It also aims to develop a new parton shower allowing complete matching to leading and next-to-leading order computations. It seeks to further develop on-shell computational methods, and apply them to the N=4 supersymmetric gauge theory, with the goal of connecting perturbative quantities to their strong-coupling counterparts computed using the dual string theory.
Summary
Gauge theories are the basis of modern theories of high-energy physics. Perturbative calculations are crucial to developing our quantitative understanding of these theories, as well as seeking new and deeper structures in these theories. Precision higher-order calculations in the SU(3) component of the Standard Model, perturbative Quantum Chromodynamics (QCD), will be crucial to understanding data at the CERN-based Large Hadron Collider (LHC) and finding and measuring physics beyond the standard model. Precision calculations in the electroweak theory will also play a role in confronting later precision data with theoretical models. The related maximally (N=4) supersymmetric gauge theory has served both as an important theoretical laboratory for developing new calculational techniques, as well as a link to string theory via the AdS/CFT duality. It is also emerging as a fruitful meeting point for ideas and methods from three distinct areas of theoretical physics: perturbative gauge theories, integrable systems, and string theory. The Project covers three related areas of perturbative gauge theories: computation of one- and two-loop amplitudes in perturbative quantum chromodynamics; incorporation of these amplitudes and development of a fully-matched parton-shower formalism and numerical code; and higher-loop computations in the N=4 supersymmetric theory. It aims to develop a general-purpose numerical-analytic hybrid program for computing phenomenologically-relevant one- and two-loop amplitudes in perturbative QCD. It also aims to develop a new parton shower allowing complete matching to leading and next-to-leading order computations. It seeks to further develop on-shell computational methods, and apply them to the N=4 supersymmetric gauge theory, with the goal of connecting perturbative quantities to their strong-coupling counterparts computed using the dual string theory.
Max ERC Funding
961 080 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym MMDYNASYS
Project Molecular Motors, powering dynamic functional molecular systems
Researcher (PI) Benard Lucas FERINGA
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary In this proposal the unique properties of unidirectional light driven molecular rotary motors will be built upon to achieve dynamic control of function and develop responsive systems with a particular focus on systems in water. Light-driven molecular rotary motors are distinct from the majority of molecular switches, as they allow sequential access to multiple functional states in a responsive system through non-invasive stimulation. Importantly, continuous irradiation induces continuous rotary motion which provides a unique opportunity to design dynamic systems and responsive materials that can be driven out-of-equilibrium. The research program is divided in four work-packages: a) chemical and redox driven unidirectional motors; here we will develop processive unidirectional motors that can use (electro)chemical energy in a continuous manner, b) amplification of motion; here rotary motors operate in assemblies to amplify mechanical function over a wide range of length scales. Specifically we will use liquid crystal-water interfaces as a unique platform to control motion and organization. c) dissipative self-assembly: molecular motors offer fantastic opportunities to control self-assembly and drive such systems out-of-equilibrium. We aim at metastable aggregate formation (hydrogels) and the design of amphiphilic motors for responsive self-assembled nanostructures; d)triggering biomolecular function; the goal is to use rotary motors to regulate DNA transcription and ultimately as genuine powering device to control cardiac cell function. In the emerging field of photopharmacology, we take advantage of non-invasive high spatio-temporal control that switching with light provides. The proposed research program is highly challenging but provides the comprehensive effort required to achieve control of complex nanomechanical systems and will opening a bright future for applications ranging from stimuli responsive materials to spatio-temporal control of biomolecular systems
Summary
In this proposal the unique properties of unidirectional light driven molecular rotary motors will be built upon to achieve dynamic control of function and develop responsive systems with a particular focus on systems in water. Light-driven molecular rotary motors are distinct from the majority of molecular switches, as they allow sequential access to multiple functional states in a responsive system through non-invasive stimulation. Importantly, continuous irradiation induces continuous rotary motion which provides a unique opportunity to design dynamic systems and responsive materials that can be driven out-of-equilibrium. The research program is divided in four work-packages: a) chemical and redox driven unidirectional motors; here we will develop processive unidirectional motors that can use (electro)chemical energy in a continuous manner, b) amplification of motion; here rotary motors operate in assemblies to amplify mechanical function over a wide range of length scales. Specifically we will use liquid crystal-water interfaces as a unique platform to control motion and organization. c) dissipative self-assembly: molecular motors offer fantastic opportunities to control self-assembly and drive such systems out-of-equilibrium. We aim at metastable aggregate formation (hydrogels) and the design of amphiphilic motors for responsive self-assembled nanostructures; d)triggering biomolecular function; the goal is to use rotary motors to regulate DNA transcription and ultimately as genuine powering device to control cardiac cell function. In the emerging field of photopharmacology, we take advantage of non-invasive high spatio-temporal control that switching with light provides. The proposed research program is highly challenging but provides the comprehensive effort required to achieve control of complex nanomechanical systems and will opening a bright future for applications ranging from stimuli responsive materials to spatio-temporal control of biomolecular systems
Max ERC Funding
2 499 524 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MO-TRAYL
Project Mobility trajectories of young lives: Life chances of transnational youths in Global South and North
Researcher (PI) Valentina Mazzucato
Host Institution (HI) UNIVERSITEIT MAASTRICHT
Call Details Consolidator Grant (CoG), SH3, ERC-2015-CoG
Summary The objective of MO-TRAYL is to develop a better understanding of the relationship between migration and young people’s life-chances by studying youth’s mobility trajectories. How the life chances of youths, defined as their educational performance, psychological well-being and transitions into adulthood, are impacted by migration are of relevance for European cities that are faced with a growing youth population with migrant background. At the same time, cities in the Global South, where many migrants in Europe originate from, are faced with large portions of the population of minors who are living without at least one of their parents due to their parent’s migration. There is growing concern in both academia and policy about how these ‘stay-behind’ children are faring. Yet little is known about how migration impacts young people in the Global North and South in the medium-term, in part because our conception of young people’s mobility patterns has to date been overly simplified (either they move once, or they do not). This results in a lack of data that specifically looks at the different mobility patterns of young people and hardly any that has a longitudinal dimension. MO-TRAYL will break new ground by studying simultaneously youths in the Global South who have remained ‘at home’ and those who have migrated to Europe by making use of unique new longitudinal data collected in the Global South as well as collecting new data in the Global North that specifically traces the mobility trajectories, the resulting different family compositions along the way, and how both affect life chances. Through a transnational perspective in which family members and events spanning home and host countries are brought to bear on life chances, MO-TRAYL aims to re-conceptualize youth mobility and families and add a longitudinal dimension to the study of migration and life chance outcomes. The project focuses on Ghanaian children in Ghana, The Netherlands, Belgium and Germany.
Summary
The objective of MO-TRAYL is to develop a better understanding of the relationship between migration and young people’s life-chances by studying youth’s mobility trajectories. How the life chances of youths, defined as their educational performance, psychological well-being and transitions into adulthood, are impacted by migration are of relevance for European cities that are faced with a growing youth population with migrant background. At the same time, cities in the Global South, where many migrants in Europe originate from, are faced with large portions of the population of minors who are living without at least one of their parents due to their parent’s migration. There is growing concern in both academia and policy about how these ‘stay-behind’ children are faring. Yet little is known about how migration impacts young people in the Global North and South in the medium-term, in part because our conception of young people’s mobility patterns has to date been overly simplified (either they move once, or they do not). This results in a lack of data that specifically looks at the different mobility patterns of young people and hardly any that has a longitudinal dimension. MO-TRAYL will break new ground by studying simultaneously youths in the Global South who have remained ‘at home’ and those who have migrated to Europe by making use of unique new longitudinal data collected in the Global South as well as collecting new data in the Global North that specifically traces the mobility trajectories, the resulting different family compositions along the way, and how both affect life chances. Through a transnational perspective in which family members and events spanning home and host countries are brought to bear on life chances, MO-TRAYL aims to re-conceptualize youth mobility and families and add a longitudinal dimension to the study of migration and life chance outcomes. The project focuses on Ghanaian children in Ghana, The Netherlands, Belgium and Germany.
Max ERC Funding
1 937 500 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ModRed
Project The geometry of modular representations of reductive algebraic groups
Researcher (PI) Simon Riche
Host Institution (HI) UNIVERSITE CLERMONT AUVERGNE
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The main theme of this proposal is the Geometric Representation Theory of reductive algebraic groups over algebraically closed fields of positive characteristic. Our primary goal is to obtain character formulas for simple and for indecomposable tilting representations of such groups, by developing a geometric framework for their categories of representations.
Obtaining such formulas has been one of the main problems in this area since the 1980's. A program outlined by G. Lusztig in the 1990's has lead to a formula for the characters of simple representations in the case the characteristic of the base field is bigger than an explicit but huge bound. A recent breakthrough due to G. Williamson has shown that this formula cannot hold for smaller characteristics, however. Nothing is known about characters of tilting modules in general (except for a conjectural formula for some characters, due to Andersen). Our main tools include a new perspective on Soergel bimodules offered by the study of parity sheaves (introduced by Juteau-Mautner-Williamson) and a diagrammatic presentation of their category (due to Elias-Williamson).
Summary
The main theme of this proposal is the Geometric Representation Theory of reductive algebraic groups over algebraically closed fields of positive characteristic. Our primary goal is to obtain character formulas for simple and for indecomposable tilting representations of such groups, by developing a geometric framework for their categories of representations.
Obtaining such formulas has been one of the main problems in this area since the 1980's. A program outlined by G. Lusztig in the 1990's has lead to a formula for the characters of simple representations in the case the characteristic of the base field is bigger than an explicit but huge bound. A recent breakthrough due to G. Williamson has shown that this formula cannot hold for smaller characteristics, however. Nothing is known about characters of tilting modules in general (except for a conjectural formula for some characters, due to Andersen). Our main tools include a new perspective on Soergel bimodules offered by the study of parity sheaves (introduced by Juteau-Mautner-Williamson) and a diagrammatic presentation of their category (due to Elias-Williamson).
Max ERC Funding
882 844 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MOLECULAR MOTORS
Project Molecular Motors - Controlling movement at the nanoscale
Researcher (PI) Bernard Feringa
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), PE4, ERC-2008-AdG
Summary The design of artificial molecular motors and machines is one of the major challenges in contemporary molecular sciences and bottom-up molecular nanotechnology. Whereas the protein-based molecular motors found in the living cell are amongst the most fascinating and complex structures found in nature and crucial to nearly every key biological process, the field of synthetic linear and rotary motors is still in its infancy. In a broader context moving molecular sciences from the current situation with a focus on static structures and operation under thermodynamic control to dynamic chemistries with systems under kinetic control will represent a major step beyond current frontiers of chemical sciences. Furthermore, a shift from control of structure to dynamic control of function and from molecules to molecular systems, where several components act in concert often at different hierarchical levels, makes it possible for fascinating and unique properties to be discovered. In this program the goal is to significantly push ahead the frontiers of the field of molecular motors and machines both with respect to control of translational and rotary motion, as well as the exploration of dynamic functions of molecular systems governed by molecular motors. A further extremely challenging goal is to explore synthetic systems that can undergo autonomous motion. This program builds on our recent discoveries of the first unidirectional light-driven rotary molecular motor, the chemical driven rotary motor that can complete a full rotary cycle in a repetitive manner and the first molecular defined autonomous translational motor powered by a chemical fuel. As the basic principles, rules and parameters that govern molecular motion at the nanoscale are, largely, not yet understood, the focus of this proposal is on a multidisciplinary program addressing some of the most challenging fundamental issues in this uncharted territory.
Summary
The design of artificial molecular motors and machines is one of the major challenges in contemporary molecular sciences and bottom-up molecular nanotechnology. Whereas the protein-based molecular motors found in the living cell are amongst the most fascinating and complex structures found in nature and crucial to nearly every key biological process, the field of synthetic linear and rotary motors is still in its infancy. In a broader context moving molecular sciences from the current situation with a focus on static structures and operation under thermodynamic control to dynamic chemistries with systems under kinetic control will represent a major step beyond current frontiers of chemical sciences. Furthermore, a shift from control of structure to dynamic control of function and from molecules to molecular systems, where several components act in concert often at different hierarchical levels, makes it possible for fascinating and unique properties to be discovered. In this program the goal is to significantly push ahead the frontiers of the field of molecular motors and machines both with respect to control of translational and rotary motion, as well as the exploration of dynamic functions of molecular systems governed by molecular motors. A further extremely challenging goal is to explore synthetic systems that can undergo autonomous motion. This program builds on our recent discoveries of the first unidirectional light-driven rotary molecular motor, the chemical driven rotary motor that can complete a full rotary cycle in a repetitive manner and the first molecular defined autonomous translational motor powered by a chemical fuel. As the basic principles, rules and parameters that govern molecular motion at the nanoscale are, largely, not yet understood, the focus of this proposal is on a multidisciplinary program addressing some of the most challenging fundamental issues in this uncharted territory.
Max ERC Funding
2 175 970 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym MOLNANOSPIN
Project Molecular spintronics using single-molecule magnets
Researcher (PI) Wolfgang Wernsdorfer
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary A revolution in electronics is in view, with the contemporary evolution of two novel disciplines, spintronics and molecular electronics. A fundamental link between these two fields can be established using molecular magnetic materials and, in particular, single-molecule magnets, which combine the classic macroscale properties of a magnet with the quantum properties of a nanoscale entity. The resulting field, molecular spintronics aims at manipulating spins and charges in electronic devices containing one or more molecules. The main advantage is that the weak spin-orbit and hyperfine interactions in organic molecules suggest that spin-coherence may be preserved over time and distance much longer than in conventional metals or semiconductors. In addition, specific functions (e.g. switchability with light, electric field etc.) could be directly integrated into the molecule. In this context, the project proposes to fabricate, characterize and study molecular devices (molecular spin-transistor, molecular spin-valve and spin filter, molecular double-dot devices, carbon nanotube nano-SQUIDs, etc.) in order to read and manipulate the spin states of the molecule and to perform basic quantum operations. MolNanoSpin is designed to play a role of pathfinder in this still largely unexplored - field. The main target for the coming 5 years concerns fundamental science, but applications in quantum electronics are expected in the long run. The visionary concept of MolNanoSpin is underpinned by worldwide research on molecular magnetism and supramolecular chemistry, the 10-year long experience in molecular magnetism of the PI, his membership in FP6 MAGMANet NoE, and collaboration with outstanding scientists in the close environment of the team. During the last year, the recently founded team of the PI has already demonstrated the first important results in this new research area.
Summary
A revolution in electronics is in view, with the contemporary evolution of two novel disciplines, spintronics and molecular electronics. A fundamental link between these two fields can be established using molecular magnetic materials and, in particular, single-molecule magnets, which combine the classic macroscale properties of a magnet with the quantum properties of a nanoscale entity. The resulting field, molecular spintronics aims at manipulating spins and charges in electronic devices containing one or more molecules. The main advantage is that the weak spin-orbit and hyperfine interactions in organic molecules suggest that spin-coherence may be preserved over time and distance much longer than in conventional metals or semiconductors. In addition, specific functions (e.g. switchability with light, electric field etc.) could be directly integrated into the molecule. In this context, the project proposes to fabricate, characterize and study molecular devices (molecular spin-transistor, molecular spin-valve and spin filter, molecular double-dot devices, carbon nanotube nano-SQUIDs, etc.) in order to read and manipulate the spin states of the molecule and to perform basic quantum operations. MolNanoSpin is designed to play a role of pathfinder in this still largely unexplored - field. The main target for the coming 5 years concerns fundamental science, but applications in quantum electronics are expected in the long run. The visionary concept of MolNanoSpin is underpinned by worldwide research on molecular magnetism and supramolecular chemistry, the 10-year long experience in molecular magnetism of the PI, his membership in FP6 MAGMANet NoE, and collaboration with outstanding scientists in the close environment of the team. During the last year, the recently founded team of the PI has already demonstrated the first important results in this new research area.
Max ERC Funding
2 096 703 €
Duration
Start date: 2008-11-01, End date: 2013-10-31
Project acronym MONACAT
Project Magnetism and Optics for Nanoparticle Catalysis
Researcher (PI) Bruno CHAUDRET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary MONACAT proposes a novel approach to address the challenge of intermittent energy storage. Specifically, the purpose is to conceive and synthesize novel complex nano-objects displaying both physical and chemical properties that enable catalytic transformations with a fast and optimum energy conversion. It follows over 20 years of research on “organometallic nanoparticles”, an approach of nanoparticles (NPs) synthesis where the first goal is to control the surface of the particles as in molecular organometallic species. Two families of NPs will be studied: 1) magnetic NPs that can be heated by excitation with an alternating magnetic field and 2) plasmonic NPs that absorb visible light and transform it into heat. In all cases, deposition of additional materials as islands or thin layers will improve the NPs catalytic activity. Iron carbides NPs have recently been shown to heat efficiently upon magnetic excitation and to catalyse CO hydrogenation into hydrocarbons. In order to transform this observation into a viable process, MONACAT will address the following challenges: determination and control of surface temperature using fluorophores or quantum dots, optimization of heating capacity (size, anisotropy of the material, crystallinity, phases: FeCo, FeNi, chemical order), optimization of catalytic properties (islands vs core-shell structures; Ru, Ni for methane, Cu/Zn for methanol), stability and optimization of energy efficiency. A similar approach will be used for direct light conversion using as first proofs of concept Au or Ag NPs coated with Ru. Catalytic tests will be performed on two heterogeneous reactions after deposition of the NPs onto a support: CO2 hydrogenation into methane and methanol synthesis. In addition, the potential of catalysis making use of self-heated and magnetically recoverable NPs will be studied in solution (reduction of arenes or oxygenated functions, hydrogenation and hydrogenolysis of biomass platform molecules, Fischer-Tropsch).
Summary
MONACAT proposes a novel approach to address the challenge of intermittent energy storage. Specifically, the purpose is to conceive and synthesize novel complex nano-objects displaying both physical and chemical properties that enable catalytic transformations with a fast and optimum energy conversion. It follows over 20 years of research on “organometallic nanoparticles”, an approach of nanoparticles (NPs) synthesis where the first goal is to control the surface of the particles as in molecular organometallic species. Two families of NPs will be studied: 1) magnetic NPs that can be heated by excitation with an alternating magnetic field and 2) plasmonic NPs that absorb visible light and transform it into heat. In all cases, deposition of additional materials as islands or thin layers will improve the NPs catalytic activity. Iron carbides NPs have recently been shown to heat efficiently upon magnetic excitation and to catalyse CO hydrogenation into hydrocarbons. In order to transform this observation into a viable process, MONACAT will address the following challenges: determination and control of surface temperature using fluorophores or quantum dots, optimization of heating capacity (size, anisotropy of the material, crystallinity, phases: FeCo, FeNi, chemical order), optimization of catalytic properties (islands vs core-shell structures; Ru, Ni for methane, Cu/Zn for methanol), stability and optimization of energy efficiency. A similar approach will be used for direct light conversion using as first proofs of concept Au or Ag NPs coated with Ru. Catalytic tests will be performed on two heterogeneous reactions after deposition of the NPs onto a support: CO2 hydrogenation into methane and methanol synthesis. In addition, the potential of catalysis making use of self-heated and magnetically recoverable NPs will be studied in solution (reduction of arenes or oxygenated functions, hydrogenation and hydrogenolysis of biomass platform molecules, Fischer-Tropsch).
Max ERC Funding
2 472 223 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MOPSA
Project Modular Open Platform for Static Analysis
Researcher (PI) Antoine Miné
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The Mopsa project aims at creating methods and tools to make computer software more reliable.
Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer I have coauthored, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. I wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software.
We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core on Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.
Summary
The Mopsa project aims at creating methods and tools to make computer software more reliable.
Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer I have coauthored, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. I wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software.
We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core on Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.
Max ERC Funding
1 773 750 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MULTISCALE
Project Precision Multi-Scale Predictions for the LHC: Higgs, Jets and Supersymmetry
Researcher (PI) Wouter - Jonathan Waalewijn
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary My project will boost the precision of theoretical predictions for collisions at the Large Hadron Collider. Precise predictions are crucial to further constrain the properties of the recently-discovered Higgs boson, and uncover a faint signal of Beyond-the-Standard Model physics. I will focus on the strong interactions, which dominate the theoretical uncertainty and play a role at multiple energy scales, including those related to the incoming protons, the hard scattering, the masses of (new) particles, the transverse momentum and size of jets.
The critical progress of this proposal lies in taking this intrinsically multi-scale nature into account, moving beyond the current trade-off between precision and realism in the three dominant calculational paradigms. Fixed-order calculations are systematically improvable but assume that there is no hierarchy between perturbative scales. Monte Carlo event generators provide a fully exclusive description of the final state, but are currently limited to leading-logarithmic order and lack theoretical uncertainties. Resummed calculations can reach a higher logarithmic accuracy, but have been restricted to single observables.
In a recent breakthrough, I constructed a new effective field theory that simultaneously achieves higher logarithmic accuracy in two independent observables, by factorizing the physics at the corresponding scales. Moving beyond this prototypical study, I will develop the general effective field theory framework that accounts for the relevant scales in realistic measurements, which overcomes the limitations of all three paradigms. This research will be carried out in the context of several important LHC applications: precision Higgs measurements, jet substructure techniques for identifying boosted heavy particles and supersymmetry searches. My new field-theoretic insights and more precise predictions will be critical as the LHC starts Run 2, searching for new physics at even higher energies.
Summary
My project will boost the precision of theoretical predictions for collisions at the Large Hadron Collider. Precise predictions are crucial to further constrain the properties of the recently-discovered Higgs boson, and uncover a faint signal of Beyond-the-Standard Model physics. I will focus on the strong interactions, which dominate the theoretical uncertainty and play a role at multiple energy scales, including those related to the incoming protons, the hard scattering, the masses of (new) particles, the transverse momentum and size of jets.
The critical progress of this proposal lies in taking this intrinsically multi-scale nature into account, moving beyond the current trade-off between precision and realism in the three dominant calculational paradigms. Fixed-order calculations are systematically improvable but assume that there is no hierarchy between perturbative scales. Monte Carlo event generators provide a fully exclusive description of the final state, but are currently limited to leading-logarithmic order and lack theoretical uncertainties. Resummed calculations can reach a higher logarithmic accuracy, but have been restricted to single observables.
In a recent breakthrough, I constructed a new effective field theory that simultaneously achieves higher logarithmic accuracy in two independent observables, by factorizing the physics at the corresponding scales. Moving beyond this prototypical study, I will develop the general effective field theory framework that accounts for the relevant scales in realistic measurements, which overcomes the limitations of all three paradigms. This research will be carried out in the context of several important LHC applications: precision Higgs measurements, jet substructure techniques for identifying boosted heavy particles and supersymmetry searches. My new field-theoretic insights and more precise predictions will be critical as the LHC starts Run 2, searching for new physics at even higher energies.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym NANO-INSITU
Project Nanoscale Chemical Reactions Studied with In-Situ Transmission Electron Microscopy
Researcher (PI) Marijn Arnout Van Huis
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary Great successes have been achieved in nanoscience where the development of functional properties and the assembly of nanostructures into nanomaterials have become increasingly important. In general, both the tuning of the chemical and physical properties and the self-assembly of nanocrystals into 2D or 3D superstructures take place in a liquid environment. When analysing the structural properties of nanocrystals using Transmission Electron Microscopy (TEM), this liquid environment is contained between membranes to keep it in the high vacuum. At present, the thickness of the liquid is not controlled, which renders standard imaging at atomic resolution impossible. Here I propose to integrate micro-electromechanical actuator functionalities in the Liquid Cell chips to overcome this problem so that real-time atomic resolution imaging and chemical analysis on nanoparticles in solution becomes a reality.
This new in-situ technology will elucidate what really happens during chemical reactions, and will thereby enable the development of new nanomaterials for optoelectronics, lighting, and catalysis. Oriented attachment processes and self-assembly of nanoparticles, which are key to the large-scale production of 2D and 3D nanomaterials, can also be followed in the Liquid Cell. Furthermore, the hydration of nanoscale model systems of earth materials such as magnesia, alumina, and calcium oxide is of major importance in the geosciences. In the field of enhanced oil recovery, for example, the huge volumetric expansion that comes with the hydration of these minerals could facilitate access to reservoirs.
My research group has extensive experience in in-situ TEM and recently has achieved significant successes in Liquid Cell studies. We are in an ideal position to develop this new technology and open up these new research areas, which will have a major impact on science, industry, and society.
Summary
Great successes have been achieved in nanoscience where the development of functional properties and the assembly of nanostructures into nanomaterials have become increasingly important. In general, both the tuning of the chemical and physical properties and the self-assembly of nanocrystals into 2D or 3D superstructures take place in a liquid environment. When analysing the structural properties of nanocrystals using Transmission Electron Microscopy (TEM), this liquid environment is contained between membranes to keep it in the high vacuum. At present, the thickness of the liquid is not controlled, which renders standard imaging at atomic resolution impossible. Here I propose to integrate micro-electromechanical actuator functionalities in the Liquid Cell chips to overcome this problem so that real-time atomic resolution imaging and chemical analysis on nanoparticles in solution becomes a reality.
This new in-situ technology will elucidate what really happens during chemical reactions, and will thereby enable the development of new nanomaterials for optoelectronics, lighting, and catalysis. Oriented attachment processes and self-assembly of nanoparticles, which are key to the large-scale production of 2D and 3D nanomaterials, can also be followed in the Liquid Cell. Furthermore, the hydration of nanoscale model systems of earth materials such as magnesia, alumina, and calcium oxide is of major importance in the geosciences. In the field of enhanced oil recovery, for example, the huge volumetric expansion that comes with the hydration of these minerals could facilitate access to reservoirs.
My research group has extensive experience in in-situ TEM and recently has achieved significant successes in Liquid Cell studies. We are in an ideal position to develop this new technology and open up these new research areas, which will have a major impact on science, industry, and society.
Max ERC Funding
1 996 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym NanoPacks
Project NanoPacks: Assembling nanoparticles via evaporation-driven droplet collapse for ultrasensitive detection techniques
Researcher (PI) Alvaro Marin
Host Institution (HI) UNIVERSITEIT TWENTE
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary The foundation of nanophotonics and nanoplasmonics has boosted the development of ultrasensitive detection techniques. Some of these techniques, such as Surface Enhanced Raman Spectroscopy or Surface Enhanced Fluorescence, are able to detect femtomolar concentrations of analytes or even single molecules, only relying on the adsorption of the analytes on a nanostructured surfaces.
The development of nanotechnology requires a high control on the building blocks of the structures. The concept of self-assembly has been introduced and successfully applied in recent years to build all sorts of nanostructures. However, self-assembly generally involves an attractive interaction of the elements which requires the use of specially designed nanoparticles, thus imposing severe limitations in the applicability of self-assembly.
The approach I want to explore in this project is a complete change of paradigm which consists on assembling nanostructures through the collapse of evaporating drops: A droplet, containing both metallic nanoparticles and a tiny amount of analyte molecules, evaporates until the whole solvent vanishes and only the solutes are left. By manipulating the way the droplet evaporates, we can control the shape and properties of the remains, and therefore assemble metallic nanoparticles together with the molecules of interest in a passive way.
The project will increase the reach of plasmonic-based techniques for the early detection of diseases: First, the approach does not rely on expensive fabrication techniques, but only on the thermodynamics and the statistical physics of the particle packings. Secondly, by using a physical approach to form nanoparticle and analyte aggregates, we avoid adverse interactions with the analyte’s chemistry.
The packing of metallic nanoparticles presents new challenges and brings several scientific questions that I will address experimentally through microfluidics, but also via simulations and modeling.
Summary
The foundation of nanophotonics and nanoplasmonics has boosted the development of ultrasensitive detection techniques. Some of these techniques, such as Surface Enhanced Raman Spectroscopy or Surface Enhanced Fluorescence, are able to detect femtomolar concentrations of analytes or even single molecules, only relying on the adsorption of the analytes on a nanostructured surfaces.
The development of nanotechnology requires a high control on the building blocks of the structures. The concept of self-assembly has been introduced and successfully applied in recent years to build all sorts of nanostructures. However, self-assembly generally involves an attractive interaction of the elements which requires the use of specially designed nanoparticles, thus imposing severe limitations in the applicability of self-assembly.
The approach I want to explore in this project is a complete change of paradigm which consists on assembling nanostructures through the collapse of evaporating drops: A droplet, containing both metallic nanoparticles and a tiny amount of analyte molecules, evaporates until the whole solvent vanishes and only the solutes are left. By manipulating the way the droplet evaporates, we can control the shape and properties of the remains, and therefore assemble metallic nanoparticles together with the molecules of interest in a passive way.
The project will increase the reach of plasmonic-based techniques for the early detection of diseases: First, the approach does not rely on expensive fabrication techniques, but only on the thermodynamics and the statistical physics of the particle packings. Secondly, by using a physical approach to form nanoparticle and analyte aggregates, we avoid adverse interactions with the analyte’s chemistry.
The packing of metallic nanoparticles presents new challenges and brings several scientific questions that I will address experimentally through microfluidics, but also via simulations and modeling.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym NERVI
Project From single neurons to visual perception
Researcher (PI) Olivier Dominique Faugeras
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary We propose to develop a formal model of information representation and processing in the part of the neocortex that is mostly concerned with visual information. This model will open new horizons in a well-principled way in the fields of artificial and biological vision as well as in computational neuroscience. Specifically the goal is to develop a universally accepted formal framework for describing complex, distributed and hierarchical processes capable of processing seamlessly a continuous flow of images. This framework features notably computational units operating at several spatiotemporal scales on stochastic data arising from natural images. Mean-field theory and stochastic calculus are used to harness the fundamental stochastic nature of the data, functional analysis and bifurcation theory to map the complexity of the behaviours of these assemblies of units. In the absence of such foundations the development of an understanding of visual information processing in man and machines could be greatly hindered. Although the proposal addresses fundamental problems its goal is to serve as the basis for ground-breaking future computational development for managing visual data and as a theoretical framework for a scientific understanding of biological vision.
Summary
We propose to develop a formal model of information representation and processing in the part of the neocortex that is mostly concerned with visual information. This model will open new horizons in a well-principled way in the fields of artificial and biological vision as well as in computational neuroscience. Specifically the goal is to develop a universally accepted formal framework for describing complex, distributed and hierarchical processes capable of processing seamlessly a continuous flow of images. This framework features notably computational units operating at several spatiotemporal scales on stochastic data arising from natural images. Mean-field theory and stochastic calculus are used to harness the fundamental stochastic nature of the data, functional analysis and bifurcation theory to map the complexity of the behaviours of these assemblies of units. In the absence of such foundations the development of an understanding of visual information processing in man and machines could be greatly hindered. Although the proposal addresses fundamental problems its goal is to serve as the basis for ground-breaking future computational development for managing visual data and as a theoretical framework for a scientific understanding of biological vision.
Max ERC Funding
1 706 839 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym NUHGD
Project Non Uniform Hyperbolicity in Global Dynamics
Researcher (PI) Sylvain CROVISIER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary An important part of differentiable dynamics has been developed from the uniformly hyperbolic systems. These systems have been introduced by Smale in the 60's in order to address chaotic behavior and are now deeply understood from the qualitative, symbolic and statistic viewpoints. They correspond to the structurally stable dynamics. It appeared that large classes of non-hyperbolic systems also exist. Since the 80's, different notions of relaxed hyperbolicity have been introduced: non-uniformly hyperbolic measures, partial hyperbolicity, ... They allowed to extend the previous approach to other families of systems and to handle new examples of dynamics: the fine description of the dynamics of Hénon maps for instance.
The development of local perturbative technics have brought a rebirth for the qualitative description of generic systems. It also opened the door to describe more globally the spaces of differentiable dynamics. For instance, it allowed recent progresses towards the Palis conjecture which characterizes the absence of uniform hyperbolicity by the homoclinic bifurcations — homoclinic tangencies or heterodimensional cycles. We propose in the present project to develop technics for realizing more global perturbations, yielding a breakthrough in the subject. This would settle this conjecture for C1 diffeomorphisms and imply other classification results.
These past years we have understood how qualitative dynamics of generic systems decompose into invariant pieces. We are now ready to describe more precisely the dynamics inside the pieces. We propose to combine these new geometrical ideas to the ergodic theory of non-uniformly hyperbolic systems. This will improve significantly our understanding of general smooth systems (for instance provide existence and finiteness of physical measures and measures of maximal entropy for new classes of systems beyond uniform hyperbolicity).
Summary
An important part of differentiable dynamics has been developed from the uniformly hyperbolic systems. These systems have been introduced by Smale in the 60's in order to address chaotic behavior and are now deeply understood from the qualitative, symbolic and statistic viewpoints. They correspond to the structurally stable dynamics. It appeared that large classes of non-hyperbolic systems also exist. Since the 80's, different notions of relaxed hyperbolicity have been introduced: non-uniformly hyperbolic measures, partial hyperbolicity, ... They allowed to extend the previous approach to other families of systems and to handle new examples of dynamics: the fine description of the dynamics of Hénon maps for instance.
The development of local perturbative technics have brought a rebirth for the qualitative description of generic systems. It also opened the door to describe more globally the spaces of differentiable dynamics. For instance, it allowed recent progresses towards the Palis conjecture which characterizes the absence of uniform hyperbolicity by the homoclinic bifurcations — homoclinic tangencies or heterodimensional cycles. We propose in the present project to develop technics for realizing more global perturbations, yielding a breakthrough in the subject. This would settle this conjecture for C1 diffeomorphisms and imply other classification results.
These past years we have understood how qualitative dynamics of generic systems decompose into invariant pieces. We are now ready to describe more precisely the dynamics inside the pieces. We propose to combine these new geometrical ideas to the ergodic theory of non-uniformly hyperbolic systems. This will improve significantly our understanding of general smooth systems (for instance provide existence and finiteness of physical measures and measures of maximal entropy for new classes of systems beyond uniform hyperbolicity).
Max ERC Funding
1 229 255 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ONE
Project Unified Principles of Interaction
Researcher (PI) Michel Beaudouin-Lafon
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary Most of today’s computer interfaces are based on principles and conceptual models created in the late seventies. They are designed for a single user interacting with a closed application on a single device with a predefined set of tools to manipulate a single type of content. But one is not enough! We need flexible and extensible environments where multiple users can truly share content and manipulate it simultaneously, where applications can be distributed across multiple devices, where content and tools can migrate from one device to the next, and where users can freely choose, combine and even create tools to make their own digital workbench.
The goal of ONE is to fundamentally re-think the basic principles and conceptual model of interactive systems to empower users by letting them appropriate their digital environment. The project will address this challenge through three interleaved strands: empirical studies to better understand interaction in both the physical and digital worlds, theoretical work to create a conceptual model of interaction and interactive systems, and prototype development to test these principles and concepts in the lab and in the field. Drawing inspiration from physics, biology and psychology, the conceptual model will combine substrates to manage digital information at various levels of abstraction and representation, instruments to manipulate substrates, and environments to organize substrates and instruments into digital workspaces.
By identifying first principles of interaction, ONE will unify a wide variety of interaction styles and create more open and flexible interactive environments.
Summary
Most of today’s computer interfaces are based on principles and conceptual models created in the late seventies. They are designed for a single user interacting with a closed application on a single device with a predefined set of tools to manipulate a single type of content. But one is not enough! We need flexible and extensible environments where multiple users can truly share content and manipulate it simultaneously, where applications can be distributed across multiple devices, where content and tools can migrate from one device to the next, and where users can freely choose, combine and even create tools to make their own digital workbench.
The goal of ONE is to fundamentally re-think the basic principles and conceptual model of interactive systems to empower users by letting them appropriate their digital environment. The project will address this challenge through three interleaved strands: empirical studies to better understand interaction in both the physical and digital worlds, theoretical work to create a conceptual model of interaction and interactive systems, and prototype development to test these principles and concepts in the lab and in the field. Drawing inspiration from physics, biology and psychology, the conceptual model will combine substrates to manage digital information at various levels of abstraction and representation, instruments to manipulate substrates, and environments to organize substrates and instruments into digital workspaces.
By identifying first principles of interaction, ONE will unify a wide variety of interaction styles and create more open and flexible interactive environments.
Max ERC Funding
2 456 028 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym PACEMAKER
Project Past Continental Climate Change: Temperatures from marine and lacustrine archives
Researcher (PI) Jaap Sinninghe Damste
Host Institution (HI) STICHTING NIOZ, KONINKLIJK NEDERLANDS INSTITUUT VOOR ONDERZOEK DER ZEE
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary Global climate change is a topic of major interest as it has a large impact on human societies. Computer models used to predict directions of future climate change are validated by means of retrospective analysis of past climate changes. Detailed reconstruction of past climates, especially temperature, is, therefore, of considerable importance. Several tools (proxies) are available to reconstruct absolute sea surface temperatures. Continental temperature reconstructions, however, are hampered by a lack of quantitative temperature proxies and, consequently, are often qualitative rather than quantitative. Recently, my group discovered a new quantitative continental temperature proxy, the MBT index, which is based on the distribution of membrane lipids of soil bacteria. Their composition is a function of annual mean air temperature (MAT). These lipids are transported by rivers to the ocean and deposited in marine sediments. Determination of the MBT index in cores from river fans can, thus, potentially be used to reconstruct continental, river basin-integrated, temperatures from a marine record in front of large river outflows. We will study the mechanisms of transport of the soil bacterial membrane lipids to the ocean in many river systems and compare the down-core changes in their composition with conventional MAT proxies. We will also investigate the potential of lake sediments as archives of continental climate change using our new MBT palaeothermometer and apply this thermometer in the assessment of continental climate change during the transition from a hothouse to an icehouse Earth in the last 100 million years. This project that combines aspects of microbiology, molecular ecology, lipid biogeochemistry and paleoclimatology will bring this novel continental palaeothermometer to maturity. If we can ground-truth the use of the MBT-proxy, it will open up new windows in palaeoclimatological research and thus contribute to improvement of current climate models.
Summary
Global climate change is a topic of major interest as it has a large impact on human societies. Computer models used to predict directions of future climate change are validated by means of retrospective analysis of past climate changes. Detailed reconstruction of past climates, especially temperature, is, therefore, of considerable importance. Several tools (proxies) are available to reconstruct absolute sea surface temperatures. Continental temperature reconstructions, however, are hampered by a lack of quantitative temperature proxies and, consequently, are often qualitative rather than quantitative. Recently, my group discovered a new quantitative continental temperature proxy, the MBT index, which is based on the distribution of membrane lipids of soil bacteria. Their composition is a function of annual mean air temperature (MAT). These lipids are transported by rivers to the ocean and deposited in marine sediments. Determination of the MBT index in cores from river fans can, thus, potentially be used to reconstruct continental, river basin-integrated, temperatures from a marine record in front of large river outflows. We will study the mechanisms of transport of the soil bacterial membrane lipids to the ocean in many river systems and compare the down-core changes in their composition with conventional MAT proxies. We will also investigate the potential of lake sediments as archives of continental climate change using our new MBT palaeothermometer and apply this thermometer in the assessment of continental climate change during the transition from a hothouse to an icehouse Earth in the last 100 million years. This project that combines aspects of microbiology, molecular ecology, lipid biogeochemistry and paleoclimatology will bring this novel continental palaeothermometer to maturity. If we can ground-truth the use of the MBT-proxy, it will open up new windows in palaeoclimatological research and thus contribute to improvement of current climate models.
Max ERC Funding
2 498 040 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym PaDyFlow
Project Particle dynamics in the flow of complex suspensions
Researcher (PI) Anke Lindner
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Particle laden flows are ubiquitous in nature and industrial applications. Particle trajectories determine transport in porous media or biomedical conducts and effective suspension properties dictate flow behavior in food processing or biofluid flow. For a better control it is necessary to know how to predict these processes from the involved particle and flow properties. However, current theory is not able to capture the complexity of the applications and experiments have been carried out on too diverse systems for a unifying picture to emerge. A systematic experimental approach is now needed to improve the present understanding.
In this experimental project, we will use novel microfabrication and characterization methods to obtain a set of complex anisotropic microscopic particles (complemented by selected bioparticles) with tunable properties, covering size, shape, deformability and activity. The transport of these particles isolated or in small concentrations will be studied in chosen microfluidic model flows of simple fluids or polymer solutions. The many degrees of freedom of this problem will be addressed by systematically combining different relevant particle and flow properties. The macroscopic properties of dilute suspensions are particularly interesting from a fundamental point of view as they are a direct consequence of the individual particle flow interaction and will be measured using original microfluidic rheometers of outstanding resolution.
This project will lead to a comprehensive understanding of fluid structure interactions at small Reynolds number. Our findings will constitute the basis for novel numerical approaches based on experimentally validated hypotheses. Using our knowledge, local flow sensors, targeted delivery and novel microfluidic filtration or separation devices can be designed. Combining particles of chosen properties and selected suspending fluids allows the fabrication of suspensions with unprecedented tailored properties.
Summary
Particle laden flows are ubiquitous in nature and industrial applications. Particle trajectories determine transport in porous media or biomedical conducts and effective suspension properties dictate flow behavior in food processing or biofluid flow. For a better control it is necessary to know how to predict these processes from the involved particle and flow properties. However, current theory is not able to capture the complexity of the applications and experiments have been carried out on too diverse systems for a unifying picture to emerge. A systematic experimental approach is now needed to improve the present understanding.
In this experimental project, we will use novel microfabrication and characterization methods to obtain a set of complex anisotropic microscopic particles (complemented by selected bioparticles) with tunable properties, covering size, shape, deformability and activity. The transport of these particles isolated or in small concentrations will be studied in chosen microfluidic model flows of simple fluids or polymer solutions. The many degrees of freedom of this problem will be addressed by systematically combining different relevant particle and flow properties. The macroscopic properties of dilute suspensions are particularly interesting from a fundamental point of view as they are a direct consequence of the individual particle flow interaction and will be measured using original microfluidic rheometers of outstanding resolution.
This project will lead to a comprehensive understanding of fluid structure interactions at small Reynolds number. Our findings will constitute the basis for novel numerical approaches based on experimentally validated hypotheses. Using our knowledge, local flow sensors, targeted delivery and novel microfluidic filtration or separation devices can be designed. Combining particles of chosen properties and selected suspending fluids allows the fabrication of suspensions with unprecedented tailored properties.
Max ERC Funding
1 971 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym PARIS
Project PARticle accelerators with Intense lasers for Science (PARIS)
Researcher (PI) Victor Malka
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Particle and radiation beams are commonly used in our daily life. For example, accelerated electrons are deflected in the cathode tube of televisions or computer screens. X rays are routinely used for non destructive material or body inspection, for example to check human bodies (to visualize tumour cells, dental caries and osseous fractures) or to increase the safety of travellers by inspecting their luggage. Ionizing radiations are efficiently used in radiotherapy to cure cancer by damaging irreversibly the DNA of cells. From the fundamental point of view, the development of ultra short bunches of energetic particles and X ray photons is of crucial importance in biology, chemistry, and solid state physics, where these beams could be used to diagnose the electronic, atomic or molecular dynamics with unprecedented, simultaneous time and space resolution. The interaction of laser beams with matter in the relativistic regime has permitted to demonstrate new approaches for producing energetic particle beams, thanks to the tremendous electric fields that plasmas can support. The incredible progress of laser plasma accelerators has allowed physicists to produce high quality beams of energetic radiation and particles. These beams could lend themselves to applications in many fields, including medicine (radiotherapy, and imaging), radiation biology, chemistry (radiolysis), physics and material science (radiography, electron and photon diffraction), security (material inspection), and of course accelerator science. Stimulated by the advent of compact and powerful lasers, with moderate costs and high repetition rate, this research field has witnessed considerably growth in the past few years, and the promises of laser plasma accelerators are in tremendous progress. The PARIS ERC/AdG proposal aims at developing actively this new field of research which is of major interest for a broad scientific community and which has the potential to provide new societal applications.
Summary
Particle and radiation beams are commonly used in our daily life. For example, accelerated electrons are deflected in the cathode tube of televisions or computer screens. X rays are routinely used for non destructive material or body inspection, for example to check human bodies (to visualize tumour cells, dental caries and osseous fractures) or to increase the safety of travellers by inspecting their luggage. Ionizing radiations are efficiently used in radiotherapy to cure cancer by damaging irreversibly the DNA of cells. From the fundamental point of view, the development of ultra short bunches of energetic particles and X ray photons is of crucial importance in biology, chemistry, and solid state physics, where these beams could be used to diagnose the electronic, atomic or molecular dynamics with unprecedented, simultaneous time and space resolution. The interaction of laser beams with matter in the relativistic regime has permitted to demonstrate new approaches for producing energetic particle beams, thanks to the tremendous electric fields that plasmas can support. The incredible progress of laser plasma accelerators has allowed physicists to produce high quality beams of energetic radiation and particles. These beams could lend themselves to applications in many fields, including medicine (radiotherapy, and imaging), radiation biology, chemistry (radiolysis), physics and material science (radiography, electron and photon diffraction), security (material inspection), and of course accelerator science. Stimulated by the advent of compact and powerful lasers, with moderate costs and high repetition rate, this research field has witnessed considerably growth in the past few years, and the promises of laser plasma accelerators are in tremendous progress. The PARIS ERC/AdG proposal aims at developing actively this new field of research which is of major interest for a broad scientific community and which has the potential to provide new societal applications.
Max ERC Funding
2 250 000 €
Duration
Start date: 2009-04-01, End date: 2014-03-31
Project acronym PEP-PRO-RNA
Project Peptide-derived bioavailable macrocycles as inhibitors of protein-RNA and protein-protein interactions
Researcher (PI) Tom Grossmann
Host Institution (HI) STICHTING VU
Call Details Starting Grant (StG), PE5, ERC-2015-STG
Summary The objective of this proposal is the elucidation of general principles for the design of bioavailable peptide-derived macrocyclic compounds and their use for the development of inhibitors of protein‒protein (PPI) and protein‒RNA interactions (PRI). Over the last decade, drug discovery faced the problem of decreasing success rates which is mainly caused by the fact that numerous novel biological targets are reluctant to classic small molecule modulation. In particular, that holds true for PPIs and PRIs. Approaches that allow the modulation of these interactions provide access to therapeutic agents targeting crucial biological processes that have been considered undruggable so far. Herein, I propose the use of irregularly structured peptide binding epitopes as starting point for the design of bioactive macrocycles. In a two-step process high target affinity and bioavailability are installed:
1) Peptide macrocyclization for the stabilization of the irregular bioactive secondary structure
2) Evolution of the cyclic peptide into a bioavailable macrocyclic compound
Using a well-characterized model system developed in my lab, initial design principles will be elucidated. These principles are subsequently used and refined for the development of macrocyclic PPI and PRI inhibitors. The protein‒protein and protein‒RNA complexes selected as targets are of therapeutic interest and corresponding inhibitors hold the potential to be pursued in subsequent drug discovery campaigns.
Summary
The objective of this proposal is the elucidation of general principles for the design of bioavailable peptide-derived macrocyclic compounds and their use for the development of inhibitors of protein‒protein (PPI) and protein‒RNA interactions (PRI). Over the last decade, drug discovery faced the problem of decreasing success rates which is mainly caused by the fact that numerous novel biological targets are reluctant to classic small molecule modulation. In particular, that holds true for PPIs and PRIs. Approaches that allow the modulation of these interactions provide access to therapeutic agents targeting crucial biological processes that have been considered undruggable so far. Herein, I propose the use of irregularly structured peptide binding epitopes as starting point for the design of bioactive macrocycles. In a two-step process high target affinity and bioavailability are installed:
1) Peptide macrocyclization for the stabilization of the irregular bioactive secondary structure
2) Evolution of the cyclic peptide into a bioavailable macrocyclic compound
Using a well-characterized model system developed in my lab, initial design principles will be elucidated. These principles are subsequently used and refined for the development of macrocyclic PPI and PRI inhibitors. The protein‒protein and protein‒RNA complexes selected as targets are of therapeutic interest and corresponding inhibitors hold the potential to be pursued in subsequent drug discovery campaigns.
Max ERC Funding
1 499 269 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym PhotoMedMet
Project Towards Novel Inert (Photo-)toxic Ru(II) Polypyridyl Complexes
Researcher (PI) Gilles Albert Gasser
Host Institution (HI) ECOLE NATIONALE SUPERIEURE DE CHIMIE DE PARIS
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary In this grant application, I propose to investigate in-depth the potential of novel inert Ru(II) polypyridyl complexes as novel anticancer drug candidates. Such compounds were investigated by Dwyer and Shulman in 1950s and 1960s both in vitro and in vivo with relatively promising results. This impressive seminal work was unfortunately not followed-up. This lack of additional studies was recently attributed, at least in part, to the observed neurotoxicity of the complexes. Nonetheless, over the last years, there has been a revival of important in vitro studies of such inert Ru(II) polypyridyl complexes for anticancer purposes. However, without further in vivo studies, it is reasonable to think that similar neurotoxicity to that observed by Dwyer and Shulman could be encountered. In order to tackle these (potential) drawbacks, I propose to use a prodrug approach.
Furthermore, I also intend to investigate the potential of inert Ru(II) polypyridyl complexes as photosensitizers (PSs) in photodynamic therapy (PDT). In the search for an alternative approach to chemotherapy, PDT has proven to be a promising, effective and non-invasive treatment modality. Importantly, in order to increase even further the potential of the PSs presented in this project, I propose to also excite them via simultaneous two-photon absorption (TPA) in the so-called two-photon excitation PDT (2 PE-PDT). Importantly, the newly Ru(II)-based PSs will be coupled to cancer cell-specific peptides or antibodies. This double selectivity (targeting vector and photo-activation) should limit the frequently encountered side-effects of (metal-based) anticancer drugs. Another important aim of this second part of this project will be the use of the Ru(II)-based PSs to kill bacteria. Interestingly, PDT has been recently shown to be an interesting alternative to fight bacteria. I therefore intend to couple Ru(II)-based (2PE )PSs to bacteria-specific peptides to bring bacteria specificity.
Summary
In this grant application, I propose to investigate in-depth the potential of novel inert Ru(II) polypyridyl complexes as novel anticancer drug candidates. Such compounds were investigated by Dwyer and Shulman in 1950s and 1960s both in vitro and in vivo with relatively promising results. This impressive seminal work was unfortunately not followed-up. This lack of additional studies was recently attributed, at least in part, to the observed neurotoxicity of the complexes. Nonetheless, over the last years, there has been a revival of important in vitro studies of such inert Ru(II) polypyridyl complexes for anticancer purposes. However, without further in vivo studies, it is reasonable to think that similar neurotoxicity to that observed by Dwyer and Shulman could be encountered. In order to tackle these (potential) drawbacks, I propose to use a prodrug approach.
Furthermore, I also intend to investigate the potential of inert Ru(II) polypyridyl complexes as photosensitizers (PSs) in photodynamic therapy (PDT). In the search for an alternative approach to chemotherapy, PDT has proven to be a promising, effective and non-invasive treatment modality. Importantly, in order to increase even further the potential of the PSs presented in this project, I propose to also excite them via simultaneous two-photon absorption (TPA) in the so-called two-photon excitation PDT (2 PE-PDT). Importantly, the newly Ru(II)-based PSs will be coupled to cancer cell-specific peptides or antibodies. This double selectivity (targeting vector and photo-activation) should limit the frequently encountered side-effects of (metal-based) anticancer drugs. Another important aim of this second part of this project will be the use of the Ru(II)-based PSs to kill bacteria. Interestingly, PDT has been recently shown to be an interesting alternative to fight bacteria. I therefore intend to couple Ru(II)-based (2PE )PSs to bacteria-specific peptides to bring bacteria specificity.
Max ERC Funding
662 015 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym Photonis
Project Isotope Fractionation of Light Elements Upon Ionization: Cosmochemical and Geochemical Implications
Researcher (PI) Bernard MARTY
Host Institution (HI) UNIVERSITE DE LORRAINE
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary Light elements such as hydrogen and nitrogen present large isotope variations among solar system objects and reservoirs (including planetary atmospheres) that remain unexplained at present. Works based on theoretical approaches are model-dependent and do not reach a consensus. Laboratory experiments are required in order to develop the underlying physical mechanisms. The aim of the project is to investigate the origins of and processes responsible for isotope variations of the light elements and noble gases in the Solar System through an experimental approach involving ionization of gaseous species. We will also investigate mechanisms and processes of isotope fractionation of atmophile elements in planetary atmospheres that have been irradiated by solar UV photons, with particular reference to Mars and the early Earth. Three pathways will be considered: (i) plasma ionisation of gas mixtures (H2-CO-N2-noble gases) in a custom-built reactor; (ii) photo-ionisation and photo-dissociation of the relevant gas species and mixtures using synchrotron light; and (iii) UV irradiation of ices containing the species of interest. The results of this study will shed light on the early Solar System evolution and on processes of planetary formation.
Summary
Light elements such as hydrogen and nitrogen present large isotope variations among solar system objects and reservoirs (including planetary atmospheres) that remain unexplained at present. Works based on theoretical approaches are model-dependent and do not reach a consensus. Laboratory experiments are required in order to develop the underlying physical mechanisms. The aim of the project is to investigate the origins of and processes responsible for isotope variations of the light elements and noble gases in the Solar System through an experimental approach involving ionization of gaseous species. We will also investigate mechanisms and processes of isotope fractionation of atmophile elements in planetary atmospheres that have been irradiated by solar UV photons, with particular reference to Mars and the early Earth. Three pathways will be considered: (i) plasma ionisation of gas mixtures (H2-CO-N2-noble gases) in a custom-built reactor; (ii) photo-ionisation and photo-dissociation of the relevant gas species and mixtures using synchrotron light; and (iii) UV irradiation of ices containing the species of interest. The results of this study will shed light on the early Solar System evolution and on processes of planetary formation.
Max ERC Funding
2 810 229 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym PLASMONICS
Project Frontiers in Surface Plasmon Photonics - Fundamentals and Applications
Researcher (PI) Thomas Ebbesen
Host Institution (HI) CENTRE INTERNATIONAL DE RECHERCHE AUX FRONTIERES DE LA CHIMIE FONDATION
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Surface plasmons have generated considerable renewed interest through a combination of scientific and technological advances. In particular with the progress nanofabrication techniques, the properties of surface plasmons (SP) can now be controlled by structuring metals at the nanometer scale. The overall objective of this proposal is to manipulate and control the properties of the SPs to analyze fundamental phenomena through which new capacities can emerge. The project is divided in four parts with strong overlap: 1) SP enhanced devices: We plan to use the benefits provided by SPs to enhance devices or create new device architectures. Textured metal surfaces, and the associated SP modes, can be used as antennas to extract, capture and control light in a variety of applications that include imaging and polarization sensing, nano-optical elements and detectors. 2) SP circuitry: To achieve complete miniature SP photonic circuits, a number of components to launch SP, control their propagation and finally decouple SP back to light are necessary. Much progress has been made in this direction but many challenges remain at the level of individual components and complete circuits that will be explored. 3) Molecule SP interactions: Molecule - SP strongly coupled interactions are expected to modify extensively photophysical and photochemical processes that will be studied by time resolved techniques. This issue also has implications for generating all optical control needed in SP circuitry. 4) Casimir effect and SPs: The tailoring of the Casimir force by enhancing the contribution of SP modes has been proposed by theoretical studies. Experiments will be undertaken to test the relationship between Casimir physics and plasmonics using nanostructured metal surfaces which could have significant consequences for nano-electro-mechanical systems. For each of these subjects, the objectives are at the cutting edge of the surface plasmon science and technology.
Summary
Surface plasmons have generated considerable renewed interest through a combination of scientific and technological advances. In particular with the progress nanofabrication techniques, the properties of surface plasmons (SP) can now be controlled by structuring metals at the nanometer scale. The overall objective of this proposal is to manipulate and control the properties of the SPs to analyze fundamental phenomena through which new capacities can emerge. The project is divided in four parts with strong overlap: 1) SP enhanced devices: We plan to use the benefits provided by SPs to enhance devices or create new device architectures. Textured metal surfaces, and the associated SP modes, can be used as antennas to extract, capture and control light in a variety of applications that include imaging and polarization sensing, nano-optical elements and detectors. 2) SP circuitry: To achieve complete miniature SP photonic circuits, a number of components to launch SP, control their propagation and finally decouple SP back to light are necessary. Much progress has been made in this direction but many challenges remain at the level of individual components and complete circuits that will be explored. 3) Molecule SP interactions: Molecule - SP strongly coupled interactions are expected to modify extensively photophysical and photochemical processes that will be studied by time resolved techniques. This issue also has implications for generating all optical control needed in SP circuitry. 4) Casimir effect and SPs: The tailoring of the Casimir force by enhancing the contribution of SP modes has been proposed by theoretical studies. Experiments will be undertaken to test the relationship between Casimir physics and plasmonics using nanostructured metal surfaces which could have significant consequences for nano-electro-mechanical systems. For each of these subjects, the objectives are at the cutting edge of the surface plasmon science and technology.
Max ERC Funding
2 200 000 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym PROWAT
Project Proton conduction in structured water
Researcher (PI) Huib BAKKER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE4, ERC-2015-AdG
Summary In recent years water near surfaces and solutes has been observed to be differently structured and to show slower reorientation and hydrogen-bond dynamics than in bulk. Aqueous proton transfer is a process that strongly relies on the structure and dynamics of the hydrogen-bond network of liquid water and that often occurs near surfaces. Examples are thylakoid and mitochondrial membranes and the nanochannels of transmembrane proteins and fuel cells. An important but experimentally largely unexplored area of research is how the rate and mechanism of aqueous proton transfer change due to the surface-induced structuring of the water medium. Theoretical work showed that the structuring and nano-confinement of water can have a strong effect on the proton mobility. Recently, experimental techniques have been developed that are capable of probing the structural dynamics of water molecules and proton-hydration structures near surfaces. These techniques include heterodyne detected sum-frequency generation (HD-SFG) and two-dimensional HD-SFG (2D-HD-VSFG).
I propose to use these and other advanced spectroscopic techniques to study the rate and molecular mechanisms of proton transfer through structured aqueous media. These systems include aqueous solutions of different solutes, water near extended surfaces like graphene and electrically switchable monolayers, and the aqueous nanochannels of metal-organic frameworks. These studies will provide a fundamental understanding of the molecular mechanisms of aqueous proton transfer in natural and man-made (bio)molecular systems, and can lead to the development of new proton-conducting membranes and nanochannels with applications in fuel cells. The obtained knowledge can also lead to new strategies to control proton mobility, e.g. by electrical switching of the properties of the water network at surfaces and in nanochannels, i.e. to field-effect proton transistors.
Summary
In recent years water near surfaces and solutes has been observed to be differently structured and to show slower reorientation and hydrogen-bond dynamics than in bulk. Aqueous proton transfer is a process that strongly relies on the structure and dynamics of the hydrogen-bond network of liquid water and that often occurs near surfaces. Examples are thylakoid and mitochondrial membranes and the nanochannels of transmembrane proteins and fuel cells. An important but experimentally largely unexplored area of research is how the rate and mechanism of aqueous proton transfer change due to the surface-induced structuring of the water medium. Theoretical work showed that the structuring and nano-confinement of water can have a strong effect on the proton mobility. Recently, experimental techniques have been developed that are capable of probing the structural dynamics of water molecules and proton-hydration structures near surfaces. These techniques include heterodyne detected sum-frequency generation (HD-SFG) and two-dimensional HD-SFG (2D-HD-VSFG).
I propose to use these and other advanced spectroscopic techniques to study the rate and molecular mechanisms of proton transfer through structured aqueous media. These systems include aqueous solutions of different solutes, water near extended surfaces like graphene and electrically switchable monolayers, and the aqueous nanochannels of metal-organic frameworks. These studies will provide a fundamental understanding of the molecular mechanisms of aqueous proton transfer in natural and man-made (bio)molecular systems, and can lead to the development of new proton-conducting membranes and nanochannels with applications in fuel cells. The obtained knowledge can also lead to new strategies to control proton mobility, e.g. by electrical switching of the properties of the water network at surfaces and in nanochannels, i.e. to field-effect proton transistors.
Max ERC Funding
2 495 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym PULSAR
Project Pushing ultrafast laser material processing into a new regime of plasma-controlled ablation
Researcher (PI) Francois Courvoisier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Ultra-intense femtosecond laser pulses promise to become a fast, universal, predictable and green tool for material processing at micro and nanometric scale. The recent tremendous increase in commercially available femtosecond laser energy at high repetition rate opens a wealth of novel perspectives for mass production. But even at high energy, laser processing remains limited to high-speed scanning point by point removal of ultra-thin nanometric layers from the material surface. This is because the uncontrolled laser-generated free-electron plasma shields against light and prevents reaching extreme internal temperatures at very precise nanometric scale.
PULSAR aims at breaking this barrier and developing a radically different concept of laser material modification regime based on free-electron plasma control. PULSAR 's unconventional concept is to control plasma generation, confinement, excitation and stability. An ambitious experimental and numerical research program will push the frontiers of laser processing to unprecedented precision, speed and predictability. PULSAR key concept is highly generic and the results will initiate new research across laser and plasma material processing, plasma physics and ultrafast optics.
Summary
Ultra-intense femtosecond laser pulses promise to become a fast, universal, predictable and green tool for material processing at micro and nanometric scale. The recent tremendous increase in commercially available femtosecond laser energy at high repetition rate opens a wealth of novel perspectives for mass production. But even at high energy, laser processing remains limited to high-speed scanning point by point removal of ultra-thin nanometric layers from the material surface. This is because the uncontrolled laser-generated free-electron plasma shields against light and prevents reaching extreme internal temperatures at very precise nanometric scale.
PULSAR aims at breaking this barrier and developing a radically different concept of laser material modification regime based on free-electron plasma control. PULSAR 's unconventional concept is to control plasma generation, confinement, excitation and stability. An ambitious experimental and numerical research program will push the frontiers of laser processing to unprecedented precision, speed and predictability. PULSAR key concept is highly generic and the results will initiate new research across laser and plasma material processing, plasma physics and ultrafast optics.
Max ERC Funding
1 996 581 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym QED-PROTONSIZE
Project The Proton Size Puzzle: Testing QED at Extreme Wavelengths
Researcher (PI) Kjeld Sijbrand Eduard EIKEMA
Host Institution (HI) STICHTING VU
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary A key component of the Standard Model is Quantum Electrodynamics (QED). QED explains e.g. the anomalous magnetic moment of the electron and small energy shifts in the energy structure of atoms and molecules due to vacuum fluctuations. After decades of precision measurements, especially laser spectroscopy in atomic hydrogen, QED is considered the most successful and best-tested theory in physics. However, in 2010 precision spectroscopy in muonic-hydrogen (where the electron is replaced with a muon) has lead to discrepancies in energy level structure that cannot be accounted for. If QED is considered correct, then one way of interpreting the results is that the size of the proton is different in normal (electronic) hydrogen by as much as 4% (a 7 sigma effect) compared to muonic hydrogen. Despite great theoretical and experimental efforts, this 'proton size puzzle' is still unsolved.
I propose to perform precision spectroscopy in the extreme ultraviolet near 30 nm in the helium+ ion, to establish an exciting new platform for QED tests and thereby shed light on the proton-size puzzle. The advantages of helium ions over hydrogen atoms are that they can be trapped (observed longer), QED effects are more than an order of magnitude larger, and the nuclear size of the alpha particle is better known than the proton. Moreover, the CREMA collaboration has recently measured the 2S-2P transition in muonic He+ (both 3He and 4He isotopes) at the Paul Scherrer Institute. Evaluation of the measurements is ongoing, but could lead to an 8 fold (or more) improved alpha-particle radius, so that it is no longer limiting QED theory in normal He+. I will use several ground-breaking methods such as Ramsey-comb spectroscopy in the extreme ultraviolet to measure the 1S-2S transition in trapped normal electronic He+, with (sub) kHz spectroscopic accuracy. This will provide a unique and timely opportunity for a direct comparison of QED in electronic and muonic systems at an unprecedented level.
Summary
A key component of the Standard Model is Quantum Electrodynamics (QED). QED explains e.g. the anomalous magnetic moment of the electron and small energy shifts in the energy structure of atoms and molecules due to vacuum fluctuations. After decades of precision measurements, especially laser spectroscopy in atomic hydrogen, QED is considered the most successful and best-tested theory in physics. However, in 2010 precision spectroscopy in muonic-hydrogen (where the electron is replaced with a muon) has lead to discrepancies in energy level structure that cannot be accounted for. If QED is considered correct, then one way of interpreting the results is that the size of the proton is different in normal (electronic) hydrogen by as much as 4% (a 7 sigma effect) compared to muonic hydrogen. Despite great theoretical and experimental efforts, this 'proton size puzzle' is still unsolved.
I propose to perform precision spectroscopy in the extreme ultraviolet near 30 nm in the helium+ ion, to establish an exciting new platform for QED tests and thereby shed light on the proton-size puzzle. The advantages of helium ions over hydrogen atoms are that they can be trapped (observed longer), QED effects are more than an order of magnitude larger, and the nuclear size of the alpha particle is better known than the proton. Moreover, the CREMA collaboration has recently measured the 2S-2P transition in muonic He+ (both 3He and 4He isotopes) at the Paul Scherrer Institute. Evaluation of the measurements is ongoing, but could lead to an 8 fold (or more) improved alpha-particle radius, so that it is no longer limiting QED theory in normal He+. I will use several ground-breaking methods such as Ramsey-comb spectroscopy in the extreme ultraviolet to measure the 1S-2S transition in trapped normal electronic He+, with (sub) kHz spectroscopic accuracy. This will provide a unique and timely opportunity for a direct comparison of QED in electronic and muonic systems at an unprecedented level.
Max ERC Funding
2 497 664 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym QINTERNET
Project Quantum communication networks
Researcher (PI) Stephanie Wehner
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary My goal is to overcome the two-most pressing theoretical challenges necessary to build large-scale quantum communication networks: routing and designing protocols that use them to solve useful tasks. In two interconnected projects, I will devise entirely new concepts, models and mathematical methods that take into account the intricacies of real world quantum devices that can operate on only very few quantum bits at a time.
(1) Security: I will prove the security of quantum cryptographic protocols under realistic conditions, and implement them in collaboration with experimentalists. I will develop a general theory and practical tests for the security of multi-party cryptographic primitives using untrusted quantum devices. This is mathematically challenging due to the possibility of entanglement between the devices.
(2) Routing: I will initiate the systematic study of effective routing in a quantum communication network. This is necessary for quantum networks to grow in scale. Quantum entanglement offers very different means of routing messages than is possible in classical networks, and poses genuinely new challenges to computer science. I will design routing protocols in a multi-node quantum network of potentially different physical implementations, i.e., hybrid networks, that will establish a new line of research in my field.
Quantum networks are still in their infancy, even though quantum communication offers unparalleled advantages that are provably impossible using classical communication. Building a quantum network is an interdisciplinary effort bringing together computer science, physics, and engineering. I am in a unique position in computer science, since I have recently joined QuTech where I have direct access to small quantum devices - bringing me tantalizingly close to seeing such networks realized. As with early classical networks, it is difficult to predict where our journey will end, but my research will join theory and experiment to move forward.
Summary
My goal is to overcome the two-most pressing theoretical challenges necessary to build large-scale quantum communication networks: routing and designing protocols that use them to solve useful tasks. In two interconnected projects, I will devise entirely new concepts, models and mathematical methods that take into account the intricacies of real world quantum devices that can operate on only very few quantum bits at a time.
(1) Security: I will prove the security of quantum cryptographic protocols under realistic conditions, and implement them in collaboration with experimentalists. I will develop a general theory and practical tests for the security of multi-party cryptographic primitives using untrusted quantum devices. This is mathematically challenging due to the possibility of entanglement between the devices.
(2) Routing: I will initiate the systematic study of effective routing in a quantum communication network. This is necessary for quantum networks to grow in scale. Quantum entanglement offers very different means of routing messages than is possible in classical networks, and poses genuinely new challenges to computer science. I will design routing protocols in a multi-node quantum network of potentially different physical implementations, i.e., hybrid networks, that will establish a new line of research in my field.
Quantum networks are still in their infancy, even though quantum communication offers unparalleled advantages that are provably impossible using classical communication. Building a quantum network is an interdisciplinary effort bringing together computer science, physics, and engineering. I am in a unique position in computer science, since I have recently joined QuTech where I have direct access to small quantum devices - bringing me tantalizingly close to seeing such networks realized. As with early classical networks, it is difficult to predict where our journey will end, but my research will join theory and experiment to move forward.
Max ERC Funding
1 498 725 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym QOM3D
Project Quantum Optomechanics in 3D
Researcher (PI) Gary Alexander Steele
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Consolidator Grant (CoG), PE2, ERC-2015-CoG
Summary Optomechanics is a field that aims to detect and control mechanical motion with light, ultimately at the quantum level. Experiments reaching the mechanical quantum ground state established optomechanics as a rapidly growing new field. Now that the quantum ground state has been reached, what is the next step?
The current goal of the field is quantum superposition states of motion. An example of this is a mechanical “Schrodinger cat” state, in which a drum is in a quantum superposition of vibrating up and vibrating down at the same time. From a technological perspective, such states could be used as a memory for storage of quantum information, or as a quantum bit itself, performing quantum calculations with a mechanical object. From a fundamental perspective, Schrodinger cat states could be used to explore the limits of macroscopic quantum mechanics and to look for the boundary between the quantum and classical worlds. Despite their recent success, the coupling between light and motion in current implementations is too weak to achieve non-classical motion.
Here, I propose a new optomechanical system coupling the motion of a millimeter-sized membrane to quantum microwave “light” in a three-dimensional superconducting cavity. In this new system, I will use the exceptional coherence photons in 3D cavities to strongly enhance the coupling of light and motion. To demonstrate the feasibility of this idea, I present preliminary data from a proof-of-concept device with coupling that is already close to state-of-the-art, with an outlook to scaling significantly beyond implementations shown to date.
With the team funded by this project, I will implement these feasible but challenging steps, creating a system with optomechanical coupling that can potentially reach the strong coupling regime for a single photon. Using this new strong coupling, I will bring optomechanics to a new regime where one can create and explore quantum superpositions of massive, macroscopic objects.
Summary
Optomechanics is a field that aims to detect and control mechanical motion with light, ultimately at the quantum level. Experiments reaching the mechanical quantum ground state established optomechanics as a rapidly growing new field. Now that the quantum ground state has been reached, what is the next step?
The current goal of the field is quantum superposition states of motion. An example of this is a mechanical “Schrodinger cat” state, in which a drum is in a quantum superposition of vibrating up and vibrating down at the same time. From a technological perspective, such states could be used as a memory for storage of quantum information, or as a quantum bit itself, performing quantum calculations with a mechanical object. From a fundamental perspective, Schrodinger cat states could be used to explore the limits of macroscopic quantum mechanics and to look for the boundary between the quantum and classical worlds. Despite their recent success, the coupling between light and motion in current implementations is too weak to achieve non-classical motion.
Here, I propose a new optomechanical system coupling the motion of a millimeter-sized membrane to quantum microwave “light” in a three-dimensional superconducting cavity. In this new system, I will use the exceptional coherence photons in 3D cavities to strongly enhance the coupling of light and motion. To demonstrate the feasibility of this idea, I present preliminary data from a proof-of-concept device with coupling that is already close to state-of-the-art, with an outlook to scaling significantly beyond implementations shown to date.
With the team funded by this project, I will implement these feasible but challenging steps, creating a system with optomechanical coupling that can potentially reach the strong coupling regime for a single photon. Using this new strong coupling, I will bring optomechanics to a new regime where one can create and explore quantum superpositions of massive, macroscopic objects.
Max ERC Funding
1 999 594 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym QUANTUMOPTOELECTR
Project Quantum Opto-Electronics
Researcher (PI) Leo Kouwenhoven
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE3, ERC-2008-AdG
Summary We propose to develop an opto-electronics interface between single-electron devices and single-photon optics. The ultimate limit in the miniaturization of electronics and photonics is at the nanometer scale. Here the signal level can be controlled at the fundamental level of a single electron for electricity and a single photon for light. These limits are actively being pursued for scientific interest with possible applications in the new area of quantum information science. Yet, these efforts occur separately in the distinct communities of solid state electronics and quantum optics. Here we propose to develop a toolbox for interfacing electronics and optics on the level of single electrons and photons. The basic building block is a nanoscale pn-junction defined in a semiconductor nanowire, which is the most versatile material system for single electron to single photon conversion. We will develop the following technology: (1) growth of complex semiconductor nanowires (2) quantum state transfer for copying the information stored in an electron quantum state onto a photon state (3) single-photon optical-chip with on-chip guiding via single plasmons and on-chip detection with a superconducting detector. Besides being fundamentally interesting by itself, this new toolbox opens a new area of experiments where qubits processed in solid state nano-devices are coupled quantum mechanically over long distances via photons as signal carriers to various kinds of other interesting quantum system (e.g. solid state quantum dots, confined nuclear spins and atomic vapours).
Summary
We propose to develop an opto-electronics interface between single-electron devices and single-photon optics. The ultimate limit in the miniaturization of electronics and photonics is at the nanometer scale. Here the signal level can be controlled at the fundamental level of a single electron for electricity and a single photon for light. These limits are actively being pursued for scientific interest with possible applications in the new area of quantum information science. Yet, these efforts occur separately in the distinct communities of solid state electronics and quantum optics. Here we propose to develop a toolbox for interfacing electronics and optics on the level of single electrons and photons. The basic building block is a nanoscale pn-junction defined in a semiconductor nanowire, which is the most versatile material system for single electron to single photon conversion. We will develop the following technology: (1) growth of complex semiconductor nanowires (2) quantum state transfer for copying the information stored in an electron quantum state onto a photon state (3) single-photon optical-chip with on-chip guiding via single plasmons and on-chip detection with a superconducting detector. Besides being fundamentally interesting by itself, this new toolbox opens a new area of experiments where qubits processed in solid state nano-devices are coupled quantum mechanically over long distances via photons as signal carriers to various kinds of other interesting quantum system (e.g. solid state quantum dots, confined nuclear spins and atomic vapours).
Max ERC Funding
1 800 000 €
Duration
Start date: 2009-01-01, End date: 2013-10-31
Project acronym QUASIFT
Project Quantum Algebraic Structures In Field Theories
Researcher (PI) Vasily PESTUN
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary Quantum Field Theory is a universal framework to address quantum physical systems with infinitely many interacting degrees of freedom, applicable both at the level of fundamental interactions, such as the subnuclear physics of quarks and gluons and at the phenomenological level such as the physics of quantum fluids and superconductivity.
Traditionally, weakly interacting quantum field theory is formulated as a perturbative deformation of the linear theory of freely propagating quantum waves or particles with interactions described by Feynman diagrams. For strongly non-linear quantum field theories the method of Feynman diagrams is not adequate.
The main goal of this proposal is to develop novel tools and techniques to address strongly non-linear quantum field theories.
To achieve this goal we will search for hidden algebraic structures in quantum field theories that will lead to efficient algorithms to compute physical observables of interest. In particular we identify non-linear quantum field theories with exactly solvable sectors of physical observables.
In this project we will focus on three objectives:
- build general theory of localization in supersymmetric Yang-Mills theory for arbitrary geometrical backgrounds
- find all realizations of symplectic and supersymplectic completely integrable systems in gauge theories
- construct finite supersymmetric Yang-Mills theory in terms of the algebra of locally supersymmetric loop observables for maximally supersymmetric gauge theory
The realization of the above objectives will uncover hidden quantum algebraic structures and consequently will bring ground-breaking results in our knowledge of quantum field theories and the fundamental interactions.
Summary
Quantum Field Theory is a universal framework to address quantum physical systems with infinitely many interacting degrees of freedom, applicable both at the level of fundamental interactions, such as the subnuclear physics of quarks and gluons and at the phenomenological level such as the physics of quantum fluids and superconductivity.
Traditionally, weakly interacting quantum field theory is formulated as a perturbative deformation of the linear theory of freely propagating quantum waves or particles with interactions described by Feynman diagrams. For strongly non-linear quantum field theories the method of Feynman diagrams is not adequate.
The main goal of this proposal is to develop novel tools and techniques to address strongly non-linear quantum field theories.
To achieve this goal we will search for hidden algebraic structures in quantum field theories that will lead to efficient algorithms to compute physical observables of interest. In particular we identify non-linear quantum field theories with exactly solvable sectors of physical observables.
In this project we will focus on three objectives:
- build general theory of localization in supersymmetric Yang-Mills theory for arbitrary geometrical backgrounds
- find all realizations of symplectic and supersymplectic completely integrable systems in gauge theories
- construct finite supersymmetric Yang-Mills theory in terms of the algebra of locally supersymmetric loop observables for maximally supersymmetric gauge theory
The realization of the above objectives will uncover hidden quantum algebraic structures and consequently will bring ground-breaking results in our knowledge of quantum field theories and the fundamental interactions.
Max ERC Funding
1 498 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym R3S3
Project Research on Really Reliable and Secure Systems Software
Researcher (PI) Andrew Stuart Tanenbaum
Host Institution (HI) STICHTING VU
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary Current operating systems have poor reliability and security. Computers crash regularly whereas other electronic devices such as televisions and mobile phones never crash. Furthermore, practically every week one reads about another security hole in Windows. As computers become more essential for all aspects of society this situation is unacceptable. The goal of my proposed research is to conceive, design, implement, and test an operating system that is as reliable and secure as is humanly possible. The job will be finished when the average user has never experienced a crash in his lifetime and RESET buttons on computers have passed into history, like 5¼ -inch floppy disks. The basic concept I want to use to achieve a reliable, secure operating system is the POLA The Principle of Least Authority. The operating system will be moved out the kernel (where it has unrestricted access to all of memory, critical machine instructions the I/O devices) into a set of multiple, tightly constrained user processes. Each process (e.g., a file server) will be given exactly the authority it needs to do its job and no more. This mechanism ensures that problems in one process cannot spill over into other ones. While this goal has floated around for years, no one really knows how to do it, so research is needed. Furthermore, I also want to make the system fault tolerant and self healing so it can continue to run even in the presence of hardware and software errors. Recovery should be done automatically without affecting running programs. Designing and building a new operating system that runs counter to 50 years of experience is extremely ground-breaking and ambitious. But the current road we are on with millions of lines of code in the kernel and growing all the time cannot be sustained. We need research that will lead to much better reliability and security. I have 30 years experience in the field and think I have a chance to pull it off.
Summary
Current operating systems have poor reliability and security. Computers crash regularly whereas other electronic devices such as televisions and mobile phones never crash. Furthermore, practically every week one reads about another security hole in Windows. As computers become more essential for all aspects of society this situation is unacceptable. The goal of my proposed research is to conceive, design, implement, and test an operating system that is as reliable and secure as is humanly possible. The job will be finished when the average user has never experienced a crash in his lifetime and RESET buttons on computers have passed into history, like 5¼ -inch floppy disks. The basic concept I want to use to achieve a reliable, secure operating system is the POLA The Principle of Least Authority. The operating system will be moved out the kernel (where it has unrestricted access to all of memory, critical machine instructions the I/O devices) into a set of multiple, tightly constrained user processes. Each process (e.g., a file server) will be given exactly the authority it needs to do its job and no more. This mechanism ensures that problems in one process cannot spill over into other ones. While this goal has floated around for years, no one really knows how to do it, so research is needed. Furthermore, I also want to make the system fault tolerant and self healing so it can continue to run even in the presence of hardware and software errors. Recovery should be done automatically without affecting running programs. Designing and building a new operating system that runs counter to 50 years of experience is extremely ground-breaking and ambitious. But the current road we are on with millions of lines of code in the kernel and growing all the time cannot be sustained. We need research that will lead to much better reliability and security. I have 30 years experience in the field and think I have a chance to pull it off.
Max ERC Funding
2 448 420 €
Duration
Start date: 2008-11-01, End date: 2014-04-30
Project acronym REALISM
Project Reproducing EArthquakes in the Laboratory: Imaging, Speed and Mineralogy
Researcher (PI) Alexandre Jean-Marie Schubnel
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary We propose a simple idea: to reproduce earthquakes in the laboratory. Because earthquakes are spectacular examples of uncontrollable catastrophes, the opportunity to study them under controlled conditions in the laboratory is unique and is, in fact, the only way to understand the details of the earthquake source physics.
The aim of the project is interdisciplinary, at the frontiers between Rock Fracture Mechanics, Seismology, and Mineralogy. Its ultimate goal is to improve, on the basis of integrated experimental data, our understanding of the earthquake source physics. We have already shown that both deep and shallow laboratory earthquakes are not mere `analogs’ of earthquakes, but are real events – though very small [Passelègue et al. 2013, Schubnel et al. 2013]. During laboratory earthquakes, by measuring all of the physical quantities related to the rupturing process, we will unravel what controls the rupture speed, rupture arrest, the earthquake rupture energy budget, as well as the common role played by mineralogy in both shallow and deep earthquakes. We will also perform some experiments on rock samples drilled from actual active fault zones. Our work will provide insights for earthquake hazard mitigation, constrain ubiquitously observed seismological statistical laws (Omori, Gutenberg-Richter) and produce unprecedented data sets on rock fracture dynamics at in-situ conditions to test seismic slip inversion and dynamic rupture modelling techniques.
The new infrastructure we plan to install will reproduce the temperatures and pressures at depths where earthquakes occur in the crust as well as in the upper mantle of the Earth, with never achieved spatio-temporal imaging resolution to this day. This will be a valuable research asset for the European community, as it will eventually open the door to a better understanding of all the processes happening under stress within the first hundreds of kilometres of the Earth.
Summary
We propose a simple idea: to reproduce earthquakes in the laboratory. Because earthquakes are spectacular examples of uncontrollable catastrophes, the opportunity to study them under controlled conditions in the laboratory is unique and is, in fact, the only way to understand the details of the earthquake source physics.
The aim of the project is interdisciplinary, at the frontiers between Rock Fracture Mechanics, Seismology, and Mineralogy. Its ultimate goal is to improve, on the basis of integrated experimental data, our understanding of the earthquake source physics. We have already shown that both deep and shallow laboratory earthquakes are not mere `analogs’ of earthquakes, but are real events – though very small [Passelègue et al. 2013, Schubnel et al. 2013]. During laboratory earthquakes, by measuring all of the physical quantities related to the rupturing process, we will unravel what controls the rupture speed, rupture arrest, the earthquake rupture energy budget, as well as the common role played by mineralogy in both shallow and deep earthquakes. We will also perform some experiments on rock samples drilled from actual active fault zones. Our work will provide insights for earthquake hazard mitigation, constrain ubiquitously observed seismological statistical laws (Omori, Gutenberg-Richter) and produce unprecedented data sets on rock fracture dynamics at in-situ conditions to test seismic slip inversion and dynamic rupture modelling techniques.
The new infrastructure we plan to install will reproduce the temperatures and pressures at depths where earthquakes occur in the crust as well as in the upper mantle of the Earth, with never achieved spatio-temporal imaging resolution to this day. This will be a valuable research asset for the European community, as it will eventually open the door to a better understanding of all the processes happening under stress within the first hundreds of kilometres of the Earth.
Max ERC Funding
2 748 188 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym Scale-FreeBack
Project Scale-Free Control for Complex Physical Network Systems
Researcher (PI) Carlos Canudas de Wit
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary Technology achievements were typically built upon fundamental theoretical findings, but nowadays technology seems to be evolving faster than our ability to develop new concepts and theories. Intelligent traffic systems benefit from many technical innovations, for example. Mobile phones, radars, cameras and magnetometers can be used to measure traffic evolution and provide large sets of valuable data. Vehicles can communicate with the network infrastructure, as well as each other. However, these huge technological advances have not been used to the full so far. Traffic lights are far from functioning optimally and traffic management systems do not always prevent the occurrence of congestions.
So what is missing? Such systems affect our daily life; why aren’t them on pace with technology advances? Possible because they have become far more complex than the analytical tools available for managing them. Systems have many components, communicate with each other, have self-decision-making mechanisms, share an enormous amount of information, and form networks. Research in control systems has challenged some of these features, but not in a very concerted way. There is a lack of “glue” relating the solutions to each other.
In the Scale-FreeBack project, it is proposed to approach this problem with a new holistic vision. Scale-FreeBack will first investigate appropriate scale-free dynamic modeling approaches breaking down system’s complexity, and then develop control and observation algorithms which are specifically tailored for such models. Scale-FreeBack will also investigate new resilient issues in control which are urgently required because of the increasing connectivity between systems and the external world. Road traffic networks will be used in proof-of-concept studies based on field tests performed at our Grenoble Traffic Lab (GTL) and in a large-scale microscopic simulator.
Summary
Technology achievements were typically built upon fundamental theoretical findings, but nowadays technology seems to be evolving faster than our ability to develop new concepts and theories. Intelligent traffic systems benefit from many technical innovations, for example. Mobile phones, radars, cameras and magnetometers can be used to measure traffic evolution and provide large sets of valuable data. Vehicles can communicate with the network infrastructure, as well as each other. However, these huge technological advances have not been used to the full so far. Traffic lights are far from functioning optimally and traffic management systems do not always prevent the occurrence of congestions.
So what is missing? Such systems affect our daily life; why aren’t them on pace with technology advances? Possible because they have become far more complex than the analytical tools available for managing them. Systems have many components, communicate with each other, have self-decision-making mechanisms, share an enormous amount of information, and form networks. Research in control systems has challenged some of these features, but not in a very concerted way. There is a lack of “glue” relating the solutions to each other.
In the Scale-FreeBack project, it is proposed to approach this problem with a new holistic vision. Scale-FreeBack will first investigate appropriate scale-free dynamic modeling approaches breaking down system’s complexity, and then develop control and observation algorithms which are specifically tailored for such models. Scale-FreeBack will also investigate new resilient issues in control which are urgently required because of the increasing connectivity between systems and the external world. Road traffic networks will be used in proof-of-concept studies based on field tests performed at our Grenoble Traffic Lab (GTL) and in a large-scale microscopic simulator.
Max ERC Funding
2 873 601 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym SCEON
Project Scanning Electron Optical Nanoscopy
Researcher (PI) Albert POLMAN
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE3, ERC-2015-AdG
Summary Novel developments in optical technology increasingly depend on control of light at the nanoscale. To study light at this small length scale it is essential to employ techniques that can excite and image light at the nanoscale. In recent years, my group has explored cathodoluminescence (CL) spectroscopy for this purpose. Based on the exciting potential of this technique, I propose to design and construct a new time- and angle-resolved CL microscope that exploits the primary electron beam as a coherent optical excitation source with deep-subwavelength spatial resolution. We will use the new instrument to address four challenges that will provide new insight in the behaviour of light at the nanoscale. Specifically, we will:
(1) use CL microscopy to excite and characterize ultra-short wavelength-plasmons on graphene. We will create 3D tomographic reconstructions of the local optical density of states in resonant plasmonic and dielectric nanostructures.
(2) determine 2D and 3D spatially-resolved ultrafast carrier recombination processes in resonant semiconductor photovoltaic nanostructures and reveal the radiative properties of single quantum emitters.
(3) develop CL momentum spectroscopy to reveal embedded eigenstates in dielectric photonic crystals and topological photonic protection in complex three-dimensional architectures.
(4) develop CL polarimetry in combination with phase-resolved CL detection to study electric and magnetic polarizabilities in nanoscale light emitters and to control the orbital angular momentum of light.
The proposed program will firmly establish time- and angle-resolved CL imaging spectroscopy as a key deep-subwavelength nanoscopy tool to investigate the interplay of electric and magnetic fields that constitute light at the nano scale, and will enable applications in photovoltaics, solid-state lighting, photonic and optoelectronic integrated circuits, quantum communication, sensing and metrology.
Summary
Novel developments in optical technology increasingly depend on control of light at the nanoscale. To study light at this small length scale it is essential to employ techniques that can excite and image light at the nanoscale. In recent years, my group has explored cathodoluminescence (CL) spectroscopy for this purpose. Based on the exciting potential of this technique, I propose to design and construct a new time- and angle-resolved CL microscope that exploits the primary electron beam as a coherent optical excitation source with deep-subwavelength spatial resolution. We will use the new instrument to address four challenges that will provide new insight in the behaviour of light at the nanoscale. Specifically, we will:
(1) use CL microscopy to excite and characterize ultra-short wavelength-plasmons on graphene. We will create 3D tomographic reconstructions of the local optical density of states in resonant plasmonic and dielectric nanostructures.
(2) determine 2D and 3D spatially-resolved ultrafast carrier recombination processes in resonant semiconductor photovoltaic nanostructures and reveal the radiative properties of single quantum emitters.
(3) develop CL momentum spectroscopy to reveal embedded eigenstates in dielectric photonic crystals and topological photonic protection in complex three-dimensional architectures.
(4) develop CL polarimetry in combination with phase-resolved CL detection to study electric and magnetic polarizabilities in nanoscale light emitters and to control the orbital angular momentum of light.
The proposed program will firmly establish time- and angle-resolved CL imaging spectroscopy as a key deep-subwavelength nanoscopy tool to investigate the interplay of electric and magnetic fields that constitute light at the nano scale, and will enable applications in photovoltaics, solid-state lighting, photonic and optoelectronic integrated circuits, quantum communication, sensing and metrology.
Max ERC Funding
2 495 625 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym SEAQUEL
Project Structured Ensembles of Atoms for Quantum Engineering of Light
Researcher (PI) Alexei Ourjoumtsev
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary This project aims at building a new versatile platform for quantum engineering of light, with the unique ability to create deterministic coherent photon-photon interactions tunable in range, strength and dimensionality. It will explore a new avenue towards this goal, combining cutting-edge advances of atomic physics with ideas inspired by nanophotonics: a cold micro-structured gas of interacting atoms will act as a Bragg mirror saturable by a single photon, strongly coupling a controlled number of spatial modes in an optical resonator. This flexible, efficient, dynamically-controlled system will be used to test the limits of fundamental no-go theorems in quantum logic, measure physical quantities inaccessible to standard detectors, and deterministically engineer massively entangled light beams for Heisenberg-limited sensing. Ultimately, it will give access to a yet unexplored regime where intracavity photons form a strongly correlated quantum fluid, with spatial and temporal dynamics ideally suited to perform real-time, single-particle-resolved simulations of non-trivial topological effects appearing in condensed-matter systems.
Summary
This project aims at building a new versatile platform for quantum engineering of light, with the unique ability to create deterministic coherent photon-photon interactions tunable in range, strength and dimensionality. It will explore a new avenue towards this goal, combining cutting-edge advances of atomic physics with ideas inspired by nanophotonics: a cold micro-structured gas of interacting atoms will act as a Bragg mirror saturable by a single photon, strongly coupling a controlled number of spatial modes in an optical resonator. This flexible, efficient, dynamically-controlled system will be used to test the limits of fundamental no-go theorems in quantum logic, measure physical quantities inaccessible to standard detectors, and deterministically engineer massively entangled light beams for Heisenberg-limited sensing. Ultimately, it will give access to a yet unexplored regime where intracavity photons form a strongly correlated quantum fluid, with spatial and temporal dynamics ideally suited to perform real-time, single-particle-resolved simulations of non-trivial topological effects appearing in condensed-matter systems.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym SIMOSOMA
Project Single molecules in soft matter: dynamical heterogeneity in supercooled liquids and glasses
Researcher (PI) Michel Orrit
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE4, ERC-2008-AdG
Summary Single-molecule optical microscopy provides average-free, dynamical and structural information about condensed matter at molecular scales. Single fluorescent molecules can now be located and tracked with a spatial resolution as high as a few tens of nanometers, even at depths as large as several microns. These capabilities are ideal to link the macroscopic physical properties of soft condensed matter with the structure, organization and dynamics of the constituent molecules. Perhaps the most surprising conclusion drawn from single-molecule observations is the unsuspected heterogeneity of molecular assemblies, both in time and space, which had remained largely hidden in conventional ensemble experiments. The structural glass transition is said to be one of the hardest open problems in condensed matter science. Although most agree on the crucial part played by heterogeneity in this process, the guesses vary wildly as to the scale and relaxation times of the inhomogeneities. Our recent discovery of glassy rheology in supercooled glass formers, following earlier observations of heterogeneity, has been received with much interest in the complex liquids community. I am convinced that single-molecule studies have the potential to radically change our view of supercooled liquids and glasses. In a broader sense, molecular insight from chemical physics complements the general ideas developed by statistical physicists. I believe it is the missing link toward a molecular control of the physical properties of soft materials. I propose to perform a broad range of novel single-molecule experiments using a micro-rheological cell to apply mechanical stress, strains and/or temperature jumps. In particular, we will perform mechanical studies of solid-solid friction, and temperature-jump studies of single proteins and single protein complexes.
Summary
Single-molecule optical microscopy provides average-free, dynamical and structural information about condensed matter at molecular scales. Single fluorescent molecules can now be located and tracked with a spatial resolution as high as a few tens of nanometers, even at depths as large as several microns. These capabilities are ideal to link the macroscopic physical properties of soft condensed matter with the structure, organization and dynamics of the constituent molecules. Perhaps the most surprising conclusion drawn from single-molecule observations is the unsuspected heterogeneity of molecular assemblies, both in time and space, which had remained largely hidden in conventional ensemble experiments. The structural glass transition is said to be one of the hardest open problems in condensed matter science. Although most agree on the crucial part played by heterogeneity in this process, the guesses vary wildly as to the scale and relaxation times of the inhomogeneities. Our recent discovery of glassy rheology in supercooled glass formers, following earlier observations of heterogeneity, has been received with much interest in the complex liquids community. I am convinced that single-molecule studies have the potential to radically change our view of supercooled liquids and glasses. In a broader sense, molecular insight from chemical physics complements the general ideas developed by statistical physicists. I believe it is the missing link toward a molecular control of the physical properties of soft materials. I propose to perform a broad range of novel single-molecule experiments using a micro-rheological cell to apply mechanical stress, strains and/or temperature jumps. In particular, we will perform mechanical studies of solid-solid friction, and temperature-jump studies of single proteins and single protein complexes.
Max ERC Funding
1 836 000 €
Duration
Start date: 2009-04-01, End date: 2014-03-31