Project acronym 3DWATERWAVES
Project Mathematical aspects of three-dimensional water waves with vorticity
Researcher (PI) Erik Torsten Wahlén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Summary
The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Max ERC Funding
1 203 627 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym bioSPINspired
Project Bio-inspired Spin-Torque Computing Architectures
Researcher (PI) Julie Grollier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Summary
In the bioSPINspired project, I propose to use my experience and skills in spintronics, non-linear dynamics and neuromorphic nanodevices to realize bio-inspired spin torque computing architectures. I will develop a bottom-up approach to build spintronic data processing systems that perform low power ‘cognitive’ tasks on-chip and could ultimately complement our traditional microprocessors. I will start by showing that spin torque nanodevices, which are multi-functional and tunable nonlinear dynamical nano-components, are capable of emulating both neurons and synapses. Then I will assemble these spin-torque nano-synapses and nano-neurons into modules that implement brain-inspired algorithms in hardware. The brain displays many features typical of non-linear dynamical networks, such as synchronization or chaotic behaviour. These observations have inspired a whole class of models that harness the power of complex non-linear dynamical networks for computing. Following such schemes, I will interconnect the spin torque nanodevices by electrical and magnetic interactions so that they can couple to each other, synchronize and display complex dynamics. Then I will demonstrate that when perturbed by external inputs, these spin torque networks can perform recognition tasks by converging to an attractor state, or use the separation properties at the edge of chaos to classify data. In the process, I will revisit these brain-inspired abstract models to adapt them to the constraints of hardware implementations. Finally I will investigate how the spin torque modules can be efficiently connected together with CMOS buffers to perform higher level computing tasks. The table-top prototypes, hardware-adapted computing models and large-scale simulations developed in bioSPINspired will lay the foundations of spin torque bio-inspired computing and open the path to the fabrication of fully integrated, ultra-dense and efficient CMOS/spin-torque nanodevice chips.
Max ERC Funding
1 907 767 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym BoneImplant
Project Monitoring bone healing around endosseous implants: from multiscale modeling to the patient’s bed
Researcher (PI) Guillaume Loïc Haiat
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Summary
Implants are often employed in orthopaedic and dental surgeries. However, risks of failure, which are difficult to anticipate, are still experienced and may have dramatic consequences. Failures are due to degraded bone remodeling at the bone-implant interface, a multiscale phenomenon of an interdisciplinary nature which remains poorly understood. The objective of BoneImplant is to provide a better understanding of the multiscale and multitime mechanisms at work at the bone-implant interface. To do so, BoneImplant aims at studying the evolution of the biomechanical properties of bone tissue around an implant during the remodeling process. A methodology involving combined in vivo, in vitro and in silico approaches is proposed.
New modeling approaches will be developed in close synergy with the experiments. Molecular dynamic computations will be used to understand fluid flow in nanoscopic cavities, a phenomenon determining bone healing process. Generalized continuum theories will be necessary to model bone tissue due to the important strain field around implants. Isogeometric mortar formulation will allow to simulate the bone-implant interface in a stable and efficient manner.
In vivo experiments realized under standardized conditions will be realized on the basis of feasibility studies. A multimodality and multi-physical experimental approach will be carried out to assess the biomechanical properties of newly formed bone tissue as a function of the implant environment. The experimental approach aims at estimating the effective adhesion energy and the potentiality of quantitative ultrasound imaging to assess different biomechanical properties of the interface.
Results will be used to design effective loading clinical procedures of implants and to optimize implant conception, leading to the development of therapeutic and diagnostic techniques. The development of quantitative ultrasonic techniques to monitor implant stability has a potential for industrial transfer.
Max ERC Funding
1 992 154 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym BOPNIE
Project Boundary value problems for nonlinear integrable equations
Researcher (PI) Jonatan Carl Anders Lenells
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The purpose of this project is to develop new methods for solving boundary value problems (BVPs) for nonlinear integrable partial differential equations (PDEs). Integrable PDEs can be analyzed by means of the Inverse Scattering Transform, whose introduction was one of the most important developments in the theory of nonlinear PDEs in the 20th century. Until the 1990s the inverse scattering methodology was pursued almost entirely for pure initial-value problems. However, in many laboratory and field situations, the solution is generated by what corresponds to the imposition of boundary conditions rather than initial conditions. Thus, an understanding of BVPs is crucial.
In an exciting sequence of events taking place in the last two decades, new tools have become available to deal with BVPs for integrable PDEs. Although some important issues have already been resolved, several major problems remain open.
The aim of this project is to solve a number of these open problems and to find solutions of BVPs which were heretofore not solvable. More precisely, the proposal has eight objectives:
1. Develop methods for solving problems with time-periodic boundary conditions.
2. Answer some long-standing open questions raised by series of wave-tank experiments 35 years ago.
3. Develop a new approach for the study of space-periodic solutions.
4. Develop new approaches for the analysis of BVPs for equations with 3 x 3-matrix Lax pairs.
5. Derive new asymptotic formulas by using a nonlinear version of the steepest descent method.
6. Construct disk and disk/black-hole solutions of the stationary axisymmetric Einstein equations.
7. Solve a BVP in Einstein's theory of relativity describing two colliding gravitational waves.
8. Extend the above methods to BVPs in higher dimensions.
Summary
The purpose of this project is to develop new methods for solving boundary value problems (BVPs) for nonlinear integrable partial differential equations (PDEs). Integrable PDEs can be analyzed by means of the Inverse Scattering Transform, whose introduction was one of the most important developments in the theory of nonlinear PDEs in the 20th century. Until the 1990s the inverse scattering methodology was pursued almost entirely for pure initial-value problems. However, in many laboratory and field situations, the solution is generated by what corresponds to the imposition of boundary conditions rather than initial conditions. Thus, an understanding of BVPs is crucial.
In an exciting sequence of events taking place in the last two decades, new tools have become available to deal with BVPs for integrable PDEs. Although some important issues have already been resolved, several major problems remain open.
The aim of this project is to solve a number of these open problems and to find solutions of BVPs which were heretofore not solvable. More precisely, the proposal has eight objectives:
1. Develop methods for solving problems with time-periodic boundary conditions.
2. Answer some long-standing open questions raised by series of wave-tank experiments 35 years ago.
3. Develop a new approach for the study of space-periodic solutions.
4. Develop new approaches for the analysis of BVPs for equations with 3 x 3-matrix Lax pairs.
5. Derive new asymptotic formulas by using a nonlinear version of the steepest descent method.
6. Construct disk and disk/black-hole solutions of the stationary axisymmetric Einstein equations.
7. Solve a BVP in Einstein's theory of relativity describing two colliding gravitational waves.
8. Extend the above methods to BVPs in higher dimensions.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ByoPiC
Project The Baryon Picture of the Cosmos
Researcher (PI) nabila AGHANIM
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2015-AdG
Summary The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Summary
The cosmological paradigm of structure formation is both extremely successful and plagued by many enigmas. Not only the nature of the main matter component, dark matter, shaping the structure skeleton in the form of a cosmic web, is mysterious; but also half of the ordinary matter (i.e. baryons) at late times of the cosmic history, remains unobserved, or hidden! ByoPiC focuses on this key and currently unresolved issue in astrophysics and cosmology: Where and how are half of the baryons hidden at late times? ByoPiC will answer that central question by detecting, mapping, and assessing the physical properties of hot ionised baryons at large cosmic scales and at late times. This will give a completely new picture of the cosmic web, added to its standard tracers, i.e. galaxies made of cold and dense baryons. To this end, ByoPiC will perform the first statistically consistent, joint analysis of complementary multiwavelength data: Planck observations tracing hot, ionised baryons via the Sunyaev-Zeldovich effect, optimally combined with optical and near infrared galaxy surveys as tracers of cold baryons. This joint analysis will rely on innovative statistical tools to recover all the (cross)information contained in these data in order to detect most of the hidden baryons in cosmic web elements such as (super)clusters and filaments. These newly detected elements will then be assembled to reconstruct the cosmic web as traced by both hot ionised baryons and galaxies. Thanks to that, ByoPiC will perform the most complete and detailed assessment of the census and contribution of hot ionised baryons to the total baryon budget, and identify the main physical processes driving their evolution in the cosmic web. Catalogues of new (super)clusters and filaments, and innovative tools, will be key deliverable products, allowing for an optimal preparation of future surveys.
Max ERC Funding
2 488 350 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CC-TOP
Project Cryosphere-Carbon on Top of the Earth (CC-Top):Decreasing Uncertainties of Thawing Permafrost and Collapsing Methane Hydrates in the Arctic
Researcher (PI) Örjan GUSTAFSSON
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary The enormous quantities of frozen carbon in the Arctic, held in shallow soils and sediments, act as “capacitors” of the global carbon system. Thawing permafrost (PF) and collapsing methane hydrates are top candidates to cause a net transfer of carbon from land/ocean to the atmosphere this century, yet uncertainties abound.
Our program targets the East Siberian Arctic Ocean (ESAO), the World’s largest shelf sea, as it holds 80% of coastal PF, 80% of subsea PF and 75% of shallow hydrates. Our initial findings (e.g., Science, 2010; Nature, 2012; PNAS; 2013; Nature Geoscience, 2013, 2014) are challenging earlier notions by showing complexities in terrestrial PF-Carbon remobilization and extensive venting of methane from subsea PF/hydrates. The objective of the CC-Top Program is to transform descriptive and data-lean pictures into quantitative understanding of the CC system, to pin down the present and predict future releases from these “Sleeping Giants” of the global carbon system.
The CC-Top program combines unique Arctic field capacities with powerful molecular-isotopic characterization of PF-carbon/methane to break through on:
The “awakening” of terrestrial PF-C pools: CC-Top will employ great pan-arctic rivers as natural integrators and by probing the δ13C/Δ14C and molecular fingerprints, apportion release fluxes of different PF-C pools.
The ESAO subsea cryosphere/methane: CC-Top will use recent spatially-extensive observations, deep sediment cores and gap-filling expeditions to (i) estimate distribution of subsea PF and hydrates; (ii) establish thermal state (thawing rate) of subsea PF-C; (iii) apportion sources of releasing methane btw subsea-PF, shallow hydrates vs seepage from the deep petroleum megapool using source-diagnostic triple-isotope fingerprinting.
Arctic Ocean slope hydrates: CC-Top will investigate sites (discovered by us 2008-2014) of collapsed hydrates venting methane, to characterize geospatial distribution and causes of destabilization.
Summary
The enormous quantities of frozen carbon in the Arctic, held in shallow soils and sediments, act as “capacitors” of the global carbon system. Thawing permafrost (PF) and collapsing methane hydrates are top candidates to cause a net transfer of carbon from land/ocean to the atmosphere this century, yet uncertainties abound.
Our program targets the East Siberian Arctic Ocean (ESAO), the World’s largest shelf sea, as it holds 80% of coastal PF, 80% of subsea PF and 75% of shallow hydrates. Our initial findings (e.g., Science, 2010; Nature, 2012; PNAS; 2013; Nature Geoscience, 2013, 2014) are challenging earlier notions by showing complexities in terrestrial PF-Carbon remobilization and extensive venting of methane from subsea PF/hydrates. The objective of the CC-Top Program is to transform descriptive and data-lean pictures into quantitative understanding of the CC system, to pin down the present and predict future releases from these “Sleeping Giants” of the global carbon system.
The CC-Top program combines unique Arctic field capacities with powerful molecular-isotopic characterization of PF-carbon/methane to break through on:
The “awakening” of terrestrial PF-C pools: CC-Top will employ great pan-arctic rivers as natural integrators and by probing the δ13C/Δ14C and molecular fingerprints, apportion release fluxes of different PF-C pools.
The ESAO subsea cryosphere/methane: CC-Top will use recent spatially-extensive observations, deep sediment cores and gap-filling expeditions to (i) estimate distribution of subsea PF and hydrates; (ii) establish thermal state (thawing rate) of subsea PF-C; (iii) apportion sources of releasing methane btw subsea-PF, shallow hydrates vs seepage from the deep petroleum megapool using source-diagnostic triple-isotope fingerprinting.
Arctic Ocean slope hydrates: CC-Top will investigate sites (discovered by us 2008-2014) of collapsed hydrates venting methane, to characterize geospatial distribution and causes of destabilization.
Max ERC Funding
2 499 756 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym CHAMPAGNE
Project Charge orders, Magnetism and Pairings in High Temperature Superconductors
Researcher (PI) Catherine, Marie, Elisabeth PEPIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE3, ERC-2015-AdG
Summary For nearly thirty years, the search for a room-temperature superconductor has focused on exotic materials known as cuprates, obtained by doping a parent Mott insulator, and which can carry currents without losing energy as heat at temperatures up to 164 Kelvin. Conventionally three main players were identified as being crucial i) the Mott insulating phase, ii) the anti-ferromagnetic order and iii) the superconducting (SC) phase. Recently a body of experimental probes suggested the presence of a fourth forgotten player, charge ordering-, as a direct competitor for superconductivity. In this project we propose that the relationship between charge ordering and superconductivity is more intimate than previously thought and is protected by an emerging SU(2) symmetry relating the two. The beauty of our theory resides in that it can be encapsulated in one simple and universal “gap equation”, which in contrast to strong coupling approaches used up to now, can easily be connected to experiments. In the first part of this work, we will refine the theoretical model in order to shape it for comparison with experiments and consistently test the SU(2) symmetry. In the second part of the work, we will search for the experimental signatures of our theory through a back and forth interaction with experimental groups. We expect our theory to generate new insights and experimental developments, and to lead to a major breakthrough if it correctly explains the origin of anomalous superconductivity in these materials.
Summary
For nearly thirty years, the search for a room-temperature superconductor has focused on exotic materials known as cuprates, obtained by doping a parent Mott insulator, and which can carry currents without losing energy as heat at temperatures up to 164 Kelvin. Conventionally three main players were identified as being crucial i) the Mott insulating phase, ii) the anti-ferromagnetic order and iii) the superconducting (SC) phase. Recently a body of experimental probes suggested the presence of a fourth forgotten player, charge ordering-, as a direct competitor for superconductivity. In this project we propose that the relationship between charge ordering and superconductivity is more intimate than previously thought and is protected by an emerging SU(2) symmetry relating the two. The beauty of our theory resides in that it can be encapsulated in one simple and universal “gap equation”, which in contrast to strong coupling approaches used up to now, can easily be connected to experiments. In the first part of this work, we will refine the theoretical model in order to shape it for comparison with experiments and consistently test the SU(2) symmetry. In the second part of the work, we will search for the experimental signatures of our theory through a back and forth interaction with experimental groups. We expect our theory to generate new insights and experimental developments, and to lead to a major breakthrough if it correctly explains the origin of anomalous superconductivity in these materials.
Max ERC Funding
1 318 145 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym chemech
Project From Chemical Bond Forces and Breakage to Macroscopic Fracture of Soft Materials
Researcher (PI) Costantino CRETON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary Soft materials are irreplaceable in engineering applications where large reversible deformations are needed, and in life sciences to mimic ever more closely or replace a variety of living tissues. While mechanical strength may not be essential for all applications, excessive brittleness is a strong limitation. Yet predicting if a soft material will be tough or brittle from its molecular composition or structure relies on empirical concepts due to the lack of proper tools to detect the damage occurring to the material before it breaks. Taking advantage of the recent advances in materials science and mechanochemistry, we propose a ground-breaking method to investigate the mechanisms of fracture of tough soft materials. To achieve this objective we will use a series of model materials containing a variable population of internal sacrificial bonds that break before the material fails macroscopically, and use a combination of advanced characterization techniques and molecular probes to map stress, strain, bond breakage and structure in a region ~100 µm in size ahead of the propagating crack. By using mechanoluminescent and mechanophore molecules incorporated in the model material in selected positions, confocal laser microscopy, digital image correlation and small-angle X-ray scattering we will gain an unprecedented molecular understanding of where and when bonds break as the material fails and the crack propagates, and will then be able to establish a direct relation between the architecture of soft polymer networks and their fracture energy, leading to a new molecular and multi-scale vision of macroscopic fracture of soft materials. Such advances will be invaluable to guide materials chemists to design and develop better and more finely tuned soft but tough and sometimes self-healing materials to replace living tissues (in bio engineering) and make lightweight tough and flexible parts for energy efficient transport.
Summary
Soft materials are irreplaceable in engineering applications where large reversible deformations are needed, and in life sciences to mimic ever more closely or replace a variety of living tissues. While mechanical strength may not be essential for all applications, excessive brittleness is a strong limitation. Yet predicting if a soft material will be tough or brittle from its molecular composition or structure relies on empirical concepts due to the lack of proper tools to detect the damage occurring to the material before it breaks. Taking advantage of the recent advances in materials science and mechanochemistry, we propose a ground-breaking method to investigate the mechanisms of fracture of tough soft materials. To achieve this objective we will use a series of model materials containing a variable population of internal sacrificial bonds that break before the material fails macroscopically, and use a combination of advanced characterization techniques and molecular probes to map stress, strain, bond breakage and structure in a region ~100 µm in size ahead of the propagating crack. By using mechanoluminescent and mechanophore molecules incorporated in the model material in selected positions, confocal laser microscopy, digital image correlation and small-angle X-ray scattering we will gain an unprecedented molecular understanding of where and when bonds break as the material fails and the crack propagates, and will then be able to establish a direct relation between the architecture of soft polymer networks and their fracture energy, leading to a new molecular and multi-scale vision of macroscopic fracture of soft materials. Such advances will be invaluable to guide materials chemists to design and develop better and more finely tuned soft but tough and sometimes self-healing materials to replace living tissues (in bio engineering) and make lightweight tough and flexible parts for energy efficient transport.
Max ERC Funding
2 251 026 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CHRiSHarMa
Project Commutators, Hilbert and Riesz transforms,Shifts, Harmonic extensions and Martingales
Researcher (PI) Stefanie Petermichl
Host Institution (HI) UNIVERSITE PAUL SABATIER TOULOUSE III
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Summary
This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Max ERC Funding
1 523 963 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CIRCUS
Project An end-to-end verification architecture for building Certified Implementations of Robust, Cryptographically Secure web applications
Researcher (PI) Karthikeyan Bhargavan
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Summary
The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Max ERC Funding
1 885 248 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CoBCoM
Project Computational Brain Connectivity Mapping
Researcher (PI) Rachid DERICHE
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Summary
One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Max ERC Funding
2 469 123 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COHEGRAPH
Project Electron quantum optics in Graphene
Researcher (PI) Séverin Preden Roulleau
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Summary
Quantum computing is based on the manipulation of quantum bits (qubits) to enhance the efficiency of information processing. In solid-state systems, two approaches have been explored:
• static qubits, coupled to quantum buses used for manipulation and information transmission,
• flying qubits which are mobile qubits propagating in quantum circuits for further manipulation.
Flying qubits research led to the recent emergence of the field of electron quantum optics, where electrons play the role of photons in quantum optic like experiments. This has recently led to the development of electronic quantum interferometry as well as single electron sources. As of yet, such experiments have only been successfully implemented in semi-conductor heterostructures cooled at extremely low temperatures. Realizing electron quantum optics experiments in graphene, an inexpensive material showing a high degree of quantum coherence even at moderately low temperatures, would be a strong evidence that quantum computing in graphene is within reach.
One of the most elementary building blocks necessary to perform electron quantum optics experiments is the electron beam splitter, which is the electronic analog of a beam splitter for light. However, the usual scheme for electron beam splitters in semi-conductor heterostructures is not available in graphene because of its gapless band structure. I propose a breakthrough in this direction where pn junction plays the role of electron beam splitter. This will lead to the following achievements considered as important steps towards quantum computing:
• electronic Mach Zehnder interferometry used to study the quantum coherence properties of graphene,
• two electrons Aharonov Bohm interferometry used to generate entangled states as an elementary quantum gate,
• the implementation of on-demand electronic sources in the GHz range for graphene flying qubits.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym COLORAMAP
Project Constrained Low-Rank Matrix Approximations: Theoretical and Algorithmic Developments for Practitioners
Researcher (PI) Nicolas Benoit P Gillis
Host Institution (HI) UNIVERSITE DE MONS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Summary
Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Max ERC Funding
1 291 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ComplexSwimmers
Project Biocompatible and Interactive Artificial Micro- and Nanoswimmers and Their Applications
Researcher (PI) Giovanni Volpe
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary Microswimmers, i.e., biological and artificial microscopic objects capable of self-propulsion, have been attracting a growing interest from the biological and physical communities. From the fundamental side, their study can shed light on the far-from-equilibrium physics underlying the adaptive and collective behavior of biological entities such as chemotactic bacteria and eukaryotic cells. From the more applied side, they provide tantalizing options to perform tasks not easily achievable with other available techniques, such as the targeted localization, pick-up and delivery of microscopic and nanoscopic cargoes, e.g., in drug delivery, bioremediation and chemical sensing.
However, there are still several open challenges that need to be tackled in order to achieve the full scientific and technological potential of microswimmers in real-life settings. The main challenges are: (1) to identify a biocompatible propulstion mechanism and energy supply capable of lasting for the whole particle life-cycle; (2) to understand their behavior in complex and crowded environments; (3) to learn how to engineer emergent behaviors; and (4) to scale down their dimensions towards the nanoscale.
This project aims at tackling these challenges by developing biocompatible microswimmers capable of elaborate behaviors, by engineering their performance when interacting with other particles and with a complex environment, and by developing working nanoswimmers.
To achieve these goals, we have laid out a roadmap that will lead us to push the frontiers of the current understanding of active matter both at the mesoscopic and at the nanoscopic scale, and will permit us to develop some technologically disruptive techniques, namely, targeted delivery of cargoes within complex environments, which is of interest for drug delivery and bioremediation, and efficient sorting of chiral nanoparticles, which is of interest for biomedical and pharmaceutical applications.
Summary
Microswimmers, i.e., biological and artificial microscopic objects capable of self-propulsion, have been attracting a growing interest from the biological and physical communities. From the fundamental side, their study can shed light on the far-from-equilibrium physics underlying the adaptive and collective behavior of biological entities such as chemotactic bacteria and eukaryotic cells. From the more applied side, they provide tantalizing options to perform tasks not easily achievable with other available techniques, such as the targeted localization, pick-up and delivery of microscopic and nanoscopic cargoes, e.g., in drug delivery, bioremediation and chemical sensing.
However, there are still several open challenges that need to be tackled in order to achieve the full scientific and technological potential of microswimmers in real-life settings. The main challenges are: (1) to identify a biocompatible propulstion mechanism and energy supply capable of lasting for the whole particle life-cycle; (2) to understand their behavior in complex and crowded environments; (3) to learn how to engineer emergent behaviors; and (4) to scale down their dimensions towards the nanoscale.
This project aims at tackling these challenges by developing biocompatible microswimmers capable of elaborate behaviors, by engineering their performance when interacting with other particles and with a complex environment, and by developing working nanoswimmers.
To achieve these goals, we have laid out a roadmap that will lead us to push the frontiers of the current understanding of active matter both at the mesoscopic and at the nanoscopic scale, and will permit us to develop some technologically disruptive techniques, namely, targeted delivery of cargoes within complex environments, which is of interest for drug delivery and bioremediation, and efficient sorting of chiral nanoparticles, which is of interest for biomedical and pharmaceutical applications.
Max ERC Funding
1 497 500 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COSMIC-DANCE
Project Unraveling the origin of the Initial Mass Function
Researcher (PI) Herve Bouy
Host Institution (HI) UNIVERSITE DE BORDEAUX
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Despite the tremendous progress achieved over the past decade, the study of stellar formation is far from complete. We have not yet measured the minimum mass for star formation, nor the shape of the IMF down to the least massive free-floating planets, or know how universal this shape is. Although clusters are the building blocks of galaxies, little is known about their early dynamical evolution and dispersal into the field. The main culprit for this state of affairs is the high level of contamination and incompleteness in the sub-stellar regime, even for the best photometric and astrometric surveys.
COSMIC-DANCE aims at overcoming these drawbacks and revealing the shape of the IMF with a precision and completeness surpassing current and foreseeable surveys of the next 15 years. We will:
1) Measure: using a groundbreaking, proven and so far unique method I designed, we will measure proper motions with an accuracy comparable to Gaia but 5 magnitudes deeper, reaching the planetary mass domain, and, critically, piercing through the dust obscured young clusters inaccessible to Gaia’s optical sensors.
2) Discover: feeding these proper motions and the multi-wavelength photometry to innovative hyper-dimensional data mining techniques, we will securely identify cluster members within the millions of sources of the COSMIC-DANCE database, complemented by Gaia at the bright end, to obtain the final census over the entire mass spectrum for 20 young nearby clusters, the end of a 60-year quest.
3) Understand: by providing conclusive empirical constraints over a broad parameter space unaccessible to current state-of-the-art surveys on the much debated respective contributions of evolutionary effects (dynamics, feedback and competitive accretion) and initial conditions (core properties) to the shape and bottom of the IMF, the most fundamental and informative product of star formation, with essential bearings on many areas of general astrophysics.
Summary
Despite the tremendous progress achieved over the past decade, the study of stellar formation is far from complete. We have not yet measured the minimum mass for star formation, nor the shape of the IMF down to the least massive free-floating planets, or know how universal this shape is. Although clusters are the building blocks of galaxies, little is known about their early dynamical evolution and dispersal into the field. The main culprit for this state of affairs is the high level of contamination and incompleteness in the sub-stellar regime, even for the best photometric and astrometric surveys.
COSMIC-DANCE aims at overcoming these drawbacks and revealing the shape of the IMF with a precision and completeness surpassing current and foreseeable surveys of the next 15 years. We will:
1) Measure: using a groundbreaking, proven and so far unique method I designed, we will measure proper motions with an accuracy comparable to Gaia but 5 magnitudes deeper, reaching the planetary mass domain, and, critically, piercing through the dust obscured young clusters inaccessible to Gaia’s optical sensors.
2) Discover: feeding these proper motions and the multi-wavelength photometry to innovative hyper-dimensional data mining techniques, we will securely identify cluster members within the millions of sources of the COSMIC-DANCE database, complemented by Gaia at the bright end, to obtain the final census over the entire mass spectrum for 20 young nearby clusters, the end of a 60-year quest.
3) Understand: by providing conclusive empirical constraints over a broad parameter space unaccessible to current state-of-the-art surveys on the much debated respective contributions of evolutionary effects (dynamics, feedback and competitive accretion) and initial conditions (core properties) to the shape and bottom of the IMF, the most fundamental and informative product of star formation, with essential bearings on many areas of general astrophysics.
Max ERC Funding
1 859 413 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym COSMO_SIMS
Project Astrophysics for the Dark Universe: Cosmological simulations in the context of dark matter and dark energy research
Researcher (PI) Oliver Jens Hahn
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Summary
The objective of this ambitious research proposal is to push forward the frontier of computational cosmology by significantly improving the precision of numerical models on par with the increasing richness and depth of surveys that aim to shed light on the nature of dark matter and dark energy.
Using new phase-space techniques for the simulation and analysis of dark matter, completely new insights into its dynamics are possible. They allow, for the first time, the accurate simulation of dark matter cosmologies with suppressed small-scale power without artificial fragmentation. Using such techniques, I will establish highly accurate predictions for the properties of dark matter and baryons on small scales and investigate the formation of the first galaxies in non-CDM cosmologies.
Baryonic effects on cosmological observables are a severe limiting factor in interpreting cosmological measurements. I will investigate their impact by identifying the relevant astrophysical processes in relation to the multi-wavelength properties of galaxy clusters and the galaxies they host. This will be enabled by a statistical set of zoom simulations where it is possible to study how these properties correlate with one another, with the assembly history, and how we can derive better models for unresolved baryonic processes in cosmological simulations and thus, ultimately, how we can improve the power of cosmological surveys.
Finally, I will develop a completely unified framework for precision cosmological initial conditions (ICs) that is scalable to both the largest simulations and the highest resolution zoom simulations. Bringing ICs into the ‘cloud’ will enable new statistical studies using zoom simulations and increase the reproducibility of simulations within the community.
My previous work in developing most of the underlying techniques puts me in an excellent position to lead a research group that is able to successfully approach such a wide-ranging and ambitious project.
Max ERC Funding
1 471 382 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COSMOKEMS
Project EXPERIMENTAL CONSTRAINTS ON THE ISOTOPE SIGNATURES OF THE EARLY SOLAR SYSTEM
Researcher (PI) bernard BOURDON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary This project aims at simulating the processes that took place in the early Solar System to determine how these processes shaped the chemical and isotope compositions of solids that accreted to ultimately form terrestrial planets. Planetary materials exhibit mass dependent and mass independent isotope signatures and their origin and relationships are not fully understood. This proposal will be based on new experiments reproducing the conditions of the solar nebula in its first few million years and on a newly designed Knudsen Effusion Mass Spectrometer (KEMS) that will be built for the purpose of this project. This project consists of three main subprojects: (1) we will simulate the effect of particle irradiation on solids to examine how isotopes can be fractionated by these processes to identify whether this can explain chemical variations in meteorites. We will examine whether particle irradiation can cause mass independent fractionation, (2) the novel KEMS instrument will be used to determine the equilibrium isotope fractionation associated with reactions between gas and condensed phases at high temperature. It will also be used to determine the kinetic isotope fractionation associated with evaporation and condensation of solids. This will provide new constraints on the thermodynamic conditions, T, P and fO2 during heating events that have modified the chemical composition of planetary materials. These constraints will also help identify the processes that cause the depletion in volatile elements and the fractionation in refractory elements observed in planetesimals and planets, (3) we will examine the effect of UV irradiation on chemical species in the vapour phase as an attempt to reproduce observed isotope compositions found in meteorites or their components. These results may radically change our view on how the protoplanetary disk evolved and how solids were transported and mixed.
Summary
This project aims at simulating the processes that took place in the early Solar System to determine how these processes shaped the chemical and isotope compositions of solids that accreted to ultimately form terrestrial planets. Planetary materials exhibit mass dependent and mass independent isotope signatures and their origin and relationships are not fully understood. This proposal will be based on new experiments reproducing the conditions of the solar nebula in its first few million years and on a newly designed Knudsen Effusion Mass Spectrometer (KEMS) that will be built for the purpose of this project. This project consists of three main subprojects: (1) we will simulate the effect of particle irradiation on solids to examine how isotopes can be fractionated by these processes to identify whether this can explain chemical variations in meteorites. We will examine whether particle irradiation can cause mass independent fractionation, (2) the novel KEMS instrument will be used to determine the equilibrium isotope fractionation associated with reactions between gas and condensed phases at high temperature. It will also be used to determine the kinetic isotope fractionation associated with evaporation and condensation of solids. This will provide new constraints on the thermodynamic conditions, T, P and fO2 during heating events that have modified the chemical composition of planetary materials. These constraints will also help identify the processes that cause the depletion in volatile elements and the fractionation in refractory elements observed in planetesimals and planets, (3) we will examine the effect of UV irradiation on chemical species in the vapour phase as an attempt to reproduce observed isotope compositions found in meteorites or their components. These results may radically change our view on how the protoplanetary disk evolved and how solids were transported and mixed.
Max ERC Funding
3 106 625 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym COSMOS
Project Semiparametric Inference for Complex and Structural Models in Survival Analysis
Researcher (PI) Ingrid VAN KEILEGOM
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Summary
In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Max ERC Funding
2 318 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CoVeCe
Project Coinduction for Verification and Certification
Researcher (PI) Damien Gabriel Jacques Pous
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Summary
Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Max ERC Funding
1 407 413 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CRESUCHIRP
Project Ultrasensitive Chirped-Pulse Fourier Transform mm-Wave Detection of Transient Species in Uniform Supersonic Flows for Reaction Kinetics Studies under Extreme Conditions
Researcher (PI) Ian SIMS
Host Institution (HI) UNIVERSITE DE RENNES I
Call Details Advanced Grant (AdG), PE4, ERC-2015-AdG
Summary This proposal aims to develop a combination of a chirped-pulse (sub)mm-wave rotational spectrometer with uniform supersonic flows generated by expansion of gases through Laval nozzles and apply it to problems at the frontiers of reaction kinetics.
The CRESU (Reaction Kinetics in Uniform Supersonic Flow) technique, combined with laser photochemical methods, has been applied with great success to perform research in gas-phase chemical kinetics at low temperatures, of particular interest for astrochemistry and cold planetary atmospheres. Recently, the PI has been involved in the development of a new combination of the revolutionary chirped pulse broadband rotational spectroscopy technique invented by B. Pate and co-workers with a novel pulsed CRESU, which we have called Chirped Pulse in Uniform Flow (CPUF). Rotational cooling by frequent collisions with cold buffer gas in the CRESU flow at ca. 20 K drastically increases the sensitivity of the technique, making broadband rotational spectroscopy suitable for detecting a wide range of transient species, such as photodissociation or reaction products.
We propose to exploit the exceptional quality of the Rennes CRESU flows to build an improved CPUF instrument (only the second worldwide), and use it for the quantitative determination of product branching ratios in elementary chemical reactions over a wide temperature range (data which are sorely lacking as input to models of gas-phase chemical environments), as well as the detection of reactive intermediates and the testing of modern reaction kinetics theory. Low temperature reactions will be initially targeted; as it is here that there is the greatest need for data. A challenging development of the technique towards the study of high temperature reactions is also proposed, exploiting existing expertise in high enthalpy sources.
Summary
This proposal aims to develop a combination of a chirped-pulse (sub)mm-wave rotational spectrometer with uniform supersonic flows generated by expansion of gases through Laval nozzles and apply it to problems at the frontiers of reaction kinetics.
The CRESU (Reaction Kinetics in Uniform Supersonic Flow) technique, combined with laser photochemical methods, has been applied with great success to perform research in gas-phase chemical kinetics at low temperatures, of particular interest for astrochemistry and cold planetary atmospheres. Recently, the PI has been involved in the development of a new combination of the revolutionary chirped pulse broadband rotational spectroscopy technique invented by B. Pate and co-workers with a novel pulsed CRESU, which we have called Chirped Pulse in Uniform Flow (CPUF). Rotational cooling by frequent collisions with cold buffer gas in the CRESU flow at ca. 20 K drastically increases the sensitivity of the technique, making broadband rotational spectroscopy suitable for detecting a wide range of transient species, such as photodissociation or reaction products.
We propose to exploit the exceptional quality of the Rennes CRESU flows to build an improved CPUF instrument (only the second worldwide), and use it for the quantitative determination of product branching ratios in elementary chemical reactions over a wide temperature range (data which are sorely lacking as input to models of gas-phase chemical environments), as well as the detection of reactive intermediates and the testing of modern reaction kinetics theory. Low temperature reactions will be initially targeted; as it is here that there is the greatest need for data. A challenging development of the technique towards the study of high temperature reactions is also proposed, exploiting existing expertise in high enthalpy sources.
Max ERC Funding
2 100 230 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym DARKJETS
Project Discovery strategies for Dark Matter and new phenomena in hadronic signatures with the ATLAS detector at the Large Hadron Collider
Researcher (PI) Caterina Doglioni
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary The Standard Model of Particle Physics describes the fundamental components of ordinary matter and their interactions. Despite its success in predicting many experimental results, the Standard Model fails to account for a number of interesting phenomena. One phenomenon of particular interest is the large excess of unobservable (Dark) matter in the Universe. This excess cannot be explained by Standard Model particles. A compelling hypothesis is that Dark Matter is comprised of particles that can be produced in the proton-proton collisions from the Large Hadron Collider (LHC) at CERN.
Within this project, I will build a team of researchers at Lund University dedicated to searches for signals of the presence of Dark Matter particles. The discovery strategies employed seek the decays of particles that either mediate the interactions between Dark and Standard Model particles or are produced in association with Dark Matter. These new particles manifest in detectors as two, three, or four collimated jets of particles (hadronic jets).
The LHC will resume delivery of proton-proton collisions to the ATLAS detector in 2015. Searches for new, rare, low mass particles such as Dark Matter mediators have so far been hindered by constraints on the rates of data that can be stored. These constraints will be overcome through the implementation of a novel real-time data analysis technique and a new search signature, both introduced to ATLAS by this project. The coincidence of this project with the upcoming LHC runs and the software and hardware improvements within the ATLAS detector is a unique opportunity to increase the sensitivity to hadronically decaying new particles by a large margin with respect to any previous searches. The results of these searches will be interpreted within a comprehensive and coherent set of theoretical benchmarks, highlighting the strengths of collider experiments in the global quest for Dark Matter.
Summary
The Standard Model of Particle Physics describes the fundamental components of ordinary matter and their interactions. Despite its success in predicting many experimental results, the Standard Model fails to account for a number of interesting phenomena. One phenomenon of particular interest is the large excess of unobservable (Dark) matter in the Universe. This excess cannot be explained by Standard Model particles. A compelling hypothesis is that Dark Matter is comprised of particles that can be produced in the proton-proton collisions from the Large Hadron Collider (LHC) at CERN.
Within this project, I will build a team of researchers at Lund University dedicated to searches for signals of the presence of Dark Matter particles. The discovery strategies employed seek the decays of particles that either mediate the interactions between Dark and Standard Model particles or are produced in association with Dark Matter. These new particles manifest in detectors as two, three, or four collimated jets of particles (hadronic jets).
The LHC will resume delivery of proton-proton collisions to the ATLAS detector in 2015. Searches for new, rare, low mass particles such as Dark Matter mediators have so far been hindered by constraints on the rates of data that can be stored. These constraints will be overcome through the implementation of a novel real-time data analysis technique and a new search signature, both introduced to ATLAS by this project. The coincidence of this project with the upcoming LHC runs and the software and hardware improvements within the ATLAS detector is a unique opportunity to increase the sensitivity to hadronically decaying new particles by a large margin with respect to any previous searches. The results of these searches will be interpreted within a comprehensive and coherent set of theoretical benchmarks, highlighting the strengths of collider experiments in the global quest for Dark Matter.
Max ERC Funding
1 268 076 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym DEMIURGE
Project Automatic Design of Robot Swarms
Researcher (PI) Mauro Birattari
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Summary
The scope of this project is the automatic design of robot swarms. Swarm robotics is an appealing approach to the coordination of large groups of robots. Up to now, robot swarms have been designed via some labor-intensive process.
My goal is to advance the state of the art in swarm robotics by developing the DEMIURGE: an intelligent system that is able to design and realize robot swarms in a totally integrated and automatic way
The DEMIURGE is a novel concept. Starting from requirements expressed in a specification language that I will define, the DEMIURGE will design all aspects of a robot swarm - hardware and control software.
The DEMIURGE will cast a design problem into an optimization problem and will tackle it in a computation-intensive way. In this project, I will study different control software structures, optimization algorithms, ways to specify requirements, validation protocols, on-line adaptation mechanisms and techniques for re-design at run time.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym DYNAMIQS
Project Relaxation dynamics in closed quantum systems
Researcher (PI) Marc Cheneau
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Summary
Statistical mechanics, a century-old theory, is probably one of the most powerful constructions of physics. It predicts that the equilibrium properties of any system composed of a large number of particles depend only on a handful of macroscopic parameters, no matter how the particles interact with each other. But the question of how many-body systems relax towards such equilibrium states remains largely unsolved. This problem is especially acute for quantum systems, which evolve in a much larger mathematical space than the classical space-time and obey non-local equations of motion. Despite the formidable complexity of quantum dynamics, recent theoretical advances have put forward a very simple picture: the dynamics of closed quantum many-body systems would be essentially local, meaning that it would take a finite time for correlations between two distant regions of space to reach their equilibrium value. This locality would be an emergent collective property, similar to spontaneous symmetry breaking, and have its origin in the propagation of quasiparticle excitations. The fact is, however, that only few observations directly confirm this scenario. In particular, the role played by the dimensionality and the interaction range is largely unknown. The concept of this project is to take advantage of the great versatility offered by ultracold atom systems to investigate experimentally the relaxation dynamics in regimes well beyond the boundaries of our current knowledge. We will focus our attention on two-dimensional systems with both short- and long-range interactions, when all previous experiments were bound to one-dimensional systems. The realisation of the project will hinge on the construction on a new-generation quantum gas microscope experiment for strontium gases. Amongst the innovative techniques that we will implement is the electronic state hybridisation with Rydberg states, called Rydberg dressing.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ECOHERB
Project Drivers and impacts of invertebrate herbivores across forest ecosystems globally.
Researcher (PI) Daniel Metcalfe
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Forests slow global climate change by absorbing atmospheric carbon dioxide but this ecosystem service is limited by soil nutrients. Herbivores potentially alter soil nutrients in a range of ways, but these have mostly only been recorded for large mammals. By comparison, the impacts of the abundant invertebrates in forests have largely been ignored and are not included in current models used to generate the climate predictions so vital for designing governmental policies
The proposed project will use a pioneering new interdisciplinary approach to provide the most complete picture yet available of the rates, underlying drivers and ultimate impacts of key nutrient inputs from invertebrate herbivores across forest ecosystems worldwide. Specifically, we will:
(1) Establish a network of herbivory monitoring stations across all major forest types, and across key environmental gradients (temperature, rainfall, ecosystem development).
(2) Perform laboratory experiments to examine the effects of herbivore excreta on soil processes under different temperature and moisture conditions.
(3) Integrate this information into a cutting-edge ecosystem model, to generate more accurate predictions of forest carbon sequestration under future climate change.
The network established will form the foundation for a unique long-term global monitoring effort which we intend to continue long after the current funding time scale. This work represents a powerful blend of several disciplines harnessing an array of cutting edge tools to provide fundamentally novel insights into an area of direct and urgent importance for the society.
Summary
Forests slow global climate change by absorbing atmospheric carbon dioxide but this ecosystem service is limited by soil nutrients. Herbivores potentially alter soil nutrients in a range of ways, but these have mostly only been recorded for large mammals. By comparison, the impacts of the abundant invertebrates in forests have largely been ignored and are not included in current models used to generate the climate predictions so vital for designing governmental policies
The proposed project will use a pioneering new interdisciplinary approach to provide the most complete picture yet available of the rates, underlying drivers and ultimate impacts of key nutrient inputs from invertebrate herbivores across forest ecosystems worldwide. Specifically, we will:
(1) Establish a network of herbivory monitoring stations across all major forest types, and across key environmental gradients (temperature, rainfall, ecosystem development).
(2) Perform laboratory experiments to examine the effects of herbivore excreta on soil processes under different temperature and moisture conditions.
(3) Integrate this information into a cutting-edge ecosystem model, to generate more accurate predictions of forest carbon sequestration under future climate change.
The network established will form the foundation for a unique long-term global monitoring effort which we intend to continue long after the current funding time scale. This work represents a powerful blend of several disciplines harnessing an array of cutting edge tools to provide fundamentally novel insights into an area of direct and urgent importance for the society.
Max ERC Funding
1 750 000 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym Emergent-BH
Project Emergent spacetime and maximally spinning black holes
Researcher (PI) Monica Guica
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Summary
One of the greatest challenges of theoretical physics is to understand the fundamental nature of gravity and how it is reconciled with quantum mechanics. Black holes indicate that gravity is holographic, i.e. it is emergent, together with some of the spacetime dimensions, from a lower-dimensional field theory. The emergence mechanism has just started to be understood in certain special contexts, such as AdS/CFT. However, very little is known about it for the spacetime backgrounds relevant to the real world, due mainly to our lack of knowledge of the underlying field theories.
My goal is to uncover the fundamental nature of spacetime and gravity in our universe by: i) formulating and working out the properties of the relevant lower-dimensional field theories and ii) studying the mechanism by which spacetime and gravity emerge from them. I will adress the first problem by concentrating on the near-horizon regions of maximally spinning black holes, for which the dual field theories greatly simplify and can be studied using a combination of conformal field theory and string theory methods. To study the emergence mechanism, I plan to adapt the tools that were succesfully used to understand emergent gravity in anti de-Sitter (AdS) spacetimes - such as holographic quantum entanglement and conformal bootstrap - to non-AdS, more realistic spacetimes.
Max ERC Funding
1 495 476 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym EUREC4A
Project Elucidating the Role of Clouds-Circulation Coupling in Climate
Researcher (PI) Sandrine Bony
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary This proposal focuses on two of climate science’s most fundamental questions: How sensitive is Earth's surface temperature to radiative forcing? and What governs the organization of the atmosphere into rain bands, cloud clusters and storms? These seemingly different questions are central to an ability to assess climate change on regional and global scales, and are in large part tied to a single and critical gap in our knowledge: A poor understanding of how clouds and atmospheric circulations interact.
To fill this gap, my goal is to answer three questions, which are critical to an understanding of cloud-circulation coupling and its role in climate: (i) How strongly is the low-clouds response to global warming controlled by atmospheric circulations within the first few kilometres of the atmosphere? (ii) What controls the propensity of the atmosphere to aggregate into clusters or rain bands, and what role does it play in the large-scale atmospheric circulation and in climate sensitivity? (iii) How much do cloud-radiative effects influence the frequency and strength of extreme events?
I will address these questions by organising the first airborne field campaign focused on elucidating the interplay between low-level clouds and the small-scale and large-scale circulations in which they are embedded, as this is key for questions (i) and (ii), by analysing data from other field campaigns and satellite observations, and by conducting targeted numerical experiments with a hierarchy of models and configurations.
This research stands a very good chance to reduce the primary source of the forty-year uncertainty in climate sensitivity, to demystify long-standing questions of tropical meteorology, and to advance the physical understanding and prediction of extreme events. EUREC4A will also support, motivate and train a team of young scientists to exploit the synergy between observational and modelling approaches to answer pressing questions of atmospheric and climate science.
Summary
This proposal focuses on two of climate science’s most fundamental questions: How sensitive is Earth's surface temperature to radiative forcing? and What governs the organization of the atmosphere into rain bands, cloud clusters and storms? These seemingly different questions are central to an ability to assess climate change on regional and global scales, and are in large part tied to a single and critical gap in our knowledge: A poor understanding of how clouds and atmospheric circulations interact.
To fill this gap, my goal is to answer three questions, which are critical to an understanding of cloud-circulation coupling and its role in climate: (i) How strongly is the low-clouds response to global warming controlled by atmospheric circulations within the first few kilometres of the atmosphere? (ii) What controls the propensity of the atmosphere to aggregate into clusters or rain bands, and what role does it play in the large-scale atmospheric circulation and in climate sensitivity? (iii) How much do cloud-radiative effects influence the frequency and strength of extreme events?
I will address these questions by organising the first airborne field campaign focused on elucidating the interplay between low-level clouds and the small-scale and large-scale circulations in which they are embedded, as this is key for questions (i) and (ii), by analysing data from other field campaigns and satellite observations, and by conducting targeted numerical experiments with a hierarchy of models and configurations.
This research stands a very good chance to reduce the primary source of the forty-year uncertainty in climate sensitivity, to demystify long-standing questions of tropical meteorology, and to advance the physical understanding and prediction of extreme events. EUREC4A will also support, motivate and train a team of young scientists to exploit the synergy between observational and modelling approaches to answer pressing questions of atmospheric and climate science.
Max ERC Funding
3 013 334 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym EVODIS
Project Exploiting vortices to suppress dispersion and reach new separation power boundaries
Researcher (PI) Wim De Malsche
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary The 21st century is expected to develop towards a society depending ever and ever more on (bio-)chemical measurements of fluids and matrices that are so complex they are well beyond the current analytical capabilities. Incremental improvements can no longer satisfy the current needs of e.g. the proteomics field, requiring the separation of tens of thousands of components. The pace of progress in these fields is therefore predominantly determined by that of analytical tools, whereby liquid chromatography is the most prominent technique to separate small molecules as well as macromolecules, based on differential interaction of each analyte with support structures giving it a unique migration velocity. To improve its performance, a faster transport between these structures needs to be generated. Unfortunately the commonly pursued strategy, relying on diffusion and reducing the structure size, has come to its limits due to practical limitations related to packing and fabrication of sub-micron support structures, pressure tolerance and viscous heating.
A ground-breaking step to advance chromatographic performance to another level would be to accelerate mass transport in the lateral direction, beyond the rate of diffusion only. To meet this requirement, an array of microstructures and local electrodes can be defined to create lateral electroosmotic vortices in a pressure-driven column, aiming to accelerate the local mass transfer in an anisotropic fashion. The achievement of ordered arrays of vortices is intimately linked to this requirement, which is also of broader importance for mixing, anti-fouling of membrane and reactor surfaces, enhanced mass transfer in reactor channels, emulsification, etc. Understanding and implementing anisotropic vortex flows will therefore not only revolutionize analytical and preparative separation procedures, but will also be highly relevant in all flow systems that benefit from enhanced mass transfer.
Summary
The 21st century is expected to develop towards a society depending ever and ever more on (bio-)chemical measurements of fluids and matrices that are so complex they are well beyond the current analytical capabilities. Incremental improvements can no longer satisfy the current needs of e.g. the proteomics field, requiring the separation of tens of thousands of components. The pace of progress in these fields is therefore predominantly determined by that of analytical tools, whereby liquid chromatography is the most prominent technique to separate small molecules as well as macromolecules, based on differential interaction of each analyte with support structures giving it a unique migration velocity. To improve its performance, a faster transport between these structures needs to be generated. Unfortunately the commonly pursued strategy, relying on diffusion and reducing the structure size, has come to its limits due to practical limitations related to packing and fabrication of sub-micron support structures, pressure tolerance and viscous heating.
A ground-breaking step to advance chromatographic performance to another level would be to accelerate mass transport in the lateral direction, beyond the rate of diffusion only. To meet this requirement, an array of microstructures and local electrodes can be defined to create lateral electroosmotic vortices in a pressure-driven column, aiming to accelerate the local mass transfer in an anisotropic fashion. The achievement of ordered arrays of vortices is intimately linked to this requirement, which is also of broader importance for mixing, anti-fouling of membrane and reactor surfaces, enhanced mass transfer in reactor channels, emulsification, etc. Understanding and implementing anisotropic vortex flows will therefore not only revolutionize analytical and preparative separation procedures, but will also be highly relevant in all flow systems that benefit from enhanced mass transfer.
Max ERC Funding
1 460 688 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym EXCITERS
Project Extreme Ultraviolet Circular Time-Resolved Spectroscopy
Researcher (PI) Yann Mairesse
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE2, ERC-2015-CoG
Summary Chiral molecules exist as two forms, so-called enantiomers, which have essentially the same physical and chemical properties and can only be distinguished via their interaction with a chiral system, such as circularly polarized light. Many biological processes are chiral-sensitive and unraveling the dynamical aspects of chirality is of prime importance for chemistry, biology and pharmacology. Studying the ultrafast electron dynamics of chiral processes requires characterization techniques at the attosecond (10−18 s) time-scale.
Molecular attosecond spectroscopy has the potential to resolve the couplings between electronic and nuclear degrees of freedom in such chiral chemical processes. There are, however, two major challenges: the generation of chiral attosecond light pulse, and the development of highly sensitive chiral discrimination techniques for time-resolved spectroscopy in the gas phase.
This ERC research project aims at developing vectorial attosecond spectroscopy using elliptical strong fields and circular attosecond pulses, and to apply it for the investigation of chiral molecules. To achieve this, I will (1) establish a new type of highly sensitive chiroptical spectroscopy using high-order harmonic generation by elliptical laser fields; (2) create and characterize sources of circular attosecond pulses; (3) use trains of circularly polarized attosecond pulses to probe the dynamics of photoionization of chiral molecules and (4) deploy ultrafast dynamical measurements to address the link between nuclear geometry and electronic chirality.
The developments from this project will set a landmark in the field of chiral recognition. They will also completely change the way ellipticity is considered in attosecond science and have an impact far beyond the study of chiral compounds, opening new perspectives for the resolution of the fastest dynamics occurring in polyatomic molecules and solid state physics.
Summary
Chiral molecules exist as two forms, so-called enantiomers, which have essentially the same physical and chemical properties and can only be distinguished via their interaction with a chiral system, such as circularly polarized light. Many biological processes are chiral-sensitive and unraveling the dynamical aspects of chirality is of prime importance for chemistry, biology and pharmacology. Studying the ultrafast electron dynamics of chiral processes requires characterization techniques at the attosecond (10−18 s) time-scale.
Molecular attosecond spectroscopy has the potential to resolve the couplings between electronic and nuclear degrees of freedom in such chiral chemical processes. There are, however, two major challenges: the generation of chiral attosecond light pulse, and the development of highly sensitive chiral discrimination techniques for time-resolved spectroscopy in the gas phase.
This ERC research project aims at developing vectorial attosecond spectroscopy using elliptical strong fields and circular attosecond pulses, and to apply it for the investigation of chiral molecules. To achieve this, I will (1) establish a new type of highly sensitive chiroptical spectroscopy using high-order harmonic generation by elliptical laser fields; (2) create and characterize sources of circular attosecond pulses; (3) use trains of circularly polarized attosecond pulses to probe the dynamics of photoionization of chiral molecules and (4) deploy ultrafast dynamical measurements to address the link between nuclear geometry and electronic chirality.
The developments from this project will set a landmark in the field of chiral recognition. They will also completely change the way ellipticity is considered in attosecond science and have an impact far beyond the study of chiral compounds, opening new perspectives for the resolution of the fastest dynamics occurring in polyatomic molecules and solid state physics.
Max ERC Funding
1 691 865 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ExCoMet
Project CONTROLLING AND MEASURING RELATIVISTIC MOTION OF MATTER WITH ULTRAINTENSE STRUCTURED LIGHT
Researcher (PI) Fabien, Hervé, Jean QUERE
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Femtosecond lasers can now provide intensities such that the light field induces relativistic motion of large ensembles of electrons. The ultimate goal of this Ultra-High Intensity (UHI) Physics is the control of relativistic motion of matter with light, which requires a deep understanding of this extreme regime of laser-matter interaction. Such a control holds the promise of major scientific and societal applications, by providing ultra-compact laser-driven particle accelerators and attosecond X-ray sources. Until now, advances in UHI Physics have relied on a quest for the highest laser intensities, pursued by focusing optimally-compressed laser pulses to their diffraction limit. In contrast, the goal of the ExCoMet project is to establish a new paradigm, by demonstrating the potential of driving UHI laser plasma-interactions with sophisticated structured laser beams–i.e. beams whose amplitude, phase or polarization are shaped in space-time.
Based on this new paradigm, we will show that unprecedented experimental insight can be gained on UHI laser-matter interactions. For instance, by using laser fields whose propagation direction rotates on a femtosecond time scale, we will temporally resolve the synchrotron emission of laser-driven relativistic electrons in plasmas, and thus gather direct information on their dynamics. We will also show that such structured laser fields can be exploited to introduce new physics in UHI experiments, and can provide advanced degrees of control that will be essential for future light and particles sources based on these interactions. Using Laguerre-Gauss beams, we will in particular investigate the transfer of orbital angular momentum from UHI lasers to plasmas, and its consequences on the physics and performances of laser-plasma accelerators. This project thus aims at bringing conceptual breakthroughs in UHI physics, at a time where major projects relying on this physics are being launched, in particular in Europe.
Summary
Femtosecond lasers can now provide intensities such that the light field induces relativistic motion of large ensembles of electrons. The ultimate goal of this Ultra-High Intensity (UHI) Physics is the control of relativistic motion of matter with light, which requires a deep understanding of this extreme regime of laser-matter interaction. Such a control holds the promise of major scientific and societal applications, by providing ultra-compact laser-driven particle accelerators and attosecond X-ray sources. Until now, advances in UHI Physics have relied on a quest for the highest laser intensities, pursued by focusing optimally-compressed laser pulses to their diffraction limit. In contrast, the goal of the ExCoMet project is to establish a new paradigm, by demonstrating the potential of driving UHI laser plasma-interactions with sophisticated structured laser beams–i.e. beams whose amplitude, phase or polarization are shaped in space-time.
Based on this new paradigm, we will show that unprecedented experimental insight can be gained on UHI laser-matter interactions. For instance, by using laser fields whose propagation direction rotates on a femtosecond time scale, we will temporally resolve the synchrotron emission of laser-driven relativistic electrons in plasmas, and thus gather direct information on their dynamics. We will also show that such structured laser fields can be exploited to introduce new physics in UHI experiments, and can provide advanced degrees of control that will be essential for future light and particles sources based on these interactions. Using Laguerre-Gauss beams, we will in particular investigate the transfer of orbital angular momentum from UHI lasers to plasmas, and its consequences on the physics and performances of laser-plasma accelerators. This project thus aims at bringing conceptual breakthroughs in UHI physics, at a time where major projects relying on this physics are being launched, in particular in Europe.
Max ERC Funding
2 250 000 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym FACTORY
Project New paradigms for latent factor estimation
Researcher (PI) Cédric Févotte
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Data is often available in matrix form, in which columns are samples, and processing of such data often entails finding an approximate factorisation of the matrix in two factors. The first factor yields recurring patterns characteristic of the data. The second factor describes in which proportions each data sample is made of these patterns. Latent factor estimation (LFE) is the problem of finding such a factorisation, usually under given constraints. LFE appears under other domain-specific names such as dictionary learning, low-rank approximation, factor analysis or latent semantic analysis. It is used for tasks such as dimensionality reduction, unmixing, soft clustering, coding or matrix completion in very diverse fields.
In this project, I propose to explore three new paradigms that push the frontiers of traditional LFE. First, I want to break beyond the ubiquitous Gaussian assumption, a practical choice that too rarely complies with the nature and geometry of the data. Estimation in non-Gaussian models is more difficult, but recent work in audio and text processing has shown that it pays off in practice. Second, in traditional settings the data matrix is often a collection of features computed from raw data. These features are computed with generic off-the-shelf transforms that loosely preprocess the data, setting a limit to performance. I propose a new paradigm in which an optimal low-rank inducing transform is learnt together with the factors in a single step. Thirdly, I show that the dominant deterministic approach to LFE should be reconsidered and I propose a novel statistical estimation paradigm, based on the marginal likelihood, with enhanced capabilities. The new methodology is applied to real-world problems with societal impact in audio signal processing (speech enhancement, music remastering), remote sensing (Earth observation, cosmic object discovery) and data mining (multimodal information retrieval, user recommendation).
Summary
Data is often available in matrix form, in which columns are samples, and processing of such data often entails finding an approximate factorisation of the matrix in two factors. The first factor yields recurring patterns characteristic of the data. The second factor describes in which proportions each data sample is made of these patterns. Latent factor estimation (LFE) is the problem of finding such a factorisation, usually under given constraints. LFE appears under other domain-specific names such as dictionary learning, low-rank approximation, factor analysis or latent semantic analysis. It is used for tasks such as dimensionality reduction, unmixing, soft clustering, coding or matrix completion in very diverse fields.
In this project, I propose to explore three new paradigms that push the frontiers of traditional LFE. First, I want to break beyond the ubiquitous Gaussian assumption, a practical choice that too rarely complies with the nature and geometry of the data. Estimation in non-Gaussian models is more difficult, but recent work in audio and text processing has shown that it pays off in practice. Second, in traditional settings the data matrix is often a collection of features computed from raw data. These features are computed with generic off-the-shelf transforms that loosely preprocess the data, setting a limit to performance. I propose a new paradigm in which an optimal low-rank inducing transform is learnt together with the factors in a single step. Thirdly, I show that the dominant deterministic approach to LFE should be reconsidered and I propose a novel statistical estimation paradigm, based on the marginal likelihood, with enhanced capabilities. The new methodology is applied to real-world problems with societal impact in audio signal processing (speech enhancement, music remastering), remote sensing (Earth observation, cosmic object discovery) and data mining (multimodal information retrieval, user recommendation).
Max ERC Funding
1 931 776 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym FLUDYCO
Project Fluid dynamics of planetary cores: formation, heterogeneous convection and rotational dynamics
Researcher (PI) Michael Le Bars
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2015-CoG
Summary Understanding the flows in planetary cores from their formation to their current dynamics is a tremendous interdisciplinary challenge. Beyond the challenge in fundamental fluid dynamics to understand these extraordinary flows involving turbulence, rotation and buoyancy at typical scales well beyond our day-to-day experience, a global knowledge of the involved processes is fundamental to a better understanding of the initial state of planets, of their thermal and orbital evolution, and of magnetic field generation, all key ingredients for habitability. The purpose of the present project is to go beyond the state-of-the-art in tackling three barriers at the current frontier of knowledge. It combines groundbreaking laboratory experiments, complementary pioneering numerical simulations, and fruitful collaborations with leaders in various fields of planetary sciences. Improving on the latest advances in the field, I will address the fluid dynamics of iron fragmentation during the later stages of planetary accretion, in order to produce innovative, dynamically reliable models of planet formation. Considering the latest published data for Earth, I will investigate the flows driven in a stratified layer at the top of a liquid core and their influence on the global convective dynamics and related dynamo. Finally, building upon the recent emergence of alternative models for core dynamics, I will quantitatively examine the non-linear saturation and turbulent state of the flows driven by libration, as well as the shape and intensity of the corresponding dynamo. In the context of an international competition, the originality of my work comes from its multi-method and interdisciplinary character, building upon my successful past researches. Beyond scientific advances, this high-risk/high-gain project will benefit to a larger community through the dissemination of experimental and numerical improvements, and allow promoting science through an original outreach program.
Summary
Understanding the flows in planetary cores from their formation to their current dynamics is a tremendous interdisciplinary challenge. Beyond the challenge in fundamental fluid dynamics to understand these extraordinary flows involving turbulence, rotation and buoyancy at typical scales well beyond our day-to-day experience, a global knowledge of the involved processes is fundamental to a better understanding of the initial state of planets, of their thermal and orbital evolution, and of magnetic field generation, all key ingredients for habitability. The purpose of the present project is to go beyond the state-of-the-art in tackling three barriers at the current frontier of knowledge. It combines groundbreaking laboratory experiments, complementary pioneering numerical simulations, and fruitful collaborations with leaders in various fields of planetary sciences. Improving on the latest advances in the field, I will address the fluid dynamics of iron fragmentation during the later stages of planetary accretion, in order to produce innovative, dynamically reliable models of planet formation. Considering the latest published data for Earth, I will investigate the flows driven in a stratified layer at the top of a liquid core and their influence on the global convective dynamics and related dynamo. Finally, building upon the recent emergence of alternative models for core dynamics, I will quantitatively examine the non-linear saturation and turbulent state of the flows driven by libration, as well as the shape and intensity of the corresponding dynamo. In the context of an international competition, the originality of my work comes from its multi-method and interdisciplinary character, building upon my successful past researches. Beyond scientific advances, this high-risk/high-gain project will benefit to a larger community through the dissemination of experimental and numerical improvements, and allow promoting science through an original outreach program.
Max ERC Funding
1 992 602 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym FOVEDIS
Project Formal specification and verification of distributed data structures
Researcher (PI) Constantin Enea
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Summary
The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Max ERC Funding
1 300 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym HEXTREME
Project Hexahedral mesh generation in real time
Researcher (PI) Jean-François REMACLE
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Summary
Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Max ERC Funding
2 244 238 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym High-Spin-Grav
Project Higher Spin Gravity and Generalized Spacetime Geometry
Researcher (PI) Marc HENNEAUX
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Extensions of Einstein’s gravity containing higher spin gauge fields (massless fields with spin greater than two) constitute a very active and challenging field of research, raising many fascinating issues and questions in different areas of physics. However, in spite of the impressive achievements already in store, it is fair to say that higher spin gravity has not delivered its full potential yet and still faces a rich number of challenges, both conceptual and technical. The objective of this proposal is to deepen our understanding of higher spin gravity, following five interconnected central themes that will constitute the backbone of the project: (i) how to construct an action principle; (ii) how to understand the generalized space-time geometry invariant under the higher-spin gauge symmetry – a key fundamental issue in the project; (iii) what is the precise asymptotic structure of the theory at infinity; (iv) what is the connection of the higher spin algebras with the hidden symmetries of gravitational theories; (v) what are the implications of hypersymmetry, which is the higher-spin version of supersymmetry. Holography in three and higher dimensions will constitute an essential tool.
One of the motivations of the project is the connection of higher spin gravity with tensionless string theory and consistent theories of quantum gravity.
Summary
Extensions of Einstein’s gravity containing higher spin gauge fields (massless fields with spin greater than two) constitute a very active and challenging field of research, raising many fascinating issues and questions in different areas of physics. However, in spite of the impressive achievements already in store, it is fair to say that higher spin gravity has not delivered its full potential yet and still faces a rich number of challenges, both conceptual and technical. The objective of this proposal is to deepen our understanding of higher spin gravity, following five interconnected central themes that will constitute the backbone of the project: (i) how to construct an action principle; (ii) how to understand the generalized space-time geometry invariant under the higher-spin gauge symmetry – a key fundamental issue in the project; (iii) what is the precise asymptotic structure of the theory at infinity; (iv) what is the connection of the higher spin algebras with the hidden symmetries of gravitational theories; (v) what are the implications of hypersymmetry, which is the higher-spin version of supersymmetry. Holography in three and higher dimensions will constitute an essential tool.
One of the motivations of the project is the connection of higher spin gravity with tensionless string theory and consistent theories of quantum gravity.
Max ERC Funding
1 841 868 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym ICARUS
Project Towards Innovative cost-effective astronomical instrumentation
Researcher (PI) Emmanuel Hugot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary Enabling disruptive technologies has always been crucial to trigger revolutionary science discoveries. The daring challenges in astronomy and astrophysics are extremely demanding in terms of high angular resolution and high contrast imaging, and require extreme stability and image quality. Instruments based on current classical designs tend to get bigger and more complex, and are faced to ever increasing difficulties to meet science requirements.
This proposal has the clear objective to propose breakthrough compact optical architectures for the next generation of giant observatories. The project focus on the niche of active components and is structured in two main research pillars to (I) enable the use of additive manufacturing (3D-printing) to produce affordable deformable mirrors for VIS or NIR observations, (II) pave the road for a common use of curved and deformable detectors. Extensive finite element analysis will allow to cover the parameter space and broad prototyping will demonstrate and characterize the performance of such devices.
Both pillars are extremely challenging, the fields of detectors and optical fabrication being driven by the market. We will then orientate the activities towards a mass production method.
To maximize the impact of this high gain R&D, the pillars are surrounded by two transverse activities: (i) design and optimization of a new zoo of optical systems using active mirrors and flexible detectors, and (ii) build a solid plan of technology transfer to end-user industrial companies, through a patenting and licensing strategy, to maximize the financial return and then perpetuate the activities.
The pathway proposed here is mandatory to develop affordable components in the near future, and will enable compact and high performance instrumentation. These high potential activities will dramatically reduce the complexity of instruments in the era of giant observatories, simplify the operability of systems and offer increased performance.
Summary
Enabling disruptive technologies has always been crucial to trigger revolutionary science discoveries. The daring challenges in astronomy and astrophysics are extremely demanding in terms of high angular resolution and high contrast imaging, and require extreme stability and image quality. Instruments based on current classical designs tend to get bigger and more complex, and are faced to ever increasing difficulties to meet science requirements.
This proposal has the clear objective to propose breakthrough compact optical architectures for the next generation of giant observatories. The project focus on the niche of active components and is structured in two main research pillars to (I) enable the use of additive manufacturing (3D-printing) to produce affordable deformable mirrors for VIS or NIR observations, (II) pave the road for a common use of curved and deformable detectors. Extensive finite element analysis will allow to cover the parameter space and broad prototyping will demonstrate and characterize the performance of such devices.
Both pillars are extremely challenging, the fields of detectors and optical fabrication being driven by the market. We will then orientate the activities towards a mass production method.
To maximize the impact of this high gain R&D, the pillars are surrounded by two transverse activities: (i) design and optimization of a new zoo of optical systems using active mirrors and flexible detectors, and (ii) build a solid plan of technology transfer to end-user industrial companies, through a patenting and licensing strategy, to maximize the financial return and then perpetuate the activities.
The pathway proposed here is mandatory to develop affordable components in the near future, and will enable compact and high performance instrumentation. These high potential activities will dramatically reduce the complexity of instruments in the era of giant observatories, simplify the operability of systems and offer increased performance.
Max ERC Funding
1 747 667 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym IMPaCT
Project Implementing Multi-Party Computation Technology
Researcher (PI) Nigel Paul Smart
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary The goal of IMPaCT is to turn Multi-Party Computation (MPC) from a stage in which we are beginning to obtain practical feasibility results, to a stage in which we have fully practical systems. It has long been acknowledged that MPC has the potential to provide a transformative change in the way security solutions are enabled. As it presently stands this is currently only possible in limited applications; deployments in restricted scenarios are beginning to emerge. However, in turning MPC into a fully practical technology a number of key scientific challenges need to be solved; many of which have not yet even been considered in the theoretical literature. The IMPaCT project aims to address this scientific gap, bridge it, and so provide the tools for a future road-map in which MPC can be deployed as a widespread tool, as ubiquitous as encryption and digital signatures are today.
Our scientific approach will be to investigate new MPC protocols and techniques which take into account practical constraints and issues which would arise in future application scenarios. Our work, despite being scientifically rigorous and driven from deep theoretical insight, will be grounded in practical considerations. All systems and protocols proposed will be prototyped so as to ensure that practical real world issues are taken into account. In addition we will use our extensive industrial linkages to ensure a two way dialogue between potential users and the developers of MPC technology; thus helping to embed future impact of the work in IMPaCT.
Our workplan is focused around key scientific challenges which we have identified on the road to fully practical MPC applications. These include the design of methodologies to cope with the asynchronicity of networks, how to realistically measure and model MPC protocols performance, how to utilize low round complexity protocols in practice, how to deal with problems with large input sizes (e.g. streaming data), and many more.
Summary
The goal of IMPaCT is to turn Multi-Party Computation (MPC) from a stage in which we are beginning to obtain practical feasibility results, to a stage in which we have fully practical systems. It has long been acknowledged that MPC has the potential to provide a transformative change in the way security solutions are enabled. As it presently stands this is currently only possible in limited applications; deployments in restricted scenarios are beginning to emerge. However, in turning MPC into a fully practical technology a number of key scientific challenges need to be solved; many of which have not yet even been considered in the theoretical literature. The IMPaCT project aims to address this scientific gap, bridge it, and so provide the tools for a future road-map in which MPC can be deployed as a widespread tool, as ubiquitous as encryption and digital signatures are today.
Our scientific approach will be to investigate new MPC protocols and techniques which take into account practical constraints and issues which would arise in future application scenarios. Our work, despite being scientifically rigorous and driven from deep theoretical insight, will be grounded in practical considerations. All systems and protocols proposed will be prototyped so as to ensure that practical real world issues are taken into account. In addition we will use our extensive industrial linkages to ensure a two way dialogue between potential users and the developers of MPC technology; thus helping to embed future impact of the work in IMPaCT.
Our workplan is focused around key scientific challenges which we have identified on the road to fully practical MPC applications. These include the design of methodologies to cope with the asynchronicity of networks, how to realistically measure and model MPC protocols performance, how to utilize low round complexity protocols in practice, how to deal with problems with large input sizes (e.g. streaming data), and many more.
Max ERC Funding
2 499 938 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym IMPACT
Project The giant impact and the Earth and Moon formation
Researcher (PI) Razvan Caracas
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Very little is understood of the physics governing the Giant Impact and the subsequent formation of the Moon. According to this model an impactor hit the proto-Earth; the resulting energy was enough to melt and partially vaporize the two bodies generating a large protolunar disk, from which the Earth-Moon couple formed. Hydrodynamic simulations of the impact and the subsequent evolution of the protolunar disk are currently based on models of equations of state and phase diagrams that are unconstrained by experiments or calculations. Estimates of the positions of critical points, when available at all, vary by one order of magnitude in both temperature and density. Here we propose to compute the thermodynamics of the major rock-forming minerals and rock aggregates, and use it to study the formation and evolution of the protolunar disk. For this we employ a unique combination of atomistic state-of-the-art ab initio simulations. We use large-scale density-functional theory (DFT) molecular dynamics to study bulk fluids, coupled with Green functions (GW) and time-dependent DFT techniques to analyze atomic clusters and molecular species. We compute the vaporization curves, position the supercritical points, and characterize the sub-critical and supercritical regimes. We construct equations of state of the rocks at the conditions of the giant impact that are beyond current experimental capabilities. We employ a multiscale approach to bridge the gap between atomic, geological sample, and planetary scales via thermodynamics; we simulate the thermal profile through the disk, the ratio between liquid and vapor, and the speciation. From speciation we predict elemental and isotopic partitioning during condensation. Plausible impact scenarios, features of the impactor and of the proto-Earth will be constrained with a feedback loop, until convergence between predictions of final Earth-Moon compositions and observations is reached.
Summary
Very little is understood of the physics governing the Giant Impact and the subsequent formation of the Moon. According to this model an impactor hit the proto-Earth; the resulting energy was enough to melt and partially vaporize the two bodies generating a large protolunar disk, from which the Earth-Moon couple formed. Hydrodynamic simulations of the impact and the subsequent evolution of the protolunar disk are currently based on models of equations of state and phase diagrams that are unconstrained by experiments or calculations. Estimates of the positions of critical points, when available at all, vary by one order of magnitude in both temperature and density. Here we propose to compute the thermodynamics of the major rock-forming minerals and rock aggregates, and use it to study the formation and evolution of the protolunar disk. For this we employ a unique combination of atomistic state-of-the-art ab initio simulations. We use large-scale density-functional theory (DFT) molecular dynamics to study bulk fluids, coupled with Green functions (GW) and time-dependent DFT techniques to analyze atomic clusters and molecular species. We compute the vaporization curves, position the supercritical points, and characterize the sub-critical and supercritical regimes. We construct equations of state of the rocks at the conditions of the giant impact that are beyond current experimental capabilities. We employ a multiscale approach to bridge the gap between atomic, geological sample, and planetary scales via thermodynamics; we simulate the thermal profile through the disk, the ratio between liquid and vapor, and the speciation. From speciation we predict elemental and isotopic partitioning during condensation. Plausible impact scenarios, features of the impactor and of the proto-Earth will be constrained with a feedback loop, until convergence between predictions of final Earth-Moon compositions and observations is reached.
Max ERC Funding
1 900 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ISOREE
Project New insight into the origin of the Earth, its bulk composition and its early evolution
Researcher (PI) Maud Boyet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary The main geochemical features of the mantles of terrestrial planets and asteroids can be attributed to differentiation events that occurred during or shortly after the formation of the Solar System. Numerous questions remain regarding the Earth’s bulk composition and the most likely scenario for its evolution prior to the last major differentiation event caused by a giant impact leading to the formation of the Moon. The aim of this five-year project is to evaluate the state-of-the-art models of the Earth’s early evolution with the following main objectives: (i) Defining precisely the age of the Moon’s formation, (ii) Refining the giant impact model and the Earth-Moon relationship, (iii) Dating the successive magmatic ocean stages on Earth, and (iv) Constraining the Earth mantle’s composition in terms of rare earth element concentrations. These different questions will be addressed using trace elements, radiogenic isotopic systematics (146Sm-142Nd, 147Sm-143Nd, 138La-138Ce) and stable isotopes. ISOREE is a multi-disciplinary project that combines isotope and trace element geochemistry, experimental geochemistry and spectroscopy. A large number of samples will be analysed, including terrestrial rocks with ages up to 3.8 Ga, chondrites, achondrites and lunar samples.
This proposal will provide the tools to tackle a vast topic from various angles, using new methodologies and instrumentation and promoting innovation and creativity in European research. This research program is essential to further constrain the major events that occurred very early on in the Earth’s history, such as the Earth’s cooling, its crustal growth, the surface conditions and development of potential habitats for life.
Summary
The main geochemical features of the mantles of terrestrial planets and asteroids can be attributed to differentiation events that occurred during or shortly after the formation of the Solar System. Numerous questions remain regarding the Earth’s bulk composition and the most likely scenario for its evolution prior to the last major differentiation event caused by a giant impact leading to the formation of the Moon. The aim of this five-year project is to evaluate the state-of-the-art models of the Earth’s early evolution with the following main objectives: (i) Defining precisely the age of the Moon’s formation, (ii) Refining the giant impact model and the Earth-Moon relationship, (iii) Dating the successive magmatic ocean stages on Earth, and (iv) Constraining the Earth mantle’s composition in terms of rare earth element concentrations. These different questions will be addressed using trace elements, radiogenic isotopic systematics (146Sm-142Nd, 147Sm-143Nd, 138La-138Ce) and stable isotopes. ISOREE is a multi-disciplinary project that combines isotope and trace element geochemistry, experimental geochemistry and spectroscopy. A large number of samples will be analysed, including terrestrial rocks with ages up to 3.8 Ga, chondrites, achondrites and lunar samples.
This proposal will provide the tools to tackle a vast topic from various angles, using new methodologies and instrumentation and promoting innovation and creativity in European research. This research program is essential to further constrain the major events that occurred very early on in the Earth’s history, such as the Earth’s cooling, its crustal growth, the surface conditions and development of potential habitats for life.
Max ERC Funding
2 200 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym KERNEL
Project Ultimate Angular Resolution Astrophysics with kernel-phase and full-aperture interferometry
Researcher (PI) Frantz Martinache
Host Institution (HI) OBSERVATOIRE DE LA COTE D'AZUR (OCA)
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Astronomy requires large telescopes to improve the sensitivity and the angular resolution of its observations. Of these qualities, angular resolution is the most difficult to maintain in the optical and near-infrared, since the atmosphere reduces it to that of a 10 cm aperture, regardless of the telescope size. On the one-hand, Adaptive Optics (AO) actively compensates for this effect but the improvement is often partial only. On the other hand, interferometric techniques (most notably sparse aperture masking interferometry) passively allow the extraction of self-calibrating observables, that boost the angular resolution, but severely affect the sensitivity of observations. A framework newly established by the PI of the proposal however now makes it possible to extract generalized self-calibrating observables called kernel-phases from conventional AO-corrected images. The work outlined in this proposal will make it possible to scientifically exploit the high angular resolution imaging capability of this technique, to improve its robustness and to expand its capabilities. The framework offers a very general purpose high angular resolution imaging tool for astronomers as well as wavefront control experts. This proposal is organized in five work-packages of increasing challenge that include: the reinterpretation of existing archival data
with a super-resolution capability, the expansion of its robustness to open up new more challenging use-cases, a special focus on the development of a very high-dynamic range mode, the adaptation of interferometric image reconstruction techniques, and the development of new advanced AO concepts. The consequences of this project will have a major impact on the design and scientific exploitation of future high angular resolution instrumentation on the existing generation of 8-10 meter class telescopes as well as on the upcoming generation of 30-40 meter giants, championned by Europe and its E-ELT.
Summary
Astronomy requires large telescopes to improve the sensitivity and the angular resolution of its observations. Of these qualities, angular resolution is the most difficult to maintain in the optical and near-infrared, since the atmosphere reduces it to that of a 10 cm aperture, regardless of the telescope size. On the one-hand, Adaptive Optics (AO) actively compensates for this effect but the improvement is often partial only. On the other hand, interferometric techniques (most notably sparse aperture masking interferometry) passively allow the extraction of self-calibrating observables, that boost the angular resolution, but severely affect the sensitivity of observations. A framework newly established by the PI of the proposal however now makes it possible to extract generalized self-calibrating observables called kernel-phases from conventional AO-corrected images. The work outlined in this proposal will make it possible to scientifically exploit the high angular resolution imaging capability of this technique, to improve its robustness and to expand its capabilities. The framework offers a very general purpose high angular resolution imaging tool for astronomers as well as wavefront control experts. This proposal is organized in five work-packages of increasing challenge that include: the reinterpretation of existing archival data
with a super-resolution capability, the expansion of its robustness to open up new more challenging use-cases, a special focus on the development of a very high-dynamic range mode, the adaptation of interferometric image reconstruction techniques, and the development of new advanced AO concepts. The consequences of this project will have a major impact on the design and scientific exploitation of future high angular resolution instrumentation on the existing generation of 8-10 meter class telescopes as well as on the upcoming generation of 30-40 meter giants, championned by Europe and its E-ELT.
Max ERC Funding
1 717 811 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym LEASP
Project Learning spatiotemporal patterns in longitudinal image data sets of the aging brain
Researcher (PI) Stanley Durrleman
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Summary
Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Max ERC Funding
1 499 894 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LEGA-C
Project The Physics of Galaxies 7 Gyr Ago
Researcher (PI) Arjen Van der wel
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary Over the past decade, redshift
surveys and multi-wavelength imaging campaigns have drawn up an
empirical picture of how many stars had formed in which types of
galaxies over the history of the universe. However, we have yet to
unravel the individual pathways along which galaxies evolve, and the
physical processes that drive them. Continuing with the previous
approach -- larger and deeper photometric samples -- is not adequate
to achieve this goal. A change of focus is required.
In this ERC project I will embark on a new way to address the question
of galaxy evolution. I will do so as Principle Investigator of the
recently approved LEGA-C observing program that has been allocated 128
nights of observation time over the next 4 years with ESO's flagship
facility the Very Large Telescope. This new survey will produce for
2500 distant (at z~1) galaxies with, for the first time,
sufficient resolution and S/N to measure ages and chemical
compositions of their stellar populations as well as internal velocity
dispersions and dynamical masses. This will provide an entirely new
physical description of the galaxy population 7 Gyr ago, with which I
will finally be able solve long-standing questions in galaxy formation
that were out of reach before: what is the star-formation history of
individual galaxies, why and how is star-formation ``quenched'' in
many galaxies, and to what extent do galaxies grow subsequently
through merging afterward?
LEGA-C is worldwide the largest spectroscopic survey of distant
galaxies to date, and ERC funding will be absolutely critical in
harvesting this unparallelled database. I am seeking to extend my
research group to realize the scientific potential of this substantial
investment (6.5M Eur) of observational resources by the European
astronomy community. Timing of the execution of the VLT program is
perfectly matched with the timeline of this ERC program.
Summary
Over the past decade, redshift
surveys and multi-wavelength imaging campaigns have drawn up an
empirical picture of how many stars had formed in which types of
galaxies over the history of the universe. However, we have yet to
unravel the individual pathways along which galaxies evolve, and the
physical processes that drive them. Continuing with the previous
approach -- larger and deeper photometric samples -- is not adequate
to achieve this goal. A change of focus is required.
In this ERC project I will embark on a new way to address the question
of galaxy evolution. I will do so as Principle Investigator of the
recently approved LEGA-C observing program that has been allocated 128
nights of observation time over the next 4 years with ESO's flagship
facility the Very Large Telescope. This new survey will produce for
2500 distant (at z~1) galaxies with, for the first time,
sufficient resolution and S/N to measure ages and chemical
compositions of their stellar populations as well as internal velocity
dispersions and dynamical masses. This will provide an entirely new
physical description of the galaxy population 7 Gyr ago, with which I
will finally be able solve long-standing questions in galaxy formation
that were out of reach before: what is the star-formation history of
individual galaxies, why and how is star-formation ``quenched'' in
many galaxies, and to what extent do galaxies grow subsequently
through merging afterward?
LEGA-C is worldwide the largest spectroscopic survey of distant
galaxies to date, and ERC funding will be absolutely critical in
harvesting this unparallelled database. I am seeking to extend my
research group to realize the scientific potential of this substantial
investment (6.5M Eur) of observational resources by the European
astronomy community. Timing of the execution of the VLT program is
perfectly matched with the timeline of this ERC program.
Max ERC Funding
1 884 875 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym LENA
Project non-LinEar sigNal processing for solving data challenges in Astrophysics
Researcher (PI) Jérôme Bobin
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Summary
Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Max ERC Funding
1 497 411 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LiKo
Project From Liouville to Kolmogorov: 2d quantum gravity, noise sensitivity and turbulent flows
Researcher (PI) Christophe Garban
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Summary
This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Max ERC Funding
935 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MagneticYSOs
Project Interpreting Dust Polarization Maps to Characterize the Role of the Magnetic Field in Star Formation Processes
Researcher (PI) Anaëlle Julie Maury
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary "Rotation and angular momentum transport play a critical role in the formation and evolution of astrophysical objects, including the fundamental bricks of astrophysical structures: stars. Stars like our Sun form when rotating dense cores, in the interstellar medium, collapse until they eventually reach temperatures at which nuclear fusion begins; while planets, including the Earth, form in the rotationally supported disks around these same young stars. One of the major challenges of modern astrophysics is the “angular momentum problem"": observations show that a typical star-forming cloud needs to reduce its specific angular momentum by 5 to 10 orders of magnitude to form a typical star such as our Sun. It is also crucial to solve the angular momentum problem to understand the formation of protoplanetary disks, stellar binaries and the initial mass function of newly formed stars. Magnetic fields are one of the key ways of transporting angular momentum in astrophysical structures: understanding how angular momentum is transported to allow star formation requires characterizing the role of magnetic fields in shaping the dynamics of star-forming structures. The MagneticYSOs project aims at characterizing the role of magnetic field in the earliest stage of star formation, during the main accretion phase.
The simultaneous major improvements of instrumental and computational facilities provide us, for the first time, with the opportunity to confront observational information to magnetized models predictions. Polarization capabilities on the last generation of instrument in large facilities are producing sensitive observations of magnetic fields with a great level of detail, while numerical simulations of star formation are now including most of the physical ingredients for a detailed description of protostellar collapse at all the relevant scales, such as resistive MHD, radiative transfer and chemical networks. These new tools will undoubtedly lead to major discovery in the fields of planets and star formation in the coming years. It is necessary to conduct comprehensive projects able to combine theory and observations in a detailed fashion, which in turn require a collaboration with access to cutting edge observational datasets and numerical models. Through an ambitious multi-faceted program of dedicated observations probing magnetic fields (polarized dust emission and Zeeman effect maps), gas kinematics (molecular lines emission maps), ionization rates and dust properties in Class 0 protostars, and their comparison to synthetic observations of MHD simulations of protostellar collapse, we aim to transform our understanding of:
1) The long-standing problem of angular momentum in star formation
2) The origin of the stellar initial mass function
3) The formation of multiple stellar systems and circumstellar disks around young stellar objects (YSOs)
Not only this project will enable a major leap forward in our understanding of low-mass star formation, answering yet unexplored questions with innovative methods, but it will also allow to spread the expertise in interpreting high-angular resolution (sub-)mm polarization data. Although characterizing magnetic fields in astrophysical structures represents the next frontier in many fields (solar physics, evolved stars, compact objects, galactic nuclei are a few examples), only a handful of astronomers in the EU community are familiar with interferometric polarization data, mostly because of the absence of large european facilities providing such capabilities until the recent advent of ALMA. It is now crucial to strengthen the European position in this research field by training a new generation of physicists with a strong expertise on tailoring, analyzing and interpreting high angular resolution polarization data."
Summary
"Rotation and angular momentum transport play a critical role in the formation and evolution of astrophysical objects, including the fundamental bricks of astrophysical structures: stars. Stars like our Sun form when rotating dense cores, in the interstellar medium, collapse until they eventually reach temperatures at which nuclear fusion begins; while planets, including the Earth, form in the rotationally supported disks around these same young stars. One of the major challenges of modern astrophysics is the “angular momentum problem"": observations show that a typical star-forming cloud needs to reduce its specific angular momentum by 5 to 10 orders of magnitude to form a typical star such as our Sun. It is also crucial to solve the angular momentum problem to understand the formation of protoplanetary disks, stellar binaries and the initial mass function of newly formed stars. Magnetic fields are one of the key ways of transporting angular momentum in astrophysical structures: understanding how angular momentum is transported to allow star formation requires characterizing the role of magnetic fields in shaping the dynamics of star-forming structures. The MagneticYSOs project aims at characterizing the role of magnetic field in the earliest stage of star formation, during the main accretion phase.
The simultaneous major improvements of instrumental and computational facilities provide us, for the first time, with the opportunity to confront observational information to magnetized models predictions. Polarization capabilities on the last generation of instrument in large facilities are producing sensitive observations of magnetic fields with a great level of detail, while numerical simulations of star formation are now including most of the physical ingredients for a detailed description of protostellar collapse at all the relevant scales, such as resistive MHD, radiative transfer and chemical networks. These new tools will undoubtedly lead to major discovery in the fields of planets and star formation in the coming years. It is necessary to conduct comprehensive projects able to combine theory and observations in a detailed fashion, which in turn require a collaboration with access to cutting edge observational datasets and numerical models. Through an ambitious multi-faceted program of dedicated observations probing magnetic fields (polarized dust emission and Zeeman effect maps), gas kinematics (molecular lines emission maps), ionization rates and dust properties in Class 0 protostars, and their comparison to synthetic observations of MHD simulations of protostellar collapse, we aim to transform our understanding of:
1) The long-standing problem of angular momentum in star formation
2) The origin of the stellar initial mass function
3) The formation of multiple stellar systems and circumstellar disks around young stellar objects (YSOs)
Not only this project will enable a major leap forward in our understanding of low-mass star formation, answering yet unexplored questions with innovative methods, but it will also allow to spread the expertise in interpreting high-angular resolution (sub-)mm polarization data. Although characterizing magnetic fields in astrophysical structures represents the next frontier in many fields (solar physics, evolved stars, compact objects, galactic nuclei are a few examples), only a handful of astronomers in the EU community are familiar with interferometric polarization data, mostly because of the absence of large european facilities providing such capabilities until the recent advent of ALMA. It is now crucial to strengthen the European position in this research field by training a new generation of physicists with a strong expertise on tailoring, analyzing and interpreting high angular resolution polarization data."
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym MALIG
Project A mathematical approach to the liquid-glass transition: kinetically constrained models, cellular automata and mixed order phase transitions
Researcher (PI) cristina Toninelli
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Summary
This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Max ERC Funding
883 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MicMactin
Project Dissecting active matter: Microscopic origins of macroscopic actomyosin activity
Researcher (PI) Martin Sylvain Peter Lenz
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary "Biological motion and forces originate from mechanically active proteins operating at the nanometer scale. These individual active elements interact through the surrounding cellular medium, collectively generating structures spanning tens of micrometers whose mechanical properties are perfectly tuned to their fundamentally out-of-equilibrium biological function. While both individual proteins and the resulting cellular behaviors are well characterized, understanding the relationship between these two scales remains a major challenge in both physics and cell biology.
We will bridge this gap through multiscale models of the emergence of active material properties in the experimentally well-characterized actin cytoskeleton. We will thus investigate unexplored, strongly interacting nonequilibrium regimes. We will develop a complete framework for cytoskeletal activity by separately studying all three fundamental processes driving it out of equilibrium: actin filament assembly and disassembly, force exertion by branched actin networks, and the action of molecular motors. We will then recombine these approaches into a unified understanding of complex cell motility processes.
To tackle the cytoskeleton's disordered geometry and many-body interactions, we will design new nonequilibrium self consistent methods in statistical mechanics and elasticity theory. Our findings will be validated through simulations and close experimental collaborations.
Our work will break new ground in both biology and physics. In the context of biology, it will establish a new framework to understand how the cell controls its achitecture and mechanics through biochemical regulation. On the physics side, it will set up new paradigms for the emergence of original out-of-equilibrium collective behaviors in an experimentally well-characterized system, addressing the foundations of existing macroscopic "active matter" approaches."
Summary
"Biological motion and forces originate from mechanically active proteins operating at the nanometer scale. These individual active elements interact through the surrounding cellular medium, collectively generating structures spanning tens of micrometers whose mechanical properties are perfectly tuned to their fundamentally out-of-equilibrium biological function. While both individual proteins and the resulting cellular behaviors are well characterized, understanding the relationship between these two scales remains a major challenge in both physics and cell biology.
We will bridge this gap through multiscale models of the emergence of active material properties in the experimentally well-characterized actin cytoskeleton. We will thus investigate unexplored, strongly interacting nonequilibrium regimes. We will develop a complete framework for cytoskeletal activity by separately studying all three fundamental processes driving it out of equilibrium: actin filament assembly and disassembly, force exertion by branched actin networks, and the action of molecular motors. We will then recombine these approaches into a unified understanding of complex cell motility processes.
To tackle the cytoskeleton's disordered geometry and many-body interactions, we will design new nonequilibrium self consistent methods in statistical mechanics and elasticity theory. Our findings will be validated through simulations and close experimental collaborations.
Our work will break new ground in both biology and physics. In the context of biology, it will establish a new framework to understand how the cell controls its achitecture and mechanics through biochemical regulation. On the physics side, it will set up new paradigms for the emergence of original out-of-equilibrium collective behaviors in an experimentally well-characterized system, addressing the foundations of existing macroscopic "active matter" approaches."
Max ERC Funding
1 491 868 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MicroParticleControl
Project Controlled synthesis of particulate matter in microfluidics
Researcher (PI) Simon Kuhn
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary Despite the many advantages of microchemical systems and their successful applications in chemical
engineering research, one major drawback greatly limiting their use is their susceptibility to channel clogging
for flows containing particulate matter. Hence, the aim of the proposed research is to overcome the challenge
of clogging in microfluidic devices and to design microfluidic systems that can tolerate particulate matter
and synthesize solid materials according to their specifications (e.g. size, purity, morphology). To reach this
goal, we apply a combined experimental and theoretical approach, in which the experimental results will lead
to model development reflecting the particle formation and interaction kinetics and their coupling to the
hydrodynamics. The novel concept of the proposal is to devise engineering strategies to handle the
particulate matter inside the reactor depending on if the solid material is i) an unwanted and insoluble by-product
of a reaction, or ii) the target compound (e.g. nanoparticle synthesis or crystallization of organic
molecules). Depending on the case we will design different ultrasound application strategies and introduce
nucleation sites to control the location of particle formation within the microchannel. This project will
provide fundamental insight into the physico-chemical phenomena that result in particle formation, growth
and agglomeration processes in continuous flow microdevices, and will provide a theoretical tool for the
prediction of the dynamics of particle-particle, particle-wall and particle-fluid interactions, leading to
innovative microreactor designs.
Summary
Despite the many advantages of microchemical systems and their successful applications in chemical
engineering research, one major drawback greatly limiting their use is their susceptibility to channel clogging
for flows containing particulate matter. Hence, the aim of the proposed research is to overcome the challenge
of clogging in microfluidic devices and to design microfluidic systems that can tolerate particulate matter
and synthesize solid materials according to their specifications (e.g. size, purity, morphology). To reach this
goal, we apply a combined experimental and theoretical approach, in which the experimental results will lead
to model development reflecting the particle formation and interaction kinetics and their coupling to the
hydrodynamics. The novel concept of the proposal is to devise engineering strategies to handle the
particulate matter inside the reactor depending on if the solid material is i) an unwanted and insoluble by-product
of a reaction, or ii) the target compound (e.g. nanoparticle synthesis or crystallization of organic
molecules). Depending on the case we will design different ultrasound application strategies and introduce
nucleation sites to control the location of particle formation within the microchannel. This project will
provide fundamental insight into the physico-chemical phenomena that result in particle formation, growth
and agglomeration processes in continuous flow microdevices, and will provide a theoretical tool for the
prediction of the dynamics of particle-particle, particle-wall and particle-fluid interactions, leading to
innovative microreactor designs.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym ModRed
Project The geometry of modular representations of reductive algebraic groups
Researcher (PI) Simon Riche
Host Institution (HI) UNIVERSITE CLERMONT AUVERGNE
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The main theme of this proposal is the Geometric Representation Theory of reductive algebraic groups over algebraically closed fields of positive characteristic. Our primary goal is to obtain character formulas for simple and for indecomposable tilting representations of such groups, by developing a geometric framework for their categories of representations.
Obtaining such formulas has been one of the main problems in this area since the 1980's. A program outlined by G. Lusztig in the 1990's has lead to a formula for the characters of simple representations in the case the characteristic of the base field is bigger than an explicit but huge bound. A recent breakthrough due to G. Williamson has shown that this formula cannot hold for smaller characteristics, however. Nothing is known about characters of tilting modules in general (except for a conjectural formula for some characters, due to Andersen). Our main tools include a new perspective on Soergel bimodules offered by the study of parity sheaves (introduced by Juteau-Mautner-Williamson) and a diagrammatic presentation of their category (due to Elias-Williamson).
Summary
The main theme of this proposal is the Geometric Representation Theory of reductive algebraic groups over algebraically closed fields of positive characteristic. Our primary goal is to obtain character formulas for simple and for indecomposable tilting representations of such groups, by developing a geometric framework for their categories of representations.
Obtaining such formulas has been one of the main problems in this area since the 1980's. A program outlined by G. Lusztig in the 1990's has lead to a formula for the characters of simple representations in the case the characteristic of the base field is bigger than an explicit but huge bound. A recent breakthrough due to G. Williamson has shown that this formula cannot hold for smaller characteristics, however. Nothing is known about characters of tilting modules in general (except for a conjectural formula for some characters, due to Andersen). Our main tools include a new perspective on Soergel bimodules offered by the study of parity sheaves (introduced by Juteau-Mautner-Williamson) and a diagrammatic presentation of their category (due to Elias-Williamson).
Max ERC Funding
882 844 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MOFcat
Project Fundamental and Applied Science on Molecular Redox-Catalysts of Energy Relevance in Metal-Organic Frameworks
Researcher (PI) Sascha Ott
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary Organometallic redox-catalysts of energy relevance, i.e. water and hydrogen oxidation, and proton and carbon dioxide reduction catalysts, will be incorporated into metal-organic frameworks (MOFs). Immobilization and spatial organization of the molecular catalysts will stabilize their molecular integrity and ensure longevity and recyclability of the resulting MOFcats. The organized environment provided by the MOF will enable the control of conformational flexibility, diffusion, charge transport, and higher coordination sphere effects that play crucial roles in enzymes, but cannot be addressed in homogenous solution and are thus largely unexplored. The effect that the MOF environment has on catalysis will be directly probed electrochemically in MOFcats that are immobilized or grown on electrode surfaces. In combination with spectroscopic techniques in spectroelectrochemical cells, intermediates in the catalytic cycles will be detected and characterized. Kinetic information of the individual steps in the catalytic cycles will be obtained in MOFs that contain both a molecular photosensitizer (PS) and a molecular catalyst (PS-MOFcats). The envisaged systems will allow light-induced electron transfer processes to generate reduced or oxidized catalyst states the reactivity of which will be studied with high time resolution by transient UV/Vis and IR spectroscopy. The acquired fundamental mechanistic knowledge is far beyond the current state-of-the-art in MOF chemistry and catalysis, and will be used to prepare MOFcat-based electrodes that function at highest possible rates and lowest overpotentials. PS-MOFcats will be grown on flat semiconductor surfaces, and explored as a novel concept to photoanode and -cathode designs for dye-sensitized solar fuel devices (DSSFDs). The design is particularly appealing as it accommodates high PS concentrations for efficient light-harvesting, while providing potent catalysts close to the solvent interface.
Summary
Organometallic redox-catalysts of energy relevance, i.e. water and hydrogen oxidation, and proton and carbon dioxide reduction catalysts, will be incorporated into metal-organic frameworks (MOFs). Immobilization and spatial organization of the molecular catalysts will stabilize their molecular integrity and ensure longevity and recyclability of the resulting MOFcats. The organized environment provided by the MOF will enable the control of conformational flexibility, diffusion, charge transport, and higher coordination sphere effects that play crucial roles in enzymes, but cannot be addressed in homogenous solution and are thus largely unexplored. The effect that the MOF environment has on catalysis will be directly probed electrochemically in MOFcats that are immobilized or grown on electrode surfaces. In combination with spectroscopic techniques in spectroelectrochemical cells, intermediates in the catalytic cycles will be detected and characterized. Kinetic information of the individual steps in the catalytic cycles will be obtained in MOFs that contain both a molecular photosensitizer (PS) and a molecular catalyst (PS-MOFcats). The envisaged systems will allow light-induced electron transfer processes to generate reduced or oxidized catalyst states the reactivity of which will be studied with high time resolution by transient UV/Vis and IR spectroscopy. The acquired fundamental mechanistic knowledge is far beyond the current state-of-the-art in MOF chemistry and catalysis, and will be used to prepare MOFcat-based electrodes that function at highest possible rates and lowest overpotentials. PS-MOFcats will be grown on flat semiconductor surfaces, and explored as a novel concept to photoanode and -cathode designs for dye-sensitized solar fuel devices (DSSFDs). The design is particularly appealing as it accommodates high PS concentrations for efficient light-harvesting, while providing potent catalysts close to the solvent interface.
Max ERC Funding
1 968 750 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym MONACAT
Project Magnetism and Optics for Nanoparticle Catalysis
Researcher (PI) Bruno CHAUDRET
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary MONACAT proposes a novel approach to address the challenge of intermittent energy storage. Specifically, the purpose is to conceive and synthesize novel complex nano-objects displaying both physical and chemical properties that enable catalytic transformations with a fast and optimum energy conversion. It follows over 20 years of research on “organometallic nanoparticles”, an approach of nanoparticles (NPs) synthesis where the first goal is to control the surface of the particles as in molecular organometallic species. Two families of NPs will be studied: 1) magnetic NPs that can be heated by excitation with an alternating magnetic field and 2) plasmonic NPs that absorb visible light and transform it into heat. In all cases, deposition of additional materials as islands or thin layers will improve the NPs catalytic activity. Iron carbides NPs have recently been shown to heat efficiently upon magnetic excitation and to catalyse CO hydrogenation into hydrocarbons. In order to transform this observation into a viable process, MONACAT will address the following challenges: determination and control of surface temperature using fluorophores or quantum dots, optimization of heating capacity (size, anisotropy of the material, crystallinity, phases: FeCo, FeNi, chemical order), optimization of catalytic properties (islands vs core-shell structures; Ru, Ni for methane, Cu/Zn for methanol), stability and optimization of energy efficiency. A similar approach will be used for direct light conversion using as first proofs of concept Au or Ag NPs coated with Ru. Catalytic tests will be performed on two heterogeneous reactions after deposition of the NPs onto a support: CO2 hydrogenation into methane and methanol synthesis. In addition, the potential of catalysis making use of self-heated and magnetically recoverable NPs will be studied in solution (reduction of arenes or oxygenated functions, hydrogenation and hydrogenolysis of biomass platform molecules, Fischer-Tropsch).
Summary
MONACAT proposes a novel approach to address the challenge of intermittent energy storage. Specifically, the purpose is to conceive and synthesize novel complex nano-objects displaying both physical and chemical properties that enable catalytic transformations with a fast and optimum energy conversion. It follows over 20 years of research on “organometallic nanoparticles”, an approach of nanoparticles (NPs) synthesis where the first goal is to control the surface of the particles as in molecular organometallic species. Two families of NPs will be studied: 1) magnetic NPs that can be heated by excitation with an alternating magnetic field and 2) plasmonic NPs that absorb visible light and transform it into heat. In all cases, deposition of additional materials as islands or thin layers will improve the NPs catalytic activity. Iron carbides NPs have recently been shown to heat efficiently upon magnetic excitation and to catalyse CO hydrogenation into hydrocarbons. In order to transform this observation into a viable process, MONACAT will address the following challenges: determination and control of surface temperature using fluorophores or quantum dots, optimization of heating capacity (size, anisotropy of the material, crystallinity, phases: FeCo, FeNi, chemical order), optimization of catalytic properties (islands vs core-shell structures; Ru, Ni for methane, Cu/Zn for methanol), stability and optimization of energy efficiency. A similar approach will be used for direct light conversion using as first proofs of concept Au or Ag NPs coated with Ru. Catalytic tests will be performed on two heterogeneous reactions after deposition of the NPs onto a support: CO2 hydrogenation into methane and methanol synthesis. In addition, the potential of catalysis making use of self-heated and magnetically recoverable NPs will be studied in solution (reduction of arenes or oxygenated functions, hydrogenation and hydrogenolysis of biomass platform molecules, Fischer-Tropsch).
Max ERC Funding
2 472 223 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym MOPSA
Project Modular Open Platform for Static Analysis
Researcher (PI) Antoine Miné
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The Mopsa project aims at creating methods and tools to make computer software more reliable.
Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer I have coauthored, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. I wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software.
We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core on Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.
Summary
The Mopsa project aims at creating methods and tools to make computer software more reliable.
Programming errors are pervasive with results ranging from user frustration to huge economical or human losses. Traditional test-based methods are insufficient to eliminate all errors. The project will develop static analyses able to detect at compile-time whole classes of program defects, leveraging the theory of abstract interpretation to design analyses that are approximate (to scale up to large programs) and sound (no defect is missed). Static analysis has enjoyed recent successes: Astrée, an industrial analyzer I have coauthored, was able to prove the absence of run-time error in Airbus software. But such results are limited to the specific, well-controlled context of critical embedded systems. I wish to bring static analysis to the next level: target larger, more complex and heterogeneous software, and make it usable by engineers to improve general-purpose software.
We focus on analyzing open-source software which are readily available, complex, widespread, and important from an economical standpoint (they are used in many infrastructures and companies) but also societal and educational ones (promoting the development of verified software for and by citizens). A major target we consider is the set of technologies at the core on Internet on which static analysis could be applied to ensure a safer Internet. The scientific challenges we must overcome include designing scalable analyses producing relevant information, supporting novel popular languages (such as Python), analyzing properties more adapted to the continuous development of software common in open-source. At the core of the project is the construction of an open-source static analysis platform. It will serve not only to implement and evaluate the results of the project, but also create a momentum encouraging the research in static analysis and hasten its adoption in open-source development communities.
Max ERC Funding
1 773 750 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym NanoPokers
Project Deciphering cell heterogeneity in tumors using arrays of nanowires to controllably poke single cells in longitudinal studies
Researcher (PI) Christelle Nathalie Prinz
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary Cancer is responsible for 20% of all deaths in Europe. Current cancer research is based on cell ensemble measurements or on snapshot studies of individual cells. However, cancer is a systemic disease, involving many cells that interact and evolve over time in a complex manner, which cell ensemble studies and snapshot studies cannot grasp. It is therefore crucial to investigate cancer at the single cell level and in longitudinal studies (over time). Despite the recent developments in micro- and nanotechnologies, combined with live cell imaging, today, there is no method available that meets the crucial need for global monitoring of individual cell responses to stimuli/perturbation in real-time.
This project addresses this crucial need by combining super resolution live-cell imaging and the development of sensors, as well as injection devices based on vertical nanowire arrays. The devices will penetrate multiple single cells in a fully controlled manner, with minimal invasiveness.
The objectives of the project are:
1) To develop nanowire based-tools in order to gain controlled and reliable access to the cell interior with minimal invasiveness.
2) Developing mRNA sensing and biomolecule injection capabilities based on nanowires.
3) Performing longitudinal single cell studies in tumours, including monitoring gene expression in real time, under controlled cell perturbation.
By enabling global, long term monitoring of individual tumour cells submitted to controlled stimuli, the project will open up new horizons in Biology and in Medical Research. It will enable ground-breaking discoveries in understanding the complexity of molecular events underlying the disease. This cross-disciplinary project will lead to paradigm-shifting research, which will enable the development of optimal treatment strategies. This will be applicable, not only for cancer, but also for a broad range of diseases, such as diabetes and neurodegenerative diseases.
Summary
Cancer is responsible for 20% of all deaths in Europe. Current cancer research is based on cell ensemble measurements or on snapshot studies of individual cells. However, cancer is a systemic disease, involving many cells that interact and evolve over time in a complex manner, which cell ensemble studies and snapshot studies cannot grasp. It is therefore crucial to investigate cancer at the single cell level and in longitudinal studies (over time). Despite the recent developments in micro- and nanotechnologies, combined with live cell imaging, today, there is no method available that meets the crucial need for global monitoring of individual cell responses to stimuli/perturbation in real-time.
This project addresses this crucial need by combining super resolution live-cell imaging and the development of sensors, as well as injection devices based on vertical nanowire arrays. The devices will penetrate multiple single cells in a fully controlled manner, with minimal invasiveness.
The objectives of the project are:
1) To develop nanowire based-tools in order to gain controlled and reliable access to the cell interior with minimal invasiveness.
2) Developing mRNA sensing and biomolecule injection capabilities based on nanowires.
3) Performing longitudinal single cell studies in tumours, including monitoring gene expression in real time, under controlled cell perturbation.
By enabling global, long term monitoring of individual tumour cells submitted to controlled stimuli, the project will open up new horizons in Biology and in Medical Research. It will enable ground-breaking discoveries in understanding the complexity of molecular events underlying the disease. This cross-disciplinary project will lead to paradigm-shifting research, which will enable the development of optimal treatment strategies. This will be applicable, not only for cancer, but also for a broad range of diseases, such as diabetes and neurodegenerative diseases.
Max ERC Funding
2 621 251 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym NanoStaph
Project Force nanoscopy of staphylococcal biofilms
Researcher (PI) Yves Dufrene
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE4, ERC-2015-AdG
Summary Staphylococcus aureus is a leading cause of hospital-acquired infections, which are often complicated by the ability of this pathogen to grow as biofilms on indwelling medical devices. Because biofilms protect the bacteria from host defenses and are resistant to many antibiotics, biofilm-related infections are difficult to fight and represent a tremendous burden on our healthcare system. Today, a true molecular understanding of the fundamental interactions driving staphylococcal adhesion and biofilm formation is lacking owing to the lack of high-resolution probing techniques. This knowledge would greatly contribute to the development of novel anti-adhesion therapies for combating biofilm infections.
We recently established advanced atomic force microscopy (AFM) techniques for analyzing the nanoscale surface architecture and interactions of microbial cells, allowing us to elucidate key cellular functions. This multidisciplinary project aims at developing an innovative AFM-based force nanoscopy platform in biofilm research, enabling us to understand the molecular mechanisms of S. aureus adhesion in a way that was not possible before, and to optimize the use of anti-adhesion compounds capable to inhibit biofilm formation by this pathogen.
NanoStaph will have strong scientific, societal and economical impacts. From the technical perspective, force nanoscopy will represent an unconventional methodology for the high throughput and high resolution characterization of adhesion forces in living cells, especially in bacterial pathogens. In microbiology, the results will radically transform our perception of the molecular bases of biofilm formation by S. aureus. In medicine, the project will provide a new screening method for the fast, label-free analysis of anti-adhesion compounds targeting S. aureus strains, including antibiotic-resistant clinical isolates that are notoriously difficult to treat, thus paving the way to the development of anti-adhesion therapies.
Summary
Staphylococcus aureus is a leading cause of hospital-acquired infections, which are often complicated by the ability of this pathogen to grow as biofilms on indwelling medical devices. Because biofilms protect the bacteria from host defenses and are resistant to many antibiotics, biofilm-related infections are difficult to fight and represent a tremendous burden on our healthcare system. Today, a true molecular understanding of the fundamental interactions driving staphylococcal adhesion and biofilm formation is lacking owing to the lack of high-resolution probing techniques. This knowledge would greatly contribute to the development of novel anti-adhesion therapies for combating biofilm infections.
We recently established advanced atomic force microscopy (AFM) techniques for analyzing the nanoscale surface architecture and interactions of microbial cells, allowing us to elucidate key cellular functions. This multidisciplinary project aims at developing an innovative AFM-based force nanoscopy platform in biofilm research, enabling us to understand the molecular mechanisms of S. aureus adhesion in a way that was not possible before, and to optimize the use of anti-adhesion compounds capable to inhibit biofilm formation by this pathogen.
NanoStaph will have strong scientific, societal and economical impacts. From the technical perspective, force nanoscopy will represent an unconventional methodology for the high throughput and high resolution characterization of adhesion forces in living cells, especially in bacterial pathogens. In microbiology, the results will radically transform our perception of the molecular bases of biofilm formation by S. aureus. In medicine, the project will provide a new screening method for the fast, label-free analysis of anti-adhesion compounds targeting S. aureus strains, including antibiotic-resistant clinical isolates that are notoriously difficult to treat, thus paving the way to the development of anti-adhesion therapies.
Max ERC Funding
2 481 438 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym NUHGD
Project Non Uniform Hyperbolicity in Global Dynamics
Researcher (PI) Sylvain CROVISIER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary An important part of differentiable dynamics has been developed from the uniformly hyperbolic systems. These systems have been introduced by Smale in the 60's in order to address chaotic behavior and are now deeply understood from the qualitative, symbolic and statistic viewpoints. They correspond to the structurally stable dynamics. It appeared that large classes of non-hyperbolic systems also exist. Since the 80's, different notions of relaxed hyperbolicity have been introduced: non-uniformly hyperbolic measures, partial hyperbolicity, ... They allowed to extend the previous approach to other families of systems and to handle new examples of dynamics: the fine description of the dynamics of Hénon maps for instance.
The development of local perturbative technics have brought a rebirth for the qualitative description of generic systems. It also opened the door to describe more globally the spaces of differentiable dynamics. For instance, it allowed recent progresses towards the Palis conjecture which characterizes the absence of uniform hyperbolicity by the homoclinic bifurcations — homoclinic tangencies or heterodimensional cycles. We propose in the present project to develop technics for realizing more global perturbations, yielding a breakthrough in the subject. This would settle this conjecture for C1 diffeomorphisms and imply other classification results.
These past years we have understood how qualitative dynamics of generic systems decompose into invariant pieces. We are now ready to describe more precisely the dynamics inside the pieces. We propose to combine these new geometrical ideas to the ergodic theory of non-uniformly hyperbolic systems. This will improve significantly our understanding of general smooth systems (for instance provide existence and finiteness of physical measures and measures of maximal entropy for new classes of systems beyond uniform hyperbolicity).
Summary
An important part of differentiable dynamics has been developed from the uniformly hyperbolic systems. These systems have been introduced by Smale in the 60's in order to address chaotic behavior and are now deeply understood from the qualitative, symbolic and statistic viewpoints. They correspond to the structurally stable dynamics. It appeared that large classes of non-hyperbolic systems also exist. Since the 80's, different notions of relaxed hyperbolicity have been introduced: non-uniformly hyperbolic measures, partial hyperbolicity, ... They allowed to extend the previous approach to other families of systems and to handle new examples of dynamics: the fine description of the dynamics of Hénon maps for instance.
The development of local perturbative technics have brought a rebirth for the qualitative description of generic systems. It also opened the door to describe more globally the spaces of differentiable dynamics. For instance, it allowed recent progresses towards the Palis conjecture which characterizes the absence of uniform hyperbolicity by the homoclinic bifurcations — homoclinic tangencies or heterodimensional cycles. We propose in the present project to develop technics for realizing more global perturbations, yielding a breakthrough in the subject. This would settle this conjecture for C1 diffeomorphisms and imply other classification results.
These past years we have understood how qualitative dynamics of generic systems decompose into invariant pieces. We are now ready to describe more precisely the dynamics inside the pieces. We propose to combine these new geometrical ideas to the ergodic theory of non-uniformly hyperbolic systems. This will improve significantly our understanding of general smooth systems (for instance provide existence and finiteness of physical measures and measures of maximal entropy for new classes of systems beyond uniform hyperbolicity).
Max ERC Funding
1 229 255 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ONE
Project Unified Principles of Interaction
Researcher (PI) Michel Beaudouin-Lafon
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary Most of today’s computer interfaces are based on principles and conceptual models created in the late seventies. They are designed for a single user interacting with a closed application on a single device with a predefined set of tools to manipulate a single type of content. But one is not enough! We need flexible and extensible environments where multiple users can truly share content and manipulate it simultaneously, where applications can be distributed across multiple devices, where content and tools can migrate from one device to the next, and where users can freely choose, combine and even create tools to make their own digital workbench.
The goal of ONE is to fundamentally re-think the basic principles and conceptual model of interactive systems to empower users by letting them appropriate their digital environment. The project will address this challenge through three interleaved strands: empirical studies to better understand interaction in both the physical and digital worlds, theoretical work to create a conceptual model of interaction and interactive systems, and prototype development to test these principles and concepts in the lab and in the field. Drawing inspiration from physics, biology and psychology, the conceptual model will combine substrates to manage digital information at various levels of abstraction and representation, instruments to manipulate substrates, and environments to organize substrates and instruments into digital workspaces.
By identifying first principles of interaction, ONE will unify a wide variety of interaction styles and create more open and flexible interactive environments.
Summary
Most of today’s computer interfaces are based on principles and conceptual models created in the late seventies. They are designed for a single user interacting with a closed application on a single device with a predefined set of tools to manipulate a single type of content. But one is not enough! We need flexible and extensible environments where multiple users can truly share content and manipulate it simultaneously, where applications can be distributed across multiple devices, where content and tools can migrate from one device to the next, and where users can freely choose, combine and even create tools to make their own digital workbench.
The goal of ONE is to fundamentally re-think the basic principles and conceptual model of interactive systems to empower users by letting them appropriate their digital environment. The project will address this challenge through three interleaved strands: empirical studies to better understand interaction in both the physical and digital worlds, theoretical work to create a conceptual model of interaction and interactive systems, and prototype development to test these principles and concepts in the lab and in the field. Drawing inspiration from physics, biology and psychology, the conceptual model will combine substrates to manage digital information at various levels of abstraction and representation, instruments to manipulate substrates, and environments to organize substrates and instruments into digital workspaces.
By identifying first principles of interaction, ONE will unify a wide variety of interaction styles and create more open and flexible interactive environments.
Max ERC Funding
2 456 028 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym ORDERin1D
Project Order in one dimension: Functional hybrids of chirality-sorted carbon nanotubes
Researcher (PI) Sofie Cambré
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary The hollow structure of carbon nanotubes (CNTs) with a wide range of diameters forms an ideal one-dimensional host system to study restricted diameter-dependent molecular transport and to achieve unique polar molecular order. For the ORDERin1D project, I will capitalize on my recent breakthroughs in the processing, filling, chiral sorting and high-resolution spectroscopic characterization of empty and filled CNTs, aiming for a diameter-dependent characterization of the filling with various molecules, which will pave the way for the rational design of ultraselective filtermembranes, sensors, nanofluidic devices and nanohybrids with unseen control over the structural order at the molecular scale. In particular, I recently found that dipolar molecules naturally align head-to-tail into a polar array inside the CNTs, after which their molecular directional properties such as their dipole moment and second-order nonlinear optical responses add up coherently, groundbreaking for the development of nanophotonics applications.
Summary
The hollow structure of carbon nanotubes (CNTs) with a wide range of diameters forms an ideal one-dimensional host system to study restricted diameter-dependent molecular transport and to achieve unique polar molecular order. For the ORDERin1D project, I will capitalize on my recent breakthroughs in the processing, filling, chiral sorting and high-resolution spectroscopic characterization of empty and filled CNTs, aiming for a diameter-dependent characterization of the filling with various molecules, which will pave the way for the rational design of ultraselective filtermembranes, sensors, nanofluidic devices and nanohybrids with unseen control over the structural order at the molecular scale. In particular, I recently found that dipolar molecules naturally align head-to-tail into a polar array inside the CNTs, after which their molecular directional properties such as their dipole moment and second-order nonlinear optical responses add up coherently, groundbreaking for the development of nanophotonics applications.
Max ERC Funding
1 499 425 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym PaDyFlow
Project Particle dynamics in the flow of complex suspensions
Researcher (PI) Anke Lindner
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Particle laden flows are ubiquitous in nature and industrial applications. Particle trajectories determine transport in porous media or biomedical conducts and effective suspension properties dictate flow behavior in food processing or biofluid flow. For a better control it is necessary to know how to predict these processes from the involved particle and flow properties. However, current theory is not able to capture the complexity of the applications and experiments have been carried out on too diverse systems for a unifying picture to emerge. A systematic experimental approach is now needed to improve the present understanding.
In this experimental project, we will use novel microfabrication and characterization methods to obtain a set of complex anisotropic microscopic particles (complemented by selected bioparticles) with tunable properties, covering size, shape, deformability and activity. The transport of these particles isolated or in small concentrations will be studied in chosen microfluidic model flows of simple fluids or polymer solutions. The many degrees of freedom of this problem will be addressed by systematically combining different relevant particle and flow properties. The macroscopic properties of dilute suspensions are particularly interesting from a fundamental point of view as they are a direct consequence of the individual particle flow interaction and will be measured using original microfluidic rheometers of outstanding resolution.
This project will lead to a comprehensive understanding of fluid structure interactions at small Reynolds number. Our findings will constitute the basis for novel numerical approaches based on experimentally validated hypotheses. Using our knowledge, local flow sensors, targeted delivery and novel microfluidic filtration or separation devices can be designed. Combining particles of chosen properties and selected suspending fluids allows the fabrication of suspensions with unprecedented tailored properties.
Summary
Particle laden flows are ubiquitous in nature and industrial applications. Particle trajectories determine transport in porous media or biomedical conducts and effective suspension properties dictate flow behavior in food processing or biofluid flow. For a better control it is necessary to know how to predict these processes from the involved particle and flow properties. However, current theory is not able to capture the complexity of the applications and experiments have been carried out on too diverse systems for a unifying picture to emerge. A systematic experimental approach is now needed to improve the present understanding.
In this experimental project, we will use novel microfabrication and characterization methods to obtain a set of complex anisotropic microscopic particles (complemented by selected bioparticles) with tunable properties, covering size, shape, deformability and activity. The transport of these particles isolated or in small concentrations will be studied in chosen microfluidic model flows of simple fluids or polymer solutions. The many degrees of freedom of this problem will be addressed by systematically combining different relevant particle and flow properties. The macroscopic properties of dilute suspensions are particularly interesting from a fundamental point of view as they are a direct consequence of the individual particle flow interaction and will be measured using original microfluidic rheometers of outstanding resolution.
This project will lead to a comprehensive understanding of fluid structure interactions at small Reynolds number. Our findings will constitute the basis for novel numerical approaches based on experimentally validated hypotheses. Using our knowledge, local flow sensors, targeted delivery and novel microfluidic filtration or separation devices can be designed. Combining particles of chosen properties and selected suspending fluids allows the fabrication of suspensions with unprecedented tailored properties.
Max ERC Funding
1 971 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym PhotoMedMet
Project Towards Novel Inert (Photo-)toxic Ru(II) Polypyridyl Complexes
Researcher (PI) Gilles Albert Gasser
Host Institution (HI) ECOLE NATIONALE SUPERIEURE DE CHIMIE DE PARIS
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary In this grant application, I propose to investigate in-depth the potential of novel inert Ru(II) polypyridyl complexes as novel anticancer drug candidates. Such compounds were investigated by Dwyer and Shulman in 1950s and 1960s both in vitro and in vivo with relatively promising results. This impressive seminal work was unfortunately not followed-up. This lack of additional studies was recently attributed, at least in part, to the observed neurotoxicity of the complexes. Nonetheless, over the last years, there has been a revival of important in vitro studies of such inert Ru(II) polypyridyl complexes for anticancer purposes. However, without further in vivo studies, it is reasonable to think that similar neurotoxicity to that observed by Dwyer and Shulman could be encountered. In order to tackle these (potential) drawbacks, I propose to use a prodrug approach.
Furthermore, I also intend to investigate the potential of inert Ru(II) polypyridyl complexes as photosensitizers (PSs) in photodynamic therapy (PDT). In the search for an alternative approach to chemotherapy, PDT has proven to be a promising, effective and non-invasive treatment modality. Importantly, in order to increase even further the potential of the PSs presented in this project, I propose to also excite them via simultaneous two-photon absorption (TPA) in the so-called two-photon excitation PDT (2 PE-PDT). Importantly, the newly Ru(II)-based PSs will be coupled to cancer cell-specific peptides or antibodies. This double selectivity (targeting vector and photo-activation) should limit the frequently encountered side-effects of (metal-based) anticancer drugs. Another important aim of this second part of this project will be the use of the Ru(II)-based PSs to kill bacteria. Interestingly, PDT has been recently shown to be an interesting alternative to fight bacteria. I therefore intend to couple Ru(II)-based (2PE )PSs to bacteria-specific peptides to bring bacteria specificity.
Summary
In this grant application, I propose to investigate in-depth the potential of novel inert Ru(II) polypyridyl complexes as novel anticancer drug candidates. Such compounds were investigated by Dwyer and Shulman in 1950s and 1960s both in vitro and in vivo with relatively promising results. This impressive seminal work was unfortunately not followed-up. This lack of additional studies was recently attributed, at least in part, to the observed neurotoxicity of the complexes. Nonetheless, over the last years, there has been a revival of important in vitro studies of such inert Ru(II) polypyridyl complexes for anticancer purposes. However, without further in vivo studies, it is reasonable to think that similar neurotoxicity to that observed by Dwyer and Shulman could be encountered. In order to tackle these (potential) drawbacks, I propose to use a prodrug approach.
Furthermore, I also intend to investigate the potential of inert Ru(II) polypyridyl complexes as photosensitizers (PSs) in photodynamic therapy (PDT). In the search for an alternative approach to chemotherapy, PDT has proven to be a promising, effective and non-invasive treatment modality. Importantly, in order to increase even further the potential of the PSs presented in this project, I propose to also excite them via simultaneous two-photon absorption (TPA) in the so-called two-photon excitation PDT (2 PE-PDT). Importantly, the newly Ru(II)-based PSs will be coupled to cancer cell-specific peptides or antibodies. This double selectivity (targeting vector and photo-activation) should limit the frequently encountered side-effects of (metal-based) anticancer drugs. Another important aim of this second part of this project will be the use of the Ru(II)-based PSs to kill bacteria. Interestingly, PDT has been recently shown to be an interesting alternative to fight bacteria. I therefore intend to couple Ru(II)-based (2PE )PSs to bacteria-specific peptides to bring bacteria specificity.
Max ERC Funding
662 015 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym Photonis
Project Isotope Fractionation of Light Elements Upon Ionization: Cosmochemical and Geochemical Implications
Researcher (PI) Bernard MARTY
Host Institution (HI) UNIVERSITE DE LORRAINE
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary Light elements such as hydrogen and nitrogen present large isotope variations among solar system objects and reservoirs (including planetary atmospheres) that remain unexplained at present. Works based on theoretical approaches are model-dependent and do not reach a consensus. Laboratory experiments are required in order to develop the underlying physical mechanisms. The aim of the project is to investigate the origins of and processes responsible for isotope variations of the light elements and noble gases in the Solar System through an experimental approach involving ionization of gaseous species. We will also investigate mechanisms and processes of isotope fractionation of atmophile elements in planetary atmospheres that have been irradiated by solar UV photons, with particular reference to Mars and the early Earth. Three pathways will be considered: (i) plasma ionisation of gas mixtures (H2-CO-N2-noble gases) in a custom-built reactor; (ii) photo-ionisation and photo-dissociation of the relevant gas species and mixtures using synchrotron light; and (iii) UV irradiation of ices containing the species of interest. The results of this study will shed light on the early Solar System evolution and on processes of planetary formation.
Summary
Light elements such as hydrogen and nitrogen present large isotope variations among solar system objects and reservoirs (including planetary atmospheres) that remain unexplained at present. Works based on theoretical approaches are model-dependent and do not reach a consensus. Laboratory experiments are required in order to develop the underlying physical mechanisms. The aim of the project is to investigate the origins of and processes responsible for isotope variations of the light elements and noble gases in the Solar System through an experimental approach involving ionization of gaseous species. We will also investigate mechanisms and processes of isotope fractionation of atmophile elements in planetary atmospheres that have been irradiated by solar UV photons, with particular reference to Mars and the early Earth. Three pathways will be considered: (i) plasma ionisation of gas mixtures (H2-CO-N2-noble gases) in a custom-built reactor; (ii) photo-ionisation and photo-dissociation of the relevant gas species and mixtures using synchrotron light; and (iii) UV irradiation of ices containing the species of interest. The results of this study will shed light on the early Solar System evolution and on processes of planetary formation.
Max ERC Funding
2 810 229 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym PrintPack
Project Arranging the Particles: Step Changing Chemical Measurement Technology
Researcher (PI) Gert DESMET
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary The progress in liquid chromatography (LC), basically following Moore’s law over the last decade, will soon come to a halt. LC is the current state-of-the-art chemical separation method to measure the composition of complex mixtures. Driven by the ever growing complexity of the samples in e.g., environmental and biomedical research, LC is constantly pushed to higher efficiencies. Using highly optimized and monodisperse spherical particles, randomly packed in high pressure columns, the progress in LC has up till now been realized by reducing the particle size and concomitantly increasing the pressure. With pressure already up at 1500 bar, groundbreaking progress is still badly needed, e.g., to fully unravel the complex reaction networks in human cells.
For this purpose, it is proposed to leave the randomly packed bed paradigm and move to structures wherein the 1 to 5 micrometer particles currently used in LC are arranged in perfectly ordered and open-structured geometries. This is now possible, as the latest advances in nano-manufacturing and positioning allow proposing and developing an inventive high-throughput particle assembly and deposition strategy. The PI's ability to develop new parts of chromatography will be used to rationally optimize the many possible geometries accessible through this disruptive new technology, and identify those structures coping best with any remaining degree of disorder. Using the PI's experimental know-how on microfluidic chromatography systems, these structures will be used to pursue the disruptive gain margin (order of factor 100 in separation speed) that is expected based on general chromatography theory.
Testing this groundbreaking new generation of LC columns together with world-leading bio-analytical scientists will illustrate their potential in making new discoveries in biology and life sciences. The new nano-assembly strategies might also be pushed to other applications, such as photonic crystals.
Summary
The progress in liquid chromatography (LC), basically following Moore’s law over the last decade, will soon come to a halt. LC is the current state-of-the-art chemical separation method to measure the composition of complex mixtures. Driven by the ever growing complexity of the samples in e.g., environmental and biomedical research, LC is constantly pushed to higher efficiencies. Using highly optimized and monodisperse spherical particles, randomly packed in high pressure columns, the progress in LC has up till now been realized by reducing the particle size and concomitantly increasing the pressure. With pressure already up at 1500 bar, groundbreaking progress is still badly needed, e.g., to fully unravel the complex reaction networks in human cells.
For this purpose, it is proposed to leave the randomly packed bed paradigm and move to structures wherein the 1 to 5 micrometer particles currently used in LC are arranged in perfectly ordered and open-structured geometries. This is now possible, as the latest advances in nano-manufacturing and positioning allow proposing and developing an inventive high-throughput particle assembly and deposition strategy. The PI's ability to develop new parts of chromatography will be used to rationally optimize the many possible geometries accessible through this disruptive new technology, and identify those structures coping best with any remaining degree of disorder. Using the PI's experimental know-how on microfluidic chromatography systems, these structures will be used to pursue the disruptive gain margin (order of factor 100 in separation speed) that is expected based on general chromatography theory.
Testing this groundbreaking new generation of LC columns together with world-leading bio-analytical scientists will illustrate their potential in making new discoveries in biology and life sciences. The new nano-assembly strategies might also be pushed to other applications, such as photonic crystals.
Max ERC Funding
2 488 813 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym PULSAR
Project Pushing ultrafast laser material processing into a new regime of plasma-controlled ablation
Researcher (PI) Francois Courvoisier
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary Ultra-intense femtosecond laser pulses promise to become a fast, universal, predictable and green tool for material processing at micro and nanometric scale. The recent tremendous increase in commercially available femtosecond laser energy at high repetition rate opens a wealth of novel perspectives for mass production. But even at high energy, laser processing remains limited to high-speed scanning point by point removal of ultra-thin nanometric layers from the material surface. This is because the uncontrolled laser-generated free-electron plasma shields against light and prevents reaching extreme internal temperatures at very precise nanometric scale.
PULSAR aims at breaking this barrier and developing a radically different concept of laser material modification regime based on free-electron plasma control. PULSAR 's unconventional concept is to control plasma generation, confinement, excitation and stability. An ambitious experimental and numerical research program will push the frontiers of laser processing to unprecedented precision, speed and predictability. PULSAR key concept is highly generic and the results will initiate new research across laser and plasma material processing, plasma physics and ultrafast optics.
Summary
Ultra-intense femtosecond laser pulses promise to become a fast, universal, predictable and green tool for material processing at micro and nanometric scale. The recent tremendous increase in commercially available femtosecond laser energy at high repetition rate opens a wealth of novel perspectives for mass production. But even at high energy, laser processing remains limited to high-speed scanning point by point removal of ultra-thin nanometric layers from the material surface. This is because the uncontrolled laser-generated free-electron plasma shields against light and prevents reaching extreme internal temperatures at very precise nanometric scale.
PULSAR aims at breaking this barrier and developing a radically different concept of laser material modification regime based on free-electron plasma control. PULSAR 's unconventional concept is to control plasma generation, confinement, excitation and stability. An ambitious experimental and numerical research program will push the frontiers of laser processing to unprecedented precision, speed and predictability. PULSAR key concept is highly generic and the results will initiate new research across laser and plasma material processing, plasma physics and ultrafast optics.
Max ERC Funding
1 996 581 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym QUANTMATT
Project Dynamics and transport of quantum matter --- exploring the interplay of topology, interactions and localization
Researcher (PI) Jens Hjörleifur Bárðarson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary Quantum matter is condensed matter which properties are dominated by the quantum nature of its constituents. The two most fundamental properties of quantum mechanics are interference and entanglement. How do these properties, and their derivatives, show up in an experiment? And how does one control them? These are the fundamental questions addressed in this proposal.
The study is divided into three main parts: many-body localization, topological insulator nanowires, and topological semimetals. Many-body localization is concerned with the interplay of interference and entanglement and is central to questions about quantum thermalization. I aim to understand experimental signatures of many-body localization as well as devising simulation schemes that allow us to conduct numerical experiments on many-body localization for larger system sizes than has been so far possible. The interplay of interference, topology and geometry is the central theme of the topic of topological insulator nanowires. I have in the past theoretically demonstrated the signatures of fundamental quantum phenomena in these systems, including perfectly transmitted mode and Majorana fermions. The major goal of this part of the project is to collaborate closely with experimental groups seeking to verify my past theories, by providing new and more detailed predictions for these systems. This requires to further understand experimental details, develop certain theoretical devices and simulation techniques based on them. The final part on topological semimetals is particularly timely in view of recent experimental realizations of Dirac semimetals and the impending realization of Weyl semimetals, which both can be roughly thought of as 3D analogs of graphene. I seek to understand their unique transport signatures and the interplay of disorder with 3D Dirac fermions. The three parts feed into and from each other both through unified concepts and common methodology.
Summary
Quantum matter is condensed matter which properties are dominated by the quantum nature of its constituents. The two most fundamental properties of quantum mechanics are interference and entanglement. How do these properties, and their derivatives, show up in an experiment? And how does one control them? These are the fundamental questions addressed in this proposal.
The study is divided into three main parts: many-body localization, topological insulator nanowires, and topological semimetals. Many-body localization is concerned with the interplay of interference and entanglement and is central to questions about quantum thermalization. I aim to understand experimental signatures of many-body localization as well as devising simulation schemes that allow us to conduct numerical experiments on many-body localization for larger system sizes than has been so far possible. The interplay of interference, topology and geometry is the central theme of the topic of topological insulator nanowires. I have in the past theoretically demonstrated the signatures of fundamental quantum phenomena in these systems, including perfectly transmitted mode and Majorana fermions. The major goal of this part of the project is to collaborate closely with experimental groups seeking to verify my past theories, by providing new and more detailed predictions for these systems. This requires to further understand experimental details, develop certain theoretical devices and simulation techniques based on them. The final part on topological semimetals is particularly timely in view of recent experimental realizations of Dirac semimetals and the impending realization of Weyl semimetals, which both can be roughly thought of as 3D analogs of graphene. I seek to understand their unique transport signatures and the interplay of disorder with 3D Dirac fermions. The three parts feed into and from each other both through unified concepts and common methodology.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym QUASIFT
Project Quantum Algebraic Structures In Field Theories
Researcher (PI) Vasily PESTUN
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary Quantum Field Theory is a universal framework to address quantum physical systems with infinitely many interacting degrees of freedom, applicable both at the level of fundamental interactions, such as the subnuclear physics of quarks and gluons and at the phenomenological level such as the physics of quantum fluids and superconductivity.
Traditionally, weakly interacting quantum field theory is formulated as a perturbative deformation of the linear theory of freely propagating quantum waves or particles with interactions described by Feynman diagrams. For strongly non-linear quantum field theories the method of Feynman diagrams is not adequate.
The main goal of this proposal is to develop novel tools and techniques to address strongly non-linear quantum field theories.
To achieve this goal we will search for hidden algebraic structures in quantum field theories that will lead to efficient algorithms to compute physical observables of interest. In particular we identify non-linear quantum field theories with exactly solvable sectors of physical observables.
In this project we will focus on three objectives:
- build general theory of localization in supersymmetric Yang-Mills theory for arbitrary geometrical backgrounds
- find all realizations of symplectic and supersymplectic completely integrable systems in gauge theories
- construct finite supersymmetric Yang-Mills theory in terms of the algebra of locally supersymmetric loop observables for maximally supersymmetric gauge theory
The realization of the above objectives will uncover hidden quantum algebraic structures and consequently will bring ground-breaking results in our knowledge of quantum field theories and the fundamental interactions.
Summary
Quantum Field Theory is a universal framework to address quantum physical systems with infinitely many interacting degrees of freedom, applicable both at the level of fundamental interactions, such as the subnuclear physics of quarks and gluons and at the phenomenological level such as the physics of quantum fluids and superconductivity.
Traditionally, weakly interacting quantum field theory is formulated as a perturbative deformation of the linear theory of freely propagating quantum waves or particles with interactions described by Feynman diagrams. For strongly non-linear quantum field theories the method of Feynman diagrams is not adequate.
The main goal of this proposal is to develop novel tools and techniques to address strongly non-linear quantum field theories.
To achieve this goal we will search for hidden algebraic structures in quantum field theories that will lead to efficient algorithms to compute physical observables of interest. In particular we identify non-linear quantum field theories with exactly solvable sectors of physical observables.
In this project we will focus on three objectives:
- build general theory of localization in supersymmetric Yang-Mills theory for arbitrary geometrical backgrounds
- find all realizations of symplectic and supersymplectic completely integrable systems in gauge theories
- construct finite supersymmetric Yang-Mills theory in terms of the algebra of locally supersymmetric loop observables for maximally supersymmetric gauge theory
The realization of the above objectives will uncover hidden quantum algebraic structures and consequently will bring ground-breaking results in our knowledge of quantum field theories and the fundamental interactions.
Max ERC Funding
1 498 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym REALISM
Project Reproducing EArthquakes in the Laboratory: Imaging, Speed and Mineralogy
Researcher (PI) Alexandre Jean-Marie Schubnel
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary We propose a simple idea: to reproduce earthquakes in the laboratory. Because earthquakes are spectacular examples of uncontrollable catastrophes, the opportunity to study them under controlled conditions in the laboratory is unique and is, in fact, the only way to understand the details of the earthquake source physics.
The aim of the project is interdisciplinary, at the frontiers between Rock Fracture Mechanics, Seismology, and Mineralogy. Its ultimate goal is to improve, on the basis of integrated experimental data, our understanding of the earthquake source physics. We have already shown that both deep and shallow laboratory earthquakes are not mere `analogs’ of earthquakes, but are real events – though very small [Passelègue et al. 2013, Schubnel et al. 2013]. During laboratory earthquakes, by measuring all of the physical quantities related to the rupturing process, we will unravel what controls the rupture speed, rupture arrest, the earthquake rupture energy budget, as well as the common role played by mineralogy in both shallow and deep earthquakes. We will also perform some experiments on rock samples drilled from actual active fault zones. Our work will provide insights for earthquake hazard mitigation, constrain ubiquitously observed seismological statistical laws (Omori, Gutenberg-Richter) and produce unprecedented data sets on rock fracture dynamics at in-situ conditions to test seismic slip inversion and dynamic rupture modelling techniques.
The new infrastructure we plan to install will reproduce the temperatures and pressures at depths where earthquakes occur in the crust as well as in the upper mantle of the Earth, with never achieved spatio-temporal imaging resolution to this day. This will be a valuable research asset for the European community, as it will eventually open the door to a better understanding of all the processes happening under stress within the first hundreds of kilometres of the Earth.
Summary
We propose a simple idea: to reproduce earthquakes in the laboratory. Because earthquakes are spectacular examples of uncontrollable catastrophes, the opportunity to study them under controlled conditions in the laboratory is unique and is, in fact, the only way to understand the details of the earthquake source physics.
The aim of the project is interdisciplinary, at the frontiers between Rock Fracture Mechanics, Seismology, and Mineralogy. Its ultimate goal is to improve, on the basis of integrated experimental data, our understanding of the earthquake source physics. We have already shown that both deep and shallow laboratory earthquakes are not mere `analogs’ of earthquakes, but are real events – though very small [Passelègue et al. 2013, Schubnel et al. 2013]. During laboratory earthquakes, by measuring all of the physical quantities related to the rupturing process, we will unravel what controls the rupture speed, rupture arrest, the earthquake rupture energy budget, as well as the common role played by mineralogy in both shallow and deep earthquakes. We will also perform some experiments on rock samples drilled from actual active fault zones. Our work will provide insights for earthquake hazard mitigation, constrain ubiquitously observed seismological statistical laws (Omori, Gutenberg-Richter) and produce unprecedented data sets on rock fracture dynamics at in-situ conditions to test seismic slip inversion and dynamic rupture modelling techniques.
The new infrastructure we plan to install will reproduce the temperatures and pressures at depths where earthquakes occur in the crust as well as in the upper mantle of the Earth, with never achieved spatio-temporal imaging resolution to this day. This will be a valuable research asset for the European community, as it will eventually open the door to a better understanding of all the processes happening under stress within the first hundreds of kilometres of the Earth.
Max ERC Funding
2 748 188 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym SEAQUEL
Project Structured Ensembles of Atoms for Quantum Engineering of Light
Researcher (PI) Alexei Ourjoumtsev
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary This project aims at building a new versatile platform for quantum engineering of light, with the unique ability to create deterministic coherent photon-photon interactions tunable in range, strength and dimensionality. It will explore a new avenue towards this goal, combining cutting-edge advances of atomic physics with ideas inspired by nanophotonics: a cold micro-structured gas of interacting atoms will act as a Bragg mirror saturable by a single photon, strongly coupling a controlled number of spatial modes in an optical resonator. This flexible, efficient, dynamically-controlled system will be used to test the limits of fundamental no-go theorems in quantum logic, measure physical quantities inaccessible to standard detectors, and deterministically engineer massively entangled light beams for Heisenberg-limited sensing. Ultimately, it will give access to a yet unexplored regime where intracavity photons form a strongly correlated quantum fluid, with spatial and temporal dynamics ideally suited to perform real-time, single-particle-resolved simulations of non-trivial topological effects appearing in condensed-matter systems.
Summary
This project aims at building a new versatile platform for quantum engineering of light, with the unique ability to create deterministic coherent photon-photon interactions tunable in range, strength and dimensionality. It will explore a new avenue towards this goal, combining cutting-edge advances of atomic physics with ideas inspired by nanophotonics: a cold micro-structured gas of interacting atoms will act as a Bragg mirror saturable by a single photon, strongly coupling a controlled number of spatial modes in an optical resonator. This flexible, efficient, dynamically-controlled system will be used to test the limits of fundamental no-go theorems in quantum logic, measure physical quantities inaccessible to standard detectors, and deterministically engineer massively entangled light beams for Heisenberg-limited sensing. Ultimately, it will give access to a yet unexplored regime where intracavity photons form a strongly correlated quantum fluid, with spatial and temporal dynamics ideally suited to perform real-time, single-particle-resolved simulations of non-trivial topological effects appearing in condensed-matter systems.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym SINCAT
Project Single Nanoparticle Catalysis
Researcher (PI) Christoph Langhammer
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary Imagine a sustainable society where clean energy is produced from sunlight, and water is converted into hydrogen to fuel a fuel cell, which produces electric energy to power the electric motor in a car. At the same time, CO2 emissions are captured and converted to hydrocarbons that are again used as fuel or as resource for fine chemical synthesis. At the heart of this vision is heterogeneous catalysis. Hence, for it to become reality, tailored highly efficient catalyst materials are of paramount importance. The goal of this research program is therefore to establish a new experimental paradigm, which allows the detailed scrutiny of individual catalyst nanoparticles and their reaction products under application conditions.
The catalytic performance of nanoparticles is directly controlled by their size, shape and chemical composition. Current studies are, however, conducted on ensembles of nanoparticles. Therefore, such studies are plagued by averaging effects, which deny access to the key details related to how size, shape and composition control catalyst performance. To eliminate this problem, we will nanofabricate a unique nanofluidic reactor device that will enable us to scrutinize catalytic processes and products at the individual catalyst nanoparticle level. In a second step, we will integrate plasmonic optical probes with the nanoreactor to be able to simultaneously monitor the dynamics of the catalyst particle state during reaction.
Finally, we will apply the nanoreactor to investigate the role of the catalyst oxidation state in Fischer-Tropsch catalysis. In parallel, we will explore novel plasmon-induced hot electron-mediated reaction pathways for catalytic CO2 reduction, as part of a carbon-neutral energy cycle. We anticipate unprecedented insight into the role of catalyst particle state, size and shape in these processes. This will facilitate the development of more efficient catalyst materials in the quest for an energy-efficient and sustainable future.
Summary
Imagine a sustainable society where clean energy is produced from sunlight, and water is converted into hydrogen to fuel a fuel cell, which produces electric energy to power the electric motor in a car. At the same time, CO2 emissions are captured and converted to hydrocarbons that are again used as fuel or as resource for fine chemical synthesis. At the heart of this vision is heterogeneous catalysis. Hence, for it to become reality, tailored highly efficient catalyst materials are of paramount importance. The goal of this research program is therefore to establish a new experimental paradigm, which allows the detailed scrutiny of individual catalyst nanoparticles and their reaction products under application conditions.
The catalytic performance of nanoparticles is directly controlled by their size, shape and chemical composition. Current studies are, however, conducted on ensembles of nanoparticles. Therefore, such studies are plagued by averaging effects, which deny access to the key details related to how size, shape and composition control catalyst performance. To eliminate this problem, we will nanofabricate a unique nanofluidic reactor device that will enable us to scrutinize catalytic processes and products at the individual catalyst nanoparticle level. In a second step, we will integrate plasmonic optical probes with the nanoreactor to be able to simultaneously monitor the dynamics of the catalyst particle state during reaction.
Finally, we will apply the nanoreactor to investigate the role of the catalyst oxidation state in Fischer-Tropsch catalysis. In parallel, we will explore novel plasmon-induced hot electron-mediated reaction pathways for catalytic CO2 reduction, as part of a carbon-neutral energy cycle. We anticipate unprecedented insight into the role of catalyst particle state, size and shape in these processes. This will facilitate the development of more efficient catalyst materials in the quest for an energy-efficient and sustainable future.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym SLAB
Project Signal processing and Learning Applied to Brain data
Researcher (PI) Alexandre Marc Gramfort
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Understanding how the brain works in healthy and pathological conditions is considered as one of the challenges for the 21st century. After the first electroencephalography (EEG) measurements in 1929, the 90’s was the birth of modern functional brain imaging with the first functional MRI and full head magnetoencephalography (MEG) system. In the last twenty years, imaging has revolutionized clinical and cognitive neuroscience.
After pioneering works in physics and engineering, the field of neuroscience has to face two major challenges.
The size of the datasets keeps growing. The answers to neuroscience questions are limited by the complexity of the signals observed: non-stationarity, high noise levels, heterogeneity of sensors, lack of accurate models.
SLAB will provide the next generation of models and algorithms for mining electrophysiology signals which offer unique ways to image the brain at a millisecond time scale.
SLAB will develop dedicated machine learning and signal processing methods and favor the emergence of new challenges for these fields. SLAB focuses on five objectives: 1) source localization with M/EEG for brain imaging at high temporal resolution 2) representation learning to boost statistical power and reduce acquisition costs 3) fusion of heterogeneous sensors 4) modeling of non-stationary spectral interactions to identify functional coupling between neural ensembles 5) development of fast algorithms easy to use by non-experts.
SLAB aims to strengthen mathematical and computational foundations of brain data analysis. The methods developed will have applications across fields (computational biology, astronomy, econometrics). Yet, the primary impact of SLAB will be on neuroscience. The tools and high quality open software produced in SLAB will facilitate the analysis of electrophysiology data, offering new perspectives to understand how the brain works at a mesoscale, and for clinical applications (epilepsy, autism, tremor, sleep disorders).
Summary
Understanding how the brain works in healthy and pathological conditions is considered as one of the challenges for the 21st century. After the first electroencephalography (EEG) measurements in 1929, the 90’s was the birth of modern functional brain imaging with the first functional MRI and full head magnetoencephalography (MEG) system. In the last twenty years, imaging has revolutionized clinical and cognitive neuroscience.
After pioneering works in physics and engineering, the field of neuroscience has to face two major challenges.
The size of the datasets keeps growing. The answers to neuroscience questions are limited by the complexity of the signals observed: non-stationarity, high noise levels, heterogeneity of sensors, lack of accurate models.
SLAB will provide the next generation of models and algorithms for mining electrophysiology signals which offer unique ways to image the brain at a millisecond time scale.
SLAB will develop dedicated machine learning and signal processing methods and favor the emergence of new challenges for these fields. SLAB focuses on five objectives: 1) source localization with M/EEG for brain imaging at high temporal resolution 2) representation learning to boost statistical power and reduce acquisition costs 3) fusion of heterogeneous sensors 4) modeling of non-stationary spectral interactions to identify functional coupling between neural ensembles 5) development of fast algorithms easy to use by non-experts.
SLAB aims to strengthen mathematical and computational foundations of brain data analysis. The methods developed will have applications across fields (computational biology, astronomy, econometrics). Yet, the primary impact of SLAB will be on neuroscience. The tools and high quality open software produced in SLAB will facilitate the analysis of electrophysiology data, offering new perspectives to understand how the brain works at a mesoscale, and for clinical applications (epilepsy, autism, tremor, sleep disorders).
Max ERC Funding
1 492 253 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym SmartCast
Project Smart casting of concrete structures by active control of rheology
Researcher (PI) Geert De schutter
Host Institution (HI) UNIVERSITEIT GENT
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Concrete production processes do not take full advantage of the rheological potential of fresh cementitious materials, and are still largely labour-driven and sensitive to the human factor. SmartCast proposes a new concrete casting concept to transform the concrete industry into a highly automated technological industry. Currently, the rheological properties of the concrete are defined by mix design and mixing procedure without any further active adjustment during casting. The goal of this proposal is the active control of concrete rheology during casting, and the active triggering of early stiffening of the concrete as soon as it is put in place. The ground-breaking idea to achieve this goal, is to develop concrete with actively controllable rheology by adding admixtures responsive to externally activated electromagnetic frequencies. Inter-disciplinary insights are important to achieve these goals, including inputs from concrete technology, polymer science, electrochemistry, rheology and computational fluid dynamics.
We will develop 4 new experimental test set-ups allowing to study active rheology control during different phases of the casting process: 1)concrete pumping (control of slip layer), 2)while flowing in the formwork (bulk control of rheology), 3)while flowing through formwork joints (control of formwork tightness), and 4)once the concrete is in its final position (trigger stiffening). Well-designed polymers with the desired response to the applied activation will be added to the concrete during mixing. The experiments will be analysed by advanced computational flow modelling based on fundamental rheological laws. Special attention will be paid to the compatibility of all responsive polymers selected for the different control phases. SmartCast will mean a paradigm shift for formwork-based concrete casting. The developed active rheology control will provide a fundamental basis for the development of future-proof 3D printing techniques in concrete industry
Summary
Concrete production processes do not take full advantage of the rheological potential of fresh cementitious materials, and are still largely labour-driven and sensitive to the human factor. SmartCast proposes a new concrete casting concept to transform the concrete industry into a highly automated technological industry. Currently, the rheological properties of the concrete are defined by mix design and mixing procedure without any further active adjustment during casting. The goal of this proposal is the active control of concrete rheology during casting, and the active triggering of early stiffening of the concrete as soon as it is put in place. The ground-breaking idea to achieve this goal, is to develop concrete with actively controllable rheology by adding admixtures responsive to externally activated electromagnetic frequencies. Inter-disciplinary insights are important to achieve these goals, including inputs from concrete technology, polymer science, electrochemistry, rheology and computational fluid dynamics.
We will develop 4 new experimental test set-ups allowing to study active rheology control during different phases of the casting process: 1)concrete pumping (control of slip layer), 2)while flowing in the formwork (bulk control of rheology), 3)while flowing through formwork joints (control of formwork tightness), and 4)once the concrete is in its final position (trigger stiffening). Well-designed polymers with the desired response to the applied activation will be added to the concrete during mixing. The experiments will be analysed by advanced computational flow modelling based on fundamental rheological laws. Special attention will be paid to the compatibility of all responsive polymers selected for the different control phases. SmartCast will mean a paradigm shift for formwork-based concrete casting. The developed active rheology control will provide a fundamental basis for the development of future-proof 3D printing techniques in concrete industry
Max ERC Funding
2 498 750 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym SOLCRIMET
Project Solvometallurgy for critical metals
Researcher (PI) Koen Binnemans
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary The recent “rare-earth crisis” has brought about the widespread realisation that the long-term availability and cost stability of many materials – not just the rare earths – can no longer be guaranteed. Increasing the levels of critical metal recycling from pre-consumer, manufacturing waste and complex, multicomponent end-of-life consumer products is considered as arguably the most important and realistic mitigation strategy. However, extracting a critical metal from complex waste is a very different challenge to that faced when attempting to produce a pure metal from a primary ore deposit. SOLCRIMET therefore develops a ground-breaking, novel approach called “solvometallurgy”, a new branch within metallurgy, next to conventional hydro- and pyrometallurgy. SOLCRIMET’s aim is to successfully apply this approach to the extraction of specific critical metals, i.e. rare earths, tantalum, niobium, cobalt, indium, gallium, germanium and antimony. As these critical metals are essential components for clean-tech and high-tech applications, they are key enablers of the required transition to a low-carbon, circular economy. The approach involves the discovery of non-aqueous solvent pairs that are immiscible and allow the extraction of metal complexes at moderate temperatures, leading to high-purity recycled metals. The idea is certainly high risk, but the preliminary results already obtained are highly encouraging. The main outcomes of the project will be lab-scale demonstrators that show the enhanced efficiency, utility and applicability of the new solvometallurgical process, with respect to conventional hydro- and pyrometallurgy. SOLCRIMET’s impact on chemistry, chemical technology, metallurgy and materials engineering science will be game-changing. The possibility to recycle critical metals with energy-efficient, low-cost processes could have a significant impact on the global recycling rates of these metals.
Summary
The recent “rare-earth crisis” has brought about the widespread realisation that the long-term availability and cost stability of many materials – not just the rare earths – can no longer be guaranteed. Increasing the levels of critical metal recycling from pre-consumer, manufacturing waste and complex, multicomponent end-of-life consumer products is considered as arguably the most important and realistic mitigation strategy. However, extracting a critical metal from complex waste is a very different challenge to that faced when attempting to produce a pure metal from a primary ore deposit. SOLCRIMET therefore develops a ground-breaking, novel approach called “solvometallurgy”, a new branch within metallurgy, next to conventional hydro- and pyrometallurgy. SOLCRIMET’s aim is to successfully apply this approach to the extraction of specific critical metals, i.e. rare earths, tantalum, niobium, cobalt, indium, gallium, germanium and antimony. As these critical metals are essential components for clean-tech and high-tech applications, they are key enablers of the required transition to a low-carbon, circular economy. The approach involves the discovery of non-aqueous solvent pairs that are immiscible and allow the extraction of metal complexes at moderate temperatures, leading to high-purity recycled metals. The idea is certainly high risk, but the preliminary results already obtained are highly encouraging. The main outcomes of the project will be lab-scale demonstrators that show the enhanced efficiency, utility and applicability of the new solvometallurgical process, with respect to conventional hydro- and pyrometallurgy. SOLCRIMET’s impact on chemistry, chemical technology, metallurgy and materials engineering science will be game-changing. The possibility to recycle critical metals with energy-efficient, low-cost processes could have a significant impact on the global recycling rates of these metals.
Max ERC Funding
2 496 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym STAQAMOF
Project Statistical modelling across price and time scales: a quantitative approach to modern financial regulation
Researcher (PI) Mathieu Felix Rosenbaum
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This project aims at providing a new quantitative approach to financial regulation, notably in the context of high frequency trading. The key idea of our method is to build relevant statistical models through price and time scales, connecting the microstructure of financial markets to the long term behavior of prices. Doing so, we will be able to understand and quantify the macroscopic consequences of regulatory measures modifying the microscopic design of the market. Succeeding in this modelling task will require to address several intricate statistical problems. In particular new results will be needed in the fields of limit theory for semi-martingales, multifractal processes, rough stochastic differential equations, Hawkes processes and high-dimensional statistics. Hence, through this project, we not only have the hope to provide groundbreaking tools for worldwide regulation of financial markets but, concurrently, to answer important and challenging mathematical problems. In term of analyzing concrete regulatory measures, particular attention will be devoted to the issue of the choice of a proper tick value, that is the minimal price increment allowed on a financial market. Indeed, the tick value is the tool which seems to be favored by most policy makers in order to regulate high frequency trading.
Summary
This project aims at providing a new quantitative approach to financial regulation, notably in the context of high frequency trading. The key idea of our method is to build relevant statistical models through price and time scales, connecting the microstructure of financial markets to the long term behavior of prices. Doing so, we will be able to understand and quantify the macroscopic consequences of regulatory measures modifying the microscopic design of the market. Succeeding in this modelling task will require to address several intricate statistical problems. In particular new results will be needed in the fields of limit theory for semi-martingales, multifractal processes, rough stochastic differential equations, Hawkes processes and high-dimensional statistics. Hence, through this project, we not only have the hope to provide groundbreaking tools for worldwide regulation of financial markets but, concurrently, to answer important and challenging mathematical problems. In term of analyzing concrete regulatory measures, particular attention will be devoted to the issue of the choice of a proper tick value, that is the minimal price increment allowed on a financial market. Indeed, the tick value is the tool which seems to be favored by most policy makers in order to regulate high frequency trading.
Max ERC Funding
1 165 625 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym Synth
Project Synthesising Inductive Data Models
Researcher (PI) Luc DE RAEDT
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary Inspired by recent successes towards automating highly complex jobs like programming and scientific experimentation, the ultimate goal of this project is to automate the task of the data scientist when developing intelligent systems, which is to extract knowledge from data in the form of models. More specifically, this project wants to develop the foundations of a theory and methodology for automatically synthesising inductive data models.
An inductive data model (IDM) consists of 1) a data model (DM) that specifies an adequate data structure for the dataset (just like a database), and 2) a set of inductive models (IMs), that is, a set of patterns and models that have been discovered in the data. While the DM can be used to retrieve information about the dataset and to answer questions about specific data points, the IMs can be used to make predictions, propose values for missing data, find inconsistencies and redundancies, etc. The task addressed in this project is to automatically synthesise such IMs from past data and to use these to support the user when making decisions.
It will be assumed that the data set consists of a set of tables, that the end-user interacts with the IDM via a visual interface, and the data scientist via a unifying IDM language offering a number of core IMs and learning algorithms.
The key challenges to be tackled in SYNTH are: 1) the synthesis system must ”learn the learning task”, that is, it should identify the right learning tasks and learn appropriate IMs for each of these; 2) the system may need to restructure the data set before IM synthesis can start; and 3) a unifying IDM language for a set of core patterns and models must be developed.
The approach will be implemented in open source software and evaluated on two challenging application areas: rostering and sports analytics.
Summary
Inspired by recent successes towards automating highly complex jobs like programming and scientific experimentation, the ultimate goal of this project is to automate the task of the data scientist when developing intelligent systems, which is to extract knowledge from data in the form of models. More specifically, this project wants to develop the foundations of a theory and methodology for automatically synthesising inductive data models.
An inductive data model (IDM) consists of 1) a data model (DM) that specifies an adequate data structure for the dataset (just like a database), and 2) a set of inductive models (IMs), that is, a set of patterns and models that have been discovered in the data. While the DM can be used to retrieve information about the dataset and to answer questions about specific data points, the IMs can be used to make predictions, propose values for missing data, find inconsistencies and redundancies, etc. The task addressed in this project is to automatically synthesise such IMs from past data and to use these to support the user when making decisions.
It will be assumed that the data set consists of a set of tables, that the end-user interacts with the IDM via a visual interface, and the data scientist via a unifying IDM language offering a number of core IMs and learning algorithms.
The key challenges to be tackled in SYNTH are: 1) the synthesis system must ”learn the learning task”, that is, it should identify the right learning tasks and learn appropriate IMs for each of these; 2) the system may need to restructure the data set before IM synthesis can start; and 3) a unifying IDM language for a set of core patterns and models must be developed.
The approach will be implemented in open source software and evaluated on two challenging application areas: rostering and sports analytics.
Max ERC Funding
2 458 656 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym TissueMaps
Project Integrating spatial and genetic information via automated image analysis and interactive visualization of tissue data
Researcher (PI) Ewa Asa Carolina Wahlby
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Digital imaging of tissue samples and genetic analysis by next generation sequencing are two rapidly emerging fields in pathology. The exponential growth in digital imaging in pathology is catalyzed by more advanced imaging hardware, comparable to the complete shift from analog to digital images that took place in radiology a couple of decades ago: Entire glass slides can be digitized at near the optical resolution limits in only a few minutes’ time, and fluorescence as well as bright field stains can be imaged in parallel.
Genetic analysis, and particularly transcriptomics, is rapidly evolving thanks to the impressive development of next generation sequencing technologies, enabling genome-wide single-cell analysis of DNA and RNA in thousands of cells at constantly decreasing costs. However, most of today’s available technologies result in a genetic analysis that is decoupled from the morphological and spatial information of the original tissue sample, while many important questions in tumor- and developmental biology require single cell spatial resolution to understand tissue heterogeneity.
The goal of the proposed project is to develop computational methods that bridge these two emerging fields. We want to combine spatially resolved high-throughput genomics analysis of tissue sections with digital image analysis of tissue morphology. Together with collaborators from the biomedical field, we propose two approaches for spatially resolved genomics; one based on sequencing mRNA transcripts directly in tissue samples, and one based on spatially resolved cellular barcoding followed by single cell sequencing. Both approaches require development of advanced digital image processing methods. Thus, we will couple genetic analysis with digital pathology. Going beyond visual assessment of this rich digital data will be a fundamental component for the future development of histopathology, both as a diagnostic tool and as a research field.
Summary
Digital imaging of tissue samples and genetic analysis by next generation sequencing are two rapidly emerging fields in pathology. The exponential growth in digital imaging in pathology is catalyzed by more advanced imaging hardware, comparable to the complete shift from analog to digital images that took place in radiology a couple of decades ago: Entire glass slides can be digitized at near the optical resolution limits in only a few minutes’ time, and fluorescence as well as bright field stains can be imaged in parallel.
Genetic analysis, and particularly transcriptomics, is rapidly evolving thanks to the impressive development of next generation sequencing technologies, enabling genome-wide single-cell analysis of DNA and RNA in thousands of cells at constantly decreasing costs. However, most of today’s available technologies result in a genetic analysis that is decoupled from the morphological and spatial information of the original tissue sample, while many important questions in tumor- and developmental biology require single cell spatial resolution to understand tissue heterogeneity.
The goal of the proposed project is to develop computational methods that bridge these two emerging fields. We want to combine spatially resolved high-throughput genomics analysis of tissue sections with digital image analysis of tissue morphology. Together with collaborators from the biomedical field, we propose two approaches for spatially resolved genomics; one based on sequencing mRNA transcripts directly in tissue samples, and one based on spatially resolved cellular barcoding followed by single cell sequencing. Both approaches require development of advanced digital image processing methods. Thus, we will couple genetic analysis with digital pathology. Going beyond visual assessment of this rich digital data will be a fundamental component for the future development of histopathology, both as a diagnostic tool and as a research field.
Max ERC Funding
1 738 690 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym TRANSEP
Project Flow physics and interaction of laminar-turbulent transition and flow separation studied by direct numerical simulations
Researcher (PI) Dan Henningson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary The vision spelled out in this proposal is to overcome the failure of Computational Fluid Dynamics to tackle one of the central unsolved fluid physics problems, namely predicting the sensitive flow physics associated with laminar-turbulent transition and flow separation. A recent, highly influential report by NASA (Slotnick et al., 2014) clearly states that the major shortcoming of CFD is its “… inability to accurately and reliably predict turbulent flows with significant regions of separation”, most often associated with laminar-turbulent transition.
The research proposed here will address this shortcoming and develop and utilize computational methods that are able to predict, understand and control the sensitive interplay between laminar-turbulent transition and flow separation in boundary layers on wings and other aerodynamic bodies.
We will be able to understand enigmas such as the recent results from the experiments of Saric et al. at the Texas A&M Univeristy where the laminar area of a wing grows after a smooth surface have been painted (increased roughness), or the drastic changes of laminar-turbulent transition and separation locations on unsteady wings, or the notoriously difficult interaction of multiple separation and transition regions on high-lift wing configurations. For such flows there have been little understanding of flow physics and few computational prediction capabilities. Here we will perform simulations that give completely new possibilities to visualize, understand and control the flow around such wings and aerodynamic bodies, including the possibility to compute and harness the flow sensitivities.
We will tackle these outstanding flow and turbulence problem using the new possibilities enabled by multi-peta scale computing.
Summary
The vision spelled out in this proposal is to overcome the failure of Computational Fluid Dynamics to tackle one of the central unsolved fluid physics problems, namely predicting the sensitive flow physics associated with laminar-turbulent transition and flow separation. A recent, highly influential report by NASA (Slotnick et al., 2014) clearly states that the major shortcoming of CFD is its “… inability to accurately and reliably predict turbulent flows with significant regions of separation”, most often associated with laminar-turbulent transition.
The research proposed here will address this shortcoming and develop and utilize computational methods that are able to predict, understand and control the sensitive interplay between laminar-turbulent transition and flow separation in boundary layers on wings and other aerodynamic bodies.
We will be able to understand enigmas such as the recent results from the experiments of Saric et al. at the Texas A&M Univeristy where the laminar area of a wing grows after a smooth surface have been painted (increased roughness), or the drastic changes of laminar-turbulent transition and separation locations on unsteady wings, or the notoriously difficult interaction of multiple separation and transition regions on high-lift wing configurations. For such flows there have been little understanding of flow physics and few computational prediction capabilities. Here we will perform simulations that give completely new possibilities to visualize, understand and control the flow around such wings and aerodynamic bodies, including the possibility to compute and harness the flow sensitivities.
We will tackle these outstanding flow and turbulence problem using the new possibilities enabled by multi-peta scale computing.
Max ERC Funding
2 097 520 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ULT-MAS-DNP
Project Dynamic Nuclear Polarization at ultra-fast sample spinning and ultra-low temperature
Researcher (PI) Gaël De Paëpe
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Consolidator Grant (CoG), PE4, ERC-2015-CoG
Summary The goal of the project is to develop a new hyperpolarization approach called Magic Angle Spinning Dynamic Nuclear Polarization (MAS-DNP) to reach levels of sensitivity and resolution that have never been achieved, in order to tackle highly relevant chemical and biological questions that remain unanswered so far. Firstly this will provide major advances in NMR crystallography (solving 3D structures by NMR) by showing that distance measurements between nuclei (13C, 15N, etc.) as well as 17O quadrupolar parameters can be extracted from NMR measurements without requiring isotopic labeling. This will be applied to systems that cannot be easily isotopically enriched and for which X-ray analysis is often not suitable. Secondly we propose an innovative strategy to hyperpolarize nuclear spins using MAS-DNP: rather than polarizing the entire system uniformly, we will selectively “light up” regions where we wish to gather important structural information. This will be developed to study protein-ligand interactions (with unprecedented resolution) to answer specific structural questions and potentially impact the field of drug engineering. Finally we will show that the unique experimental setup developed in this project will open up NMR to the routine study of “exotic”, yet ubiquitous and highly informative, nuclei such as 43Ca and 67Zn. Specifically, we will show that MAS-DNP can become a choice technique for the study of diamagnetic metal binding sites, complementing EPR for the study of metalloproteins. These goals will be achieved thanks to the development of original methods and advanced instrumentation, allowing sustainable access to low temperatures (down to 10-20 K) and fast pneumatic sample spinning, under microwave irradiation. We expect to improve the current sensitivity to such an extent that 4 orders of magnitude of experimental timesavings are obtained, resulting in completely new research directions and regimes.
Summary
The goal of the project is to develop a new hyperpolarization approach called Magic Angle Spinning Dynamic Nuclear Polarization (MAS-DNP) to reach levels of sensitivity and resolution that have never been achieved, in order to tackle highly relevant chemical and biological questions that remain unanswered so far. Firstly this will provide major advances in NMR crystallography (solving 3D structures by NMR) by showing that distance measurements between nuclei (13C, 15N, etc.) as well as 17O quadrupolar parameters can be extracted from NMR measurements without requiring isotopic labeling. This will be applied to systems that cannot be easily isotopically enriched and for which X-ray analysis is often not suitable. Secondly we propose an innovative strategy to hyperpolarize nuclear spins using MAS-DNP: rather than polarizing the entire system uniformly, we will selectively “light up” regions where we wish to gather important structural information. This will be developed to study protein-ligand interactions (with unprecedented resolution) to answer specific structural questions and potentially impact the field of drug engineering. Finally we will show that the unique experimental setup developed in this project will open up NMR to the routine study of “exotic”, yet ubiquitous and highly informative, nuclei such as 43Ca and 67Zn. Specifically, we will show that MAS-DNP can become a choice technique for the study of diamagnetic metal binding sites, complementing EPR for the study of metalloproteins. These goals will be achieved thanks to the development of original methods and advanced instrumentation, allowing sustainable access to low temperatures (down to 10-20 K) and fast pneumatic sample spinning, under microwave irradiation. We expect to improve the current sensitivity to such an extent that 4 orders of magnitude of experimental timesavings are obtained, resulting in completely new research directions and regimes.
Max ERC Funding
1 999 805 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym WHIPLASH
Project WHat next? an Integrated PLanetary Atmosphere Simulator: from Habitable worlds to Hot jupiters
Researcher (PI) jeremy Jean Maurice Henri Leconte
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2015-STG
Summary Thousands of exoplanets have now been found. In the next decade, the grand challenge is to characterize their atmospheres. This is the only way to unravel the origin of the wild, unexpected diversity we have uncovered. For this task, there are several planned missions—JWST being our next best opportunity. However, to be ready for the analysis and interpretation of such high-precision observations, we need new-generation tools fit to address the multiple challenges they will raise. Indeed, until now, most atmospheric characterization observations—e.g. transit/eclipse spectroscopy—are analyzed with spherically symmetric, steady state 1D models that cannot accurately represent the very anisotropic atmospheres of most transiting exoplanets. This issue is worsened by the ubiquity of clouds, whose inhomogeneous spatial distribution—patchiness—prevents any satisfactory treatment in 1D.
In this project, we will develop a new framework to constrain the physics and composition of exo-atmospheres that will allow us to overcome these difficulties when analyzing and interpreting observations. This will be done by exploiting a new 3D planetary atmosphere simulator that integrates a global climate model and a 3D Monte Carlo radiative transfer code to generate observables. Using such an innovative approach, this ERC project will thus answer the following fundamental questions:
- What are the necessary conditions to sustain liquid water on terrestrial exoplanets? How can we infer observationally whether an atmosphere meeting these requirements is actually present?
- Can clouds explain the puzzling features of observed hot, gaseous exoplanets? What can these observations tell us on the dynamical and microphysical properties of clouds inside these atmospheres?
If we want theory to keep pace with the quality of future data, such a project is the necessary counterpart to the huge ongoing observational effort made by the community.
Summary
Thousands of exoplanets have now been found. In the next decade, the grand challenge is to characterize their atmospheres. This is the only way to unravel the origin of the wild, unexpected diversity we have uncovered. For this task, there are several planned missions—JWST being our next best opportunity. However, to be ready for the analysis and interpretation of such high-precision observations, we need new-generation tools fit to address the multiple challenges they will raise. Indeed, until now, most atmospheric characterization observations—e.g. transit/eclipse spectroscopy—are analyzed with spherically symmetric, steady state 1D models that cannot accurately represent the very anisotropic atmospheres of most transiting exoplanets. This issue is worsened by the ubiquity of clouds, whose inhomogeneous spatial distribution—patchiness—prevents any satisfactory treatment in 1D.
In this project, we will develop a new framework to constrain the physics and composition of exo-atmospheres that will allow us to overcome these difficulties when analyzing and interpreting observations. This will be done by exploiting a new 3D planetary atmosphere simulator that integrates a global climate model and a 3D Monte Carlo radiative transfer code to generate observables. Using such an innovative approach, this ERC project will thus answer the following fundamental questions:
- What are the necessary conditions to sustain liquid water on terrestrial exoplanets? How can we infer observationally whether an atmosphere meeting these requirements is actually present?
- Can clouds explain the puzzling features of observed hot, gaseous exoplanets? What can these observations tell us on the dynamical and microphysical properties of clouds inside these atmospheres?
If we want theory to keep pace with the quality of future data, such a project is the necessary counterpart to the huge ongoing observational effort made by the community.
Max ERC Funding
1 480 421 €
Duration
Start date: 2016-09-01, End date: 2021-08-31