Project acronym AArteMIS
Project Aneurysmal Arterial Mechanics: Into the Structure
Researcher (PI) Pierre Joseph Badel
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Summary
The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Max ERC Funding
1 499 783 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym AEROFLEX
Project AEROelastic instabilities and control of FLEXible Structures
Researcher (PI) Olivier Pierre MARQUET
Host Institution (HI) OFFICE NATIONAL D'ETUDES ET DE RECHERCHES AEROSPATIALES
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Summary
Aeroelastic instabilities are at the origin of large deformations of structures and are limiting the capacities of products in various industrial branches such as aeronautics, marine industry, or wind electricity production. If suppressing aeroelastic instabilities is an ultimate goal, a paradigm shift in the technological development is to take advantage of these instabilities to achieve others objectives, as reducing the drag of these flexible structures. The ground-breaking challenges addressed in this project are to design fundamentally new theoretical methodologies for (i) describing mathematically aeroelastic instabilities, (ii) suppressing them and (iii) using them to reduce mean drag of structures at a low energetic cost. To that aim, two types of aeroelastic phenomena will be specifically studied: the flutter, which arises as a result of an unstable coupling instability between two stable dynamics, that of the structures and that the flow, and vortex-induced vibrations which appear when the fluid dynamics is unstable. An aeroelastic global stability analysis will be first developed and applied to problems of increasing complexity, starting from two-dimensional free-vibrating rigid structures and progressing towards three-dimensional free-deforming elastic structures. The control of these aeroelastic instabilities will be then addressed with two different objectives: their suppression or their use for flow control. A theoretical passive control methodology will be established for suppressing linear aeroelastic instabilities, and extended to high Reynolds number flows and experimental configurations. New perturbation methods for solving strongly nonlinear problems and adjoint-based control algorithm will allow to use these aeroelastic instabilities for drag reduction. This project will allow innovative control solutions to emerge, not only in flutter or vortex-induced vibrations problems, but also in a much broader class of fluid-structure problems.
Max ERC Funding
1 377 290 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym aLzINK
Project Alzheimer's disease and Zinc: the missing link ?
Researcher (PI) Christelle Sandrine Florence HUREAU-SABATER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Summary
Alzheimer's disease (AD) is one of the most serious diseases mankind is now facing as its social and economical impacts are increasing fastly. AD is very complex and the amyloid-β (Aβ) peptide as well as metallic ions (mainly copper and zinc) have been linked to its aetiology. While the deleterious impact of Cu is widely acknowledged, intervention of Zn is certain but still needs to be figured out.
The main objective of the present proposal, which is strongly anchored in the bio-inorganic chemistry field at interface with spectroscopy and biochemistry, is to design, synthesize and study new drug candidates (ligands L) capable of (i) targeting Cu(II) bound to Aβ within the synaptic cleft, where Zn is co-localized and ultimately to develop Zn-driven Cu(II) removal from Aβ and (ii) disrupting the aberrant Cu(II)-Aβ interactions involved in ROS production and Aβ aggregation, two deleterious events in AD. The drug candidates will thus have high Cu(II) over Zn selectively to preserve the crucial physiological role of Zn in the neurotransmission process. Zn is always underestimated (if not completely neglected) in current therapeutic approaches targeting Cu(II) despite the known interference of Zn with Cu(II) binding.
To reach this objective, it is absolutely necessary to first understand the metal ions trafficking issues in presence of Aβ alone at a molecular level (i.e. without the drug candidates).This includes: (i) determination of Zn binding site to Aβ, impact on Aβ aggregation and cell toxicity, (ii) determination of the mutual influence of Zn and Cu to their coordination to Aβ, impact on Aβ aggregation, ROS production and cell toxicity.
Methods used will span from organic synthesis to studies of neuronal model cells, with a major contribution of a wide panel of spectroscopic techniques including NMR, EPR, mass spectrometry, fluorescence, UV-Vis, circular-dichroism, X-ray absorption spectroscopy...
Max ERC Funding
1 499 948 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym ANT
Project Automata in Number Theory
Researcher (PI) Boris Adamczewski
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Summary
Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Max ERC Funding
1 438 745 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ARPEMA
Project Anionic redox processes: A transformational approach for advanced energy materials
Researcher (PI) Jean-Marie Tarascon
Host Institution (HI) COLLEGE DE FRANCE
Call Details Advanced Grant (AdG), PE5, ERC-2014-ADG
Summary Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Summary
Redox chemistry provides the fundamental basis for numerous energy-related electrochemical devices, among which Li-ion batteries (LIB) have become the premier energy storage technology for portable electronics and vehicle electrification. Throughout its history, LIB technology has relied on cationic redox reactions as the sole source of energy storage capacity. This is no longer true. In 2013 we demonstrated that Li-driven reversible formation of (O2)n peroxo-groups in new layered oxides led to extraordinary increases in energy storage capacity. This finding, which is receiving worldwide attention, represents a transformational approach for creating advanced energy materials for not only energy storage, but also water splitting applications as both involve peroxo species. However, as is often the case with new discoveries, the fundamental science at work needs to be rationalized and understood. Specifically, what are the mechanisms for ion and electron transport in these Li-driven anionic redox reactions?
To address these seminal questions and to widen the spectrum of materials (transition metal and anion) showing anionic redox chemistry, we propose a comprehensive research program that combines experimental and computational methods. The experimental methods include structural and electrochemical analyses (both ex-situ and in-situ), and computational modeling will be based on first-principles DFT for identifying the fundamental processes that enable anionic redox activity. The knowledge gained from these studies, in combination with our expertise in inorganic synthesis, will enable us to design a new generation of Li-ion battery materials that exhibit substantial increases (20 -30%) in energy storage capacity, with additional impacts on the development of Na-ion batteries and the design of water splitting catalysts, with the feasibility to surpass current water splitting efficiencies via novel (O2)n-based electrocatalysts.
Max ERC Funding
2 249 196 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BIOLOCHANICS
Project Localization in biomechanics and mechanobiology of aneurysms: Towards personalized medicine
Researcher (PI) Stéphane Henri Anatole Avril
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Consolidator Grant (CoG), PE8, ERC-2014-CoG
Summary Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Summary
Rupture of Aortic Aneurysms (AA), which kills more than 30 000 persons every year in Europe and the USA, is a complex phenomenon that occurs when the wall stress exceeds the local strength of the aorta due to degraded properties of the tissue. The state of the art in AA biomechanics and mechanobiology reveals that major scientific challenges still have to be addressed to permit patient-specific computational predictions of AA rupture and enable localized repair of the structure with targeted pharmacologic treatment. A first challenge relates to ensuring an objective prediction of localized mechanisms preceding rupture. A second challenge relates to modelling the patient-specific evolutions of material properties leading to the localized mechanisms preceding rupture. Addressing these challenges is the aim of the BIOLOCHANICS proposal. We will take into account internal length-scales controlling localization mechanisms preceding AA rupture by implementing an enriched, also named nonlocal, continuum damage theory in the computational models of AA biomechanics and mechanobiology. We will also develop very advanced experiments, based on full-field optical measurements, aimed at characterizing localization mechanisms occurring in aortic tissues and at identifying local distributions of material properties at different stages of AA progression. A first in vivo application will be performed on genetic and pharmacological models of mice and rat AA. Eventually, a retrospective clinical study involving more than 100 patients at the Saint-Etienne University hospital will permit calibrating estimations of AA rupture risk thanks to our novel approaches and infuse them into future clinical practice. Through the achievements of BIOLOCHANICS, nonlocal mechanics will be possibly extended to other soft tissues for applications in orthopaedics, oncology, sport biomechanics, interventional surgery, human safety, cell biology, etc.
Max ERC Funding
1 999 396 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym BLOC
Project Mathematical study of Boundary Layers in Oceanic Motions
Researcher (PI) Anne-Laure Perrine Dalibard
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Summary
Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Max ERC Funding
1 267 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym BRAIN MICRO SNOOPER
Project A mimetic implant for low perturbation, stable stimulation and recording of neural units inside the brain.
Researcher (PI) Gaelle Offranc piret
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary Developing brain implants is crucial to better decipher the neuronal information and intervene in a very thin way on neural networks using microstimulations. This project aims to address two major challenges: to achieve the realization of a highly mechanically stable implant, allowing long term connection between neurons and microelectrodes and to provide neural implants with a high temporal and spatial resolution. To do so, the present project will develop implants with structural and mechanical properties that resemble those of the natural brain environment. According to the literature, using electrodes and electric leads with a size of a few microns allows for a better neural tissue reconstruction around the implant. Also, the mechanical mismatch between the usually stiff implant material and the soft brain tissue affects the adhesion between tissue cells and electrodes. With the objective to implant a highly flexible free-floating microelectrode array in the brain tissue, we will develop a new method using micro-nanotechnology steps as well as a combination of polymers. Moreover, the literature and preliminary studies indicate that some surface chemistries and nanotopographies can promote neurite outgrowth while limiting glial cell proliferation. Implants will be nanostructured so as to help the neural tissue growth and to be provided with a highly adhesive property, which will ensure its stable contact with the brain neural tissue over time. Implants with different microelectrode configurations and number will be tested in vitro and in vivo for their biocompatibility and their ability to record and stimulate neurons with high stability. This project will produce high-performance generic implants that can be used for various fundamental studies and applications, including neural prostheses and brain machine interfaces.
Summary
Developing brain implants is crucial to better decipher the neuronal information and intervene in a very thin way on neural networks using microstimulations. This project aims to address two major challenges: to achieve the realization of a highly mechanically stable implant, allowing long term connection between neurons and microelectrodes and to provide neural implants with a high temporal and spatial resolution. To do so, the present project will develop implants with structural and mechanical properties that resemble those of the natural brain environment. According to the literature, using electrodes and electric leads with a size of a few microns allows for a better neural tissue reconstruction around the implant. Also, the mechanical mismatch between the usually stiff implant material and the soft brain tissue affects the adhesion between tissue cells and electrodes. With the objective to implant a highly flexible free-floating microelectrode array in the brain tissue, we will develop a new method using micro-nanotechnology steps as well as a combination of polymers. Moreover, the literature and preliminary studies indicate that some surface chemistries and nanotopographies can promote neurite outgrowth while limiting glial cell proliferation. Implants will be nanostructured so as to help the neural tissue growth and to be provided with a highly adhesive property, which will ensure its stable contact with the brain neural tissue over time. Implants with different microelectrode configurations and number will be tested in vitro and in vivo for their biocompatibility and their ability to record and stimulate neurons with high stability. This project will produce high-performance generic implants that can be used for various fundamental studies and applications, including neural prostheses and brain machine interfaces.
Max ERC Funding
1 499 850 €
Duration
Start date: 2015-08-01, End date: 2021-07-31
Project acronym BrightSens
Project Ultrabright Turn-on Fluorescent Organic Nanoparticles for Amplified Molecular Sensing in Living Cells
Researcher (PI) Andrii Andrey Klymchenko
Host Institution (HI) UNIVERSITE DE STRASBOURG
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary Existing fluorescent molecular probes, due to limited brightness, do not allow imaging individual biomolecules directly in living cells, whereas bright fluorescent nanoparticles are unable to respond to single molecular stimuli and their inorganic core is not biodegradable. The aim of BrightSens is to develop ultrabright fluorescent organic nanoparticles (FONs) capable to convert single molecular stimuli into collective turn-on response of >100 encapsulated dyes, and to apply them in amplified molecular sensing of specific targets at the cell surface (receptors) and in the cytosol (mRNA). The project is composed of three work packages. (1) Synthesis of FONs: Dye-doped polymer and micellar FONs will be obtained by self-assembly. Molecular design of dyes and the use of bulky hydrophobic counterions will enable precise control of dyes organization inside FONs, which will resolve the fundamental problems of self-quenching and cooperative on/off switching in dye ensembles. (2) Synthesis of nanoprobes: Using cooperative Forster Resonance Energy Transfer from FONs to originally designed acceptor-sensor unit, we propose synthesis of the first nanoprobes that (a) undergo complete turn-on or colour switch in response to single molecular targets and (b) harvest light energy into photochemical disruption of cell membrane barriers. (3) Cellular applications: The obtained nanoprobes will be applied in 2D and 3D cultures of cancer cells for background-free single-molecule detection of membrane receptors and intracellular mRNA, which are important markers of cancer and apoptosis. An original concept of amplified photochemical internalization is proposed to trigger by light entry of nanoprobes into the cytosol. This high-risk/high-gain multidisciplinary project will result in new organic nanomaterials with unique photophysical properties that will enable visualization of biomolecules at work in living cells with expected impact on cancer research.
Summary
Existing fluorescent molecular probes, due to limited brightness, do not allow imaging individual biomolecules directly in living cells, whereas bright fluorescent nanoparticles are unable to respond to single molecular stimuli and their inorganic core is not biodegradable. The aim of BrightSens is to develop ultrabright fluorescent organic nanoparticles (FONs) capable to convert single molecular stimuli into collective turn-on response of >100 encapsulated dyes, and to apply them in amplified molecular sensing of specific targets at the cell surface (receptors) and in the cytosol (mRNA). The project is composed of three work packages. (1) Synthesis of FONs: Dye-doped polymer and micellar FONs will be obtained by self-assembly. Molecular design of dyes and the use of bulky hydrophobic counterions will enable precise control of dyes organization inside FONs, which will resolve the fundamental problems of self-quenching and cooperative on/off switching in dye ensembles. (2) Synthesis of nanoprobes: Using cooperative Forster Resonance Energy Transfer from FONs to originally designed acceptor-sensor unit, we propose synthesis of the first nanoprobes that (a) undergo complete turn-on or colour switch in response to single molecular targets and (b) harvest light energy into photochemical disruption of cell membrane barriers. (3) Cellular applications: The obtained nanoprobes will be applied in 2D and 3D cultures of cancer cells for background-free single-molecule detection of membrane receptors and intracellular mRNA, which are important markers of cancer and apoptosis. An original concept of amplified photochemical internalization is proposed to trigger by light entry of nanoprobes into the cytosol. This high-risk/high-gain multidisciplinary project will result in new organic nanomaterials with unique photophysical properties that will enable visualization of biomolecules at work in living cells with expected impact on cancer research.
Max ERC Funding
1 999 750 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CARBONFIX
Project Towards a Self-Amplifying Carbon-Fixing Anabolic Cycle
Researcher (PI) Joseph Moran
Host Institution (HI) CENTRE INTERNATIONAL DE RECHERCHE AUX FRONTIERES DE LA CHIMIE FONDATION
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary How can simple molecules self-organize into a growing synthetic reaction network like biochemical metabolism? This proposal takes a novel synthesis-driven approach to the question by mimicking a central self-amplifying CO2-fixing biochemical reaction cycle known as the reductive tricarboxylic acid cycle. The intermediates of this cycle are the synthetic precursors to all major classes of biomolecules and are built from CO2, an anhydride and electrons from simple reducing agents. Based on the nature of the reactions in the cycle and the specific structural features of the intermediates that comprise it, we propose that the entire cycle may be enabled in a single reaction vessel with a surprisingly small number of simple, mutually compatible catalysts from the recent synthetic organic literature. However, since one of the required reactions does not yet have an efficient synthetic equivalent in the literature and since those that do have not yet been carried out sequentially in a single reaction vessel, we will first independently develop the new reaction and sequences before attempting to combine them into the entire cycle. The new reaction and sequences will be useful green synthetic methods in their own right. Most significantly, this endeavour could provide the first experimental evidence of an exciting new alternative model for early biochemical evolution that finally illuminates the origins and necessity of biochemistry’s core reactions.
Summary
How can simple molecules self-organize into a growing synthetic reaction network like biochemical metabolism? This proposal takes a novel synthesis-driven approach to the question by mimicking a central self-amplifying CO2-fixing biochemical reaction cycle known as the reductive tricarboxylic acid cycle. The intermediates of this cycle are the synthetic precursors to all major classes of biomolecules and are built from CO2, an anhydride and electrons from simple reducing agents. Based on the nature of the reactions in the cycle and the specific structural features of the intermediates that comprise it, we propose that the entire cycle may be enabled in a single reaction vessel with a surprisingly small number of simple, mutually compatible catalysts from the recent synthetic organic literature. However, since one of the required reactions does not yet have an efficient synthetic equivalent in the literature and since those that do have not yet been carried out sequentially in a single reaction vessel, we will first independently develop the new reaction and sequences before attempting to combine them into the entire cycle. The new reaction and sequences will be useful green synthetic methods in their own right. Most significantly, this endeavour could provide the first experimental evidence of an exciting new alternative model for early biochemical evolution that finally illuminates the origins and necessity of biochemistry’s core reactions.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym chemREPEAT
Project Structure and Dynamics of Low-Complexity Regions in Proteins: The Huntingtin Case
Researcher (PI) Pau Bernado Pereto
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary Proteins hosting regions highly enriched in one or few amino acids, the so-called Low-Complexity Regions (LCR), are very common in eukaryotes and play crucial roles in biology. Homorepeats, a subfamily of LCR that present stretches of the same amino acid, perform very specialized functions facilitated by the localized enrichment of the same physicochemical property. In contrast, numerous severe pathologies have been associated to abnormally long repetitions. Despite the relevance of homorepeats, their high-resolution characterization by traditional structural biology techniques is hampered by the degeneracy of the amino acid environments and their intrinsic flexibility. In chemREPEAT, I will develop strategies to incorporate isotopically labelled and unnatural amino acids at specific positions within homorepeats that will overcome present limitations. These labelled positions will be unique probes to investigate for first time the structure and dynamics of homorepeats at atomic level using complementary biophysical techniques. Computational tools will be specifically developed to derive three-dimensional conformational ensembles of homorepeats by synergistically integrating experimental data.
chemREPEAT strategies will be developed on huntingtin (Htt), the prototype of repetitive protein. Htt hosts a glutamine tract that is linked with Huntington’s disease (HD), a deadly neuropathology appearing in individuals with more than 35 consecutive Glutamine residues that represent a pathological threshold. The application of the developed approaches to several Htt constructions with different number of Glutamines will reveal the structural bases of the pathological threshold in HD and the role played by the regions flanking the Glutamine tract.
The strategies designed in chemREPEAT will expand present frontiers of structural biology to unveil the structure/function relationships for LCRs. This capacity will pave the way for a rational intervention in associated diseases.
Summary
Proteins hosting regions highly enriched in one or few amino acids, the so-called Low-Complexity Regions (LCR), are very common in eukaryotes and play crucial roles in biology. Homorepeats, a subfamily of LCR that present stretches of the same amino acid, perform very specialized functions facilitated by the localized enrichment of the same physicochemical property. In contrast, numerous severe pathologies have been associated to abnormally long repetitions. Despite the relevance of homorepeats, their high-resolution characterization by traditional structural biology techniques is hampered by the degeneracy of the amino acid environments and their intrinsic flexibility. In chemREPEAT, I will develop strategies to incorporate isotopically labelled and unnatural amino acids at specific positions within homorepeats that will overcome present limitations. These labelled positions will be unique probes to investigate for first time the structure and dynamics of homorepeats at atomic level using complementary biophysical techniques. Computational tools will be specifically developed to derive three-dimensional conformational ensembles of homorepeats by synergistically integrating experimental data.
chemREPEAT strategies will be developed on huntingtin (Htt), the prototype of repetitive protein. Htt hosts a glutamine tract that is linked with Huntington’s disease (HD), a deadly neuropathology appearing in individuals with more than 35 consecutive Glutamine residues that represent a pathological threshold. The application of the developed approaches to several Htt constructions with different number of Glutamines will reveal the structural bases of the pathological threshold in HD and the role played by the regions flanking the Glutamine tract.
The strategies designed in chemREPEAT will expand present frontiers of structural biology to unveil the structure/function relationships for LCRs. This capacity will pave the way for a rational intervention in associated diseases.
Max ERC Funding
1 999 844 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym CoqHoTT
Project Coq for Homotopy Type Theory
Researcher (PI) nicolas Tabareau
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Summary
Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Max ERC Funding
1 498 290 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CurvedSusy
Project Dynamics of Supersymmetry in Curved Space
Researcher (PI) Guido Festuccia
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary Quantum field theory provides a theoretical framework to explain quantitatively natural phenomena as diverse as the fluctuations in the cosmic microwave background, superconductivity, and elementary particle interactions in colliders. Even if we use quantum field theories in different settings, their structure and dynamics are still largely mysterious. Weakly coupled systems can be studied perturbatively, however many natural phenomena are characterized by strong self-interactions (e.g. high T superconductors, nuclear forces) and their analysis requires going beyond perturbation theory. Supersymmetric field theories are very interesting in this respect because they can be studied exactly even at strong coupling and their dynamics displays phenomena like confinement or the breaking of chiral symmetries that occur in nature and are very difficult to study analytically.
Recently it was realized that many interesting insights on the dynamics of supersymmetric field theories can be obtained by placing these theories in curved space preserving supersymmetry. These advances have opened new research avenues but also left many important questions unanswered. The aim of our research programme will be to clarify the dynamics of supersymmetric field theories in curved space and use this knowledge to establish new exact results for strongly coupled supersymmetric gauge theories. The novelty of our approach resides in the systematic use of the interplay between the physical properties of a supersymmetric theory and the geometrical properties of the space-time it lives in. The analytical results we will obtain, while derived for very symmetric theories, can be used as a guide in understanding the dynamics of many physical systems. Besides providing new tools to address the dynamics of quantum field theory at strong coupling this line of investigation could lead to new connections between Physics and Mathematics.
Summary
Quantum field theory provides a theoretical framework to explain quantitatively natural phenomena as diverse as the fluctuations in the cosmic microwave background, superconductivity, and elementary particle interactions in colliders. Even if we use quantum field theories in different settings, their structure and dynamics are still largely mysterious. Weakly coupled systems can be studied perturbatively, however many natural phenomena are characterized by strong self-interactions (e.g. high T superconductors, nuclear forces) and their analysis requires going beyond perturbation theory. Supersymmetric field theories are very interesting in this respect because they can be studied exactly even at strong coupling and their dynamics displays phenomena like confinement or the breaking of chiral symmetries that occur in nature and are very difficult to study analytically.
Recently it was realized that many interesting insights on the dynamics of supersymmetric field theories can be obtained by placing these theories in curved space preserving supersymmetry. These advances have opened new research avenues but also left many important questions unanswered. The aim of our research programme will be to clarify the dynamics of supersymmetric field theories in curved space and use this knowledge to establish new exact results for strongly coupled supersymmetric gauge theories. The novelty of our approach resides in the systematic use of the interplay between the physical properties of a supersymmetric theory and the geometrical properties of the space-time it lives in. The analytical results we will obtain, while derived for very symmetric theories, can be used as a guide in understanding the dynamics of many physical systems. Besides providing new tools to address the dynamics of quantum field theory at strong coupling this line of investigation could lead to new connections between Physics and Mathematics.
Max ERC Funding
1 145 879 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym DBA
Project Distributed Biological Algorithms
Researcher (PI) Amos Korman
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary This project proposes a new application for computational reasoning. More specifically, the purpose of this interdisciplinary project is to demonstrate the usefulness of an algorithmic perspective in studies of complex biological systems. We focus on the domain of collective behavior, and demonstrate the benefits of using techniques from the field of theoretical distributed computing in order to establish algorithmic insights regarding the behavior of biological ensembles. The project includes three related tasks, for which we have already obtained promising preliminary results. Each task contains a purely theoretical algorithmic component as well as one which integrates theoretical algorithmic studies with experiments. Most experiments are strategically designed by the PI based on computational insights, and are physically conducted by experimental biologists that have been carefully chosen by the PI. In turn, experimental outcomes will be theoretically analyzed via an algorithmic perspective. By this integration, we aim at deciphering how a biological individual (such as an ant) “thinks”, without having direct access to the neurological process within its brain, and how such limited individuals assemble into ensembles that appear to be far greater than the sum of their parts. The ultimate vision behind this project is to enable the formation of a new scientific field, called algorithmic biology, that bases biological studies on theoretical algorithmic insights.
Summary
This project proposes a new application for computational reasoning. More specifically, the purpose of this interdisciplinary project is to demonstrate the usefulness of an algorithmic perspective in studies of complex biological systems. We focus on the domain of collective behavior, and demonstrate the benefits of using techniques from the field of theoretical distributed computing in order to establish algorithmic insights regarding the behavior of biological ensembles. The project includes three related tasks, for which we have already obtained promising preliminary results. Each task contains a purely theoretical algorithmic component as well as one which integrates theoretical algorithmic studies with experiments. Most experiments are strategically designed by the PI based on computational insights, and are physically conducted by experimental biologists that have been carefully chosen by the PI. In turn, experimental outcomes will be theoretically analyzed via an algorithmic perspective. By this integration, we aim at deciphering how a biological individual (such as an ant) “thinks”, without having direct access to the neurological process within its brain, and how such limited individuals assemble into ensembles that appear to be far greater than the sum of their parts. The ultimate vision behind this project is to enable the formation of a new scientific field, called algorithmic biology, that bases biological studies on theoretical algorithmic insights.
Max ERC Funding
1 894 947 €
Duration
Start date: 2015-05-01, End date: 2021-04-30
Project acronym DuaLL
Project Duality in Formal Languages and Logic - a unifying approach to complexity and semantics
Researcher (PI) Mai Gehrke
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Dualities between algebraic and topological structure are pervasive in mathematics, and toggling back and forth between them has often been associated with important breakthroughs. The main objective of this project is to bring this important tool to bear on a number of subjects in theoretical computer science thereby advancing, systematising, and unifying them.
One subject of focus is the search for robust extensions of the theory of regular languages. A powerful technical tool for classifying regular languages and proving decidability results is Eilenberg-Reiterman theory, which assigns classes of finite monoids or single profinite algebras to classes of languages. Recent results by the PI and her co-authors show that the theory may be seen as a special case of Stone duality for Boolean algebras with operators. We want to:
- Develop an Eilenberg-Reiterman theory beyond regular languages with the goal of obtaining new tools and separation results for Boolean circuit classes, an active area in the search for lower bounds in complexity theory.
-Systematise and advance the search for robust generalisations of regularity to other structures such as infinite words, finite and infinite trees, cost functions, and words with data.
The second subject of focus is the development of duality theoretic methods for logics with categorical semantics. We want to approach the problem incrementally:
- View duality for categorical semantics through a spectrum of intermediate cases going from regular languages over varying alphabets, Ghilardi-Zawadowski duality for finitely presented Heyting algebras, and the Bodirsky-Pinsker topological Birkhoff theorem to Makkai's, Awodey and Forssell's, and Coumans' recent work on first-order logic duality, thus unifying topics in semantics and formal languages.
Our main tools come from Stone duality in various forms including the Jonsson-Tarski canonical extensions and profinite algebra, and from universal algebra and category theory.
Summary
Dualities between algebraic and topological structure are pervasive in mathematics, and toggling back and forth between them has often been associated with important breakthroughs. The main objective of this project is to bring this important tool to bear on a number of subjects in theoretical computer science thereby advancing, systematising, and unifying them.
One subject of focus is the search for robust extensions of the theory of regular languages. A powerful technical tool for classifying regular languages and proving decidability results is Eilenberg-Reiterman theory, which assigns classes of finite monoids or single profinite algebras to classes of languages. Recent results by the PI and her co-authors show that the theory may be seen as a special case of Stone duality for Boolean algebras with operators. We want to:
- Develop an Eilenberg-Reiterman theory beyond regular languages with the goal of obtaining new tools and separation results for Boolean circuit classes, an active area in the search for lower bounds in complexity theory.
-Systematise and advance the search for robust generalisations of regularity to other structures such as infinite words, finite and infinite trees, cost functions, and words with data.
The second subject of focus is the development of duality theoretic methods for logics with categorical semantics. We want to approach the problem incrementally:
- View duality for categorical semantics through a spectrum of intermediate cases going from regular languages over varying alphabets, Ghilardi-Zawadowski duality for finitely presented Heyting algebras, and the Bodirsky-Pinsker topological Birkhoff theorem to Makkai's, Awodey and Forssell's, and Coumans' recent work on first-order logic duality, thus unifying topics in semantics and formal languages.
Our main tools come from Stone duality in various forms including the Jonsson-Tarski canonical extensions and profinite algebra, and from universal algebra and category theory.
Max ERC Funding
2 348 938 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym DYNPOR
Project First principle molecular dynamics simulations for complex chemical transformations in nanoporous materials
Researcher (PI) Véronique Van Speybroeck
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary Chemical transformations in nanoporous materials are vital in many application domains, such as catalysis, molecular separations, sustainable chemistry,…. Model-guided design is indispensable to tailoring materials at the nanometer scale level.
At real operating conditions, chemical transformations taking place at the nanometer scale have a very complex nature, due to the interplay of several factors such as the number of particles present in the pores of the material, framework flexibility, competitive pathways, entropy effects,… The textbook concept of a single transition state is far too simplistic in such cases. A restricted number of configurations of the potential energy surface is not sufficient to capture the complexity of the transformation.
My objective is to simulate complex chemical transformations in nanoporous materials using first principle molecular dynamics methods at real operating conditions, capturing the full complexity of the free energy surface. To achieve these goals advanced sampling methods will be used to explore the interesting regions of the free energy surface. The number of guest molecules at real operating conditions will be derived and the diffusion of small molecules through pores with blocking molecules will be studied. New theoretical models will be developed to keep track of both the framework flexibility and entropy of the lattice.
The selected applications are timely and rely on an extensive network with prominent experimental partners. The applications will encompass contemporary catalytic conversions in zeolites, active site engineering in metal organic frameworks and structural transitions in nanoporous materials, and the expected outcomes will have the potential to yield groundbreaking new insights.
The results are expected to have impact far beyond the horizon of the current project as they will contribute to the transition from static to dynamically based modeling tools within heterogeneous catalysis
Summary
Chemical transformations in nanoporous materials are vital in many application domains, such as catalysis, molecular separations, sustainable chemistry,…. Model-guided design is indispensable to tailoring materials at the nanometer scale level.
At real operating conditions, chemical transformations taking place at the nanometer scale have a very complex nature, due to the interplay of several factors such as the number of particles present in the pores of the material, framework flexibility, competitive pathways, entropy effects,… The textbook concept of a single transition state is far too simplistic in such cases. A restricted number of configurations of the potential energy surface is not sufficient to capture the complexity of the transformation.
My objective is to simulate complex chemical transformations in nanoporous materials using first principle molecular dynamics methods at real operating conditions, capturing the full complexity of the free energy surface. To achieve these goals advanced sampling methods will be used to explore the interesting regions of the free energy surface. The number of guest molecules at real operating conditions will be derived and the diffusion of small molecules through pores with blocking molecules will be studied. New theoretical models will be developed to keep track of both the framework flexibility and entropy of the lattice.
The selected applications are timely and rely on an extensive network with prominent experimental partners. The applications will encompass contemporary catalytic conversions in zeolites, active site engineering in metal organic frameworks and structural transitions in nanoporous materials, and the expected outcomes will have the potential to yield groundbreaking new insights.
The results are expected to have impact far beyond the horizon of the current project as they will contribute to the transition from static to dynamically based modeling tools within heterogeneous catalysis
Max ERC Funding
1 993 750 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym EQUEMI
Project Entanglement and Quantum Engineering with optical Microcavities
Researcher (PI) Jakob Reichel
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE2, ERC-2014-ADG
Summary I propose to leverage the unique properties of optical fiber Fabry-Perot (FFP) microcavities pioneered by my group to advance the field of quantum engineering. We will take quantum-enhanced measurement from its current proof-of-principle state to a true metrological level by applying cavity-based spin squeezing to a compact atomic clock, aiming to improve the clock stability beyond one part in 10^-13 in one second. In a new experiment, we will generate multiparticle entangled states with high metrological gain by applying cavity-based entanglement schemes to alkaline earth-like atoms, the atomic species used in today’s most precise atomic clocks. In a second phase, a miniature quantum gas microscope will be added to this experiment, creating a rich new situation at the interface of quantum information, metrology, and cutting-edge quantum gas research. Finally, we will further improve the FFP microcavity technology itself to enable novel atom-light interfaces with a currently unavailable combination of strong coupling, efficient fiber coupling, and open access. This will open new horizons for light-matter interfaces not only in our experiments, but also in our partner groups working with trapped ions, diamond color centers, semiconductor quantum dots, carbon nanotubes and in quantum optomechanics.
Summary
I propose to leverage the unique properties of optical fiber Fabry-Perot (FFP) microcavities pioneered by my group to advance the field of quantum engineering. We will take quantum-enhanced measurement from its current proof-of-principle state to a true metrological level by applying cavity-based spin squeezing to a compact atomic clock, aiming to improve the clock stability beyond one part in 10^-13 in one second. In a new experiment, we will generate multiparticle entangled states with high metrological gain by applying cavity-based entanglement schemes to alkaline earth-like atoms, the atomic species used in today’s most precise atomic clocks. In a second phase, a miniature quantum gas microscope will be added to this experiment, creating a rich new situation at the interface of quantum information, metrology, and cutting-edge quantum gas research. Finally, we will further improve the FFP microcavity technology itself to enable novel atom-light interfaces with a currently unavailable combination of strong coupling, efficient fiber coupling, and open access. This will open new horizons for light-matter interfaces not only in our experiments, but also in our partner groups working with trapped ions, diamond color centers, semiconductor quantum dots, carbon nanotubes and in quantum optomechanics.
Max ERC Funding
2 422 750 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym EQuO
Project Electron Quantum optics in quantum Hall edge channels
Researcher (PI) Gwendal Feve
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary Quantum effects have been studied on photon propagation in the context of quantum optics since the second half of the last century. In particular, using single photon emitters, fundamental tests of quantum mechanics were explored by manipulating single to few photons in Hanbury-Brown and Twiss and Hong Ou Mandel experiments.
In nanophysics, there is a growing interest to translate these concepts of quantum optics to electrons propagating in nanostructures. Single electron emitters have been realized such that single elementary electronic excitations can now be manipulated in the analog of pioneer quantum optics experiments.
Electron quantum optics goes beyond the mere reproduction of optical setups using electron beams, as electrons, being interacting fermions, differ strongly from photons. Contrary to optics, understanding the propagation of an elementary excitation requires replacing the single body description by a many body one.
The purpose of this proposal is to specifically explore the emergence of many body physics and its effects on electronic propagation using the setups and concepts of electron quantum optics. The motivations are numerous: firstly single particle emission initializes a simple and well controlled state. I will take this unique opportunity to test birth, life and death scenarii of Landau quasiparticles and observe the emergence of many-body physics. Secondly, I will address the generation of entangled few electrons quantum coherent states and study how they are affected by interactions. Finally, I will attempt to apply electron quantum optics concepts to a regime where the ground state itself is a strongly correlated state of matter. In such a situation, elementary excitations are no longer electrons but carry a fractional charge and obey fractional statistics. No manipulation of single quasiparticles has been reported yet and the determination of some quasiparticle characteristics, such as the fractional statistics remains elusive.
Summary
Quantum effects have been studied on photon propagation in the context of quantum optics since the second half of the last century. In particular, using single photon emitters, fundamental tests of quantum mechanics were explored by manipulating single to few photons in Hanbury-Brown and Twiss and Hong Ou Mandel experiments.
In nanophysics, there is a growing interest to translate these concepts of quantum optics to electrons propagating in nanostructures. Single electron emitters have been realized such that single elementary electronic excitations can now be manipulated in the analog of pioneer quantum optics experiments.
Electron quantum optics goes beyond the mere reproduction of optical setups using electron beams, as electrons, being interacting fermions, differ strongly from photons. Contrary to optics, understanding the propagation of an elementary excitation requires replacing the single body description by a many body one.
The purpose of this proposal is to specifically explore the emergence of many body physics and its effects on electronic propagation using the setups and concepts of electron quantum optics. The motivations are numerous: firstly single particle emission initializes a simple and well controlled state. I will take this unique opportunity to test birth, life and death scenarii of Landau quasiparticles and observe the emergence of many-body physics. Secondly, I will address the generation of entangled few electrons quantum coherent states and study how they are affected by interactions. Finally, I will attempt to apply electron quantum optics concepts to a regime where the ground state itself is a strongly correlated state of matter. In such a situation, elementary excitations are no longer electrons but carry a fractional charge and obey fractional statistics. No manipulation of single quasiparticles has been reported yet and the determination of some quasiparticle characteristics, such as the fractional statistics remains elusive.
Max ERC Funding
1 997 878 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym FAnFArE
Project Fourier Analysis For/And Partial Differential Equations
Researcher (PI) Frederic, Jérôme, Louis Bernicot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary "This project aims to develop the field of Harmonic Analysis, and more precisely to study problems at the interface between Fourier Analysis and PDEs (and also some Geometry).
We are interested in two aspects of the Fourier Analysis:
(1) The Euclidean Fourier Analysis, where a deep analysis can be performed using specificities as the notion of "frequencies" (involving the Fourier transform) or the geometry of the Euclidean balls. By taking advantage of them, this proposal aims to pursue the study and bring novelties in three fashionable topics: the study of bilinear/multilinear Fourier multipliers, the development of the "space-time resonances" method in a systematic way and for some specific PDEs, and the study of nonlinear transport equations in BMO-type spaces (as Euler and Navier-Stokes equations).
(2) A Functional Fourier Analysis, which can be performed in a more general situation using the notion of "oscillation" adapted to a heat semigroup (or semigroup of operators). This second Challenge is (at the same time) independent of the first one and also very close. It is very close, due to the same point of view of Fourier Analysis involving a space decomposition and simultaneously some frequency decomposition. However they are quite independent because the main goal is to extend/develop an analysis in the more general framework given by a semigroup of operators (so without using the previous Euclidean specificities). By this way, we aim to transfer some results known in the Euclidean situation to some Riemannian manifolds, Fractals sets, bounded open set setting, ... Still having in mind some applications to the study of PDEs, such questions make also a connexion with the geometry of the ambient spaces (by its Riesz transform, Poincaré inequality, ...). I propose here to attack different problems as dispersive estimates, ""L^p""-version of De Giorgi inequalities and the study of paraproducts, all of them with a heat semigroup point of view."
Summary
"This project aims to develop the field of Harmonic Analysis, and more precisely to study problems at the interface between Fourier Analysis and PDEs (and also some Geometry).
We are interested in two aspects of the Fourier Analysis:
(1) The Euclidean Fourier Analysis, where a deep analysis can be performed using specificities as the notion of "frequencies" (involving the Fourier transform) or the geometry of the Euclidean balls. By taking advantage of them, this proposal aims to pursue the study and bring novelties in three fashionable topics: the study of bilinear/multilinear Fourier multipliers, the development of the "space-time resonances" method in a systematic way and for some specific PDEs, and the study of nonlinear transport equations in BMO-type spaces (as Euler and Navier-Stokes equations).
(2) A Functional Fourier Analysis, which can be performed in a more general situation using the notion of "oscillation" adapted to a heat semigroup (or semigroup of operators). This second Challenge is (at the same time) independent of the first one and also very close. It is very close, due to the same point of view of Fourier Analysis involving a space decomposition and simultaneously some frequency decomposition. However they are quite independent because the main goal is to extend/develop an analysis in the more general framework given by a semigroup of operators (so without using the previous Euclidean specificities). By this way, we aim to transfer some results known in the Euclidean situation to some Riemannian manifolds, Fractals sets, bounded open set setting, ... Still having in mind some applications to the study of PDEs, such questions make also a connexion with the geometry of the ambient spaces (by its Riesz transform, Poincaré inequality, ...). I propose here to attack different problems as dispersive estimates, ""L^p""-version of De Giorgi inequalities and the study of paraproducts, all of them with a heat semigroup point of view."
Max ERC Funding
940 540 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym FELICITY
Project Foundations of Efficient Lattice Cryptography
Researcher (PI) Vadim Lyubashevsky
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Public key cryptography is the backbone of internet security. Yet it is very likely that within the next few decades some government or corporate entity will succeed in building a general-purpose quantum computer that is capable of breaking all of today's public key protocols. Lattice cryptography, which appears to be resilient to quantum attacks, is currently viewed as the most promising candidate to take over as the basis for cryptography in the future. Recent theoretical breakthroughs have additionally shown that lattice cryptography may even allow for constructions of primitives with novel capabilities. But even though the progress in this latter area has been considerable, the resulting schemes are still extremely impractical.
The central objective of the FELICITY project is to substantially expand the boundaries of efficient lattice-based cryptography. This includes improving on the most crucial cryptographic protocols, some of which are already considered practical, as well as pushing towards efficiency in areas that currently seem out of reach. The methodology that we propose to use differs from the bulk of the research being done today. Rather than directly working on advanced primitives in which practical considerations are ignored, the focus of the project will be on finding novel ways in which to break the most fundamental barriers that are standing in the way of practicality. For this, I believe it is productive to concentrate on building schemes that stand at the frontier of what is considered efficient -- because it is there that the most critical barriers are most apparent. And since cryptographic techniques usually propagate themselves from simple to advanced primitives, improved solutions for the fundamental ones will eventually serve as building blocks for practical constructions of schemes having advanced capabilities.
Summary
Public key cryptography is the backbone of internet security. Yet it is very likely that within the next few decades some government or corporate entity will succeed in building a general-purpose quantum computer that is capable of breaking all of today's public key protocols. Lattice cryptography, which appears to be resilient to quantum attacks, is currently viewed as the most promising candidate to take over as the basis for cryptography in the future. Recent theoretical breakthroughs have additionally shown that lattice cryptography may even allow for constructions of primitives with novel capabilities. But even though the progress in this latter area has been considerable, the resulting schemes are still extremely impractical.
The central objective of the FELICITY project is to substantially expand the boundaries of efficient lattice-based cryptography. This includes improving on the most crucial cryptographic protocols, some of which are already considered practical, as well as pushing towards efficiency in areas that currently seem out of reach. The methodology that we propose to use differs from the bulk of the research being done today. Rather than directly working on advanced primitives in which practical considerations are ignored, the focus of the project will be on finding novel ways in which to break the most fundamental barriers that are standing in the way of practicality. For this, I believe it is productive to concentrate on building schemes that stand at the frontier of what is considered efficient -- because it is there that the most critical barriers are most apparent. And since cryptographic techniques usually propagate themselves from simple to advanced primitives, improved solutions for the fundamental ones will eventually serve as building blocks for practical constructions of schemes having advanced capabilities.
Max ERC Funding
1 311 688 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym FireBar-Concept
Project MULTI-CONCEPTUAL DESIGN OF FIRE BARRIER: A SYSTEMIC APPROACH
Researcher (PI) Serge Bourbigot
Host Institution (HI) UNIVERSITE DES SCIENCES ET TECHNOLOGIES DE LILLE - LILLE I
Call Details Advanced Grant (AdG), PE8, ERC-2014-ADG
Summary The development of science and technology provides the availability of sophisticated products but concurrently, increases the use of combustible materials, in particular organic materials. Those materials are easily flammable and must be flame retarded to make them safer. In case of fire, people must be protected by materials confining and stopping fire. It is one of the goals of the FireBar-Concept project to design materials and assembly of materials exhibiting low flammability, protecting substrates and limiting fire spread.
The objective of FireBar-Concept is to make a fire barrier formed at the right time, at the right location and reacting accordingly against thermal constraint (fire scenario). This fire barrier can be developed in several ways according to the chemical nature of the material and/or of its formulation:
- Heat barrier formed by inherently flame retarded materials (e.g. mineral fibers, ceramic …) and exhibiting low thermal conductivity (note the assembly of those materials can also provide low thermal conductivity controlling porosity and its distribution)
- Evolution of reactive radicals poisoning the flame and forming a protective ‘umbrella’ avoiding the combustion of the material
- Additives promoting charring of the materials and forming an expanding carbonaceous protective coating or barrier (intumescence)
- Additives forming a physical barrier limiting mass transfer of the degradation products to the flame
The FireBar-Concept project is multidisciplinary and it requires expertise in material science, chemical engineering, chemistry, thermal science and physics. The approach is to make 5 actions linked together by transverse developments (3) according to this scheme: (i) fundamentals of fire barrier, (ii) multi-material and combination of concepts, (iii) modeling and numerical simulation, (iv) design and development of experimental protocols and (v) optimization of the systems.
Summary
The development of science and technology provides the availability of sophisticated products but concurrently, increases the use of combustible materials, in particular organic materials. Those materials are easily flammable and must be flame retarded to make them safer. In case of fire, people must be protected by materials confining and stopping fire. It is one of the goals of the FireBar-Concept project to design materials and assembly of materials exhibiting low flammability, protecting substrates and limiting fire spread.
The objective of FireBar-Concept is to make a fire barrier formed at the right time, at the right location and reacting accordingly against thermal constraint (fire scenario). This fire barrier can be developed in several ways according to the chemical nature of the material and/or of its formulation:
- Heat barrier formed by inherently flame retarded materials (e.g. mineral fibers, ceramic …) and exhibiting low thermal conductivity (note the assembly of those materials can also provide low thermal conductivity controlling porosity and its distribution)
- Evolution of reactive radicals poisoning the flame and forming a protective ‘umbrella’ avoiding the combustion of the material
- Additives promoting charring of the materials and forming an expanding carbonaceous protective coating or barrier (intumescence)
- Additives forming a physical barrier limiting mass transfer of the degradation products to the flame
The FireBar-Concept project is multidisciplinary and it requires expertise in material science, chemical engineering, chemistry, thermal science and physics. The approach is to make 5 actions linked together by transverse developments (3) according to this scheme: (i) fundamentals of fire barrier, (ii) multi-material and combination of concepts, (iii) modeling and numerical simulation, (iv) design and development of experimental protocols and (v) optimization of the systems.
Max ERC Funding
2 429 988 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym FORCASTER
Project Force, Motion and Positioning of Microtubule Asters
Researcher (PI) Nicolas David Minc
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary Cells must move and position internal components to perform their function. We here focus on the physical designs which allow microtubule (MT) asters to exert forces in order to move and position themselves in vivo. These are arrays of MTs radiating from the centrosome, which fill up large portions of cells. They orchestrate nuclear positioning and spindle orientation for polarity, division and development. Forces that move asters are generated at nanometer and second scales by MT-associated motors from sites in the cytoplasm or at the cell surface. How MTs and force-generators self-organize to control aster motion and position at millimeter and hour scales is not known. We will use a suit of biophysical experiments and models to address how aster micro-mechanics contribute to aster migration, centration, de-centration and orientation in a single in vivo system, using the early stages of Sea urchin development as a quantitative model.
We aim to: 1) Elucidate mechanisms that drive aster large-scale motion, using sperm aster migration after fertilization during which asters grow and move rapidly and persistently to the large-egg center. We will investigate how speeds and trajectories depend on boundary conditions and on the dynamic spatial organization of force-generators.
2) Implement magnetic-based subcellular force measurements of MT asters. We will use this to understand how single force-events are integrated at the scale of asters, how global forces may evolve will aster size, shape, in centration and de-centration processes, using various stages of development, and cell manipulation; and to compute aster friction.
3) Couple computational models and 3D imaging to understand and predict stereotyped division patterns driven by subsequent aster positioning and aster-pairs orientation in the early divisions of Sea urchin embryos and in other tissues.
This framework bridging multiple scales will bring unprecedented insights on the physics of living active matter.
Summary
Cells must move and position internal components to perform their function. We here focus on the physical designs which allow microtubule (MT) asters to exert forces in order to move and position themselves in vivo. These are arrays of MTs radiating from the centrosome, which fill up large portions of cells. They orchestrate nuclear positioning and spindle orientation for polarity, division and development. Forces that move asters are generated at nanometer and second scales by MT-associated motors from sites in the cytoplasm or at the cell surface. How MTs and force-generators self-organize to control aster motion and position at millimeter and hour scales is not known. We will use a suit of biophysical experiments and models to address how aster micro-mechanics contribute to aster migration, centration, de-centration and orientation in a single in vivo system, using the early stages of Sea urchin development as a quantitative model.
We aim to: 1) Elucidate mechanisms that drive aster large-scale motion, using sperm aster migration after fertilization during which asters grow and move rapidly and persistently to the large-egg center. We will investigate how speeds and trajectories depend on boundary conditions and on the dynamic spatial organization of force-generators.
2) Implement magnetic-based subcellular force measurements of MT asters. We will use this to understand how single force-events are integrated at the scale of asters, how global forces may evolve will aster size, shape, in centration and de-centration processes, using various stages of development, and cell manipulation; and to compute aster friction.
3) Couple computational models and 3D imaging to understand and predict stereotyped division patterns driven by subsequent aster positioning and aster-pairs orientation in the early divisions of Sea urchin embryos and in other tissues.
This framework bridging multiple scales will bring unprecedented insights on the physics of living active matter.
Max ERC Funding
2 199 310 €
Duration
Start date: 2015-07-01, End date: 2020-12-31
Project acronym GAN
Project Groups, Actions and von Neumann algebras
Researcher (PI) Cyril Houdayer
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary This research project focuses on the structure, classification and rigidity of three closely related objects: group actions on measure spaces, orbit equivalence relations and von Neumann algebras. Over the last 15 years, the study of interactions between these three topics has led to a process of mutual enrichment, providing both striking theorems and outstanding conjectures.
Some fundamental questions such as Connes' rigidity conjecture, the structure of von Neumann algebras associated with higher rank lattices, or the fine classification of factors of type III still remain untouched. The general aim of the project is to tackle these problems and other related questions by developing a further analysis and understanding of the interplay between von Neumann algebra theory on the one hand, as well as ergodic and group theory on the other hand. To do so, I will use and combine several tools and develop new ones arising from Popa's Deformation/Rigidity theory, Lie group theory (lattices, boundaries), topological and geometric group theory and representation group theory (amenability, property (T)). More specifically, the main directions of my research project are:
1) The structure of the von Neumann algebras arising from Voiculescu's Free Probability theory: Shlyakhtenko's free Araki-Woods factors, amalgamated free product von Neumann algebras and the free group factors.
2) The structure and the classification of the von Neumann algebras and the measured equivalence relations arising from lattices in higher rank semisimple connected Lie groups.
3) The measure equivalence rigidity of the Baumslag-Solitar groups and several other classes of discrete groups acting on trees.
Summary
This research project focuses on the structure, classification and rigidity of three closely related objects: group actions on measure spaces, orbit equivalence relations and von Neumann algebras. Over the last 15 years, the study of interactions between these three topics has led to a process of mutual enrichment, providing both striking theorems and outstanding conjectures.
Some fundamental questions such as Connes' rigidity conjecture, the structure of von Neumann algebras associated with higher rank lattices, or the fine classification of factors of type III still remain untouched. The general aim of the project is to tackle these problems and other related questions by developing a further analysis and understanding of the interplay between von Neumann algebra theory on the one hand, as well as ergodic and group theory on the other hand. To do so, I will use and combine several tools and develop new ones arising from Popa's Deformation/Rigidity theory, Lie group theory (lattices, boundaries), topological and geometric group theory and representation group theory (amenability, property (T)). More specifically, the main directions of my research project are:
1) The structure of the von Neumann algebras arising from Voiculescu's Free Probability theory: Shlyakhtenko's free Araki-Woods factors, amalgamated free product von Neumann algebras and the free group factors.
2) The structure and the classification of the von Neumann algebras and the measured equivalence relations arising from lattices in higher rank semisimple connected Lie groups.
3) The measure equivalence rigidity of the Baumslag-Solitar groups and several other classes of discrete groups acting on trees.
Max ERC Funding
876 750 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym GATIPOR
Project Guaranteed fully adaptive algorithms with tailored inexact solvers for complex porous media flows
Researcher (PI) Martin Vohralik
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Efficient use of computational resources with a reliable outcome is a definite target in numerical simulations of partial differential equations (PDEs). Although this has been an important subject of numerical analysis and scientific computing for decades, still, surprisingly, often more than 90% of the CPU time in numerical simulations is literally wasted and the accuracy of the final outcome is not guaranteed. The reason is that addressing this complex issue rigorously is extremely challenging, as it stems from linking several rather disconnected domains like modeling, analysis of PDEs, numerical analysis, numerical linear algebra, and scientific computing. The goal of this project is to design novel inexact algebraic and linearization solvers, with each step being adaptively steered by optimal (guaranteed and robust) a posteriori error estimates, thus online interconnecting all parts of the numerical simulation of complex environmental porous media flows. The key novel ingredients will be multilevel algebraic solvers, tailored to porous media simulations, with problem- and discretization-dependent restriction, prolongation, and smoothing, yielding mass balance on all grid levels, accompanied by local adaptive stopping criteria. We shall theoretically prove the convergence of the new algorithms and justify their optimality, with in particular guaranteed (without any unknown constant) error reduction and overall computational load. Implementation into established numerical simulation codes and assessment on renowned academic and industrial benchmarks will consolidate the theoretical results. As a final outcome, the total simulation error will be certified and current computational burden cut by orders of magnitude. This would represent a cardinal technological advance both theoretically as well as practically in urgent environmental applications, namely the nuclear waste storage and the geological sequestration of CO2.
Summary
Efficient use of computational resources with a reliable outcome is a definite target in numerical simulations of partial differential equations (PDEs). Although this has been an important subject of numerical analysis and scientific computing for decades, still, surprisingly, often more than 90% of the CPU time in numerical simulations is literally wasted and the accuracy of the final outcome is not guaranteed. The reason is that addressing this complex issue rigorously is extremely challenging, as it stems from linking several rather disconnected domains like modeling, analysis of PDEs, numerical analysis, numerical linear algebra, and scientific computing. The goal of this project is to design novel inexact algebraic and linearization solvers, with each step being adaptively steered by optimal (guaranteed and robust) a posteriori error estimates, thus online interconnecting all parts of the numerical simulation of complex environmental porous media flows. The key novel ingredients will be multilevel algebraic solvers, tailored to porous media simulations, with problem- and discretization-dependent restriction, prolongation, and smoothing, yielding mass balance on all grid levels, accompanied by local adaptive stopping criteria. We shall theoretically prove the convergence of the new algorithms and justify their optimality, with in particular guaranteed (without any unknown constant) error reduction and overall computational load. Implementation into established numerical simulation codes and assessment on renowned academic and industrial benchmarks will consolidate the theoretical results. As a final outcome, the total simulation error will be certified and current computational burden cut by orders of magnitude. This would represent a cardinal technological advance both theoretically as well as practically in urgent environmental applications, namely the nuclear waste storage and the geological sequestration of CO2.
Max ERC Funding
1 283 088 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym GEM
Project From Geometry to Motion: inverse modeling of complex mechanical structures
Researcher (PI) Florence Bertails-Descoubes
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary With the considerable advance of automatic image-based capture in Computer Vision and Computer Graphics these latest years, it becomes now affordable to acquire quickly and precisely the full 3D geometry of many mechanical objects featuring intricate shapes. Yet, while more and more geometrical data get collected and shared among the communities, there is currently very little study about how to infer the underlying mechanical properties of the captured objects merely from their geometrical configurations.
The GEM challenge consists in developing a non-invasive method for inferring the mechanical properties of complex objects from a minimal set of geometrical poses, in order to predict their dynamics. In contrast to classical inverse reconstruction methods, my proposal is built upon the claim that 1/ the mere geometrical shape of physical objects reveals a lot about their underlying mechanical properties and 2/ this property can be fully leveraged for a wide range of objects featuring rich geometrical configurations, such as slender structures subject to frictional contact (e.g., folded cloth or twined filaments).
To achieve this goal, we shall develop an original inverse modeling strategy based upon a/ the design of reduced and high-order discrete models for slender mechanical structures including rods, plates and shells, b/ a compact and well-posed mathematical formulation of our nonsmooth inverse problems, both in the static and dynamic cases, c/ the design of robust and efficient numerical tools for solving such complex problems, and d/ a thorough experimental validation of our methods relying on the most recent capturing tools.
In addition to significant advances in fast image-based measurement of diverse mechanical materials stemming from physics, biology, or manufacturing, this research is expected in the long run to ease considerably the design of physically realistic virtual worlds, as well as to boost the creation of dynamic human doubles.
Summary
With the considerable advance of automatic image-based capture in Computer Vision and Computer Graphics these latest years, it becomes now affordable to acquire quickly and precisely the full 3D geometry of many mechanical objects featuring intricate shapes. Yet, while more and more geometrical data get collected and shared among the communities, there is currently very little study about how to infer the underlying mechanical properties of the captured objects merely from their geometrical configurations.
The GEM challenge consists in developing a non-invasive method for inferring the mechanical properties of complex objects from a minimal set of geometrical poses, in order to predict their dynamics. In contrast to classical inverse reconstruction methods, my proposal is built upon the claim that 1/ the mere geometrical shape of physical objects reveals a lot about their underlying mechanical properties and 2/ this property can be fully leveraged for a wide range of objects featuring rich geometrical configurations, such as slender structures subject to frictional contact (e.g., folded cloth or twined filaments).
To achieve this goal, we shall develop an original inverse modeling strategy based upon a/ the design of reduced and high-order discrete models for slender mechanical structures including rods, plates and shells, b/ a compact and well-posed mathematical formulation of our nonsmooth inverse problems, both in the static and dynamic cases, c/ the design of robust and efficient numerical tools for solving such complex problems, and d/ a thorough experimental validation of our methods relying on the most recent capturing tools.
In addition to significant advances in fast image-based measurement of diverse mechanical materials stemming from physics, biology, or manufacturing, this research is expected in the long run to ease considerably the design of physically realistic virtual worlds, as well as to boost the creation of dynamic human doubles.
Max ERC Funding
1 498 570 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym IChaos
Project Intermediate Chaos
Researcher (PI) Alexander Bufetov
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary "The transition from order to chaos has been a central theme of investigation in dynamical systems in the last two decades. Structures that exhibit a mix of deterministic and chaotic properties, for example, quasi-crystals, naturally arise in problems of geometry and mathematical physics. Despite intense study, key questions about these structures remain wide open.
The proposed research is an investigation of intermediate chaos in ergodic theory of dynamical systems. Specific examples include systems of geometric origin such as interval exchange maps, translation and Hamiltonian flows on surfaces of higher genus, symbolic substitution systems important in the study of quasi-crystals as well as dynamical systems arising in asymptotic combinatorics and mathematical physics such as determinantal and Pfaffian point processes. Specific tasks include computation of the Hausdorff dimension for the spectral measure of interval exchange maps (problem posed by Ya. Sinai), limit theorems for Hamiltonian flows on surfaces of higher genus (question of A. Katok), development of entropy theory and functional limit theorems for determinantal point processes and a description of the ergodic decomposition for infinite orthogonally-invariant measures on the space of infinite real matrices (the real case of the problem, posed in 2000 by A. Borodin and G. Olshanski, of harmonic analysis on the infinite-dimensional analogue of the Grassmann manifold). The project consolidates the proposer's past work, in particular, his limit theorems for translation flows (Annals of Math. 2014), his proof of the 1985 Vershik-Kerov entropy conjecture (GAFA 2012) and his solution of the complex case of the Borodin-Olshanski problem (preprint 2013). The proposer is currently PI of project ANR-11-IDEX-0001-02 (1.11.2013--30.10.2015; budget 360000 euro) under the Programme "Investissements d'avenir" of the Government of the French Republic."
Summary
"The transition from order to chaos has been a central theme of investigation in dynamical systems in the last two decades. Structures that exhibit a mix of deterministic and chaotic properties, for example, quasi-crystals, naturally arise in problems of geometry and mathematical physics. Despite intense study, key questions about these structures remain wide open.
The proposed research is an investigation of intermediate chaos in ergodic theory of dynamical systems. Specific examples include systems of geometric origin such as interval exchange maps, translation and Hamiltonian flows on surfaces of higher genus, symbolic substitution systems important in the study of quasi-crystals as well as dynamical systems arising in asymptotic combinatorics and mathematical physics such as determinantal and Pfaffian point processes. Specific tasks include computation of the Hausdorff dimension for the spectral measure of interval exchange maps (problem posed by Ya. Sinai), limit theorems for Hamiltonian flows on surfaces of higher genus (question of A. Katok), development of entropy theory and functional limit theorems for determinantal point processes and a description of the ergodic decomposition for infinite orthogonally-invariant measures on the space of infinite real matrices (the real case of the problem, posed in 2000 by A. Borodin and G. Olshanski, of harmonic analysis on the infinite-dimensional analogue of the Grassmann manifold). The project consolidates the proposer's past work, in particular, his limit theorems for translation flows (Annals of Math. 2014), his proof of the 1985 Vershik-Kerov entropy conjecture (GAFA 2012) and his solution of the complex case of the Borodin-Olshanski problem (preprint 2013). The proposer is currently PI of project ANR-11-IDEX-0001-02 (1.11.2013--30.10.2015; budget 360000 euro) under the Programme "Investissements d'avenir" of the Government of the French Republic."
Max ERC Funding
1 696 937 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym IMAGINE
Project Imaging magnetic fields at the nanoscale with a single spin microscope
Researcher (PI) Vincent, Henri Jacques
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Detecting and imaging magnetic fields with high sensitivity and nanoscale resolution is a topic of crucial importance for a wealth of research domains, from material science, to mesoscopic physics, and life sciences. This is obviously also a key requirement for fundamental studies in nanomagnetism and the design of innovative magnetic materials with tailored properties for applications in spintronics. Although a remarkable number of magnetic microscopy techniques have been developed over the last decades, imaging magnetism at the nanoscale remains a challenging task.
It was recently realized that the experimental methods allowing for the detection of single spins in the solid-state, which were initially developed for quantum information science, open new avenues for high sensitivity magnetometry. In that spirit, it was recently proposed to use the electronic spin of a single nitrogen-vacancy (NV) defect in diamond as a nanoscale quantum sensor for scanning probe magnetometry. This approach promises significant advances in magnetic imaging since it provides quantitative and vectorial magnetic field measurements, with an unprecedented combination of spatial resolution and magnetic sensitivity, even under ambient conditions.
The IMAGINE project intend to exploit the unique performances of scanning-NV magnetometry to achieve major breakthroughs in nanomagnetism. We will first explore the structure of domain walls and individual skyrmions in ultrathin magnetic wires, which both promise disruptive applications in spintronics. This will lead (i) to solve an important academic debate regarding the inner structure of domain walls and (ii) to the first detection of individual skyrmions in ultrathin magnetic wire under ambient conditions. This might result in a new paradigm for spin-based applications in nanoelectronics. We will then explore orbital magnetism in graphene, which has never been observed experimentally and is the purpose of surprising theoretical predictions.
Summary
Detecting and imaging magnetic fields with high sensitivity and nanoscale resolution is a topic of crucial importance for a wealth of research domains, from material science, to mesoscopic physics, and life sciences. This is obviously also a key requirement for fundamental studies in nanomagnetism and the design of innovative magnetic materials with tailored properties for applications in spintronics. Although a remarkable number of magnetic microscopy techniques have been developed over the last decades, imaging magnetism at the nanoscale remains a challenging task.
It was recently realized that the experimental methods allowing for the detection of single spins in the solid-state, which were initially developed for quantum information science, open new avenues for high sensitivity magnetometry. In that spirit, it was recently proposed to use the electronic spin of a single nitrogen-vacancy (NV) defect in diamond as a nanoscale quantum sensor for scanning probe magnetometry. This approach promises significant advances in magnetic imaging since it provides quantitative and vectorial magnetic field measurements, with an unprecedented combination of spatial resolution and magnetic sensitivity, even under ambient conditions.
The IMAGINE project intend to exploit the unique performances of scanning-NV magnetometry to achieve major breakthroughs in nanomagnetism. We will first explore the structure of domain walls and individual skyrmions in ultrathin magnetic wires, which both promise disruptive applications in spintronics. This will lead (i) to solve an important academic debate regarding the inner structure of domain walls and (ii) to the first detection of individual skyrmions in ultrathin magnetic wire under ambient conditions. This might result in a new paradigm for spin-based applications in nanoelectronics. We will then explore orbital magnetism in graphene, which has never been observed experimentally and is the purpose of surprising theoretical predictions.
Max ERC Funding
1 498 810 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym JSPEC
Project Josephson Junction Spectroscopy of Mesoscopic Systems
Researcher (PI) Caglar Ozgun Girit
Host Institution (HI) COLLEGE DE FRANCE
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Spectroscopy is a powerful tool to probe matter. By measuring the spectrum of elementary excitations, one reveals the symmetries and interactions inherent in a physical system. Mesoscopic devices, which preserve quantum coherence over lengths larger than the atomic scale, offer a unique possibility to both engineer and investigate excitations at the single quanta level. Unfortunately, conventional spectroscopy techniques are inadequate for coupling radiation to mesoscopic systems and detecting their small absorption signals. I propose an on-chip, Josephson-junction based spectrometer which surpasses state-of-the-art instruments and is ideally suited for probing elementary excitations in mesoscopic systems. It has an original design providing uniform wideband coupling from 2-2000 GHz, low background noise, high sensitivity, and narrow linewidth.
I describe the operating principle and design of the spectrometer, show preliminary results demonstrating proof-of-concept, and outline three experiments which exploit the spectrometer to address important issues in condensed matter physics. The experiments are: measuring the lifetime of single quasiparticle and excited Cooper pair states in superconductors, a topic relevant for quantum information processing; determining whether graphene has a bandgap, a fundamental yet unresolved question; and recording a clear spectroscopic signature of Majorana bound states in topological superconductor weak links.
Various applications of the superconducting circuits developed for the spectrometer include a Josephson vector network analyzer, a cryogenic mixer, a THz camera, a detector for radioastronomy, and a scanning microwave impedance microscope. In itself the proposed JJ spectrometer is a general purpose tool that will benefit researchers studying mesoscopic systems. Ultimately, Josephson junction spectroscopy should not only be useful to detect existing elementary excitations but also to discover new ones.
Summary
Spectroscopy is a powerful tool to probe matter. By measuring the spectrum of elementary excitations, one reveals the symmetries and interactions inherent in a physical system. Mesoscopic devices, which preserve quantum coherence over lengths larger than the atomic scale, offer a unique possibility to both engineer and investigate excitations at the single quanta level. Unfortunately, conventional spectroscopy techniques are inadequate for coupling radiation to mesoscopic systems and detecting their small absorption signals. I propose an on-chip, Josephson-junction based spectrometer which surpasses state-of-the-art instruments and is ideally suited for probing elementary excitations in mesoscopic systems. It has an original design providing uniform wideband coupling from 2-2000 GHz, low background noise, high sensitivity, and narrow linewidth.
I describe the operating principle and design of the spectrometer, show preliminary results demonstrating proof-of-concept, and outline three experiments which exploit the spectrometer to address important issues in condensed matter physics. The experiments are: measuring the lifetime of single quasiparticle and excited Cooper pair states in superconductors, a topic relevant for quantum information processing; determining whether graphene has a bandgap, a fundamental yet unresolved question; and recording a clear spectroscopic signature of Majorana bound states in topological superconductor weak links.
Various applications of the superconducting circuits developed for the spectrometer include a Josephson vector network analyzer, a cryogenic mixer, a THz camera, a detector for radioastronomy, and a scanning microwave impedance microscope. In itself the proposed JJ spectrometer is a general purpose tool that will benefit researchers studying mesoscopic systems. Ultimately, Josephson junction spectroscopy should not only be useful to detect existing elementary excitations but also to discover new ones.
Max ERC Funding
1 997 498 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym LEGO
Project Multimodal glycoconjugates: a molecular Lego approach for antitumoral immunotherapy
Researcher (PI) Olivier Pierre Renaudet
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary Despite significant progress in cancer therapy, current treatments are still controversial due to intolerable side effects. Targeted immunotherapy has recently emerged as an ideal alternative to improve treatment modalities for cancers patients. However, very limited approaches are available today and major issues remain to be addressed. The ERC grant offers a unique opportunity to propose a new paradigm for treating cancer. Through a ground-breaking interdisciplinary program, at the crossroad of supramolecular chemistry, synthetic chemistry, molecular engineering, biophysics, biochemistry, immunochemistry and glycoscience, it is my ambition to design, synthesize and study smart biomolecular structures with unprecedented combinations, complexity and immunological properties against cancers. To achieve this purpose, I will develop a “molecular LEGO” approach to construct synthetic molecules capable of redirecting endogenous antibodies present in the human bloodstream against tumors without preliminary immunization. Efficient tumoral killing by immune effectors will be provided by molecules combining innovative antibody and tumor binding modules that will be selected in vitro beforehand. To be successful, I will address fundamental questions that are still unresolved in chemical and biological sciences. The expected breakthroughs will represent a landmark achievement in these fields and will open promising horizons in cancer immunotherapy. Beyond this, it can be expected that our findings will pave the way to future development of synthetic molecules embedded with recognition, labeling, and/or therapeutic functions. They will thus find wider medicinal, diagnostic and even theranostic applications for which the development of more effective and selective biomolecular systems is of the utmost importance.
Summary
Despite significant progress in cancer therapy, current treatments are still controversial due to intolerable side effects. Targeted immunotherapy has recently emerged as an ideal alternative to improve treatment modalities for cancers patients. However, very limited approaches are available today and major issues remain to be addressed. The ERC grant offers a unique opportunity to propose a new paradigm for treating cancer. Through a ground-breaking interdisciplinary program, at the crossroad of supramolecular chemistry, synthetic chemistry, molecular engineering, biophysics, biochemistry, immunochemistry and glycoscience, it is my ambition to design, synthesize and study smart biomolecular structures with unprecedented combinations, complexity and immunological properties against cancers. To achieve this purpose, I will develop a “molecular LEGO” approach to construct synthetic molecules capable of redirecting endogenous antibodies present in the human bloodstream against tumors without preliminary immunization. Efficient tumoral killing by immune effectors will be provided by molecules combining innovative antibody and tumor binding modules that will be selected in vitro beforehand. To be successful, I will address fundamental questions that are still unresolved in chemical and biological sciences. The expected breakthroughs will represent a landmark achievement in these fields and will open promising horizons in cancer immunotherapy. Beyond this, it can be expected that our findings will pave the way to future development of synthetic molecules embedded with recognition, labeling, and/or therapeutic functions. They will thus find wider medicinal, diagnostic and even theranostic applications for which the development of more effective and selective biomolecular systems is of the utmost importance.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym LITHIUM
Project From planetary birth with aperture masking interferometry to nulling with Lithium Niobate technology
Researcher (PI) Sylvestre Mathieu André Lacour
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary Observing the process of planetary accretion is crucial to inform models of planet formation. Most of the key action is expected to happen in the gaps of protostellar disks – a spatial realm over which aperture masking interferometry has demonstrated a unique ability to deliver incisive imaging. Masking offers twin advantages of higher dynamic range at the diffraction limit (lambda/D) than differential imaging, while at the same time giving nearly complete Fourier coverage compared to long baseline interferometry. The founding objective of this proposal is to create expertise and technology to understand the astrophysical phenomena so far only glimpsed in faint detections in stellar gaps such as those published in T Cha (Huelamo et al. 2011), HD142527 (Biller et al. 2012) and FL Cha (Cieza et al. 2013). But the central goal of this project is to further advance the experimental technique. Reaching even higher dynamic range for fainter detections is essential for probing planetary birth. The way to improve the dynamic range is clear: increase the accuracy of the primary closure phase observable. To do so, we will follow two paths. The first will use laboratory experimentations to analyse and understand the sources of bias to the closure phase. The resulting end-product will be better software offered to the community, and better techniques for a next generation of aperture masking devices. The second path is to amplify the closure phase signal by combining nulling with closure phase (Lacour et al. 2014). This second path is the most challenging, but will be an important breakthrough to the field. Nulling is to aperture masking what coronagraphy is to classical imaging. Without a first level of nulling, the aperture masking technique will always be limited by the photon noise due to the stellar light. We propose to build on our experience of Lithium Niobate integrated optics devices to bring aperture masking to a new level of performance in high dynamic range imaging.
Summary
Observing the process of planetary accretion is crucial to inform models of planet formation. Most of the key action is expected to happen in the gaps of protostellar disks – a spatial realm over which aperture masking interferometry has demonstrated a unique ability to deliver incisive imaging. Masking offers twin advantages of higher dynamic range at the diffraction limit (lambda/D) than differential imaging, while at the same time giving nearly complete Fourier coverage compared to long baseline interferometry. The founding objective of this proposal is to create expertise and technology to understand the astrophysical phenomena so far only glimpsed in faint detections in stellar gaps such as those published in T Cha (Huelamo et al. 2011), HD142527 (Biller et al. 2012) and FL Cha (Cieza et al. 2013). But the central goal of this project is to further advance the experimental technique. Reaching even higher dynamic range for fainter detections is essential for probing planetary birth. The way to improve the dynamic range is clear: increase the accuracy of the primary closure phase observable. To do so, we will follow two paths. The first will use laboratory experimentations to analyse and understand the sources of bias to the closure phase. The resulting end-product will be better software offered to the community, and better techniques for a next generation of aperture masking devices. The second path is to amplify the closure phase signal by combining nulling with closure phase (Lacour et al. 2014). This second path is the most challenging, but will be an important breakthrough to the field. Nulling is to aperture masking what coronagraphy is to classical imaging. Without a first level of nulling, the aperture masking technique will always be limited by the photon noise due to the stellar light. We propose to build on our experience of Lithium Niobate integrated optics devices to bring aperture masking to a new level of performance in high dynamic range imaging.
Max ERC Funding
1 851 881 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym LOFAR
Project Searching for The Origin of Cosmic Rays and Neutrinos with LOFAR
Researcher (PI) Stijn Buitink
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary The origin of cosmic rays remains one of the largest mysteries in astrophysics. Innovative and accurate radio measurements of cosmic rays and neutrinos with LOFAR promise to provide new answers.
It is generally believed that ultra-high-energy cosmic rays are produced in extragalactic sources like gamma- ray bursts or active galactic nuclei, while the lower energy cosmic rays come from our own Galaxy. At what energy this transition takes place is still unknown. Here we focus on disentangling Galactic and extragalactic components by studying the mass composition between 10^17 and 10^18 eV, a regime that is also crucial for understanding the origin of the extraterrestrial neutrinos detected by IceCube.
We do this with LOFAR, the first radio telescope that can detect individual cosmic rays with hundreds of antennas. This incredible level of detail allowed us to finally understand the complicated radiation mechanism and to perform the first-ever accurate mass analysis based on radio measurements. Our first data reveal a strong proton component below 10^18 eV, suggesting an early transition to an extragalactic component. With upgrades to our detector and techniques we will be able to improve our sample size by an order of magnitude, resolve more mass components, and identify the origin of high-energy cosmic rays and neutrinos.
The technique may be scaled up to higher energies, measured at the Pierre Auger Observatory, where mass information is needed to correlate cosmic rays with their astrophysical sources and to confirm the nature of the cutoff at ~10^19.6 eV.
We can even search for particles beyond the GZK limit. With the Westerbork telescope we have already set the best limit on cosmic rays and neutrinos above 10^23 eV. With LOFAR we will achieve a much better sensitivity at lower energies, also probing for new physics, like the decays of cosmic strings predicted by supersymmetric theories.
Summary
The origin of cosmic rays remains one of the largest mysteries in astrophysics. Innovative and accurate radio measurements of cosmic rays and neutrinos with LOFAR promise to provide new answers.
It is generally believed that ultra-high-energy cosmic rays are produced in extragalactic sources like gamma- ray bursts or active galactic nuclei, while the lower energy cosmic rays come from our own Galaxy. At what energy this transition takes place is still unknown. Here we focus on disentangling Galactic and extragalactic components by studying the mass composition between 10^17 and 10^18 eV, a regime that is also crucial for understanding the origin of the extraterrestrial neutrinos detected by IceCube.
We do this with LOFAR, the first radio telescope that can detect individual cosmic rays with hundreds of antennas. This incredible level of detail allowed us to finally understand the complicated radiation mechanism and to perform the first-ever accurate mass analysis based on radio measurements. Our first data reveal a strong proton component below 10^18 eV, suggesting an early transition to an extragalactic component. With upgrades to our detector and techniques we will be able to improve our sample size by an order of magnitude, resolve more mass components, and identify the origin of high-energy cosmic rays and neutrinos.
The technique may be scaled up to higher energies, measured at the Pierre Auger Observatory, where mass information is needed to correlate cosmic rays with their astrophysical sources and to confirm the nature of the cutoff at ~10^19.6 eV.
We can even search for particles beyond the GZK limit. With the Westerbork telescope we have already set the best limit on cosmic rays and neutrinos above 10^23 eV. With LOFAR we will achieve a much better sensitivity at lower energies, also probing for new physics, like the decays of cosmic strings predicted by supersymmetric theories.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym LUCKY STAR
Project Exploring the outer solar system beyond Neptune using stellar occultations
Researcher (PI) Bruno SICARDY
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE9, ERC-2014-ADG
Summary The solar system beyond Neptune’s contains largely unaltered material from the primordial circum-solar disk. It also kept the memory of the early planetary migrations, and thus contains essential information on the origin and evolution of our planetary system.
Here I propose to study the Trans-Neptunian Objects (TNOs) using the stellar occultation technique. It consists in observing the passage of remote TNOs in front of those “Lucky Stars”, that reveal shapes, atmosphere and rings of bodies from sub-km to thousand-km in size. Very few teams in the world master this method. The European-led network that I coordinate is now leader in predictions, instrumentation, observations and analysis related to stellar occultations, with innovative approaches and unprecedented results.
In the last decade, our group led the field by discovering rings around the asteroid-like object Chariklo, detecting sub-km TNOs and drastic variations of Pluto’s atmospheric pressure. Based on those noteworthy discoveries and unique skills of ours, I will coordinate the following work packages:
(1) Rings around small bodies - Understand the newly found Chariklo’s rings, tackle the theory of rings’ origins and evolutions around small bodies, discover new ring systems around other bodies.
(2) Very small, sub-km TNOs and Oort Cloud objects - Constrain the collisional history of our early outer solar system, and possibly detect Oort Cloud objects.
(3) Pluto’s atmosphere – Explore Pluto’s atmosphere and its atypical seasonal cycle, search for atmospheres around other TNOs.
(4) Explore specific, large TNOs – Provide their sizes, shapes, albedos and densities.
These programs are timely in view of NASA/New Horizons Pluto flyby in July 2015, and the ESA/GAIA mission expected to provide a greatly improved astrometric catalog release in 2016.
Most of the budget will be dedicated to human power to conduct observations and their analysis, plus the associated travel and telescope time expenses.
Summary
The solar system beyond Neptune’s contains largely unaltered material from the primordial circum-solar disk. It also kept the memory of the early planetary migrations, and thus contains essential information on the origin and evolution of our planetary system.
Here I propose to study the Trans-Neptunian Objects (TNOs) using the stellar occultation technique. It consists in observing the passage of remote TNOs in front of those “Lucky Stars”, that reveal shapes, atmosphere and rings of bodies from sub-km to thousand-km in size. Very few teams in the world master this method. The European-led network that I coordinate is now leader in predictions, instrumentation, observations and analysis related to stellar occultations, with innovative approaches and unprecedented results.
In the last decade, our group led the field by discovering rings around the asteroid-like object Chariklo, detecting sub-km TNOs and drastic variations of Pluto’s atmospheric pressure. Based on those noteworthy discoveries and unique skills of ours, I will coordinate the following work packages:
(1) Rings around small bodies - Understand the newly found Chariklo’s rings, tackle the theory of rings’ origins and evolutions around small bodies, discover new ring systems around other bodies.
(2) Very small, sub-km TNOs and Oort Cloud objects - Constrain the collisional history of our early outer solar system, and possibly detect Oort Cloud objects.
(3) Pluto’s atmosphere – Explore Pluto’s atmosphere and its atypical seasonal cycle, search for atmospheres around other TNOs.
(4) Explore specific, large TNOs – Provide their sizes, shapes, albedos and densities.
These programs are timely in view of NASA/New Horizons Pluto flyby in July 2015, and the ESA/GAIA mission expected to provide a greatly improved astrometric catalog release in 2016.
Most of the budget will be dedicated to human power to conduct observations and their analysis, plus the associated travel and telescope time expenses.
Max ERC Funding
2 423 750 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym MAGIC
Project Monsoons of Asia caused Greenhouse to Icehouse Cooling
Researcher (PI) Guillaume Dupont-nivet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE10, ERC-2014-CoG
Summary Unraveling the cause for Cenozoic global climate cooling is one of the most important unresolved questions challenging the Earth and Environmental sciences community today. Increased erosion and weathering of the uplifted Tibetan Plateau and Himalayas, is advocated as the primary cause for the enigmatic pCO2 drawdown, that led to global cooling 50 to 34 Myrs ago from the warm ice-free Greenhouse world to the bi-polar Icehouse conditions still prevailing today. Asian Monsoons are genetically linked to high orography associated with the India-Asia collision starting ca. 50 Myrs ago, however, the relation between Greenhouse to Icehouse cooling and Asian Monsoons remains to be explore as they were previously thought to intensify only much later ca. 25 Myrs ago. Our recent findings of monsoonal activity in Asia since at least 45 Myrs ago raises the fascinating possibility that Asian Monsoons may have triggered global cooling from Greenhouse to Icehouse conditions. Testing this novel hypothesis and exploring its implications on feedback mechanisms between regional environments, Asian Monsoons and global climate, will constitute the stimulating objectives of MAGIC. 3 PhDs will provide end-member monsoonal archives well-dated during greenhouse to icehouse cooling from 3 key locations (NE Tibet, SE Asia and Paratethys Sea). These will be analyzed by three postdocs expert in novel proxy methods tailored for MAGIC to infer temperatures, precipitation, salinity, seasonality, paleoaltimetry, wind-patterns, paleoecology and paleogeography at infra-annual, orbital and tectonic time scales. Ultimately, these records and boundary conditions will be integrated into climate models by a dedicated postdoc to unravel the role and behavior of Asian Monsoons with respect to long-term Greenhouse to Icehouse cooling, pCO2 levels as well as global hyperthermal and cooling event such as the PETM, MECO and EOT.
Summary
Unraveling the cause for Cenozoic global climate cooling is one of the most important unresolved questions challenging the Earth and Environmental sciences community today. Increased erosion and weathering of the uplifted Tibetan Plateau and Himalayas, is advocated as the primary cause for the enigmatic pCO2 drawdown, that led to global cooling 50 to 34 Myrs ago from the warm ice-free Greenhouse world to the bi-polar Icehouse conditions still prevailing today. Asian Monsoons are genetically linked to high orography associated with the India-Asia collision starting ca. 50 Myrs ago, however, the relation between Greenhouse to Icehouse cooling and Asian Monsoons remains to be explore as they were previously thought to intensify only much later ca. 25 Myrs ago. Our recent findings of monsoonal activity in Asia since at least 45 Myrs ago raises the fascinating possibility that Asian Monsoons may have triggered global cooling from Greenhouse to Icehouse conditions. Testing this novel hypothesis and exploring its implications on feedback mechanisms between regional environments, Asian Monsoons and global climate, will constitute the stimulating objectives of MAGIC. 3 PhDs will provide end-member monsoonal archives well-dated during greenhouse to icehouse cooling from 3 key locations (NE Tibet, SE Asia and Paratethys Sea). These will be analyzed by three postdocs expert in novel proxy methods tailored for MAGIC to infer temperatures, precipitation, salinity, seasonality, paleoaltimetry, wind-patterns, paleoecology and paleogeography at infra-annual, orbital and tectonic time scales. Ultimately, these records and boundary conditions will be integrated into climate models by a dedicated postdoc to unravel the role and behavior of Asian Monsoons with respect to long-term Greenhouse to Icehouse cooling, pCO2 levels as well as global hyperthermal and cooling event such as the PETM, MECO and EOT.
Max ERC Funding
1 999 999 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym MAGNETO
Project Active Magnetorheological Elastomers: from Hierarchical Composite Materials to tailored Instabilities
Researcher (PI) Konstantinos Danas
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary In recent years, there has been an increased effort by scientists to obtain new composite materials with extreme properties. Inspired by natural and biological processes, scientists have proposed the use of hierarchical architectures (i.e., assembly of structural components) spanning several length scales from nanometer to centimeter sizes. Depending each time on the desired properties of the composite material, optimization with respect to its stiffness, weight, density, toughness and other properties is carried out. In the present subject, the interest is in magneto-mechanical coupling and tailored instabilities. Hierarchical materials, such as magnetorheological elastomers (MREs) which combine magnetic particles (at the scale of nanometers and micrometers) embedded in a soft polymeric non-magnetic matrix, give rise to a coupled magneto-mechanical response at the macroscopic (order of millimeters and centimeters) scale when they are subjected to combined magneto-mechanical external stimuli. These composite materials can deform at very large strains due to the presence of the soft polymeric matrix without fracturing. From an unconventional point of view, a remarkable property of these materials is that while they can become unstable by combined magneto-mechanical loading, their response is well controlled in the post-instability regime. This, in turn, allows us to try to operate these materials in this critically stable region, similar to most biological systems. These instabilities can lead to extreme responses such as wrinkles (for haptic applications), actively controlled stiffness (for cell-growth) and acoustic properties with only marginal changes in the externally applied magnetic fields. Unlike the current modeling of hierarchical composites, MREs require the development of novel experimental techniques and advanced coupled nonlinear magneto-mechanical models in order to tailor the desired macroscopic instability response at finite strains.
Summary
In recent years, there has been an increased effort by scientists to obtain new composite materials with extreme properties. Inspired by natural and biological processes, scientists have proposed the use of hierarchical architectures (i.e., assembly of structural components) spanning several length scales from nanometer to centimeter sizes. Depending each time on the desired properties of the composite material, optimization with respect to its stiffness, weight, density, toughness and other properties is carried out. In the present subject, the interest is in magneto-mechanical coupling and tailored instabilities. Hierarchical materials, such as magnetorheological elastomers (MREs) which combine magnetic particles (at the scale of nanometers and micrometers) embedded in a soft polymeric non-magnetic matrix, give rise to a coupled magneto-mechanical response at the macroscopic (order of millimeters and centimeters) scale when they are subjected to combined magneto-mechanical external stimuli. These composite materials can deform at very large strains due to the presence of the soft polymeric matrix without fracturing. From an unconventional point of view, a remarkable property of these materials is that while they can become unstable by combined magneto-mechanical loading, their response is well controlled in the post-instability regime. This, in turn, allows us to try to operate these materials in this critically stable region, similar to most biological systems. These instabilities can lead to extreme responses such as wrinkles (for haptic applications), actively controlled stiffness (for cell-growth) and acoustic properties with only marginal changes in the externally applied magnetic fields. Unlike the current modeling of hierarchical composites, MREs require the development of novel experimental techniques and advanced coupled nonlinear magneto-mechanical models in order to tailor the desired macroscopic instability response at finite strains.
Max ERC Funding
1 499 206 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym MAMSIE
Project Mixing and Angular Momentum tranSport of massIvE stars
Researcher (PI) Conny Aerts
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE9, ERC-2014-ADG
Summary With the CoRoT & Kepler data analysed, the time is optimal to move from observational asteroseismology to innovative stellar modelling of the steal factories of the Universe. With MAMSIE, we follow the footsteps of helioseismologists some 30 years after them, but this time we shall be developing inversion methods for stellar structure based on gravity-mode oscillations that probe the deep stellar interior. MAMSIE will lead to new models for a variety of single and binary stars with masses between 3 and 30 M⊙ whose space photometry and high-resolution spectroscopy reveal sufficient seismic information on their gravity modes to invert the frequencies and compute the stars’ structure. In contrast to the conventional theoretical approach to stellar evolution, the data-driven approach of MAMSIE will allow us to include angular momentum transport due to internal gravity waves, as well as mixing prescriptions for turbulent entrainment, from coupling of the output of 3D hydrodynamical simulations of these phenomena to specialised seismic observables of relevance for massive stars. Our sample includes slow and fast rotators, with and without a magnetic field, with and without a stellar wind. The new models will be placed in an evolutionary context for optimal assessment of the evolution of internal rotation, angular momentum, and chemical mixing throughout stellar life of massive stars. The output of the stellar modelling will provide fundamentals for all topics in modern astrophysics that rely on massive star models. MAMSIE is overarching and will require a multidisciplinary team led by an expert in gravity-mode oscillations working in close collaboration with a 3D hydrodynamics expert; it will offer a highly competitive environment for PhD and postdoctoral research on the astrophysics of massive stars.
Summary
With the CoRoT & Kepler data analysed, the time is optimal to move from observational asteroseismology to innovative stellar modelling of the steal factories of the Universe. With MAMSIE, we follow the footsteps of helioseismologists some 30 years after them, but this time we shall be developing inversion methods for stellar structure based on gravity-mode oscillations that probe the deep stellar interior. MAMSIE will lead to new models for a variety of single and binary stars with masses between 3 and 30 M⊙ whose space photometry and high-resolution spectroscopy reveal sufficient seismic information on their gravity modes to invert the frequencies and compute the stars’ structure. In contrast to the conventional theoretical approach to stellar evolution, the data-driven approach of MAMSIE will allow us to include angular momentum transport due to internal gravity waves, as well as mixing prescriptions for turbulent entrainment, from coupling of the output of 3D hydrodynamical simulations of these phenomena to specialised seismic observables of relevance for massive stars. Our sample includes slow and fast rotators, with and without a magnetic field, with and without a stellar wind. The new models will be placed in an evolutionary context for optimal assessment of the evolution of internal rotation, angular momentum, and chemical mixing throughout stellar life of massive stars. The output of the stellar modelling will provide fundamentals for all topics in modern astrophysics that rely on massive star models. MAMSIE is overarching and will require a multidisciplinary team led by an expert in gravity-mode oscillations working in close collaboration with a 3D hydrodynamics expert; it will offer a highly competitive environment for PhD and postdoctoral research on the astrophysics of massive stars.
Max ERC Funding
2 498 941 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym MaTissE
Project Magnetic approaches for Tissue Mechanics and Engineering
Researcher (PI) Claire Wilhelm
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary "While magnetic nanomaterials are increasingly used as clinical agents for imaging and therapy, their use as a tool for tissue engineering opens up challenging perspectives that have rarely been explored. Lying at the interface between biophysics and nanomedicine, and based on magnetic techniques, the proposed project aims to magnetically design functional tissues and to explore the tissular fate of nanomaterials. Magnetic nanoparticles will be safely introduced into therapeutic cells, thus allowing them to be remotely manipulated by external magnets. 3D manipulations of the magnetized cells (patented in 2012) will be used to form tissues with a controlled size and shape through the development of a unique magnetic bioreactor. In a self-integrating all-in-one process, 3D tissue will be shaped from cellular "bricks" without the need for a scaffold. The magnetic tissue will be amenable to mechanical stimulation and in situ imaging at each step of its maturation. The project is inherently multidisciplinary:
1) From a biophysics standpoint, controlled tissue stimulation, forced cell alignment, and mapping of cell-cell forces, will be used to answer pressing questions on the role of physical stresses in cell and tissue functions, such as differentiation.
2) From a regenerative medicine standpoint, this magnetic technology will be applied to cartilage and cardiac tissue repair. The functionality of the constructs and their centimetric size range, combined with a surgeon-friendly tissue handling with a dedicated magnetic tool, and the inherent magnetic resonance imaging properties of the constructs will be major advantages for clinical translation.
3) From a nanomaterials standpoint, nanomaterial fate will be explored in situ using nanomagnetic methods, both at the tissue scale (macroscopic) and at the nanoscale. This is a necessary corollary for the use of nanomaterials in regenerative medicine, and one that is largely unexplored."
Summary
"While magnetic nanomaterials are increasingly used as clinical agents for imaging and therapy, their use as a tool for tissue engineering opens up challenging perspectives that have rarely been explored. Lying at the interface between biophysics and nanomedicine, and based on magnetic techniques, the proposed project aims to magnetically design functional tissues and to explore the tissular fate of nanomaterials. Magnetic nanoparticles will be safely introduced into therapeutic cells, thus allowing them to be remotely manipulated by external magnets. 3D manipulations of the magnetized cells (patented in 2012) will be used to form tissues with a controlled size and shape through the development of a unique magnetic bioreactor. In a self-integrating all-in-one process, 3D tissue will be shaped from cellular "bricks" without the need for a scaffold. The magnetic tissue will be amenable to mechanical stimulation and in situ imaging at each step of its maturation. The project is inherently multidisciplinary:
1) From a biophysics standpoint, controlled tissue stimulation, forced cell alignment, and mapping of cell-cell forces, will be used to answer pressing questions on the role of physical stresses in cell and tissue functions, such as differentiation.
2) From a regenerative medicine standpoint, this magnetic technology will be applied to cartilage and cardiac tissue repair. The functionality of the constructs and their centimetric size range, combined with a surgeon-friendly tissue handling with a dedicated magnetic tool, and the inherent magnetic resonance imaging properties of the constructs will be major advantages for clinical translation.
3) From a nanomaterials standpoint, nanomaterial fate will be explored in situ using nanomagnetic methods, both at the tissue scale (macroscopic) and at the nanoscale. This is a necessary corollary for the use of nanomaterials in regenerative medicine, and one that is largely unexplored."
Max ERC Funding
1 589 000 €
Duration
Start date: 2015-07-01, End date: 2020-12-31
Project acronym MESOPROBIO
Project Mesoscopic models for propagation in biology
Researcher (PI) Vincent CALVEZ
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary According to biologists, there is a need for quantitative models that are able to cope with the complexity of problems arising in the field of life sciences. Here, complexity refers to the interplay between various scales that are not clearly separate. The great challenge of the MESOPROBIO project is to analyse complex PDE models for biological propagation phenomena at the mesoscale. By analogy with the kinetic theory of gases, this is an intermediate level of description between the microscale (individual-based models) and the macroscale (parabolic reaction-transport-diffusion equations). The specific feature common to all the models involved in the project is the local heterogeneity with respect to a structure variable (velocity, phenotypical trait, age) which requires new mathematical methods. I propose to push analysis beyond classical upscaling arguments and to track the local heterogeneity all along the analysis.
The biological applications are: concentration waves of bacteria, evolutionary aspects of structured populations (with respect to dispersal ability or life-history traits), and anomalous diffusion. The mathematical challenges are: multiscale analysis of PDE having different properties in different directions of the phase space, including nonlocal terms (scattering, competition), and possibly lacking basic features of reaction-diffusion equations such as the maximum principle. The outcomes are: travelling waves, accelerating fronts, approximation of geometric optics, nonlocal Hamilton-Jacobi equations, optimal foraging strategies and evolutionary dynamics of phenotypical traits. Emphasis will be placed on quantitative results with strong feedback towards biology.
The project will be conducted in Lyon, a French hub for mathematical biology and hyperbolic equations. There will be close interaction with biologists in order to establish the most appropriate questions to answer. Several collaborations in Europe (UK, Austria) will be developed.
Summary
According to biologists, there is a need for quantitative models that are able to cope with the complexity of problems arising in the field of life sciences. Here, complexity refers to the interplay between various scales that are not clearly separate. The great challenge of the MESOPROBIO project is to analyse complex PDE models for biological propagation phenomena at the mesoscale. By analogy with the kinetic theory of gases, this is an intermediate level of description between the microscale (individual-based models) and the macroscale (parabolic reaction-transport-diffusion equations). The specific feature common to all the models involved in the project is the local heterogeneity with respect to a structure variable (velocity, phenotypical trait, age) which requires new mathematical methods. I propose to push analysis beyond classical upscaling arguments and to track the local heterogeneity all along the analysis.
The biological applications are: concentration waves of bacteria, evolutionary aspects of structured populations (with respect to dispersal ability or life-history traits), and anomalous diffusion. The mathematical challenges are: multiscale analysis of PDE having different properties in different directions of the phase space, including nonlocal terms (scattering, competition), and possibly lacking basic features of reaction-diffusion equations such as the maximum principle. The outcomes are: travelling waves, accelerating fronts, approximation of geometric optics, nonlocal Hamilton-Jacobi equations, optimal foraging strategies and evolutionary dynamics of phenotypical traits. Emphasis will be placed on quantitative results with strong feedback towards biology.
The project will be conducted in Lyon, a French hub for mathematical biology and hyperbolic equations. There will be close interaction with biologists in order to establish the most appropriate questions to answer. Several collaborations in Europe (UK, Austria) will be developed.
Max ERC Funding
1 091 688 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym MorePheno
Project Collider Phenomenology and Event Generators
Researcher (PI) Håkan Torbjörn Sjöstrand
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2014-ADG
Summary Collider physics is about exploring the smallest constituents of matter, and unravelling the basic laws of the Universe. Unfortunately there can be a huge gap between a one-line formula of a fundamental theory and the experimental reality it implies. Phenomenology is intended to fill that gap, e.g. to explore the consequences of a theory such that it can be directly compared with data.
Nowhere is the gap more striking than for QCD, the theory of strong interactions, which dominates in most high-energy collisions, like at the LHC (Large Hadron Collider) at CERN. And yet, when such collisions produce hundreds of outgoing particles, calculational complexity is insurmountable. Instead ingenious but approximate QCD-inspired models have to be invented.
Such models are especially powerful if they can be cast in the form of computer code, and combined to provide a complete description of the collision process. An event generator is such a code, where random numbers are used to emulate the quantum mechanical uncertainty that leads to no two collision events being quite identical.
The Principal Investigator is the main author of PYTHIA, the most widely used event generator of the last 30 years and vital for physics studies at the LHC. It is in a state of continuous extension: new concepts are invented, new models developed, new code written, to provide an increasingly accurate understanding of collider physics. But precise LHC data has put a demand on far more precise descriptions, and have also shown that some models need to be rethought from the ground up.
This project, at its core, is about conducting more frontline research with direct implications for event generators, embedded in a broader phenomenology context. In addition to the PI, the members of the theoretical high energy physics group in Lund and of the PYTHIA collaboration will participate in this project, as well as graduate students and postdocs.
Summary
Collider physics is about exploring the smallest constituents of matter, and unravelling the basic laws of the Universe. Unfortunately there can be a huge gap between a one-line formula of a fundamental theory and the experimental reality it implies. Phenomenology is intended to fill that gap, e.g. to explore the consequences of a theory such that it can be directly compared with data.
Nowhere is the gap more striking than for QCD, the theory of strong interactions, which dominates in most high-energy collisions, like at the LHC (Large Hadron Collider) at CERN. And yet, when such collisions produce hundreds of outgoing particles, calculational complexity is insurmountable. Instead ingenious but approximate QCD-inspired models have to be invented.
Such models are especially powerful if they can be cast in the form of computer code, and combined to provide a complete description of the collision process. An event generator is such a code, where random numbers are used to emulate the quantum mechanical uncertainty that leads to no two collision events being quite identical.
The Principal Investigator is the main author of PYTHIA, the most widely used event generator of the last 30 years and vital for physics studies at the LHC. It is in a state of continuous extension: new concepts are invented, new models developed, new code written, to provide an increasingly accurate understanding of collider physics. But precise LHC data has put a demand on far more precise descriptions, and have also shown that some models need to be rethought from the ground up.
This project, at its core, is about conducting more frontline research with direct implications for event generators, embedded in a broader phenomenology context. In addition to the PI, the members of the theoretical high energy physics group in Lund and of the PYTHIA collaboration will participate in this project, as well as graduate students and postdocs.
Max ERC Funding
1 990 895 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym Nano Harvest
Project Flexible nanowire devices for energy harvesting
Researcher (PI) Maria Tchernycheva
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary The goal of NanoHarvest is to explore novel solutions for flexible photovoltaic and piezoelectric converters enabled by semiconductor nanowires. The first objective is to demonstrate an innovative concept of flexible solar cells based on free-standing polymer-embedded nanowires which can be applied to almost any supporting material such as plastic, metal foil or even fabrics. The second objective it to develop high-efficiency flexible and compact piezo-generators based on ordered arrays of nanowire heterostructures. The crucial ingredient - and also the common basis - of the two proposed research axes are the advanced nanowire heterostructures: we will develop nanowires with new control-by-design functionalities by engineering their structure at the nanoscale. The main focus of NanoHarvest will be on the III-nitride semiconductors, which are characterized by a strong piezoelectric response and have also demonstrated their ability for efficient photon harvesting in the blue and green parts of the solar spectrum. Our strategy is to address the physical mechanisms governing the energy conversion from the single nanowire level up to the macroscopic device level. The deep understanding gained at the nanoscale will guide the optimization of the device architecture, of the material growth and of the fabrication process. We will make use of Molecular Beam Epitaxy to achieve ultimate control over the nanowire morphology and composition and to produce control-by-design model systems for fundamental studies and for exploration of device physics. The original transfer procedure of the ordered nanowire arrays onto flexible substrates will enable lightweight flexible devices with ultimate performance, which will serve as energy harvesters for nomad applications.
Summary
The goal of NanoHarvest is to explore novel solutions for flexible photovoltaic and piezoelectric converters enabled by semiconductor nanowires. The first objective is to demonstrate an innovative concept of flexible solar cells based on free-standing polymer-embedded nanowires which can be applied to almost any supporting material such as plastic, metal foil or even fabrics. The second objective it to develop high-efficiency flexible and compact piezo-generators based on ordered arrays of nanowire heterostructures. The crucial ingredient - and also the common basis - of the two proposed research axes are the advanced nanowire heterostructures: we will develop nanowires with new control-by-design functionalities by engineering their structure at the nanoscale. The main focus of NanoHarvest will be on the III-nitride semiconductors, which are characterized by a strong piezoelectric response and have also demonstrated their ability for efficient photon harvesting in the blue and green parts of the solar spectrum. Our strategy is to address the physical mechanisms governing the energy conversion from the single nanowire level up to the macroscopic device level. The deep understanding gained at the nanoscale will guide the optimization of the device architecture, of the material growth and of the fabrication process. We will make use of Molecular Beam Epitaxy to achieve ultimate control over the nanowire morphology and composition and to produce control-by-design model systems for fundamental studies and for exploration of device physics. The original transfer procedure of the ordered nanowire arrays onto flexible substrates will enable lightweight flexible devices with ultimate performance, which will serve as energy harvesters for nomad applications.
Max ERC Funding
1 496 571 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym NanoSOFT
Project Fluid transport at the nano- and meso- scales : from fundamentals to applications in energy harvesting and desalination process
Researcher (PI) Alessandro SIRIA
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary New models of fluid transport are expected to emerge from the confinement of liquids at the nanoscale, where the behaviour of matter strongly departs from common expectations.
This is the field of the Nanofluidics: taking inspiration from the solution found by evolved biological systems, new functionalities will emerge from the nanometre scale, with potential applications in ultrafiltration, desalination and energy conversion.
Nevertheless, advancing our fundamental understanding of fluid transport on the smallest scales requires mass and ion dynamics to be ultimately characterized across channels with dimensions close to the molecular size. A major challenge for nanofluidics thus lies in building distinct and well-controlled nanochannels, amenable to the systematic exploration of their properties.
This project will tackle several complementary challenges. On the first hand the realization of new kind of fluidic devices allowing the study of fluid and ion transport at the nanoscale: these new experimental devices will be obtained by using nanostructures like building blocks as already shown by realising a fluidics set-up based on transmembrane nanotubes; in parallel a dedicated plateform for the characterization of fluid transport will be developed based on electrokinetics and optical detection set-ups. On the other hand, profiting of such experimental set-ups, I will look for the limit of the classical description of the fluid dynamics, focusing on new functionalities emerging from exotic behaviour of fluids at the nanometer level. This will be done by studying different kind of nanofluidics set-up such as carbon and boron-nitride nanotube, ultrathin pierced graphene and h-BN sheet and composite materials.
I aim the creation of a link between fundamental research on soft matter and nanoscience-condensed matter with a an attention on the energy production domain, assuring a fruitful transfer between the fundamental findings and new industrial applications.
Summary
New models of fluid transport are expected to emerge from the confinement of liquids at the nanoscale, where the behaviour of matter strongly departs from common expectations.
This is the field of the Nanofluidics: taking inspiration from the solution found by evolved biological systems, new functionalities will emerge from the nanometre scale, with potential applications in ultrafiltration, desalination and energy conversion.
Nevertheless, advancing our fundamental understanding of fluid transport on the smallest scales requires mass and ion dynamics to be ultimately characterized across channels with dimensions close to the molecular size. A major challenge for nanofluidics thus lies in building distinct and well-controlled nanochannels, amenable to the systematic exploration of their properties.
This project will tackle several complementary challenges. On the first hand the realization of new kind of fluidic devices allowing the study of fluid and ion transport at the nanoscale: these new experimental devices will be obtained by using nanostructures like building blocks as already shown by realising a fluidics set-up based on transmembrane nanotubes; in parallel a dedicated plateform for the characterization of fluid transport will be developed based on electrokinetics and optical detection set-ups. On the other hand, profiting of such experimental set-ups, I will look for the limit of the classical description of the fluid dynamics, focusing on new functionalities emerging from exotic behaviour of fluids at the nanometer level. This will be done by studying different kind of nanofluidics set-up such as carbon and boron-nitride nanotube, ultrathin pierced graphene and h-BN sheet and composite materials.
I aim the creation of a link between fundamental research on soft matter and nanoscience-condensed matter with a an attention on the energy production domain, assuring a fruitful transfer between the fundamental findings and new industrial applications.
Max ERC Funding
1 494 000 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym NAPOLI
Project Nanoporous Asymmetric Poly(Ionic Liquid) Membrane
Researcher (PI) Jiayin Yuan
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Nanoporous polymer membranes (NPMs) play a crucial, irreplaceable role in fundamental research and industrial usage, including separation, filtration, water treatment and sustainable environment. The vast majority of advances concentrate on neutral or weakly charged polymers, such as the ongoing interest on self-assembled block copolymer NPMs. There is an urgent need to process polyelectrolytes into NPMs that critically combine a high charge density with nanoporous morphology. Additionally, engineering structural asymmetry/gradient simultaneously in the membrane is equally beneficial, as it would improve membrane performance by building up compartmentalized functionalities. For example, a gradient in pore size forms high pressure resistance coupled with improved selectivity. Nevertheless, developing such highly charged, nanoporous and gradient membranes has remained a challenge, owing to the water solubility and ionic nature of conventional polyelectrolytes, poorly processable into nanoporous state via common routes.
Recently, my group first reported an easy-to-perform production of nanoporous polyelectrolyte membranes. Building on this important but rather preliminary advance, I propose to develop the next generation of NPMs, nanoporous asymmetric poly(ionic liquid) membranes (NAPOLIs). The aim is to produce NAPOLIs bearing diverse gradients, understand the unique transport behavior, improve the membrane stability/sustainability/applicability, and finally apply them in the active fields of energy and environment. Both the currently established route and the newly proposed ones will be employed for the membrane fabrication.
This proposal is inherently interdisciplinary, as it must combine polymer chemistry/engineering, physical chemistry, membrane/materials science, and nanoscience for its success. This research will fundamentally advance nanoporous membrane design for a wide scope of applications and reveal unique physical processes in an asymmetric context.
Summary
Nanoporous polymer membranes (NPMs) play a crucial, irreplaceable role in fundamental research and industrial usage, including separation, filtration, water treatment and sustainable environment. The vast majority of advances concentrate on neutral or weakly charged polymers, such as the ongoing interest on self-assembled block copolymer NPMs. There is an urgent need to process polyelectrolytes into NPMs that critically combine a high charge density with nanoporous morphology. Additionally, engineering structural asymmetry/gradient simultaneously in the membrane is equally beneficial, as it would improve membrane performance by building up compartmentalized functionalities. For example, a gradient in pore size forms high pressure resistance coupled with improved selectivity. Nevertheless, developing such highly charged, nanoporous and gradient membranes has remained a challenge, owing to the water solubility and ionic nature of conventional polyelectrolytes, poorly processable into nanoporous state via common routes.
Recently, my group first reported an easy-to-perform production of nanoporous polyelectrolyte membranes. Building on this important but rather preliminary advance, I propose to develop the next generation of NPMs, nanoporous asymmetric poly(ionic liquid) membranes (NAPOLIs). The aim is to produce NAPOLIs bearing diverse gradients, understand the unique transport behavior, improve the membrane stability/sustainability/applicability, and finally apply them in the active fields of energy and environment. Both the currently established route and the newly proposed ones will be employed for the membrane fabrication.
This proposal is inherently interdisciplinary, as it must combine polymer chemistry/engineering, physical chemistry, membrane/materials science, and nanoscience for its success. This research will fundamentally advance nanoporous membrane design for a wide scope of applications and reveal unique physical processes in an asymmetric context.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-03-01, End date: 2021-01-31
Project acronym NSECPROBE
Project Probing quantum fluctuations of single electronic channels in model interacting systems
Researcher (PI) Carles Oriol Altimiras Martin
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary The fluctuation-dissipation theorem is a prominent milestone in Physics: It links the dissipative response of a physical system to its fluctuations, and provides a microscopic understanding of macroscopic irreversibility. Recent theoretical advances that have generalized the original fluctuation-dissipation theorem to non-linear quantum systems even far from equilibrium, ask for an experimental test, which is the aim of the project. We will measure the current fluctuations and dissipative response of driven quantum systems whose non-linearity arises from strong interactions. We will exploit the flexibility offered by nano-patterned high purity 2D electron gases in order to realize single electron channels in different regimes: 1/ interacting strongly with a single electromagnetic mode (Dynamical Coulomb Blockade of a quantum point contact), 2/ interacting with a single magnetic impurity (Kondo effect in quantum dots), 3/ driving the 2D gas in the fractional quantum Hall effect where current is carried by strongly correlated 1D channels prototypical of Luttinger liquids. Last, we will address a fundamental issue raised in the early days of quantum mechanics: how long does it take for a particle to cross a classically forbidden barrier? While Wigner-Smith’s theorem links the issue to the density fluctuations within the barrier, the fluctuation-dissipation theorem links it further to a quantum relaxation resistance. A full investigation of fluctuation-dissipation relations including quantum effects requires measurements at frequencies hf>k_BT. With the available dilution refrigeration techniques it implies measuring in the few GHz range. Since quantum conductors have an impedance h/e^2~25.8 kohm much larger than the 50ohm impedance of microwave components, new microwave methods able to deal with large impedance values will be developed. They will be based on the extension to finite magnetic field of the wide-band impedance matching methods recently developed by the PI.
Summary
The fluctuation-dissipation theorem is a prominent milestone in Physics: It links the dissipative response of a physical system to its fluctuations, and provides a microscopic understanding of macroscopic irreversibility. Recent theoretical advances that have generalized the original fluctuation-dissipation theorem to non-linear quantum systems even far from equilibrium, ask for an experimental test, which is the aim of the project. We will measure the current fluctuations and dissipative response of driven quantum systems whose non-linearity arises from strong interactions. We will exploit the flexibility offered by nano-patterned high purity 2D electron gases in order to realize single electron channels in different regimes: 1/ interacting strongly with a single electromagnetic mode (Dynamical Coulomb Blockade of a quantum point contact), 2/ interacting with a single magnetic impurity (Kondo effect in quantum dots), 3/ driving the 2D gas in the fractional quantum Hall effect where current is carried by strongly correlated 1D channels prototypical of Luttinger liquids. Last, we will address a fundamental issue raised in the early days of quantum mechanics: how long does it take for a particle to cross a classically forbidden barrier? While Wigner-Smith’s theorem links the issue to the density fluctuations within the barrier, the fluctuation-dissipation theorem links it further to a quantum relaxation resistance. A full investigation of fluctuation-dissipation relations including quantum effects requires measurements at frequencies hf>k_BT. With the available dilution refrigeration techniques it implies measuring in the few GHz range. Since quantum conductors have an impedance h/e^2~25.8 kohm much larger than the 50ohm impedance of microwave components, new microwave methods able to deal with large impedance values will be developed. They will be based on the extension to finite magnetic field of the wide-band impedance matching methods recently developed by the PI.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym NuQFT
Project The Hall Plateau Transition and non-unitary Quantum Field Theory
Researcher (PI) Hubert Saleur
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE2, ERC-2014-ADG
Summary I propose to solve the Quantum Field Theory (QFT) describing the transition between plateaus of quantized Hall conductance in the Integer Quantum Hall Effect (IQHE).
The existence of the plateaus and their topological origin are certainly well understood. In sharp contrast, the transition, which mixes the effects of disorder, magnetic field and possibly interactions, remains very mysterious. Numerical studies of lattice models are plagued by disorder. The QFT description involves physics at very strong coupling, and requires a non-perturbative solution before quantitative predictions can be made.
Finding such a solution is very difficult because the QFT for the plateau transition is ‘non-unitary’ - it involves a non-Hermitian ‘Hamiltonian’. Non-unitary QFT is a challenging, almost unexplored topic, that must be first developed before the plateau transition can be addressed.
I propose to carry out this task with a cross-disciplinary strategy that uses ideas and tools from conformal field theory, statistical mechanics, and mathematics. Key to this strategy is a new and powerful way of analyzing lattice regularizations of the QFTs by focussing on their algebraic properties directly on the lattice, with a mix of advanced representation theory and numerical techniques.
The results - in particular, concerning conformal invariance and renormalization group flows in the non-unitary case - will then be used to solve the QFT models for the plateau transition in the IQHE and in other universality classes of 2D Anderson insulators. This will be a landmark step in our understanding of the localization/delocalization transition in two dimensions, and allow a long delayed comparison of theory with experiment. The results will, more generally, impact many other areas of physics where non-unitary QFT plays a central role - from disordered systems of statistical mechanics to the string theory side of the AdS/CFT duality, to the effective description of open quantum systems.
Summary
I propose to solve the Quantum Field Theory (QFT) describing the transition between plateaus of quantized Hall conductance in the Integer Quantum Hall Effect (IQHE).
The existence of the plateaus and their topological origin are certainly well understood. In sharp contrast, the transition, which mixes the effects of disorder, magnetic field and possibly interactions, remains very mysterious. Numerical studies of lattice models are plagued by disorder. The QFT description involves physics at very strong coupling, and requires a non-perturbative solution before quantitative predictions can be made.
Finding such a solution is very difficult because the QFT for the plateau transition is ‘non-unitary’ - it involves a non-Hermitian ‘Hamiltonian’. Non-unitary QFT is a challenging, almost unexplored topic, that must be first developed before the plateau transition can be addressed.
I propose to carry out this task with a cross-disciplinary strategy that uses ideas and tools from conformal field theory, statistical mechanics, and mathematics. Key to this strategy is a new and powerful way of analyzing lattice regularizations of the QFTs by focussing on their algebraic properties directly on the lattice, with a mix of advanced representation theory and numerical techniques.
The results - in particular, concerning conformal invariance and renormalization group flows in the non-unitary case - will then be used to solve the QFT models for the plateau transition in the IQHE and in other universality classes of 2D Anderson insulators. This will be a landmark step in our understanding of the localization/delocalization transition in two dimensions, and allow a long delayed comparison of theory with experiment. The results will, more generally, impact many other areas of physics where non-unitary QFT plays a central role - from disordered systems of statistical mechanics to the string theory side of the AdS/CFT duality, to the effective description of open quantum systems.
Max ERC Funding
2 098 158 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym P-MEM-NMR
Project Structure of paramagnetic integral membrane metalloproteins by MAS-NMR
Researcher (PI) Guido Pintacuda
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary Integral membrane metalloproteins are involved in the transport and homeostasis of metal ions, as well as in key redox reactions that have a tremendous impact on many fields within life sciences, environment, energy, and industry.
Most of our understanding of fine details of biochemical processes derives from atomic or molecular structures obtained by diffraction methods on single crystal samples. However, in the case of integral membrane systems, single crystals large enough for X-ray diffraction cannot be easily obtained, and the problem of structure elucidation is largely unsolved.
We have recently pioneered a breakthrough approach using Magic-Angle Spinning Nuclear Magnetic Resonance (MAS-NMR) for the atomic-level characterization of paramagnetic materials and complex biological macromolecules. The proposed project aims to leverage these new advances through a series of new concepts i) to improve the resolution and sensitivity of MAS-NMR from nuclei surrounding a paramagnetic metal ion, such as e.g. cobalt, nickel and iron, and ii) to extend its applicability to large integral membrane proteins in lipid membrane environments. With these methods, we will enable the determination of structure-activity relationships in integral membrane metalloenzymes and transporters, by combining the calculation of global structure and dynamics with measurement of the electronic features of metal ions.
These goals require a leap forward with respect to today’s protocols, and we propose to achieve this through a combination of innovative NMR experiments and isotopic labeling, faster MAS rates and high magnetic fields. As outlined here, the approaches go well beyond the frontier of current research. The project will yield a broadly applicable method for the structural characterization of essential cellular processes and thereby will provide a powerful tool to solve challenges at the forefront of molecular and chemical sciences today.
Summary
Integral membrane metalloproteins are involved in the transport and homeostasis of metal ions, as well as in key redox reactions that have a tremendous impact on many fields within life sciences, environment, energy, and industry.
Most of our understanding of fine details of biochemical processes derives from atomic or molecular structures obtained by diffraction methods on single crystal samples. However, in the case of integral membrane systems, single crystals large enough for X-ray diffraction cannot be easily obtained, and the problem of structure elucidation is largely unsolved.
We have recently pioneered a breakthrough approach using Magic-Angle Spinning Nuclear Magnetic Resonance (MAS-NMR) for the atomic-level characterization of paramagnetic materials and complex biological macromolecules. The proposed project aims to leverage these new advances through a series of new concepts i) to improve the resolution and sensitivity of MAS-NMR from nuclei surrounding a paramagnetic metal ion, such as e.g. cobalt, nickel and iron, and ii) to extend its applicability to large integral membrane proteins in lipid membrane environments. With these methods, we will enable the determination of structure-activity relationships in integral membrane metalloenzymes and transporters, by combining the calculation of global structure and dynamics with measurement of the electronic features of metal ions.
These goals require a leap forward with respect to today’s protocols, and we propose to achieve this through a combination of innovative NMR experiments and isotopic labeling, faster MAS rates and high magnetic fields. As outlined here, the approaches go well beyond the frontier of current research. The project will yield a broadly applicable method for the structural characterization of essential cellular processes and thereby will provide a powerful tool to solve challenges at the forefront of molecular and chemical sciences today.
Max ERC Funding
2 499 375 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym PHOROSOL
Project Integrating photochemistry in nanoconfined carbon-based porous materials in technological processes
Researcher (PI) Maria Concepcion Ovin Ania
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE8, ERC-2014-CoG
Summary The aim of this proposal is to exploit the potentialities of confined pore spaces in technological processes related to applied photochemistry for gas sensing, energy conversion and environmental protection. I will focus on new light responsive nanoporous carbons which characteristics can be tailored at two levels (pore void at the nanometric scale and surface functionalization) during the synthesis to modulate their selectivity towards a given molecule (i.e. gas sensing) or efficiency in a given reaction (i.e. energy conversion, environmental protection).
The dual nature of the nanoporous carbons with ad-hoc designed pore architectures acting as nanoreactors (confinement) and photoactivity defined by composition (chromophoric groups) offers new perspectives in the fields of light harvesting of applied photochemistry, and shows multitude of fundamental questions that are worth investigating to exploit this concept. Understanding of the confinement effects and the light/solid/molecule interactions is the key for integrating carbon nanostructures in a whole new array of applications. An example would be the design of multifunctional spatially organized photoactive carbons with high electron mobility, multimodal pore systems and chromophoric groups. These systems are expected to show enhanced diffusion and mass transport, with great potential in gas sensing applications where a fast, sensitivity and selective response is needed.
I plan to work with functionalized light-responsive polymeric nanoporous carbons (mainly gels, graphene-oxide frameworks). A smart design of hybrid nanostructures introducing other confined photoactive elements will also be studied. The outcome of the proposal is to understand the fundamentals of photochemistry of carbon nanostructures for the implementation of best performing materials in different technological processes related to photochemical energy conversion for H2 and O2 generation, gas sensing and environmental protection.
Summary
The aim of this proposal is to exploit the potentialities of confined pore spaces in technological processes related to applied photochemistry for gas sensing, energy conversion and environmental protection. I will focus on new light responsive nanoporous carbons which characteristics can be tailored at two levels (pore void at the nanometric scale and surface functionalization) during the synthesis to modulate their selectivity towards a given molecule (i.e. gas sensing) or efficiency in a given reaction (i.e. energy conversion, environmental protection).
The dual nature of the nanoporous carbons with ad-hoc designed pore architectures acting as nanoreactors (confinement) and photoactivity defined by composition (chromophoric groups) offers new perspectives in the fields of light harvesting of applied photochemistry, and shows multitude of fundamental questions that are worth investigating to exploit this concept. Understanding of the confinement effects and the light/solid/molecule interactions is the key for integrating carbon nanostructures in a whole new array of applications. An example would be the design of multifunctional spatially organized photoactive carbons with high electron mobility, multimodal pore systems and chromophoric groups. These systems are expected to show enhanced diffusion and mass transport, with great potential in gas sensing applications where a fast, sensitivity and selective response is needed.
I plan to work with functionalized light-responsive polymeric nanoporous carbons (mainly gels, graphene-oxide frameworks). A smart design of hybrid nanostructures introducing other confined photoactive elements will also be studied. The outcome of the proposal is to understand the fundamentals of photochemistry of carbon nanostructures for the implementation of best performing materials in different technological processes related to photochemical energy conversion for H2 and O2 generation, gas sensing and environmental protection.
Max ERC Funding
1 994 180 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym PLANETDIVE
Project Planetary diversity: the experimental terapascal perspective
Researcher (PI) Guillaume, Marie, Bernard Fiquet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE9, ERC-2014-ADG
Summary The discovery of extra-solar planets orbiting other stars has been one of the major breakthroughs in astronomy of the past decades. Exoplanets are common objects in the universe and planetary systems seem to be more diverse than originally predicted. The use of radius-mass relationships has been generalized as a means for understanding exoplanets compositions, in combination with equations of state of main planetary components extrapolated to TeraPascal (TPa) pressures.
In the most current description, Earth-like planets are assumed to be fully differentiated and made of a metallic core surrounded by a silicate mantle, and possibly volatile elements at their surfaces in supercritical, liquid or gaseous states. This model is currently used to infer mass-radius relationship for planets up to 100 Earth masses but rests on poorly known equations of states for iron alloys and silicates, as well as even less known melting properties at TPa pressures.
This proposal thus aims at providing experimental references for equations of state and melting properties up to TPa pressure range, with the combined use of well-calibrated static experiments (laser-heated diamond-anvil cells) and laser-compression experiments capable of developing several Mbar pressures at high temperature, coupled with synchrotron or XFEL X-ray sources. I propose to establish benchmarking values for the equations of states, phase diagrams and melting curves relations at unprecedented P-T conditions. The proposed experiments will be focused on simple silicates, oxides and carbides (SiO2, MgSiO3, MgO, SiC), iron alloys (Fe-S, Fe-Si, Fe-O, Fe-C) and more complex metals (Fe,Si,O,S) and silicates (Mg,Fe)SiO3. In this proposal, I will address key questions concerning planets with 1-5 Earth masses as well as fundamental questions about the existence of heavy rocky cores in giant planets.
Summary
The discovery of extra-solar planets orbiting other stars has been one of the major breakthroughs in astronomy of the past decades. Exoplanets are common objects in the universe and planetary systems seem to be more diverse than originally predicted. The use of radius-mass relationships has been generalized as a means for understanding exoplanets compositions, in combination with equations of state of main planetary components extrapolated to TeraPascal (TPa) pressures.
In the most current description, Earth-like planets are assumed to be fully differentiated and made of a metallic core surrounded by a silicate mantle, and possibly volatile elements at their surfaces in supercritical, liquid or gaseous states. This model is currently used to infer mass-radius relationship for planets up to 100 Earth masses but rests on poorly known equations of states for iron alloys and silicates, as well as even less known melting properties at TPa pressures.
This proposal thus aims at providing experimental references for equations of state and melting properties up to TPa pressure range, with the combined use of well-calibrated static experiments (laser-heated diamond-anvil cells) and laser-compression experiments capable of developing several Mbar pressures at high temperature, coupled with synchrotron or XFEL X-ray sources. I propose to establish benchmarking values for the equations of states, phase diagrams and melting curves relations at unprecedented P-T conditions. The proposed experiments will be focused on simple silicates, oxides and carbides (SiO2, MgSiO3, MgO, SiC), iron alloys (Fe-S, Fe-Si, Fe-O, Fe-C) and more complex metals (Fe,Si,O,S) and silicates (Mg,Fe)SiO3. In this proposal, I will address key questions concerning planets with 1-5 Earth masses as well as fundamental questions about the existence of heavy rocky cores in giant planets.
Max ERC Funding
3 498 938 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym PLANTMOVE
Project Plant movements and mechano-perception: from biophysics to biomimetics
Researcher (PI) Yoel Stephane Forterre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary How to transport fluids, move solids or perceive mechanical signals without the equivalent of pumps, muscles or nerves? This ongoing challenge, which is relevant from microfluidics to robotics, has long been solved by plants. In this project, I wish to gather my cross-disciplinary background in plant mechanics, soft matter physics and granular materials to address some of the fundamental mechanisms used by plants to perceive mechanical stimuli and generate motion. The project focuses on three major issues in plant biophysics, which all involve the coupling between a fluid (water in the vascular network or in the plant cell, cellular cytoplasm) and a solid (plant cell wall, starch grains in gravity-sensing cells):
(i) How mechanical signals are perceived and transported within the plant and what is the role of the water pressure in this long-distance signalling.
(ii) How plants sense and respond to gravity and how this response is related to the granular nature of the sensor at the cellular level.
(iii) How plants perform rapid motion and what is the role of osmotic motors and cell wall actuation in this process, using the carnivorous plant Venus flytrap as a paradigm for study.
The global approach will combine experiments on physical systems mimicking the key features of plant tissue and in situ experiments on plants, in strong collaboration with plant physiologists and agronomists. Experiments will be performed both at the organ level (growth kinematics, response to strain and force stimuli) and at the tissue and cellular level (cell imaging, micro-indentation, cell pressure probe). This multi-disciplinary and multi-scale approach should help to fill the gap in our understanding of basic plant functions and offers new strategies to design smart soft materials and fluids inspired by plant sensors and motility mechanism.
Summary
How to transport fluids, move solids or perceive mechanical signals without the equivalent of pumps, muscles or nerves? This ongoing challenge, which is relevant from microfluidics to robotics, has long been solved by plants. In this project, I wish to gather my cross-disciplinary background in plant mechanics, soft matter physics and granular materials to address some of the fundamental mechanisms used by plants to perceive mechanical stimuli and generate motion. The project focuses on three major issues in plant biophysics, which all involve the coupling between a fluid (water in the vascular network or in the plant cell, cellular cytoplasm) and a solid (plant cell wall, starch grains in gravity-sensing cells):
(i) How mechanical signals are perceived and transported within the plant and what is the role of the water pressure in this long-distance signalling.
(ii) How plants sense and respond to gravity and how this response is related to the granular nature of the sensor at the cellular level.
(iii) How plants perform rapid motion and what is the role of osmotic motors and cell wall actuation in this process, using the carnivorous plant Venus flytrap as a paradigm for study.
The global approach will combine experiments on physical systems mimicking the key features of plant tissue and in situ experiments on plants, in strong collaboration with plant physiologists and agronomists. Experiments will be performed both at the organ level (growth kinematics, response to strain and force stimuli) and at the tissue and cellular level (cell imaging, micro-indentation, cell pressure probe). This multi-disciplinary and multi-scale approach should help to fill the gap in our understanding of basic plant functions and offers new strategies to design smart soft materials and fluids inspired by plant sensors and motility mechanism.
Max ERC Funding
1 933 996 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym PLASMA
Project Running away and radiating
Researcher (PI) Tünde-Maria Fülöp
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary Particle acceleration and radiation in plasmas has a wide variety of applications, ranging from cancer therapy and lightning initiation, to the improved design of fusion devices for large scale energy production. The goal of this project is to build a flexible ensemble of theoretical and numerical models that describes the acceleration processes and the resulting fast particle dynamics in two focus areas: magnetic fusion plasmas and laser-produced plasmas. This interdisciplinary approach is a new way of studying charged particle acceleration. It will lead to a deeper understanding of the complex interactions that characterise fast particle behaviour in plasmas. Plasmas are complex systems, with many kinds of interacting electromagnetic (EM) waves and charged particles. For such a system it is infeasible to build one model which captures both the small scale physics and the large scale phenomena. Therefore we aim to develop several complementary models, in one common framework, and make sure they agree in overlapping regions. The common framework will be built layer-by-layer, using models derived from first principles in a systematic way, with theory closely linked to numerics and validated by experimental observations. The key object of study is the evolution of the velocity-space particle distribution in time and space. The main challenge is the strong coupling between the distribution and the EM-field, which requires models with self-consistent coupling of Maxwell’s equations and kinetic equations. For the latter we will use Vlasov-Fokker-Planck solvers extended with advanced collision operators. Interesting aspects include non-Maxwellian distributions, instabilities, shock-wave formation and avalanches. The resulting theoretical framework and the corresponding code-suite will be a novel instrument for advanced studies of charged particle acceleration. Due to the generality of our approach, the applicability will reach far beyond the two focus areas.
Summary
Particle acceleration and radiation in plasmas has a wide variety of applications, ranging from cancer therapy and lightning initiation, to the improved design of fusion devices for large scale energy production. The goal of this project is to build a flexible ensemble of theoretical and numerical models that describes the acceleration processes and the resulting fast particle dynamics in two focus areas: magnetic fusion plasmas and laser-produced plasmas. This interdisciplinary approach is a new way of studying charged particle acceleration. It will lead to a deeper understanding of the complex interactions that characterise fast particle behaviour in plasmas. Plasmas are complex systems, with many kinds of interacting electromagnetic (EM) waves and charged particles. For such a system it is infeasible to build one model which captures both the small scale physics and the large scale phenomena. Therefore we aim to develop several complementary models, in one common framework, and make sure they agree in overlapping regions. The common framework will be built layer-by-layer, using models derived from first principles in a systematic way, with theory closely linked to numerics and validated by experimental observations. The key object of study is the evolution of the velocity-space particle distribution in time and space. The main challenge is the strong coupling between the distribution and the EM-field, which requires models with self-consistent coupling of Maxwell’s equations and kinetic equations. For the latter we will use Vlasov-Fokker-Planck solvers extended with advanced collision operators. Interesting aspects include non-Maxwellian distributions, instabilities, shock-wave formation and avalanches. The resulting theoretical framework and the corresponding code-suite will be a novel instrument for advanced studies of charged particle acceleration. Due to the generality of our approach, the applicability will reach far beyond the two focus areas.
Max ERC Funding
1 948 750 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym preQFT
Project Strategic Predictions for Quantum Field Theories
Researcher (PI) John Joseph Carrasco
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary Ambitious Questions:
* How does the relatively calm macroscopic universe survive and emerge from the violent quantum fluctuations of its underlying microphysics?
* How do classical notions of space and time emerge from fundamental principles, and what governs their evolution?
These questions are difficult to answer---perhaps impossible given current ideas and frameworks---but I believe a strategic path forward is to thoroughly understand the quantum predictions of our Yang-Mills and Gravity theories, and unambiguously identify their non-perturbative UV completions. The first step forward, and the goal of this project, is to move towards the trivialization of perturbative calculations.
Consider the notion of failure-point calculations -- calculations that push modern methods and world-class technologies to their breaking-point. Such calculations, for their very success, engender the chance of cultivating and exploiting previously unappreciated structure. In doing so, such calculations advance the state of the art forward to some degree, dependent on the class of the problems and nature of the solution. With scattering amplitude calculations, we battle against (naive) combinatorial complexity as we go either higher in order of quantum correction ( loop order ), or higher in number of external particles scattering (multiplicity), so our advances must be revolutionary to lift us forward. Yet I and others have shown that the very complications of generalized gauge freedom promise a potential salvation at least as powerful as the complications that confront us. The potential reward is enormous, a rewriting of perturbative quantum field theory to make these principles manifest and calculation natural, an ambitious but now realistic goal. The path forward is optimized through strategic calculations.
Summary
Ambitious Questions:
* How does the relatively calm macroscopic universe survive and emerge from the violent quantum fluctuations of its underlying microphysics?
* How do classical notions of space and time emerge from fundamental principles, and what governs their evolution?
These questions are difficult to answer---perhaps impossible given current ideas and frameworks---but I believe a strategic path forward is to thoroughly understand the quantum predictions of our Yang-Mills and Gravity theories, and unambiguously identify their non-perturbative UV completions. The first step forward, and the goal of this project, is to move towards the trivialization of perturbative calculations.
Consider the notion of failure-point calculations -- calculations that push modern methods and world-class technologies to their breaking-point. Such calculations, for their very success, engender the chance of cultivating and exploiting previously unappreciated structure. In doing so, such calculations advance the state of the art forward to some degree, dependent on the class of the problems and nature of the solution. With scattering amplitude calculations, we battle against (naive) combinatorial complexity as we go either higher in order of quantum correction ( loop order ), or higher in number of external particles scattering (multiplicity), so our advances must be revolutionary to lift us forward. Yet I and others have shown that the very complications of generalized gauge freedom promise a potential salvation at least as powerful as the complications that confront us. The potential reward is enormous, a rewriting of perturbative quantum field theory to make these principles manifest and calculation natural, an ambitious but now realistic goal. The path forward is optimized through strategic calculations.
Max ERC Funding
1 299 958 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym PRIMCHEM
Project Primitive chemistry in planetary atmospheres: From the upper atmosphere down to the surface
Researcher (PI) Nathalie, Marie Carrasco
Host Institution (HI) UNIVERSITE DE VERSAILLES SAINT-QUENTIN-EN-YVELINES.
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary The presence of organic compounds was essential to the emergence of life on Earth 3.5 to 3.8 billion years ago. Such compounds may have had several different origins; amongst them the ocean-atmosphere coupled system (the primordial soup theory), or exogenous inputs by meteorites, comets and Interplanetary Dust Particles.
Titan, the largest moon of Saturn, is the best known observable analogue of the Early Earth. I recently identified a totally new source of prebiotic material for this system: the upper atmosphere. Nucleobases have been highlighted as components of the solid aerosols analogues produced in a reactor mimicking the chemistry that occurs in the upper atmosphere. The specificity of this external layer is that it receives harsh solar UV radiations enabling the chemical activation of molecular nitrogen N2, and involving a nitrogen rich organic chemistry with high prebiotic interest.
As organic solid aerosols are initiated in the upper atmosphere of Titan, a new question is raised that I will address: what is the evolution of these organic prebiotic seeds when sedimenting down to the surface? Aerosols will indeed undergo the bombardment of charged particles, further UV radiation, and/or coating of condensable species at lower altitudes. I expect possible changes on the aerosols themselves, but also on the budget of the gas phase through emissions of new organic volatiles compounds. The aerosols aging may therefore impact the whole atmospheric system.
An original methodology will be developed to address this novel issue. The successive aging sequences will be experimentally simulated in chemical reactors combining synchrotron and plasma sources. The interpretation of the experimental results will moreover be supported by a modelling of the processes. This complementary approach will enable to decipher the aerosols evolution in laboratory conditions and to extrapolate the impact on Titan atmospheric system.
Summary
The presence of organic compounds was essential to the emergence of life on Earth 3.5 to 3.8 billion years ago. Such compounds may have had several different origins; amongst them the ocean-atmosphere coupled system (the primordial soup theory), or exogenous inputs by meteorites, comets and Interplanetary Dust Particles.
Titan, the largest moon of Saturn, is the best known observable analogue of the Early Earth. I recently identified a totally new source of prebiotic material for this system: the upper atmosphere. Nucleobases have been highlighted as components of the solid aerosols analogues produced in a reactor mimicking the chemistry that occurs in the upper atmosphere. The specificity of this external layer is that it receives harsh solar UV radiations enabling the chemical activation of molecular nitrogen N2, and involving a nitrogen rich organic chemistry with high prebiotic interest.
As organic solid aerosols are initiated in the upper atmosphere of Titan, a new question is raised that I will address: what is the evolution of these organic prebiotic seeds when sedimenting down to the surface? Aerosols will indeed undergo the bombardment of charged particles, further UV radiation, and/or coating of condensable species at lower altitudes. I expect possible changes on the aerosols themselves, but also on the budget of the gas phase through emissions of new organic volatiles compounds. The aerosols aging may therefore impact the whole atmospheric system.
An original methodology will be developed to address this novel issue. The successive aging sequences will be experimentally simulated in chemical reactors combining synchrotron and plasma sources. The interpretation of the experimental results will moreover be supported by a modelling of the processes. This complementary approach will enable to decipher the aerosols evolution in laboratory conditions and to extrapolate the impact on Titan atmospheric system.
Max ERC Funding
1 487 500 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym PRISTINE
Project High precision isotopic measurements of heavy elements in extra-terrestrial materials: origin and age of the solar system volatile element depletion
Researcher (PI) Frédéric, Pierre, Louis Moynier
Host Institution (HI) INSTITUT DE PHYSIQUE DU GLOBE DE PARIS
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary "The objectives of this proposal, PRISTINE (high PRecision ISotopic measurements of heavy elements in extra-Terrestrial materials: origIN and age of the solar system volatile Element depletion), are to develop new cutting edge high precision isotopic measurements to understand the origin of the Earth, Moon and solar system volatile elements and link their relative depletion in the different planets to their formation mechanism. In addition, the understanding of the origin of the volatile elements will have direct consequences for the understanding of the origin of the Earth’s water. To that end, we will approach the problem from two angles: 1) Develop and use novel stable isotope systems for volatile elements (e.g. Zn, Ga, Cu, and Rb) in terrestrial, lunar and meteoritic materials to constrain the origin of solar system’s volatile element depletion 2) Determine the age of the volatile element depletion by using a novel and original approach: calculate the original Rb/Sr ratio of the Solar Nebula by measuring the isotopic composition of the Sun with respect to Sr via the isotopic composition of solar wind implanted in lunar soil grains.
The stable isotope composition (goal #1) will give us new constraints on the mechanisms (e.g. evaporation following a giant impact or incomplete condensation) that have shaped the abundances of the volatile elements in terrestrial planets, while the timing (goal #2) will be used to differentiate between nebular events (early) from planetary events (late). These new results will have major implications on our understanding of the origin of the Earth and of the Moon, and they will be used to test the giant impact hypothesis of the Moon and the origin of the Earth’s water."
Summary
"The objectives of this proposal, PRISTINE (high PRecision ISotopic measurements of heavy elements in extra-Terrestrial materials: origIN and age of the solar system volatile Element depletion), are to develop new cutting edge high precision isotopic measurements to understand the origin of the Earth, Moon and solar system volatile elements and link their relative depletion in the different planets to their formation mechanism. In addition, the understanding of the origin of the volatile elements will have direct consequences for the understanding of the origin of the Earth’s water. To that end, we will approach the problem from two angles: 1) Develop and use novel stable isotope systems for volatile elements (e.g. Zn, Ga, Cu, and Rb) in terrestrial, lunar and meteoritic materials to constrain the origin of solar system’s volatile element depletion 2) Determine the age of the volatile element depletion by using a novel and original approach: calculate the original Rb/Sr ratio of the Solar Nebula by measuring the isotopic composition of the Sun with respect to Sr via the isotopic composition of solar wind implanted in lunar soil grains.
The stable isotope composition (goal #1) will give us new constraints on the mechanisms (e.g. evaporation following a giant impact or incomplete condensation) that have shaped the abundances of the volatile elements in terrestrial planets, while the timing (goal #2) will be used to differentiate between nebular events (early) from planetary events (late). These new results will have major implications on our understanding of the origin of the Earth and of the Moon, and they will be used to test the giant impact hypothesis of the Moon and the origin of the Earth’s water."
Max ERC Funding
1 487 500 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym PROMISE
Project Origins of the Molecular Cloud Structure
Researcher (PI) Jouni Tapani Kainulainen
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary Understanding the physical processes that control the life-cycle of the interstellar medium (ISM) is one of the key themes in the astrophysics of galaxies today. This importance originates from the role of the ISM as the birthplace of new stars, and therefore, as an indivisible component of galaxy evolution. Exactly how the conversion of the ISM to stars takes place is intricately linked to how the internal structure of the cold, molecular clouds in the ISM forms and evolves. Despite this pivotal role, our picture of the molecular cloud structure has a fundamental lacking: it is based largely on observations of low-mass molecular clouds. Yet, it is the massive, giant molecular clouds (GMCs) in which most stars form and which impact the ISM of galaxies most. I present a program that will fill this gap and make profound progress in the field. We have developed a new observational technique that provides an unparalleled view of the structure of young GMCs. I also have developed a powerful tool to study the most important structural characteristics of molecular clouds, e.g., the probability distribution of volume densities, which have not been accessible before. With this program, the full potential of these tools will be put into use. We will produce a unique, high-fidelity column density data set for a statistically interesting volume in the Galaxy, including thousands of molecular clouds. The data set will be unmatched in its quality and extent, providing an unprecedented basis for statistical studies. We will then connect this outstanding observational view with state-of-the-art numerical simulations. This approach allows us to address the key question in the field: Which processes drive the structure formation in massive molecular clouds, and how do they do it? Most crucially, we will create a new, observationally constrained framework for the evolution of the molecular cloud structure over the entire mass range of molecular clouds and star formation in the ISM.
Summary
Understanding the physical processes that control the life-cycle of the interstellar medium (ISM) is one of the key themes in the astrophysics of galaxies today. This importance originates from the role of the ISM as the birthplace of new stars, and therefore, as an indivisible component of galaxy evolution. Exactly how the conversion of the ISM to stars takes place is intricately linked to how the internal structure of the cold, molecular clouds in the ISM forms and evolves. Despite this pivotal role, our picture of the molecular cloud structure has a fundamental lacking: it is based largely on observations of low-mass molecular clouds. Yet, it is the massive, giant molecular clouds (GMCs) in which most stars form and which impact the ISM of galaxies most. I present a program that will fill this gap and make profound progress in the field. We have developed a new observational technique that provides an unparalleled view of the structure of young GMCs. I also have developed a powerful tool to study the most important structural characteristics of molecular clouds, e.g., the probability distribution of volume densities, which have not been accessible before. With this program, the full potential of these tools will be put into use. We will produce a unique, high-fidelity column density data set for a statistically interesting volume in the Galaxy, including thousands of molecular clouds. The data set will be unmatched in its quality and extent, providing an unprecedented basis for statistical studies. We will then connect this outstanding observational view with state-of-the-art numerical simulations. This approach allows us to address the key question in the field: Which processes drive the structure formation in massive molecular clouds, and how do they do it? Most crucially, we will create a new, observationally constrained framework for the evolution of the molecular cloud structure over the entire mass range of molecular clouds and star formation in the ISM.
Max ERC Funding
1 266 750 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym QAffine
Project Representations of quantum affine algebras and applications
Researcher (PI) David Christophe Hernandez
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Quantum affine algebras are important examples of Drinfeld-Jimbo quantum groups. They can be defined as quantizations of affine Kac-Moody algebras or as affinizations of finite type quantum groups (Drinfeld Theorem).
The representation theory of quantum affine algebras is very rich. It has been studied intensively during the past twenty five years from different point of views, in particular in connections with various fields in mathematics and in physics, such as geometry (geometric representation theory, geometric Langlands program), topology (invariants in small dimension), combinatorics (crystals, positivity problems) and theoretical physics (Bethe Ansatz, integrable systems).
In particular, the category C of finite-dimensional representations of a quantum affine algebra is one of the most studied object in quantum groups theory. However, many important and fundamental questions are still unsolved in this field. The aim of the research project is to make significant advances in the understanding of the category C as well as of its applications in the following five directions. They seem to us to be the most promising directions for this field in the next years:
1. Asymptotical representations and applications to quantum integrable systems,
2. G-bundles on elliptic curves and quantum groups at roots of 1,
3. Categorications (of cluster algebras and of quantum groups),
4. Langlands duality for quantum groups,
5. Proof of (geometric) character formulas and applications.
The resources would be used for the following:
(1) Hiring of 2 PhD students (in 2015 and 2017).
(2) Hiring of 2 Postdocs (in 2015 and 2017).
(3) Invitations and travel for ongoing and future scientific collaborations.
(4) Organization of a summer school in Paris on quantum affine algebras.
Summary
Quantum affine algebras are important examples of Drinfeld-Jimbo quantum groups. They can be defined as quantizations of affine Kac-Moody algebras or as affinizations of finite type quantum groups (Drinfeld Theorem).
The representation theory of quantum affine algebras is very rich. It has been studied intensively during the past twenty five years from different point of views, in particular in connections with various fields in mathematics and in physics, such as geometry (geometric representation theory, geometric Langlands program), topology (invariants in small dimension), combinatorics (crystals, positivity problems) and theoretical physics (Bethe Ansatz, integrable systems).
In particular, the category C of finite-dimensional representations of a quantum affine algebra is one of the most studied object in quantum groups theory. However, many important and fundamental questions are still unsolved in this field. The aim of the research project is to make significant advances in the understanding of the category C as well as of its applications in the following five directions. They seem to us to be the most promising directions for this field in the next years:
1. Asymptotical representations and applications to quantum integrable systems,
2. G-bundles on elliptic curves and quantum groups at roots of 1,
3. Categorications (of cluster algebras and of quantum groups),
4. Langlands duality for quantum groups,
5. Proof of (geometric) character formulas and applications.
The resources would be used for the following:
(1) Hiring of 2 PhD students (in 2015 and 2017).
(2) Hiring of 2 Postdocs (in 2015 and 2017).
(3) Invitations and travel for ongoing and future scientific collaborations.
(4) Organization of a summer school in Paris on quantum affine algebras.
Max ERC Funding
1 182 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym QUEST
Project QUantum Hall Edge State Tunnelling spectroscopy
Researcher (PI) Benjamin Pierre Alexis Sacépé
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary The quantum nature of an electronic fluid is ubiquitous in many solid-state systems subjected to correlations or confinement. This is particularly true for two-dimensional electron gases (2DEGs) in which fascinating quantum states of matter, such as the integer and fractional quantum Hall (QH) states, arise under strong magnetic fields. The understanding of QH systems relies on the existence of one-dimensional (1D) conducting channels that propagate unidirectionally along the edges of the system, following the confining potential. Due to the buried nature of 2DEG commonly built in semiconducting heterostructures, the considerable real space structure of this 1D electronic fluid and its energy spectrum remain largely unexplored.
This project consists in exploring at the local scale the intimate link between the spatial structure of QH edge states, coherent transport and the coupling with superconductivity at interfaces. We will use graphene as a surface-accessible 2DEG to perform a pioneering local investigation of normal and superconducting transport through QH edge states. A new and unique hybrid Atomic Force Microscope and Scanning Tunneling Microscope (STM) operating in the extreme conditions required for this physics, i.e. below 0.1 kelvin and up to 14 teslas, will be developed and will allow unprecedented access to the edge of a graphene flake where QH edge states propagate.
Overall, the original combination of magnetotransport measurements with scanning tunnelling spectroscopy will solve fundamental questions on the considerable real-space structure of integer and fractional QH edge states impinged by either normal or superconducting electrodes. Our world-unique approach, which will provide the first STM imaging and spectroscopy of QH edge channels, promises to open a new field of investigation of the local scale physics of the QH effect.
Summary
The quantum nature of an electronic fluid is ubiquitous in many solid-state systems subjected to correlations or confinement. This is particularly true for two-dimensional electron gases (2DEGs) in which fascinating quantum states of matter, such as the integer and fractional quantum Hall (QH) states, arise under strong magnetic fields. The understanding of QH systems relies on the existence of one-dimensional (1D) conducting channels that propagate unidirectionally along the edges of the system, following the confining potential. Due to the buried nature of 2DEG commonly built in semiconducting heterostructures, the considerable real space structure of this 1D electronic fluid and its energy spectrum remain largely unexplored.
This project consists in exploring at the local scale the intimate link between the spatial structure of QH edge states, coherent transport and the coupling with superconductivity at interfaces. We will use graphene as a surface-accessible 2DEG to perform a pioneering local investigation of normal and superconducting transport through QH edge states. A new and unique hybrid Atomic Force Microscope and Scanning Tunneling Microscope (STM) operating in the extreme conditions required for this physics, i.e. below 0.1 kelvin and up to 14 teslas, will be developed and will allow unprecedented access to the edge of a graphene flake where QH edge states propagate.
Overall, the original combination of magnetotransport measurements with scanning tunnelling spectroscopy will solve fundamental questions on the considerable real-space structure of integer and fractional QH edge states impinged by either normal or superconducting electrodes. Our world-unique approach, which will provide the first STM imaging and spectroscopy of QH edge channels, promises to open a new field of investigation of the local scale physics of the QH effect.
Max ERC Funding
1 761 412 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym QUTE
Project Quantum Tensor Networks and Entanglement
Researcher (PI) Frank Paul Bernard Verstraete
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary One of the major challenges in theoretical physics is the development of systematic methods for describing and simulating quantum many body systems with strong interactions. Given the huge experimental progress and technological potential in manipulating strongly correlated atoms and electrons, there is a pressing need for such a better theory.
The study of quantum entanglement holds the promise of being a game changer for this question. By mapping out the entanglement structure of the low-energy wavefunctions of quantum spin systems on the lattice, the prototypical example of strongly correlated systems, we have found that the associated wavefunctions can be very well modeled by a novel class of variational wavefunctions, called tensor network states. Tensor networks are changing the ways in which strongly correlated systems can be simulated, classified and understood: as opposed to the usual many body methods, these tensor networks are generic and describe non-perturbative effects in a very natural way.
The goal of this proposal is to advance the scope and use of tensor networks in several directions, both from the numerical and theoretical point of view. We plan to study the differential geometric character of the manifold of tensor network states and the associated nonlinear differential equations of motion on it, develop post tensor network methods in the form of effective theories on top of the tensor network vacuum, study tensor networks in the context of lattice gauge theories and topologically ordered systems, and investigate the novel insights that tensor networks are providing to the renormalization group and the holographic principle.
Colloquially, we believe that tensor networks and the theory of entanglement provide a basic new vocabulary for describing strongly correlated quantum systems, and the main goal of this proposal is to develop the syntax and semantics of that new language.
Summary
One of the major challenges in theoretical physics is the development of systematic methods for describing and simulating quantum many body systems with strong interactions. Given the huge experimental progress and technological potential in manipulating strongly correlated atoms and electrons, there is a pressing need for such a better theory.
The study of quantum entanglement holds the promise of being a game changer for this question. By mapping out the entanglement structure of the low-energy wavefunctions of quantum spin systems on the lattice, the prototypical example of strongly correlated systems, we have found that the associated wavefunctions can be very well modeled by a novel class of variational wavefunctions, called tensor network states. Tensor networks are changing the ways in which strongly correlated systems can be simulated, classified and understood: as opposed to the usual many body methods, these tensor networks are generic and describe non-perturbative effects in a very natural way.
The goal of this proposal is to advance the scope and use of tensor networks in several directions, both from the numerical and theoretical point of view. We plan to study the differential geometric character of the manifold of tensor network states and the associated nonlinear differential equations of motion on it, develop post tensor network methods in the form of effective theories on top of the tensor network vacuum, study tensor networks in the context of lattice gauge theories and topologically ordered systems, and investigate the novel insights that tensor networks are providing to the renormalization group and the holographic principle.
Colloquially, we believe that tensor networks and the theory of entanglement provide a basic new vocabulary for describing strongly correlated quantum systems, and the main goal of this proposal is to develop the syntax and semantics of that new language.
Max ERC Funding
1 927 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ReactiveFronts
Project Mixing interfaces as reactive hotspots of porous media flows: theoretical upscaling, experimental imaging and field scale validation
Researcher (PI) Tanguy Eugene Le Borgne
Host Institution (HI) UNIVERSITE DE RENNES I
Call Details Consolidator Grant (CoG), PE8, ERC-2014-CoG
Summary In porous media, mixing interfaces such as contaminant plume fringes or boundaries between water bodies create highly reactive localized hotspots of chemical and microbiological activity, whether in engineered or natural systems. These reactive fronts are characterized by high concentration gradients, complex flow dynamics, variable water saturation, fluctuating redox conditions and multifunctional biological communities. The spatial and temporal variability of velocity gradients is expected to elongate mixing interfaces and steepen concentration gradients, thus strongly affecting biochemical reactivity. However, a major issue with porous media flows is that these essential micro-scale interactions are inaccessible to direct observation. Furthermore, the lack of a validated upscaling framework from fluid- to system-scale represents a major barrier to the application of reactive transport models to natural or industrial problems.
The ambition of the ReactiveFronts project is to address this knowledge gap by setting up a high level interdisciplinary team that will provide a new theoretical understanding and novel experimental imaging capacities for micro-scale interactions between flow, mixing and reactions and their impact on reactive front kinetics at the system scale. ReactiveFronts will develop an original approach to this long-standing problem; combining theoretical, laboratory and field experimental methods.The focus on reactive interface dynamics, which represents a paradigm shift for reactive transport modelling in porous media, will require the development of original theoretical approaches (WP1) and novel microfluidic experiments (WP2). This will form a strong basis for the study of complex features at increasing spatial scales, including the coupling between fluid dynamics and biological activity (WP4), the impact of 3D flow topologies and chaotic mixing on effective reaction kinetics (WP3), and the field scale assessment of these interactions (WP5).
Summary
In porous media, mixing interfaces such as contaminant plume fringes or boundaries between water bodies create highly reactive localized hotspots of chemical and microbiological activity, whether in engineered or natural systems. These reactive fronts are characterized by high concentration gradients, complex flow dynamics, variable water saturation, fluctuating redox conditions and multifunctional biological communities. The spatial and temporal variability of velocity gradients is expected to elongate mixing interfaces and steepen concentration gradients, thus strongly affecting biochemical reactivity. However, a major issue with porous media flows is that these essential micro-scale interactions are inaccessible to direct observation. Furthermore, the lack of a validated upscaling framework from fluid- to system-scale represents a major barrier to the application of reactive transport models to natural or industrial problems.
The ambition of the ReactiveFronts project is to address this knowledge gap by setting up a high level interdisciplinary team that will provide a new theoretical understanding and novel experimental imaging capacities for micro-scale interactions between flow, mixing and reactions and their impact on reactive front kinetics at the system scale. ReactiveFronts will develop an original approach to this long-standing problem; combining theoretical, laboratory and field experimental methods.The focus on reactive interface dynamics, which represents a paradigm shift for reactive transport modelling in porous media, will require the development of original theoretical approaches (WP1) and novel microfluidic experiments (WP2). This will form a strong basis for the study of complex features at increasing spatial scales, including the coupling between fluid dynamics and biological activity (WP4), the impact of 3D flow topologies and chaotic mixing on effective reaction kinetics (WP3), and the field scale assessment of these interactions (WP5).
Max ERC Funding
1 998 747 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ResiBots
Project Robots with animal-like resilience
Researcher (PI) Jean-Baptiste Nicolas Mouret
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Despite over 50 years of research in robotics, most existing robots are far from being as resilient as the simplest animals: they are fragile machines that easily stop functioning in difficult conditions. The goal of this proposal is to radically change this situation by providing the algorithmic foundations for low-cost robots that can autonomously recover from unforeseen damages in a few minutes. The current approach to fault tolerance is inherited from safety-critical systems (e.g. spaceships or nuclear plants). It is inappropriate for low-cost autonomous robots because it relies on diagnostic procedures, which require expensive proprioceptive sensors, and contingency plans, which cannot cover all the possible situations that an autonomous robot can encounter. It is here contended that trial-and-error learning algorithms provide an alternate approach that does not require diagnostic, nor pre-defined contingency plans. In this project, we will develop and study a novel family of such learning algorithms that make it possible for autonomous robots to quickly discover compensatory behaviors. We will thus shed a new light on one of the most fundamental questions of robotics: how can a robot be as adaptive as an animal? The techniques developed in this project will substantially increase the lifespan of robots without increasing their cost and open new research avenues for adaptive machines.
Summary
Despite over 50 years of research in robotics, most existing robots are far from being as resilient as the simplest animals: they are fragile machines that easily stop functioning in difficult conditions. The goal of this proposal is to radically change this situation by providing the algorithmic foundations for low-cost robots that can autonomously recover from unforeseen damages in a few minutes. The current approach to fault tolerance is inherited from safety-critical systems (e.g. spaceships or nuclear plants). It is inappropriate for low-cost autonomous robots because it relies on diagnostic procedures, which require expensive proprioceptive sensors, and contingency plans, which cannot cover all the possible situations that an autonomous robot can encounter. It is here contended that trial-and-error learning algorithms provide an alternate approach that does not require diagnostic, nor pre-defined contingency plans. In this project, we will develop and study a novel family of such learning algorithms that make it possible for autonomous robots to quickly discover compensatory behaviors. We will thus shed a new light on one of the most fundamental questions of robotics: how can a robot be as adaptive as an animal? The techniques developed in this project will substantially increase the lifespan of robots without increasing their cost and open new research avenues for adaptive machines.
Max ERC Funding
1 499 501 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym RotaNut
Project Rotation and Nutation of a wobbly Earth
Researcher (PI) Veronique Dehant
Host Institution (HI) KONINKLIJKE STERRENWACHT VAN BELGIE
Call Details Advanced Grant (AdG), PE10, ERC-2014-ADG
Summary The rotation of the Earth has long been used as a measure of time, and the stars as reference points to determine travellers’ whereabouts on the globe. Today, precise timescales are provided using atomic clocks and precise positioning is determined using geodetic techniques such as GPS grounded on two reference frames: the terrestrial frame, fixed relative to the Earth and rotating synchronously with the planet, and the celestial frame, which is immobile in space, where the artificial satellites such as those of GPS are moving. The relationship between these frames is complicated by the fact that the rotation and orientation of the Earth is subject to irregularities induced by global mass redistributions with time and external forcing such as the gravitational pull of the Sun and the Moon. With the advance of observation precision, the causes of Earth orientation changes are progressively being identified by geodesists and geophysicists. The term ‘precession’ describes the long-term trend of the orientation of the axis of spin, while ‘nutation’ is the name given to shorter-term periodic variations, which are the prime focus of the present project. The rotation axis of the Earth is moving in space at the level of 1.5km/year due to precession and has periodic variations at the level of 600 meters as seen from space in a plane tangent to the pole. The present observations allow scientists to measure these at the sub-centimetre level enabling them to identify further physics of the Earth’s interior to be taken into account in the Earth orientation models such as the coupling mechanisms at the boundary between the liquid core and the viscoelastic mantle, as well as many other factors (sometimes not yet definitely identified). The proposed research will address many of these and will result in the development of improved global orientation of the Earth with an unprecedented accuracy - at the sub-centimetre level.
Summary
The rotation of the Earth has long been used as a measure of time, and the stars as reference points to determine travellers’ whereabouts on the globe. Today, precise timescales are provided using atomic clocks and precise positioning is determined using geodetic techniques such as GPS grounded on two reference frames: the terrestrial frame, fixed relative to the Earth and rotating synchronously with the planet, and the celestial frame, which is immobile in space, where the artificial satellites such as those of GPS are moving. The relationship between these frames is complicated by the fact that the rotation and orientation of the Earth is subject to irregularities induced by global mass redistributions with time and external forcing such as the gravitational pull of the Sun and the Moon. With the advance of observation precision, the causes of Earth orientation changes are progressively being identified by geodesists and geophysicists. The term ‘precession’ describes the long-term trend of the orientation of the axis of spin, while ‘nutation’ is the name given to shorter-term periodic variations, which are the prime focus of the present project. The rotation axis of the Earth is moving in space at the level of 1.5km/year due to precession and has periodic variations at the level of 600 meters as seen from space in a plane tangent to the pole. The present observations allow scientists to measure these at the sub-centimetre level enabling them to identify further physics of the Earth’s interior to be taken into account in the Earth orientation models such as the coupling mechanisms at the boundary between the liquid core and the viscoelastic mantle, as well as many other factors (sometimes not yet definitely identified). The proposed research will address many of these and will result in the development of improved global orientation of the Earth with an unprecedented accuracy - at the sub-centimetre level.
Max ERC Funding
2 500 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym SEED
Project Learning to See in a Dynamic World
Researcher (PI) Cristian Sminchisescu
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary The goal of SEED is to fundamentally advance the methodology of computer vision by exploiting a dynamic analysis perspective in order to acquire accurate, yet tractable models, that can automatically learn to sense our visual world, localize still and animate objects (e.g. chairs, phones, computers, bicycles or cars, people and animals), actions and interactions, as well as qualitative geometrical and physical scene properties, by propagating and consolidating temporal information, with minimal system training and supervision. SEED will extract descriptions that identify the precise boundaries and spatial layout of the different scene components, and the manner they move, interact, and change over time. For this purpose, SEED will develop novel high-order compositional methodologies for the semantic segmentation of video data acquired by observers of dynamic scenes, by adaptively integrating figure-ground reasoning based on bottom-up and top-down information, and by using weakly supervised machine learning techniques that support continuous learning towards an open-ended number of visual categories. The system will be able not only to recover detailed models of dynamic scenes, but also forecast future actions and interactions in those scenes, over long time horizons, by contextual reasoning and inverse reinforcement learning. Two demonstrators are envisaged, the first corresponding to scene understanding and forecasting in indoor office spaces, and the second for urban outdoor environments. The methodology emerging from this research has the potential to impact fields as diverse as automatic personal assistance for people, video editing and indexing, robotics, environmental awareness, augmented reality, human-computer interaction, or manufacturing.
Summary
The goal of SEED is to fundamentally advance the methodology of computer vision by exploiting a dynamic analysis perspective in order to acquire accurate, yet tractable models, that can automatically learn to sense our visual world, localize still and animate objects (e.g. chairs, phones, computers, bicycles or cars, people and animals), actions and interactions, as well as qualitative geometrical and physical scene properties, by propagating and consolidating temporal information, with minimal system training and supervision. SEED will extract descriptions that identify the precise boundaries and spatial layout of the different scene components, and the manner they move, interact, and change over time. For this purpose, SEED will develop novel high-order compositional methodologies for the semantic segmentation of video data acquired by observers of dynamic scenes, by adaptively integrating figure-ground reasoning based on bottom-up and top-down information, and by using weakly supervised machine learning techniques that support continuous learning towards an open-ended number of visual categories. The system will be able not only to recover detailed models of dynamic scenes, but also forecast future actions and interactions in those scenes, over long time horizons, by contextual reasoning and inverse reinforcement learning. Two demonstrators are envisaged, the first corresponding to scene understanding and forecasting in indoor office spaces, and the second for urban outdoor environments. The methodology emerging from this research has the potential to impact fields as diverse as automatic personal assistance for people, video editing and indexing, robotics, environmental awareness, augmented reality, human-computer interaction, or manufacturing.
Max ERC Funding
1 999 412 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym SINGWAVES
Project Singularity formation in nonlinear evolution equations
Researcher (PI) PIERRE HENRI ALEXANDRE RAPHAEL
Host Institution (HI) UNIVERSITE DE NICE SOPHIA ANTIPOLIS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary "Non linear wave equations are central in the description of many canonical models in physics from nonlinear optics to fluid mechanics. A phenomenon of particular interest is singularity formation which corresponds to the concentration of the energy of the wave packet. The existence and description of such dynamics is still mostly mysterious, but fundamental progress have been made in the past ten years on canonical models like nonlinear wave and Schr\"odinger equations, with in particular the discovery of the fundamental role played by a specific class of nonlinear wave packets: the solitary waves. These very recent works open up a huge field of investigation on problems which were considered out of reach ten years ago. The aim of the SINGWAVES project is to strenghten our research group in the setting of an intense international activity with two main directions of investigation: the construction and classification of singular bubbles for some canonical models like non linear Schr\"odinger equations, the exploration of new deeply nonlinear dynamics in connection with classical models at the frontier of current research."
Summary
"Non linear wave equations are central in the description of many canonical models in physics from nonlinear optics to fluid mechanics. A phenomenon of particular interest is singularity formation which corresponds to the concentration of the energy of the wave packet. The existence and description of such dynamics is still mostly mysterious, but fundamental progress have been made in the past ten years on canonical models like nonlinear wave and Schr\"odinger equations, with in particular the discovery of the fundamental role played by a specific class of nonlinear wave packets: the solitary waves. These very recent works open up a huge field of investigation on problems which were considered out of reach ten years ago. The aim of the SINGWAVES project is to strenghten our research group in the setting of an intense international activity with two main directions of investigation: the construction and classification of singular bubbles for some canonical models like non linear Schr\"odinger equations, the exploration of new deeply nonlinear dynamics in connection with classical models at the frontier of current research."
Max ERC Funding
1 211 055 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym SM-GRAV
Project Gravity, Holography and The Standard Model
Researcher (PI) Ilias Kyritsis
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2014-ADG
Summary "The main thrust of this proposal is to investigate implications of a recent correspondence (string theory (ST) vs. gauge theory) to the physics beyond the Standard Model (SM) and its coupling to gravity. Instead of relying on the string picture of the unification of all interactions with gravity, I propose to look at its dual version: 4d quantum field
theories (QFT). The different perspective is expected to provide 3 distinct results:
(a) A QFT view of the SM embedding in string theory
(b) Novel phenomena and properties that are hard to see in the string theory picture.
(c) A ""dual"" view that would be valid in non-stringy regimes.
The key idea is that gravity, as observed in nature, is emergent: it is the avatar of a (hidden) large-N (near)
CFT that is interacting with the SM at high energy (the Planck scale). Such an approach provides an appealing UV completion to the SM+gravity: a UV complete four-dimensional QFT. There are, however, many questions that need to be
addressed in order for this setup to be a viable physical theory:
1. Why is the gravitational force four-dimensional (instead of higher-dimensional as suggested by standard holography)?
2. Why does the coupling of the gravitational force to the SM satisfy the equivalence principle to such a high accuracy?
3. What are other universal interactions with the SM model implied in this picture? What are their phenomenological consequences?
4. How can one construct, precise and controllable models for this setup?
5. How is Cosmology emerging in this picture? How do the important problems associated with it get resolved?
SM-GRAV will address all of the above questions using the tools of QFT, of string theory and the AdS-CFT correspondence. The outcome of the proposed research is expected to be a concrete and quantitative model/scenario for the emergence and coupling of the ""gravitational sector fields"" to the SM model and the novel phenomenological implications for particle physics and cosmology."
Summary
"The main thrust of this proposal is to investigate implications of a recent correspondence (string theory (ST) vs. gauge theory) to the physics beyond the Standard Model (SM) and its coupling to gravity. Instead of relying on the string picture of the unification of all interactions with gravity, I propose to look at its dual version: 4d quantum field
theories (QFT). The different perspective is expected to provide 3 distinct results:
(a) A QFT view of the SM embedding in string theory
(b) Novel phenomena and properties that are hard to see in the string theory picture.
(c) A ""dual"" view that would be valid in non-stringy regimes.
The key idea is that gravity, as observed in nature, is emergent: it is the avatar of a (hidden) large-N (near)
CFT that is interacting with the SM at high energy (the Planck scale). Such an approach provides an appealing UV completion to the SM+gravity: a UV complete four-dimensional QFT. There are, however, many questions that need to be
addressed in order for this setup to be a viable physical theory:
1. Why is the gravitational force four-dimensional (instead of higher-dimensional as suggested by standard holography)?
2. Why does the coupling of the gravitational force to the SM satisfy the equivalence principle to such a high accuracy?
3. What are other universal interactions with the SM model implied in this picture? What are their phenomenological consequences?
4. How can one construct, precise and controllable models for this setup?
5. How is Cosmology emerging in this picture? How do the important problems associated with it get resolved?
SM-GRAV will address all of the above questions using the tools of QFT, of string theory and the AdS-CFT correspondence. The outcome of the proposed research is expected to be a concrete and quantitative model/scenario for the emergence and coupling of the ""gravitational sector fields"" to the SM model and the novel phenomenological implications for particle physics and cosmology."
Max ERC Funding
1 649 238 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym SMART DESIGN
Project Spin-orbit mechanism in adaptive magnetization-reversal techniques, for magnetic memory design
Researcher (PI) Ioan Mihai Miron
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE3, ERC-2014-STG
Summary Compared to existing Random Access Memories, the Magnetic RAM (MRAM) has the advantage of being non-volatile. Though the basic requirements for reading and writing a single memory element are fulfilled, the present approach based on Spin Transfer Torque (STT) suffers from an innate lack of flexibility.
The solution that I propose is based on the discovery of a novel phenomenon, where instead of transferring spin angular momentum from a neighbouring layer, magnetization reversal is achieved by angular momentum transfer directly from the crystal lattice. There is a long list of advantages that this novel approach has compared to STT, but the goal of this project is to focus only on their most generic difference: flexibility.
The singularity of spin-orbit torque is that the in-plane current injection geometry decouples the “read” and “write” mechanisms. The disconnection is essential, as unlike STT where the pillar shape of the magnetic trilayer sets the current path, in the case of SOT the composing elements may be shaped separately. The liberty of shaping the current distribution allows to spatially modulate the torque exerted on the local magnetization.
The central goal of my project is to explore the new magnetization dynamics, specific to the Spin-Orbit Torque (SOT) geometry, and design novel magnetization switching schemes.
I will begin by tackling the fundamental questions about the origin of SOT and try to control it by mastering its dependence on the layer structure. Materials with on-demand SOT will serve as playground for the testing of a broad range of magnetization reversal techniques. The most successful among them will become the building-blocks of complex magnetic objects whose switching behaviour is tightly related to their shape. To study their magnetization dynamics I plan to build a time-resolved near-field magneto-optical microscope, a unique tool for the ultimate spatial and temporal resolution.
Summary
Compared to existing Random Access Memories, the Magnetic RAM (MRAM) has the advantage of being non-volatile. Though the basic requirements for reading and writing a single memory element are fulfilled, the present approach based on Spin Transfer Torque (STT) suffers from an innate lack of flexibility.
The solution that I propose is based on the discovery of a novel phenomenon, where instead of transferring spin angular momentum from a neighbouring layer, magnetization reversal is achieved by angular momentum transfer directly from the crystal lattice. There is a long list of advantages that this novel approach has compared to STT, but the goal of this project is to focus only on their most generic difference: flexibility.
The singularity of spin-orbit torque is that the in-plane current injection geometry decouples the “read” and “write” mechanisms. The disconnection is essential, as unlike STT where the pillar shape of the magnetic trilayer sets the current path, in the case of SOT the composing elements may be shaped separately. The liberty of shaping the current distribution allows to spatially modulate the torque exerted on the local magnetization.
The central goal of my project is to explore the new magnetization dynamics, specific to the Spin-Orbit Torque (SOT) geometry, and design novel magnetization switching schemes.
I will begin by tackling the fundamental questions about the origin of SOT and try to control it by mastering its dependence on the layer structure. Materials with on-demand SOT will serve as playground for the testing of a broad range of magnetization reversal techniques. The most successful among them will become the building-blocks of complex magnetic objects whose switching behaviour is tightly related to their shape. To study their magnetization dynamics I plan to build a time-resolved near-field magneto-optical microscope, a unique tool for the ultimate spatial and temporal resolution.
Max ERC Funding
1 476 000 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym SPIRE
Project Stars: dynamical Processes driving tidal Interactions, Rotation and Evolution
Researcher (PI) Stephane Frédéric Noël Paul Mathis
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The rotational dynamics of stars strongly impacts their evolution and those of their planetary and galactic environment. Space helio- and asteroseismology recently allowed an observational revolution in this domain. They revealed, e.g., that the core of the Sun is close to a uniform rotation while those of subgiant and red giant stars slow down drastically during their evolution. These important results demonstrate that powerful dynamical mechanisms (internal waves, magnetic fields, turbulence) are in action to extract angular momentum all along the evolution of stars.
Simultaneously, a very large diversity of stellar systems has been discovered and their number will strongly increase thanks to new space missions (K2, TESS, PLATO). It is thus urgent to progress on our understanding of star-planet and star-star interactions: highly complex dynamical processes leading to tidal dissipation in stars play a key role to shape the orbital architecture of their systems and they may deeply modify their evolution.
To interpret these observational breakthroughs, it is necessary to develop now new frontier theoretical and numerical long-term evolution models of rotating magnetic stars and of their systems. To reach this ambitious objective, the SPIRE project will develop new groundbreaking equations, prescriptions, and scaling laws that describe coherently all dynamical mechanisms that transport angular momentum and drive tidal dissipation in stars using advanced semi-analytical modeling and numerical simulations. They will be implemented in the new generation dynamical stellar evolution code STAREVOL and N-body code ESPER. This will allow us to provide state-of-the-art ab-initio integrated and coupled models for the long-term evolution of stars and of their systems, which cannot be directly simulated in 3D yet. SPIRE will thus provide key inputs for the whole astrophysical community: understanding the dynamics of stars is a fundamental step to understand our Universe.
Summary
The rotational dynamics of stars strongly impacts their evolution and those of their planetary and galactic environment. Space helio- and asteroseismology recently allowed an observational revolution in this domain. They revealed, e.g., that the core of the Sun is close to a uniform rotation while those of subgiant and red giant stars slow down drastically during their evolution. These important results demonstrate that powerful dynamical mechanisms (internal waves, magnetic fields, turbulence) are in action to extract angular momentum all along the evolution of stars.
Simultaneously, a very large diversity of stellar systems has been discovered and their number will strongly increase thanks to new space missions (K2, TESS, PLATO). It is thus urgent to progress on our understanding of star-planet and star-star interactions: highly complex dynamical processes leading to tidal dissipation in stars play a key role to shape the orbital architecture of their systems and they may deeply modify their evolution.
To interpret these observational breakthroughs, it is necessary to develop now new frontier theoretical and numerical long-term evolution models of rotating magnetic stars and of their systems. To reach this ambitious objective, the SPIRE project will develop new groundbreaking equations, prescriptions, and scaling laws that describe coherently all dynamical mechanisms that transport angular momentum and drive tidal dissipation in stars using advanced semi-analytical modeling and numerical simulations. They will be implemented in the new generation dynamical stellar evolution code STAREVOL and N-body code ESPER. This will allow us to provide state-of-the-art ab-initio integrated and coupled models for the long-term evolution of stars and of their systems, which cannot be directly simulated in 3D yet. SPIRE will thus provide key inputs for the whole astrophysical community: understanding the dynamics of stars is a fundamental step to understand our Universe.
Max ERC Funding
1 839 634 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym SPOOC
Project Automated Security Proofs of Cryptographic Protocols: Privacy, Untrusted Platforms and Applications to E-voting Protocols
Researcher (PI) Steve Kremer
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary The rise of the Internet and the ubiquity of electronic devices has deeply changed our way of life. Many face to face and paper transactions have nowadays digital counterparts: home banking, e- commerce, e-voting, etc. The security of such transactions is ensured by the means of cryptographic protocols. While historically the main goals of protocols were to ensure confidentiality and authentication the situation has changed. The ability of people to stay connected constantly combined with ill-conceived systems seriously threatens people’s privacy. E-voting protocols need to guarantee privacy of votes, while ensuring transparency of the voting process; RFID and mobile telephone protocols have to guarantee that people cannot be traced. Moreover due to viruses and malware, personal computers and mobile phones must not be considered anymore to be trustworthy; yet they have to be used to execute protocols that need to achieve security goals. To detect flaws, prove the security of protocols and propose new design principles the Spooc project will develop solid foundations and practical tools to analyze and formally prove security properties that ensure the privacy of users as well as techniques for executing protocols on untrusted platforms. We will
- develop foundations and practical tools for specifying and formally verifying new security properties, in particular privacy properties;
- develop techniques for the design and automated analysis of protocols that have to be executed on untrusted platforms;
- apply these methods in particular to novel e-voting protocols, which aim at guaranteeing strong security guarantees without need to trust the voter client software.
The Spooc project will significantly advance formal verification of security protocols and contribute to the development of a rich framework that provides techniques and tools to analyze and design security protocols guaranteeing user’s privacy and relaxing trust assumptions on the execution platforms.
Summary
The rise of the Internet and the ubiquity of electronic devices has deeply changed our way of life. Many face to face and paper transactions have nowadays digital counterparts: home banking, e- commerce, e-voting, etc. The security of such transactions is ensured by the means of cryptographic protocols. While historically the main goals of protocols were to ensure confidentiality and authentication the situation has changed. The ability of people to stay connected constantly combined with ill-conceived systems seriously threatens people’s privacy. E-voting protocols need to guarantee privacy of votes, while ensuring transparency of the voting process; RFID and mobile telephone protocols have to guarantee that people cannot be traced. Moreover due to viruses and malware, personal computers and mobile phones must not be considered anymore to be trustworthy; yet they have to be used to execute protocols that need to achieve security goals. To detect flaws, prove the security of protocols and propose new design principles the Spooc project will develop solid foundations and practical tools to analyze and formally prove security properties that ensure the privacy of users as well as techniques for executing protocols on untrusted platforms. We will
- develop foundations and practical tools for specifying and formally verifying new security properties, in particular privacy properties;
- develop techniques for the design and automated analysis of protocols that have to be executed on untrusted platforms;
- apply these methods in particular to novel e-voting protocols, which aim at guaranteeing strong security guarantees without need to trust the voter client software.
The Spooc project will significantly advance formal verification of security protocols and contribute to the development of a rich framework that provides techniques and tools to analyze and design security protocols guaranteeing user’s privacy and relaxing trust assumptions on the execution platforms.
Max ERC Funding
1 903 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym Spray-Imaging
Project Detailed Characterization of Spray Systems using Novel Laser Imaging Techniques
Researcher (PI) Edouard Jean Jacques Berrocal
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The multiple scattering of light is a complex phenomenon, commonly encountered but rarely desired. In imaging it induces strong blurring on the recorded photographs, limiting the range of applicability and accuracy of modern optical instruments. A typical example concerns the laser diagnostics of spray systems. The PI has revealed in 2008 a technique based on structured illumination with the important capability to remove the contributions from multiple light scattering, allowing the unique possibility of visualising through dense sprays. Based on this acquired knowledge, the aim of this proposal is to develop and apply three novel imaging techniques for the complete characterizations of spray systems:
The first technique will focus on visualizing with both high contrast and high resolution various spray phenomena that have not been observed in the past; such as complex spray breakup mechanisms in the near-nozzle region.
The second technique is related to the characterization of the formed droplets field. This concerns the accurate measurement of both droplets size and concentration using a three-dimensional imaging approach.
Finally, a third important task is the mapping of the spray temperature over the whole spray system. This information would lead to the determination of heat transfer and evaporation rate, which are key factors in the performance of combustion devices.
By extracting these important quantities - dynamics, droplets size/concentration and thermometry - fundamental insights which are still missing to fully understand the process of atomization will be provided. This will also serve at validating modern CFD models, leading to reliable predictions of spray behaviours. Even though this work can directly benefit to a large number of medical and industrial spray applications, it will mostly focus on fuel spray injections used in combustion devices.
Summary
The multiple scattering of light is a complex phenomenon, commonly encountered but rarely desired. In imaging it induces strong blurring on the recorded photographs, limiting the range of applicability and accuracy of modern optical instruments. A typical example concerns the laser diagnostics of spray systems. The PI has revealed in 2008 a technique based on structured illumination with the important capability to remove the contributions from multiple light scattering, allowing the unique possibility of visualising through dense sprays. Based on this acquired knowledge, the aim of this proposal is to develop and apply three novel imaging techniques for the complete characterizations of spray systems:
The first technique will focus on visualizing with both high contrast and high resolution various spray phenomena that have not been observed in the past; such as complex spray breakup mechanisms in the near-nozzle region.
The second technique is related to the characterization of the formed droplets field. This concerns the accurate measurement of both droplets size and concentration using a three-dimensional imaging approach.
Finally, a third important task is the mapping of the spray temperature over the whole spray system. This information would lead to the determination of heat transfer and evaporation rate, which are key factors in the performance of combustion devices.
By extracting these important quantities - dynamics, droplets size/concentration and thermometry - fundamental insights which are still missing to fully understand the process of atomization will be provided. This will also serve at validating modern CFD models, leading to reliable predictions of spray behaviours. Even though this work can directly benefit to a large number of medical and industrial spray applications, it will mostly focus on fuel spray injections used in combustion devices.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym STRIGES
Project Escaping from the Franck-Condon region : a theoretical approach to describe molecular STructural ReorganIzation for reversible EnerGy and information storage at the Excited State
Researcher (PI) ilaria Ciofini
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary STRIGES is a theoretical project aimed at developing new computational approaches and descriptors essentially rooted on Density Functional Theory enabling to design new single molecule architectures able to undergo to significant light induced electronic and structural reorganization.
In this respect the present project concerns beside fundamental goals, such as the development of new theoretical approaches for the description of photochemical and photophysical processes in molecular systems, the description and prediction of photoinduced phenomena which is indeed of fundamental importance also in many research fields of technological relevance, ranging from artificial photosynthesis to molecular electronics.
To this end, we will develop, implement and apply suitable theoretical tools enabling the accurate description of potential energy surfaces of the lowest lying excited states not exclusively within the Franck-Condon region. From the application point of view, the end point of this project is the in-silico design and optimization of two new classes of photomolecular devices.
Summary
STRIGES is a theoretical project aimed at developing new computational approaches and descriptors essentially rooted on Density Functional Theory enabling to design new single molecule architectures able to undergo to significant light induced electronic and structural reorganization.
In this respect the present project concerns beside fundamental goals, such as the development of new theoretical approaches for the description of photochemical and photophysical processes in molecular systems, the description and prediction of photoinduced phenomena which is indeed of fundamental importance also in many research fields of technological relevance, ranging from artificial photosynthesis to molecular electronics.
To this end, we will develop, implement and apply suitable theoretical tools enabling the accurate description of potential energy surfaces of the lowest lying excited states not exclusively within the Franck-Condon region. From the application point of view, the end point of this project is the in-silico design and optimization of two new classes of photomolecular devices.
Max ERC Funding
1 202 500 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym SUSPINTRONICS
Project Magnetic, electric-field and light induced control of spin-polarized supercurrents: fundamentals for an offbeat electronics
Researcher (PI) Javier Eulogio Villegas Hernandez
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary This project aims at establishing the basis for high-temperature superconducting spintronics. The innovative idea is to use spin-polarized superconducting pairs -instead of normal electrons- to convey and manipulate information, taking advantage of the coherent transport inherent to superconductivity. To further increase the potential of this approach, we intend to create multiple control knobs: magnetic field, the classical one in spintronics, as well as the knobs customary in conventional electronics: electric field and light. This will endow superconducting spintronics with a magnetic and electric memory, as well as with photosensitivity. The basic ingredient for this ambitious project is complex-oxide heterostructures. The approach consists of combining the following fundamental effects:
(a) Superconducting proximity effects, in order to transfer superconductivity into ferromagnets.
(b) Ferroelectric field-effects, in order to modulate the superconductor/ferromagnet interactions and tune Josephson coupling.
(c) Spin-torque and ferromagnetic resonance effects, in order to couple superconductivity and magnetization dynamics.
(d) Photoconductivity and photoelectric effects, in order to manipulate the interactions between superconductors and ferroics.
This research is essentially fundamental, but the novel concepts pursued will increase the technological possibilities of superconductivity and spintronics -whose applications are at present completely disconnected.
Summary
This project aims at establishing the basis for high-temperature superconducting spintronics. The innovative idea is to use spin-polarized superconducting pairs -instead of normal electrons- to convey and manipulate information, taking advantage of the coherent transport inherent to superconductivity. To further increase the potential of this approach, we intend to create multiple control knobs: magnetic field, the classical one in spintronics, as well as the knobs customary in conventional electronics: electric field and light. This will endow superconducting spintronics with a magnetic and electric memory, as well as with photosensitivity. The basic ingredient for this ambitious project is complex-oxide heterostructures. The approach consists of combining the following fundamental effects:
(a) Superconducting proximity effects, in order to transfer superconductivity into ferromagnets.
(b) Ferroelectric field-effects, in order to modulate the superconductor/ferromagnet interactions and tune Josephson coupling.
(c) Spin-torque and ferromagnetic resonance effects, in order to couple superconductivity and magnetization dynamics.
(d) Photoconductivity and photoelectric effects, in order to manipulate the interactions between superconductors and ferroics.
This research is essentially fundamental, but the novel concepts pursued will increase the technological possibilities of superconductivity and spintronics -whose applications are at present completely disconnected.
Max ERC Funding
1 997 729 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym SymplecticEinstein
Project The symplectic geometry of anti-self-dual Einstein metrics
Researcher (PI) Joel Fine
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary This project is founded on a new formulation of Einstein's equations in dimension 4, which I developed together with my co-authors. This new approach reveals a surprising link between four-dimensional Einstein manifolds and six-dimensional symplectic geometry. My project will exploit this interplay in both directions: using Riemannian geometry to prove results about symplectic manifolds and using symplectic geometry to prove results about Reimannian manifolds.
Our new idea is to rewrite Einstein's equations using the language of gauge theory. The fundamental objects are no longer Riemannian metrics, but instead certain connections over a 4-manifold M. A connection A defines a metric g_A via its curvature, analogous to the relationship between the electromagnetic potential and field in Maxwell's theory. The total volume of (M,g_A) is an action S(A) for the theory, whose critical points give Einstein metrics. At the same time, the connection A also determines a symplectic structure \omega_A on an associated 6-manifold Z which fibres over M.
My project has two main goals. The first is to classify the symplectic manifolds which arise this way. Classification of general symplectic 6-manifolds is beyond current techniques of symplectic geometry, making my aims here very ambitious. My second goal is to provide an existence theory both for anti-self-dual Poincaré--Einstein metrics and for minimal surfaces in such manifolds. Again, my aims here go decisively beyond the state of the art. In all of these situations, a fundamental problem is the formation of singularities in degenerating families. What makes new progress possible is the fresh input coming from the symplectic manifold Z. I will combine this with techniques from Riemannian geometry and gauge theory to control the singularities which can occur.
Summary
This project is founded on a new formulation of Einstein's equations in dimension 4, which I developed together with my co-authors. This new approach reveals a surprising link between four-dimensional Einstein manifolds and six-dimensional symplectic geometry. My project will exploit this interplay in both directions: using Riemannian geometry to prove results about symplectic manifolds and using symplectic geometry to prove results about Reimannian manifolds.
Our new idea is to rewrite Einstein's equations using the language of gauge theory. The fundamental objects are no longer Riemannian metrics, but instead certain connections over a 4-manifold M. A connection A defines a metric g_A via its curvature, analogous to the relationship between the electromagnetic potential and field in Maxwell's theory. The total volume of (M,g_A) is an action S(A) for the theory, whose critical points give Einstein metrics. At the same time, the connection A also determines a symplectic structure \omega_A on an associated 6-manifold Z which fibres over M.
My project has two main goals. The first is to classify the symplectic manifolds which arise this way. Classification of general symplectic 6-manifolds is beyond current techniques of symplectic geometry, making my aims here very ambitious. My second goal is to provide an existence theory both for anti-self-dual Poincaré--Einstein metrics and for minimal surfaces in such manifolds. Again, my aims here go decisively beyond the state of the art. In all of these situations, a fundamental problem is the formation of singularities in degenerating families. What makes new progress possible is the fresh input coming from the symplectic manifold Z. I will combine this with techniques from Riemannian geometry and gauge theory to control the singularities which can occur.
Max ERC Funding
1 162 880 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym TAMING
Project Taming non convexity?
Researcher (PI) Jean-Bernard Lasserre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary In many important areas and applications of science one has to solve non convex optimization problems and ideally and ultimately one would like to find the global optimum. However in most cases one is faced with NP-hard problems and therefore in practice one has been often satisfied with only a local optimum obtained with some ad-hoc (local) optimization algorithm.
TAMING intends to provide a systematic methodology for solving hard non convex polynomial optimization problems in all areas of science. Indeed the last decade has witnessed the emergence of Polynomial Optimization as a new field in which powerful positivity certificates from real algebraic geometry have permitted to develop an original and systematic approach to solve (at global optimality) optimization problems with polynomial (and even semi-algebraic) data. The backbone of this powerful methodology is the « moment-SOS » approach also known as « Lasserre hierarchy » which has attracted a lot of attention in many areas (e.g., optimization, applied mathematics, quantum computing, engineering, theoretical computer science) with important potential applications. It is now a basic tool for analyzing hardness of approximation in combinatorial optimization and the best candidate algorithm to prove/disprove the famous Unique Games Conjecture. Recently it has also become a promising new method for solving the important Optimal Power Flow Problem in the strategic domain of Energy Networks (as the only method that could solve to optimality certain types of such problems).
However in its present form this promising methodology inherits a high computational cost and a (too) severe problem size limitation which precludes from its application many important real life problems of significant size. Proving that indeed this methodology can fulfill its promises and solve important practical problems in various areas poses major theoretical & practical challenges.
Summary
In many important areas and applications of science one has to solve non convex optimization problems and ideally and ultimately one would like to find the global optimum. However in most cases one is faced with NP-hard problems and therefore in practice one has been often satisfied with only a local optimum obtained with some ad-hoc (local) optimization algorithm.
TAMING intends to provide a systematic methodology for solving hard non convex polynomial optimization problems in all areas of science. Indeed the last decade has witnessed the emergence of Polynomial Optimization as a new field in which powerful positivity certificates from real algebraic geometry have permitted to develop an original and systematic approach to solve (at global optimality) optimization problems with polynomial (and even semi-algebraic) data. The backbone of this powerful methodology is the « moment-SOS » approach also known as « Lasserre hierarchy » which has attracted a lot of attention in many areas (e.g., optimization, applied mathematics, quantum computing, engineering, theoretical computer science) with important potential applications. It is now a basic tool for analyzing hardness of approximation in combinatorial optimization and the best candidate algorithm to prove/disprove the famous Unique Games Conjecture. Recently it has also become a promising new method for solving the important Optimal Power Flow Problem in the strategic domain of Energy Networks (as the only method that could solve to optimality certain types of such problems).
However in its present form this promising methodology inherits a high computational cost and a (too) severe problem size limitation which precludes from its application many important real life problems of significant size. Proving that indeed this methodology can fulfill its promises and solve important practical problems in various areas poses major theoretical & practical challenges.
Max ERC Funding
1 450 625 €
Duration
Start date: 2015-09-01, End date: 2019-08-31
Project acronym ThermoTex
Project Woven and 3D-Printed Thermoelectric Textiles
Researcher (PI) Christian Müller
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Imagine a world, in which countless embedded microelectronic components continuously monitor our health and allow us to seamlessly interact with our digital environment. One particularly promising platform for the realisation of this concept is based on wearable electronic textiles. In order for this technology to become truly pervasive, a myriad of devices will have to operate autonomously over an extended period of time without the need for additional maintenance, repair or battery replacement. The goal of this research programme is to realise textile-based thermoelectric generators that without additional cost can power built-in electronics by harvesting one of the most ubiquitous energy sources available to us: our body heat.
Current thermoelectric technologies rely on toxic inorganic materials that are both expensive to produce and fragile by design, which renders them unsuitable especially for wearable applications. Instead, in this programme we will use polymer semiconductors and nanocomposites. Initially, we will focus on the preparation of materials with a thermoelectric performance significantly beyond the state-of-the-art. Then, we will exploit the ease of shaping polymers into light-weight and flexible articles such as fibres, yarns and fabrics. We will explore both, traditional weaving methods as well as emerging 3D-printing techniques, in order to realise low-cost thermoelectric textiles.
Finally, within the scope of this programme we will demonstrate the ability of prototype thermoelectric textiles to harvest a small fraction of the wearer’s body heat under realistic conditions. We will achieve this through integration into clothing to power off-the-shelf sensors for health care and security applications. Eventually, it can be anticipated that the here interrogated thermoelectric design paradigms will be of significant benefit to the European textile and health care sector as well as society in general.
Summary
Imagine a world, in which countless embedded microelectronic components continuously monitor our health and allow us to seamlessly interact with our digital environment. One particularly promising platform for the realisation of this concept is based on wearable electronic textiles. In order for this technology to become truly pervasive, a myriad of devices will have to operate autonomously over an extended period of time without the need for additional maintenance, repair or battery replacement. The goal of this research programme is to realise textile-based thermoelectric generators that without additional cost can power built-in electronics by harvesting one of the most ubiquitous energy sources available to us: our body heat.
Current thermoelectric technologies rely on toxic inorganic materials that are both expensive to produce and fragile by design, which renders them unsuitable especially for wearable applications. Instead, in this programme we will use polymer semiconductors and nanocomposites. Initially, we will focus on the preparation of materials with a thermoelectric performance significantly beyond the state-of-the-art. Then, we will exploit the ease of shaping polymers into light-weight and flexible articles such as fibres, yarns and fabrics. We will explore both, traditional weaving methods as well as emerging 3D-printing techniques, in order to realise low-cost thermoelectric textiles.
Finally, within the scope of this programme we will demonstrate the ability of prototype thermoelectric textiles to harvest a small fraction of the wearer’s body heat under realistic conditions. We will achieve this through integration into clothing to power off-the-shelf sensors for health care and security applications. Eventually, it can be anticipated that the here interrogated thermoelectric design paradigms will be of significant benefit to the European textile and health care sector as well as society in general.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym TOSSIBERG
Project Theory of Stein Spaces in Berkovich Geometry
Researcher (PI) Jérôme, Jacques, René Poineau
Host Institution (HI) UNIVERSITE DE CAEN NORMANDIE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Complex Stein spaces may be thought of as analytic analogues of the affine schemes of algebraic geometry. They may be characterized in several manners: using convergence of holomorphic functions, topological properties or potential-theoretic properties, for instance. Especially useful for applications is the fact that their coherent cohomology vanishes. Despite the crucial importance of this theory in complex analytic geometry, its p-adic counterpart has hardly been sketched.
In the setting of Berkovich geometry (one among the several notions of p-adic geometry), recent developments have enabled to get a fine understanding of the topology of the spaces (work of Berkovich and Hrushovski-Loeser) and to define the basic tools of potential theory (work of Baker-Rumely, Thuillier, Boucksom-Favre-Jonsson and Chambert-Loir-Ducros). The conditions for a comprehensive study of p-adic Stein spaces are now met; this will be our first goal. The theory will then be used to investigate envelopes of holomorphy and meromorphy. As an application, I plan to derive rationality criteria for power series over function fields.
The second part of the project is devoted to the theory of Stein spaces for Berkovich spaces over rings of integers of number fields (where all the places appear on an equal footing). Those spaces have hardly been studied and only a very small part of the usual analytic machinery is available in this setting. Here, my main goal will consist in proving the basic and fundamental fact that relative polydisks are Stein spaces (in the cohomological sense). This will allow a deeper investigation of rings of convergent arithmetic power series (i.e. with integral coefficients) and will lead up to properties related to commutative algebra but also to the inverse Galois problem. Knowing that the coherent cohomology of polydisks vanishes also opens the road towards computing global cohomology groups for projective analytic spaces over ring of integers of number fields.
Summary
Complex Stein spaces may be thought of as analytic analogues of the affine schemes of algebraic geometry. They may be characterized in several manners: using convergence of holomorphic functions, topological properties or potential-theoretic properties, for instance. Especially useful for applications is the fact that their coherent cohomology vanishes. Despite the crucial importance of this theory in complex analytic geometry, its p-adic counterpart has hardly been sketched.
In the setting of Berkovich geometry (one among the several notions of p-adic geometry), recent developments have enabled to get a fine understanding of the topology of the spaces (work of Berkovich and Hrushovski-Loeser) and to define the basic tools of potential theory (work of Baker-Rumely, Thuillier, Boucksom-Favre-Jonsson and Chambert-Loir-Ducros). The conditions for a comprehensive study of p-adic Stein spaces are now met; this will be our first goal. The theory will then be used to investigate envelopes of holomorphy and meromorphy. As an application, I plan to derive rationality criteria for power series over function fields.
The second part of the project is devoted to the theory of Stein spaces for Berkovich spaces over rings of integers of number fields (where all the places appear on an equal footing). Those spaces have hardly been studied and only a very small part of the usual analytic machinery is available in this setting. Here, my main goal will consist in proving the basic and fundamental fact that relative polydisks are Stein spaces (in the cohomological sense). This will allow a deeper investigation of rings of convergent arithmetic power series (i.e. with integral coefficients) and will lead up to properties related to commutative algebra but also to the inverse Galois problem. Knowing that the coherent cohomology of polydisks vanishes also opens the road towards computing global cohomology groups for projective analytic spaces over ring of integers of number fields.
Max ERC Funding
1 153 750 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym TRACES
Project Tracing ancient microbial cells embedded in silica
Researcher (PI) Mark Adriaan Van Zuilen
Host Institution (HI) INSTITUT DE PHYSIQUE DU GLOBE DE PARIS
Call Details Consolidator Grant (CoG), PE10, ERC-2014-CoG
Summary Reconstructing the nature and habitat of early life is a difficult task that strongly depends on the study of rare microfossils in the ancient rock record. The preservation of such organic structures critically depends on rapid entombment in a mineral matrix. Throughout most of Earth’s history the oceans were silica-supersaturated, leading to precipitation of opal deposits that incorporated superbly preserved microbial cells. As we trace this record of life back in deep time, however, three important obstacles are encountered; 1) microorganisms lack sufficient morphologic complexity to be easily distinguished from each other and from certain abiologic microstructures, 2) the ancient rock record has been subjected to increased pressures and temperatures causing variable degradation of different types of microorganism, and 3) early habitats of life were dominated by hydrothermal processes that can generate abiologic organic microstructures. TRACES will study the critical transformations that occur when representative groups of microorganisms are subjected to artificial silicification and thermal alteration. At incremental steps during these experiments the (sub)micron-scale changes in structure and composition of organic cell walls are monitored. This will be compared with fossilized life in diagenetic hot spring sinters and metamorphosed Precambrian chert deposits. The combined work will lead to a dynamic model for microfossil transformation in progressively altered silica-matrices. The critical question will be answered whether certain types of microorganisms are more likely to be preserved than others. In addition, the critical nano-scale structural differences will be determined between abiologic artefacts – such as carbon coatings on botryoidal quartz or adsorbed carbon on silica biomorphs – and true microfossils in hydrothermal cherts. This will provide a solid scientific basis for tracing life in the oldest, most altered part of the rock record.
Summary
Reconstructing the nature and habitat of early life is a difficult task that strongly depends on the study of rare microfossils in the ancient rock record. The preservation of such organic structures critically depends on rapid entombment in a mineral matrix. Throughout most of Earth’s history the oceans were silica-supersaturated, leading to precipitation of opal deposits that incorporated superbly preserved microbial cells. As we trace this record of life back in deep time, however, three important obstacles are encountered; 1) microorganisms lack sufficient morphologic complexity to be easily distinguished from each other and from certain abiologic microstructures, 2) the ancient rock record has been subjected to increased pressures and temperatures causing variable degradation of different types of microorganism, and 3) early habitats of life were dominated by hydrothermal processes that can generate abiologic organic microstructures. TRACES will study the critical transformations that occur when representative groups of microorganisms are subjected to artificial silicification and thermal alteration. At incremental steps during these experiments the (sub)micron-scale changes in structure and composition of organic cell walls are monitored. This will be compared with fossilized life in diagenetic hot spring sinters and metamorphosed Precambrian chert deposits. The combined work will lead to a dynamic model for microfossil transformation in progressively altered silica-matrices. The critical question will be answered whether certain types of microorganisms are more likely to be preserved than others. In addition, the critical nano-scale structural differences will be determined between abiologic artefacts – such as carbon coatings on botryoidal quartz or adsorbed carbon on silica biomorphs – and true microfossils in hydrothermal cherts. This will provide a solid scientific basis for tracing life in the oldest, most altered part of the rock record.
Max ERC Funding
1 999 250 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym TUCLA
Project Towards a deepened understanding of combustion processes using advanced laser diagnostics
Researcher (PI) Lars Eric Marcus Aldén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE8, ERC-2014-ADG
Summary The field of combustion is of utmost societal/industrial importance while at the same time posing outstanding scientific challenges. In order to handle these, it is extremely important to develop and apply non-intrusive laser-diagnostic techniques with high spatial and temporal resolution for measurements of key parameters such as species concentrations and temperatures. Such techniques have been developed and applied by the PI for more than thirty-five years and the home institute has one of the most advanced instrumentations in academia world-wide.
The proposal activities will be divided into two areas including five main Work packages:
1. Development of new diagnostic techniques. We will concentrate on concepts based on structured illumination which will add a new dimension to present diagnostics based on temporal, intensity and spectral properties. It will allow for multiscalar measurements and efficient suppression of background light. Furthermore, we will work with femto/picosecond lasers for investigating the diagnostic applicability of filamentation, new aspects of non-linear techniques, and diagnostic aspects of photodissociation phenomena.
2. Phenomenological combustion studies using advanced laser diagnostics. A very important aspect of the project is to use the developed and available diagnostic techniques to assure experimental data in extremely challenging environments and together with modeling experts enhance the understanding of combustion phenomena. Studies will be carried out on three
different topics:
- Flame structures in laminar flames at high pressure as well as turbulent flames at atmospheric/high pressure.
- Biomass gasification, where complex fuels require new techniques to measure nitrogen, alkali, chlorine and sulfur compounds, as well as for measurements inside fuel particles.
- Combustion improvement by electric activation which can be introduced to handle flame oscillations and instabilities.
Summary
The field of combustion is of utmost societal/industrial importance while at the same time posing outstanding scientific challenges. In order to handle these, it is extremely important to develop and apply non-intrusive laser-diagnostic techniques with high spatial and temporal resolution for measurements of key parameters such as species concentrations and temperatures. Such techniques have been developed and applied by the PI for more than thirty-five years and the home institute has one of the most advanced instrumentations in academia world-wide.
The proposal activities will be divided into two areas including five main Work packages:
1. Development of new diagnostic techniques. We will concentrate on concepts based on structured illumination which will add a new dimension to present diagnostics based on temporal, intensity and spectral properties. It will allow for multiscalar measurements and efficient suppression of background light. Furthermore, we will work with femto/picosecond lasers for investigating the diagnostic applicability of filamentation, new aspects of non-linear techniques, and diagnostic aspects of photodissociation phenomena.
2. Phenomenological combustion studies using advanced laser diagnostics. A very important aspect of the project is to use the developed and available diagnostic techniques to assure experimental data in extremely challenging environments and together with modeling experts enhance the understanding of combustion phenomena. Studies will be carried out on three
different topics:
- Flame structures in laminar flames at high pressure as well as turbulent flames at atmospheric/high pressure.
- Biomass gasification, where complex fuels require new techniques to measure nitrogen, alkali, chlorine and sulfur compounds, as well as for measurements inside fuel particles.
- Combustion improvement by electric activation which can be introduced to handle flame oscillations and instabilities.
Max ERC Funding
2 442 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym ULT-NEMS
Project Ultra-Cold Nano-Mechanics: from Classical to Quantum Complexity
Researcher (PI) Eddy Charles Eric Collin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary Nano-electro-mechanical devices (NEMS) are extremely small objects that can be actuated and detected by electric means. They are in the first place transducers that can be used as probes for forces down to the molecular level. Top-down fabricated NEMS using conventional microelectronics techniques are simple devices that intimately link mechanical and electrical degrees of freedom. As such, they can be viewed as model systems from basic (linear) harmonic motion up to complex nonlinear dynamics.
The most intriguing experimental situation is attained when the devices are cold enough to behave according to the laws of quantum mechanics, instead of classical physics. This leads to a unique approach of the classical-to-quantum crossover with truly macroscopic position-states. Complementarily, at low temperatures the forces sensed by the NEMS arise from materials themselves cold enough to exhibit exotic quantum properties, originating either in the devices’ constitutive amorphous materials and their intrinsic elusive Tunneling Systems, or from their interaction with a sophisticated fluid like superfluid 3He.
I propose unique research linking ultra-low temperature physics and nano-mechanics, building on my knowledge of both fields and my experience in superconducting quantum circuits. The research has two identified axes, which aim at pushing both the “sensor” and “model system” aspects of NEMS down to their quantum retrenchments. Macroscopic quantum position-states can be engineered with a hybrid quantum circuit arrangement (a combination of NEMS, microwaves and quantum bit), while topological states of confined superfluid 3He with their elementary excitations can be mechanically probed by dedicated NEMS (measuring friction). The scientific impact of this research is extremely wide, tackling fundamental questions like: what/where is the boundary between quantum and classical worlds, and do Majorana particles (potentially obtained in topological 3He) exist at all?
Summary
Nano-electro-mechanical devices (NEMS) are extremely small objects that can be actuated and detected by electric means. They are in the first place transducers that can be used as probes for forces down to the molecular level. Top-down fabricated NEMS using conventional microelectronics techniques are simple devices that intimately link mechanical and electrical degrees of freedom. As such, they can be viewed as model systems from basic (linear) harmonic motion up to complex nonlinear dynamics.
The most intriguing experimental situation is attained when the devices are cold enough to behave according to the laws of quantum mechanics, instead of classical physics. This leads to a unique approach of the classical-to-quantum crossover with truly macroscopic position-states. Complementarily, at low temperatures the forces sensed by the NEMS arise from materials themselves cold enough to exhibit exotic quantum properties, originating either in the devices’ constitutive amorphous materials and their intrinsic elusive Tunneling Systems, or from their interaction with a sophisticated fluid like superfluid 3He.
I propose unique research linking ultra-low temperature physics and nano-mechanics, building on my knowledge of both fields and my experience in superconducting quantum circuits. The research has two identified axes, which aim at pushing both the “sensor” and “model system” aspects of NEMS down to their quantum retrenchments. Macroscopic quantum position-states can be engineered with a hybrid quantum circuit arrangement (a combination of NEMS, microwaves and quantum bit), while topological states of confined superfluid 3He with their elementary excitations can be mechanically probed by dedicated NEMS (measuring friction). The scientific impact of this research is extremely wide, tackling fundamental questions like: what/where is the boundary between quantum and classical worlds, and do Majorana particles (potentially obtained in topological 3He) exist at all?
Max ERC Funding
1 990 574 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym WAPITI
Project Water-mass transformation and Pathways In The Weddell Sea: uncovering the dynamics of a global climate chokepoint from In-situ measurements
Researcher (PI) Jean-Baptiste Bruno Sallée
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary Deep water formed around the Antarctic continent drives the world ocean circulation. 50-70% of this deep water is formed within only about 10% of the Antarctic circumpolar band: the Weddell Sea. Subtle changes in the circulation of the Weddell Sea can lead to major changes in floating ice-shelves, with critical implications for global sea-level, the production of deep water and the global ocean overturning circulation. Despite these critical climate implications, the Antarctic shelf circulation remains poorly understood.
I propose an ambitious project at the crossroads of experimental and numerical oceanography. By drawing on the strengths of each discipline I will explore the regional water-mass pathways in the Weddell Sea: an unchartered cornerstone for understanding the polar ocean circulation and its links to global climate. A key issue facing climate scientists will be addressed: “What sets the tridimensional water-mass structure and pathways in the Weddell Sea and modulates the flow of deep waters between the Antarctica ice-shelves and the global ocean circulation?”
To address this question I propose to investigate several key aspects of the Weddell Sea system: the dynamical forcing of the Weddell gyre and its response to atmospheric variability; the forcing and the circulation on the continental shelf and its interaction with the gyre; and the time-scale and mixing associated with bottom water sinking along the continental shelf. WAPITI approaches these objectives through a series of innovations, including (i) an ambitious field experiment to investigate the shelf circulation and processes, (ii) a powerful conceptual framework applied for the first time to a realistic eddy-resolving model of the Weddell gyre, and (iii) a novel instrument that will be developed to directly observe the sinking of deep water into the abyssal ocean for the first time. Collectively, the project will contribute a new insight into global climate feedbacks.
Summary
Deep water formed around the Antarctic continent drives the world ocean circulation. 50-70% of this deep water is formed within only about 10% of the Antarctic circumpolar band: the Weddell Sea. Subtle changes in the circulation of the Weddell Sea can lead to major changes in floating ice-shelves, with critical implications for global sea-level, the production of deep water and the global ocean overturning circulation. Despite these critical climate implications, the Antarctic shelf circulation remains poorly understood.
I propose an ambitious project at the crossroads of experimental and numerical oceanography. By drawing on the strengths of each discipline I will explore the regional water-mass pathways in the Weddell Sea: an unchartered cornerstone for understanding the polar ocean circulation and its links to global climate. A key issue facing climate scientists will be addressed: “What sets the tridimensional water-mass structure and pathways in the Weddell Sea and modulates the flow of deep waters between the Antarctica ice-shelves and the global ocean circulation?”
To address this question I propose to investigate several key aspects of the Weddell Sea system: the dynamical forcing of the Weddell gyre and its response to atmospheric variability; the forcing and the circulation on the continental shelf and its interaction with the gyre; and the time-scale and mixing associated with bottom water sinking along the continental shelf. WAPITI approaches these objectives through a series of innovations, including (i) an ambitious field experiment to investigate the shelf circulation and processes, (ii) a powerful conceptual framework applied for the first time to a realistic eddy-resolving model of the Weddell gyre, and (iii) a novel instrument that will be developed to directly observe the sinking of deep water into the abyssal ocean for the first time. Collectively, the project will contribute a new insight into global climate feedbacks.
Max ERC Funding
1 998 125 €
Duration
Start date: 2015-05-01, End date: 2021-04-30
Project acronym WATER
Project Probing the Structure and Dynamics of Water in its Various States
Researcher (PI) Anders Nilsson
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE3, ERC-2014-ADG
Summary We propose to address some of the most important outstanding questions for a microscopic understanding of water: What is the structure and dynamics of the hydrogen-bonding network that give rise to all the unique properties of water? How is the structure and dynamics affected by temperature, pressure and by perturbation through interaction with solutes and interfaces? Here we point to the opportunity to exploit the completely new avenues that the novel x-ray free-electron lasers open up for probing both structure and dynamics of water from hot temperatures down to the deeply supercooled regime where the anomalous properties become extreme. We plan to further develop fast cooling and ultrafast x-ray probing allowing access to below the homogeneous ice nucleation limit, to probe equilibrium dynamics through probe-probe techniques based on x-ray correlation spectroscopy, to access low-energy vibrational mode dynamics through THz pump and x-ray scattering probe and to transfer x-ray spectroscopies into the time domain.
We will address one of the currently most debated issues related to a potential liquid-liquid transition and 2nd critical point in liquid water. The goal is to determine experimentally if water, as hypothesized in certain models, can really exist as two liquids, if there is reversible phase transition between the hypothesized liquids, evaluate if these hypothesized liquids can equilibrate on a time scale faster then the rate of ice nucleation and if there exists a critical point that can explain the fluctuations related to the diverging response functions. We will continue to critically investigate our proposed hypothesis that water at ambient temperature encompasses fluctuations around two local structures and that the dominating structure is a strongly distorted hydrogen bonded environment. We will investigate if these concepts can be used to describe the observed perturbations of water structure by solutes and interfaces.
Summary
We propose to address some of the most important outstanding questions for a microscopic understanding of water: What is the structure and dynamics of the hydrogen-bonding network that give rise to all the unique properties of water? How is the structure and dynamics affected by temperature, pressure and by perturbation through interaction with solutes and interfaces? Here we point to the opportunity to exploit the completely new avenues that the novel x-ray free-electron lasers open up for probing both structure and dynamics of water from hot temperatures down to the deeply supercooled regime where the anomalous properties become extreme. We plan to further develop fast cooling and ultrafast x-ray probing allowing access to below the homogeneous ice nucleation limit, to probe equilibrium dynamics through probe-probe techniques based on x-ray correlation spectroscopy, to access low-energy vibrational mode dynamics through THz pump and x-ray scattering probe and to transfer x-ray spectroscopies into the time domain.
We will address one of the currently most debated issues related to a potential liquid-liquid transition and 2nd critical point in liquid water. The goal is to determine experimentally if water, as hypothesized in certain models, can really exist as two liquids, if there is reversible phase transition between the hypothesized liquids, evaluate if these hypothesized liquids can equilibrate on a time scale faster then the rate of ice nucleation and if there exists a critical point that can explain the fluctuations related to the diverging response functions. We will continue to critically investigate our proposed hypothesis that water at ambient temperature encompasses fluctuations around two local structures and that the dominating structure is a strongly distorted hydrogen bonded environment. We will investigate if these concepts can be used to describe the observed perturbations of water structure by solutes and interfaces.
Max ERC Funding
2 486 951 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym WATU
Project Wave turbulence: beyond weak turbulence
Researcher (PI) Nicolas Mordant
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Consolidator Grant (CoG), PE3, ERC-2014-CoG
Summary Wave turbulence and fluid turbulence belong to the same class of turbulent states made of a large number of nonlinearly coupled degrees of freedom driven far from equilibrium. The Weak Turbulence Theory is a statistical theory of low amplitude turbulent waves. The predicted phenomenology (energy cascade) is very similar to that of fluid turbulence, which badly lacks such a statistical theory. Weak Turbulence is thus a promising mathematical framework for turbulence in general. It is observed in many systems such as planetary atmospheres, astrophysical plasmas, tokomak fusion plasmas, superfluid turbulence or Bose-Einstein condensates for example. The theory is much less advanced in the strong wave turbulence case for which a richer phenomenology appears due to the generation of coherent structures. Furthermore, to a large extent the theory lacks experimental validation.
My project aims at studying several physical systems (vibrating elastic plate, 1D and 2D water surface waves, 3D internal waves in a stratified fluid) specifically chosen to highlight various features of wave turbulence both in the weak and strong regimes. Under strong forcing, coherent structures will appear such as developable cones (elastic plates), solitons and sharp water wave ridges (water surface waves) or even fluid turbulence for overturning 3D internal waves. I will specifically use two unique large-scale facilities available in LEGI (Grenoble, France): the 30 m 1D wave flume for surface water waves and the 13m-diameter Coriolis turntable for water surface waves and internal waves. I will setup advanced space-time resolved profilometry and velocimetry techniques adapted to the dimensionality and size of each one of these systems. Advanced statistical tools on massive datasets will provide a profound insight into the coupling between waves and structures in the various regimes of wave turbulence.
Summary
Wave turbulence and fluid turbulence belong to the same class of turbulent states made of a large number of nonlinearly coupled degrees of freedom driven far from equilibrium. The Weak Turbulence Theory is a statistical theory of low amplitude turbulent waves. The predicted phenomenology (energy cascade) is very similar to that of fluid turbulence, which badly lacks such a statistical theory. Weak Turbulence is thus a promising mathematical framework for turbulence in general. It is observed in many systems such as planetary atmospheres, astrophysical plasmas, tokomak fusion plasmas, superfluid turbulence or Bose-Einstein condensates for example. The theory is much less advanced in the strong wave turbulence case for which a richer phenomenology appears due to the generation of coherent structures. Furthermore, to a large extent the theory lacks experimental validation.
My project aims at studying several physical systems (vibrating elastic plate, 1D and 2D water surface waves, 3D internal waves in a stratified fluid) specifically chosen to highlight various features of wave turbulence both in the weak and strong regimes. Under strong forcing, coherent structures will appear such as developable cones (elastic plates), solitons and sharp water wave ridges (water surface waves) or even fluid turbulence for overturning 3D internal waves. I will specifically use two unique large-scale facilities available in LEGI (Grenoble, France): the 30 m 1D wave flume for surface water waves and the 13m-diameter Coriolis turntable for water surface waves and internal waves. I will setup advanced space-time resolved profilometry and velocimetry techniques adapted to the dimensionality and size of each one of these systems. Advanced statistical tools on massive datasets will provide a profound insight into the coupling between waves and structures in the various regimes of wave turbulence.
Max ERC Funding
1 991 611 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym Weakinteract
Project Weak interactions in self-organizations studied by NMR spectroscopy in the supramolecular solid-state
Researcher (PI) Antoine Loquet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary Self-assembly is a fundamental process by which individual subunits organize into ordered supramolecular entities, usually through weak interactions. A longstanding goal is to engineer synthetic self-organized structures, often inspired by protein assemblies found in the context of living cells, to design materials of high potentiality, e.g. drug delivery, scaffolding or electronic applications. There is a tremendous interest in physical chemistry to understand the role of weak interactions at the supramolecular interfaces. However, self-organizations usually form soft material, lacking crystalline order and at the same time exhibiting poor solubility. As a consequence, standard techniques for structural investigation such as X-ray crystallography or solution NMR usually fail or deliver only partial information, preventing an atomic-level understanding and therefore the design of new architectures.
The Weakinteract project aims at developing NMR spectroscopy in the relevant supramolecular solid-state for those non-crystalline and insoluble self-organizations. Weakinteract will exploit strategic isotope labeling, state-of-the-art solid-state NMR methods and integration of hybrid approaches to elucidate the assembly mechanisms, revealing the weak interactions at the supramolecular interfaces. The project comprises three different aspects of growing complexity: (1) Elaboration of a proof-of-concept for atomic resolution structure determination of self-assembled nanotubes in hydrogel form. (2) Determination of the structural basis for bacterial filaments (3) Investigation of the phenomenon of heterogeneous supramolecular templating, in the context of amyloid fold initiation. One major aim of Weakinteract is to provide a robust approach dedicated to chemists, biophysicists and structural biologists in order to tackle weak interactions in the relevant assembled state, ultimately delivering atomic level structures and an understanding of the assembly process.
Summary
Self-assembly is a fundamental process by which individual subunits organize into ordered supramolecular entities, usually through weak interactions. A longstanding goal is to engineer synthetic self-organized structures, often inspired by protein assemblies found in the context of living cells, to design materials of high potentiality, e.g. drug delivery, scaffolding or electronic applications. There is a tremendous interest in physical chemistry to understand the role of weak interactions at the supramolecular interfaces. However, self-organizations usually form soft material, lacking crystalline order and at the same time exhibiting poor solubility. As a consequence, standard techniques for structural investigation such as X-ray crystallography or solution NMR usually fail or deliver only partial information, preventing an atomic-level understanding and therefore the design of new architectures.
The Weakinteract project aims at developing NMR spectroscopy in the relevant supramolecular solid-state for those non-crystalline and insoluble self-organizations. Weakinteract will exploit strategic isotope labeling, state-of-the-art solid-state NMR methods and integration of hybrid approaches to elucidate the assembly mechanisms, revealing the weak interactions at the supramolecular interfaces. The project comprises three different aspects of growing complexity: (1) Elaboration of a proof-of-concept for atomic resolution structure determination of self-assembled nanotubes in hydrogel form. (2) Determination of the structural basis for bacterial filaments (3) Investigation of the phenomenon of heterogeneous supramolecular templating, in the context of amyloid fold initiation. One major aim of Weakinteract is to provide a robust approach dedicated to chemists, biophysicists and structural biologists in order to tackle weak interactions in the relevant assembled state, ultimately delivering atomic level structures and an understanding of the assembly process.
Max ERC Funding
1 472 425 €
Duration
Start date: 2015-09-01, End date: 2020-08-31