Project acronym 3CBIOTECH
Project Cold Carbon Catabolism of Microbial Communities underprinning a Sustainable Bioenergy and Biorefinery Economy
Researcher (PI) Gavin James Collins
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Starting Grant (StG), LS9, ERC-2010-StG_20091118
Summary The applicant will collaborate with Irish, European and U.S.-based colleagues to develop a sustainable biorefinery and bioenergy industry in Ireland and Europe. The focus of this ERC Starting Grant will be the application of classical microbiological, physiological and real-time polymerase chain reaction (PCR)-based assays, to qualitatively and quantitatively characterize microbial communities underpinning novel and innovative, low-temperature, anaerobic waste (and other biomass) conversion technologies, including municipal wastewater treatment and, demonstration- and full-scale biorefinery applications.
Anaerobic digestion (AD) is a naturally-occurring process, which is widely applied for the conversion of waste to methane-containing biogas. Low-temperature (<20 degrees C) AD has been applied by the applicant as a cost-effective alternative to mesophilic (c. 35C) AD for the treatment of several waste categories. However, the microbiology of low-temperature AD is poorly understood. The applicant will work with microbial consortia isolated from anaerobic bioreactors, which have been operated for long-term experiments (>3.5 years), and include organic acid-oxidizing, hydrogen-producing syntrophic microbes and hydrogen-consuming methanogens. A major focus of the project will be the ecophysiology of psychrotolerant and psychrophilic methanogens already identified and cultivated by the applicant. The project will also investigate the role(s) of poorly-understood Crenarchaeota populations and homoacetogenic bacteria, in complex consortia. The host organization is a leading player in the microbiology of waste-to-energy applications. The applicant will train a team of scientists in all aspects of the microbiology and bioengineering of biomass conversion systems.
Summary
The applicant will collaborate with Irish, European and U.S.-based colleagues to develop a sustainable biorefinery and bioenergy industry in Ireland and Europe. The focus of this ERC Starting Grant will be the application of classical microbiological, physiological and real-time polymerase chain reaction (PCR)-based assays, to qualitatively and quantitatively characterize microbial communities underpinning novel and innovative, low-temperature, anaerobic waste (and other biomass) conversion technologies, including municipal wastewater treatment and, demonstration- and full-scale biorefinery applications.
Anaerobic digestion (AD) is a naturally-occurring process, which is widely applied for the conversion of waste to methane-containing biogas. Low-temperature (<20 degrees C) AD has been applied by the applicant as a cost-effective alternative to mesophilic (c. 35C) AD for the treatment of several waste categories. However, the microbiology of low-temperature AD is poorly understood. The applicant will work with microbial consortia isolated from anaerobic bioreactors, which have been operated for long-term experiments (>3.5 years), and include organic acid-oxidizing, hydrogen-producing syntrophic microbes and hydrogen-consuming methanogens. A major focus of the project will be the ecophysiology of psychrotolerant and psychrophilic methanogens already identified and cultivated by the applicant. The project will also investigate the role(s) of poorly-understood Crenarchaeota populations and homoacetogenic bacteria, in complex consortia. The host organization is a leading player in the microbiology of waste-to-energy applications. The applicant will train a team of scientists in all aspects of the microbiology and bioengineering of biomass conversion systems.
Max ERC Funding
1 499 797 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ANADEL
Project Analysis of Geometrical Effects on Dispersive Equations
Researcher (PI) Danela Oana IVANOVICI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Summary
We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Max ERC Funding
1 293 763 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BLOC
Project Mathematical study of Boundary Layers in Oceanic Motions
Researcher (PI) Anne-Laure Perrine Dalibard
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Summary
Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Max ERC Funding
1 267 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym COMBINEPIC
Project Elliptic Combinatorics: Solving famous models from combinatorics, probability and statistical mechanics, via a transversal approach of special functions
Researcher (PI) Kilian RASCHEL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary I am willing to solve several well-known models from combinatorics, probability theory and statistical mechanics: the Ising model on isoradial graphs, dimer models, spanning forests, random walks in cones, occupation time problems. Although completely unrelated a priori, these models have the common feature of being presumed “exactly solvable” models, for which surprising and spectacular formulas should exist for quantities of interest. This is captured by the title “Elliptic Combinatorics”, the wording elliptic referring to the use of special functions, in a broad sense: algebraic/differentially finite (or holonomic)/diagonals/(hyper)elliptic/ hypergeometric/etc.
Besides the exciting nature of the models which we aim at solving, one main strength of our project lies in the variety of modern methods and fields that we cover: combinatorics, probability, algebra (representation theory), computer algebra, algebraic geometry, with a spectrum going from applied to pure mathematics.
We propose in addition two major applications, in finance (Markovian order books) and in population biology (evolution of multitype populations). We plan to work in close collaborations with researchers from these fields, to eventually apply our results (study of extinction probabilities for self-incompatible flower populations, for instance).
Summary
I am willing to solve several well-known models from combinatorics, probability theory and statistical mechanics: the Ising model on isoradial graphs, dimer models, spanning forests, random walks in cones, occupation time problems. Although completely unrelated a priori, these models have the common feature of being presumed “exactly solvable” models, for which surprising and spectacular formulas should exist for quantities of interest. This is captured by the title “Elliptic Combinatorics”, the wording elliptic referring to the use of special functions, in a broad sense: algebraic/differentially finite (or holonomic)/diagonals/(hyper)elliptic/ hypergeometric/etc.
Besides the exciting nature of the models which we aim at solving, one main strength of our project lies in the variety of modern methods and fields that we cover: combinatorics, probability, algebra (representation theory), computer algebra, algebraic geometry, with a spectrum going from applied to pure mathematics.
We propose in addition two major applications, in finance (Markovian order books) and in population biology (evolution of multitype populations). We plan to work in close collaborations with researchers from these fields, to eventually apply our results (study of extinction probabilities for self-incompatible flower populations, for instance).
Max ERC Funding
1 242 400 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CombiTop
Project New Interactions of Combinatorics through Topological Expansions, at the crossroads of Probability, Graph theory, and Mathematical Physics
Researcher (PI) Guillaume CHAPUY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary "The purpose of this project is to use the ubiquitous nature of certain combinatorial topological objects called maps in order to unveil deep connections between several areas of mathematics. Maps, that describe the embedding of a graph into a surface, appear in probability theory, mathematical physics, enumerative geometry or graph theory, and different combinatorial viewpoints on these objects have been developed in connection with each topic. The originality of our project will be to study these approaches together and to unify them.
The outcome will be triple, as we will:
1. build a new, well structured branch of combinatorics of which many existing results in different areas of enumerative and algebraic combinatorics are only first fruits;
2. connect and unify several aspects of the domains related to it, most importantly between probability and integrable hierarchies thus proposing new directions, new tools and new results for each of them;
3. export the tools of this unified framework to reach at new applications, especially in random graph theory and in a rising domain of algebraic combinatorics related to Tamari lattices.
The methodology to reach the unification will be the study of some strategic interactions at different places involving topological expansions, that is to say, places where enumerative problems dealing with maps appear and their genus invariant plays a natural role, in particular: 1. the combinatorial theory of maps developped by the "French school" of combinatorics, and the study of random maps; 2. the combinatorics of Fermions underlying the theory of KP and 2-Toda hierarchies; 3; the Eynard-Orantin "topological recursion" coming from mathematical physics.
We present some key set of tasks in view of relating these different topics together. The pertinence of the approach is demonstrated by recent research of the principal investigator."
Summary
"The purpose of this project is to use the ubiquitous nature of certain combinatorial topological objects called maps in order to unveil deep connections between several areas of mathematics. Maps, that describe the embedding of a graph into a surface, appear in probability theory, mathematical physics, enumerative geometry or graph theory, and different combinatorial viewpoints on these objects have been developed in connection with each topic. The originality of our project will be to study these approaches together and to unify them.
The outcome will be triple, as we will:
1. build a new, well structured branch of combinatorics of which many existing results in different areas of enumerative and algebraic combinatorics are only first fruits;
2. connect and unify several aspects of the domains related to it, most importantly between probability and integrable hierarchies thus proposing new directions, new tools and new results for each of them;
3. export the tools of this unified framework to reach at new applications, especially in random graph theory and in a rising domain of algebraic combinatorics related to Tamari lattices.
The methodology to reach the unification will be the study of some strategic interactions at different places involving topological expansions, that is to say, places where enumerative problems dealing with maps appear and their genus invariant plays a natural role, in particular: 1. the combinatorial theory of maps developped by the "French school" of combinatorics, and the study of random maps; 2. the combinatorics of Fermions underlying the theory of KP and 2-Toda hierarchies; 3; the Eynard-Orantin "topological recursion" coming from mathematical physics.
We present some key set of tasks in view of relating these different topics together. The pertinence of the approach is demonstrated by recent research of the principal investigator."
Max ERC Funding
1 086 125 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym CONSTRAINTS
Project Ecophysiological and biophysical constraints on domestication in crop plants
Researcher (PI) Cyrille (Fabrice) Violle
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS9, ERC-2014-STG
Summary A fundamental question in biology is how constraints drive phenotypic changes and the diversification of life. We know little about the role of these constraints on crop domestication, nor how artificial selection can escape them. CONSTRAINTS questions whether crop domestication has shifted ecophysiological and biophysical traits related to resource acquisition, use and partitioning, and how trade-offs between them have constrained domestication and can limit future improvements in both optimal and sub-optimal conditions.
The project is based on three objectives: 1. revealing the existence (or lack) of generic resource-use domestication syndrome in crop science; 2. elucidating ecophysiological and biophysical trade-offs within crop science and delineating the envelope of constraints for artificial selection; 3. examining the shape of ecophysiological and biophysical trade-offs in crop species when grown in sub-optimal environmental conditions. This project will be investigated within and across crop species thanks to a core panel of 12 studied species (maize, sunflower, Japanese rice, sorghum, durum wheat, bread wheat, alfalfa, orchardgrass, silvergrass, pea, colza, vine) for which data and collections (ca. 1,300 genotypes total) are already available to the PI, and additional high throughput phenotyping using automatons. Additional species will be used for specific tasks: (i) a panel of 30 species for a comparative analysis of crop species and their wild progenitors; (ii) 400 worldwide accessions of Arabidopsis thaliana for a genome-wide association study of resource-use traits. Collectively, we will use a multiple-tool approach by using: field measurement, high-throughput phenotyping, common-garden experiment, comparative analysis using databases, modelling, genomics.
The ground-breaking nature of the project holds in the nature of the questions asked and in the unique opportunity to transfer knowledge from ecology and evolutionary biology to crop species.
Summary
A fundamental question in biology is how constraints drive phenotypic changes and the diversification of life. We know little about the role of these constraints on crop domestication, nor how artificial selection can escape them. CONSTRAINTS questions whether crop domestication has shifted ecophysiological and biophysical traits related to resource acquisition, use and partitioning, and how trade-offs between them have constrained domestication and can limit future improvements in both optimal and sub-optimal conditions.
The project is based on three objectives: 1. revealing the existence (or lack) of generic resource-use domestication syndrome in crop science; 2. elucidating ecophysiological and biophysical trade-offs within crop science and delineating the envelope of constraints for artificial selection; 3. examining the shape of ecophysiological and biophysical trade-offs in crop species when grown in sub-optimal environmental conditions. This project will be investigated within and across crop species thanks to a core panel of 12 studied species (maize, sunflower, Japanese rice, sorghum, durum wheat, bread wheat, alfalfa, orchardgrass, silvergrass, pea, colza, vine) for which data and collections (ca. 1,300 genotypes total) are already available to the PI, and additional high throughput phenotyping using automatons. Additional species will be used for specific tasks: (i) a panel of 30 species for a comparative analysis of crop species and their wild progenitors; (ii) 400 worldwide accessions of Arabidopsis thaliana for a genome-wide association study of resource-use traits. Collectively, we will use a multiple-tool approach by using: field measurement, high-throughput phenotyping, common-garden experiment, comparative analysis using databases, modelling, genomics.
The ground-breaking nature of the project holds in the nature of the questions asked and in the unique opportunity to transfer knowledge from ecology and evolutionary biology to crop species.
Max ERC Funding
1 499 979 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CONTACTMATH
Project Legendrian contact homology and generating families
Researcher (PI) Frédéric Bourgeois
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary A contact structure on an odd dimensional manifold in a maximally non integrable hyperplane field. It is the odd dimensional counterpart of a symplectic structure. Contact and symplectic topology is a recent and very active area that studies intrinsic questions about existence, (non) uniqueness and rigidity of contact and symplectic structures. It is intimately related to many other important disciplines, such as dynamical systems, singularity theory, knot theory, Morse theory, complex analysis, ... Legendrian submanifolds are a distinguished class of submanifolds in a contact manifold, which are tangent to the contact distribution. These manifolds are of a particular interest in contact topology. Important classes of Legendrian submanifolds can be described using generating families, and this description can be used to define Legendrian invariants via Morse theory. Other the other hand, Legendrian contact homology is an invariant for Legendrian submanifolds, based on holomorphic curves. The goal of this research proposal is to study the relationship between these two approaches. More precisely, we plan to show that the generating family homology and the linearized Legendrian contact homology can be defined for the same class of Legendrian submanifolds, and are isomorphic. This correspondence should be established using a parametrized version of symplectic homology, being developed by the Principal Investigator in collaboration with Oancea. Such a result would give an entirely new type of information about holomorphic curves invariants. Moreover, it can be used to obtain more general structural results on linearized Legendrian contact homology, to extend recent results on existence of Reeb chords, and to gain a much better understanding of the geography of Legendrian submanifolds.
Summary
A contact structure on an odd dimensional manifold in a maximally non integrable hyperplane field. It is the odd dimensional counterpart of a symplectic structure. Contact and symplectic topology is a recent and very active area that studies intrinsic questions about existence, (non) uniqueness and rigidity of contact and symplectic structures. It is intimately related to many other important disciplines, such as dynamical systems, singularity theory, knot theory, Morse theory, complex analysis, ... Legendrian submanifolds are a distinguished class of submanifolds in a contact manifold, which are tangent to the contact distribution. These manifolds are of a particular interest in contact topology. Important classes of Legendrian submanifolds can be described using generating families, and this description can be used to define Legendrian invariants via Morse theory. Other the other hand, Legendrian contact homology is an invariant for Legendrian submanifolds, based on holomorphic curves. The goal of this research proposal is to study the relationship between these two approaches. More precisely, we plan to show that the generating family homology and the linearized Legendrian contact homology can be defined for the same class of Legendrian submanifolds, and are isomorphic. This correspondence should be established using a parametrized version of symplectic homology, being developed by the Principal Investigator in collaboration with Oancea. Such a result would give an entirely new type of information about holomorphic curves invariants. Moreover, it can be used to obtain more general structural results on linearized Legendrian contact homology, to extend recent results on existence of Reeb chords, and to gain a much better understanding of the geography of Legendrian submanifolds.
Max ERC Funding
710 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym CoqHoTT
Project Coq for Homotopy Type Theory
Researcher (PI) nicolas Tabareau
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Summary
Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Max ERC Funding
1 498 290 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CORFRONMAT
Project Correlated frontiers of many-body quantum mathematics and condensed matter physics
Researcher (PI) Nicolas ROUGERIE
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary One of the main challenges in condensed matter physics is to understand strongly correlated quantum systems. Our purpose is to approach this issue from the point of view of rigorous mathematical analysis. The goals are twofold: develop a mathematical framework applicable to physically relevant scenarii, take inspiration from the physics to introduce new topics in mathematics. The scope of the proposal thus goes from physically oriented questions (theoretical description and modelization of physical systems) to analytical ones (rigorous derivation and analysis of reduced models) in several cases where strong correlations play the key role.
In a first part, we aim at developing mathematical methods of general applicability to go beyond mean-field theory in different contexts. Our long-term goal is to forge new tools to attack important open problems in the field. Particular emphasis will be put on the structural properties of large quantum states as a general tool.
A second part is concerned with so-called fractional quantum Hall states, host of the fractional quantum Hall effect. Despite the appealing structure of their built-in correlations, their mathematical study is in its infancy. They however constitute an excellent testing ground to develop ideas of possible wider applicability. In particular, we introduce and study a new class of many-body variational problems.
In the third part we discuss so-called anyons, exotic quasi-particles thought to emerge as excitations of highly-correlated quantum systems. Their modelization gives rise to rather unusual, strongly interacting, many-body Hamiltonians with a topological content. Mathematical analysis will help us shed light on those, clarifying the characteristic properties that could ultimately be experimentally tested.
Summary
One of the main challenges in condensed matter physics is to understand strongly correlated quantum systems. Our purpose is to approach this issue from the point of view of rigorous mathematical analysis. The goals are twofold: develop a mathematical framework applicable to physically relevant scenarii, take inspiration from the physics to introduce new topics in mathematics. The scope of the proposal thus goes from physically oriented questions (theoretical description and modelization of physical systems) to analytical ones (rigorous derivation and analysis of reduced models) in several cases where strong correlations play the key role.
In a first part, we aim at developing mathematical methods of general applicability to go beyond mean-field theory in different contexts. Our long-term goal is to forge new tools to attack important open problems in the field. Particular emphasis will be put on the structural properties of large quantum states as a general tool.
A second part is concerned with so-called fractional quantum Hall states, host of the fractional quantum Hall effect. Despite the appealing structure of their built-in correlations, their mathematical study is in its infancy. They however constitute an excellent testing ground to develop ideas of possible wider applicability. In particular, we introduce and study a new class of many-body variational problems.
In the third part we discuss so-called anyons, exotic quasi-particles thought to emerge as excitations of highly-correlated quantum systems. Their modelization gives rise to rather unusual, strongly interacting, many-body Hamiltonians with a topological content. Mathematical analysis will help us shed light on those, clarifying the characteristic properties that could ultimately be experimentally tested.
Max ERC Funding
1 056 664 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym CoVeCe
Project Coinduction for Verification and Certification
Researcher (PI) Damien Gabriel Jacques Pous
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Summary
Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Max ERC Funding
1 407 413 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CriBLaM
Project Critical behavior of lattice models
Researcher (PI) Hugo DUMINIL-COPIN
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary Statistical physics is a theory allowing the derivation of the statistical behavior of macroscopic systems from the description of the interactions of their microscopic constituents. For more than a century, lattice models (i.e. random systems defined on lattices) have been introduced as discrete models describing the phase transition for a large variety of phenomena, ranging from ferroelectrics to lattice gas.
In the last decades, our understanding of percolation and the Ising model, two classical exam- ples of lattice models, progressed greatly. Nonetheless, major questions remain open on these two models.
The goal of this project is to break new grounds in the understanding of phase transition in statistical physics by using and aggregating in a pioneering way multiple techniques from proba- bility, combinatorics, analysis and integrable systems. In this project, we will focus on three main goals:
Objective A Provide a solid mathematical framework for the study of universality for Bernoulli percolation and the Ising model in two dimensions.
Objective B Advance in the understanding of the critical behavior of Bernoulli percolation and the Ising model in dimensions larger or equal to 3.
Objective C Greatly improve the understanding of planar lattice models obtained by general- izations of percolation and the Ising model, through the design of an innovative mathematical theory of phase transition dedicated to graphical representations of classical lattice models, such as Fortuin-Kasteleyn percolation, Ashkin-Teller models and Loop models.
Most of the questions that we propose to tackle are notoriously difficult open problems. We believe that breakthroughs in these fundamental questions would reshape significantly our math- ematical understanding of phase transition.
Summary
Statistical physics is a theory allowing the derivation of the statistical behavior of macroscopic systems from the description of the interactions of their microscopic constituents. For more than a century, lattice models (i.e. random systems defined on lattices) have been introduced as discrete models describing the phase transition for a large variety of phenomena, ranging from ferroelectrics to lattice gas.
In the last decades, our understanding of percolation and the Ising model, two classical exam- ples of lattice models, progressed greatly. Nonetheless, major questions remain open on these two models.
The goal of this project is to break new grounds in the understanding of phase transition in statistical physics by using and aggregating in a pioneering way multiple techniques from proba- bility, combinatorics, analysis and integrable systems. In this project, we will focus on three main goals:
Objective A Provide a solid mathematical framework for the study of universality for Bernoulli percolation and the Ising model in two dimensions.
Objective B Advance in the understanding of the critical behavior of Bernoulli percolation and the Ising model in dimensions larger or equal to 3.
Objective C Greatly improve the understanding of planar lattice models obtained by general- izations of percolation and the Ising model, through the design of an innovative mathematical theory of phase transition dedicated to graphical representations of classical lattice models, such as Fortuin-Kasteleyn percolation, Ashkin-Teller models and Loop models.
Most of the questions that we propose to tackle are notoriously difficult open problems. We believe that breakthroughs in these fundamental questions would reshape significantly our math- ematical understanding of phase transition.
Max ERC Funding
1 499 912 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CRYSP
Project CRYSP: A Novel Framework for Collaboratively Building Cryptographically Secure Programs and their Proofs
Researcher (PI) Karthikeyan Bhargavan
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary The field of software security analysis stands at a critical juncture.
Applications have become too large for security experts to examine by hand,
automated verification tools do not scale, and the risks of deploying insecure software are too great to tolerate anything less than mathematical proof.
A radical shift of strategy is needed if programming and analysis techniques are to keep up in a networked world where increasing amounts of governmental and individual information are generated, manipulated, and accessed through web-based software applications.
The basic tenet of this proposal is that the main roadblock to the security verification of a large program is not its size, but rather the lack of precise security specifications for the underlying libraries and security-critical application code. Since, large-scale software is often a collaborative effort, no single programmer knows all the design goals. Hence, this proposal advocates a collaborative specification and verification framework that helps teams of programmers write detailed security specifications incrementally and then verify that they are satisfied by the source program.
The main scientific challenge is to develop new program verification techniques that can be applied collaboratively, incrementally, and modularly to application and library code written in mainstream programming languages. The validation of this approach will be through substantial case studies. Our aim is to produce the first verified open source cryptographic protocol library and the first web applications with formal proofs of security.
The proposed project is bold and ambitious, but it is certainly feasible, and has the potential to change how software security is analyzed for years to come.
Summary
The field of software security analysis stands at a critical juncture.
Applications have become too large for security experts to examine by hand,
automated verification tools do not scale, and the risks of deploying insecure software are too great to tolerate anything less than mathematical proof.
A radical shift of strategy is needed if programming and analysis techniques are to keep up in a networked world where increasing amounts of governmental and individual information are generated, manipulated, and accessed through web-based software applications.
The basic tenet of this proposal is that the main roadblock to the security verification of a large program is not its size, but rather the lack of precise security specifications for the underlying libraries and security-critical application code. Since, large-scale software is often a collaborative effort, no single programmer knows all the design goals. Hence, this proposal advocates a collaborative specification and verification framework that helps teams of programmers write detailed security specifications incrementally and then verify that they are satisfied by the source program.
The main scientific challenge is to develop new program verification techniques that can be applied collaboratively, incrementally, and modularly to application and library code written in mainstream programming languages. The validation of this approach will be through substantial case studies. Our aim is to produce the first verified open source cryptographic protocol library and the first web applications with formal proofs of security.
The proposed project is bold and ambitious, but it is certainly feasible, and has the potential to change how software security is analyzed for years to come.
Max ERC Funding
1 406 726 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym D3
Project Interpreting Drawings for 3D Design
Researcher (PI) Adrien BOUSSEAU
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary Designers draw extensively to externalize their ideas and communicate with others. However, drawings are currently not directly interpretable by computers. To test their ideas against physical reality, designers have to create 3D models suitable for simulation and 3D printing. However, the visceral and approximate nature of drawing clashes with the tediousness and rigidity of 3D modeling. As a result, designers only model finalized concepts, and have no feedback on feasibility during creative exploration.
Our ambition is to bring the power of 3D engineering tools to the creative phase of design by automatically estimating 3D models from drawings. However, this problem is ill-posed: a point in the drawing can lie anywhere in depth. Existing solutions are limited to simple shapes, or require user input to “explain” to the computer how to interpret the drawing. Our originality is to exploit professional drawing techniques that designers developed to communicate shape most efficiently. Each technique provides geometric constraints that help viewers understand drawings, and that we shall leverage for 3D reconstruction.
Our first challenge is to formalize common drawing techniques and derive how they constrain 3D shape. Our second challenge is to identify which techniques are used in a drawing. We cast this problem as the joint optimization of discrete variables indicating which constraints apply, and continuous variables representing the 3D model that best satisfies these constraints. But evaluating all constraint configurations is impractical. To solve this inverse problem, we will first develop forward algorithms that synthesize drawings from 3D models. Our idea is to use this synthetic data to train machine learning algorithms that predict the likelihood that constraints apply in a given drawing.
In addition to tackling the long-standing problem of single-image 3D reconstruction, our research will significantly tighten design and engineering for rapid prototyping.
Summary
Designers draw extensively to externalize their ideas and communicate with others. However, drawings are currently not directly interpretable by computers. To test their ideas against physical reality, designers have to create 3D models suitable for simulation and 3D printing. However, the visceral and approximate nature of drawing clashes with the tediousness and rigidity of 3D modeling. As a result, designers only model finalized concepts, and have no feedback on feasibility during creative exploration.
Our ambition is to bring the power of 3D engineering tools to the creative phase of design by automatically estimating 3D models from drawings. However, this problem is ill-posed: a point in the drawing can lie anywhere in depth. Existing solutions are limited to simple shapes, or require user input to “explain” to the computer how to interpret the drawing. Our originality is to exploit professional drawing techniques that designers developed to communicate shape most efficiently. Each technique provides geometric constraints that help viewers understand drawings, and that we shall leverage for 3D reconstruction.
Our first challenge is to formalize common drawing techniques and derive how they constrain 3D shape. Our second challenge is to identify which techniques are used in a drawing. We cast this problem as the joint optimization of discrete variables indicating which constraints apply, and continuous variables representing the 3D model that best satisfies these constraints. But evaluating all constraint configurations is impractical. To solve this inverse problem, we will first develop forward algorithms that synthesize drawings from 3D models. Our idea is to use this synthetic data to train machine learning algorithms that predict the likelihood that constraints apply in a given drawing.
In addition to tackling the long-standing problem of single-image 3D reconstruction, our research will significantly tighten design and engineering for rapid prototyping.
Max ERC Funding
1 482 761 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym DANSEINCELL
Project Modeling cytoplasmic trafficking and molecular delivery in cellular microdomains
Researcher (PI) David Holcman
Host Institution (HI) ECOLE NORMALE SUPERIEURE
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary Cytoplasmic motion is a key determinant of organelle transport, protein-protein interactions, RNA transport and drug delivery, to name but a few cellular phenomena. Nucleic acid trafficking is important in antisense and gene therapy based on viral and synthetic vectors. This proposal is dedicated to the theoretical study of intracellular transport of proteins, organelles and DNA particles. We propose to construct a mathematical model to quantify and predict the spatiotemporal dynamics of complex structures in the cytosol and the nucleus, based on the physical characteristics and the micro-rheology of the environment (viscosity). We model the passive motion of proteins or DNA as free or confined diffusion, while for the organelle and virus motion, we will include active cytoskeleton-dependent transport. The proposed mathematical model of cellular trafficking is based on physical principles. We propose to estimate the mean arrival time and the probability of viruses and plasmid DNA to arrive to a nuclear pore. The motion will be described by stochastic dynamics, containing both a drift (along microtubules) and a Brownian (free diffusion) component. The analysis of the equations requires the development of new asymptotic methods for the calculation of the probability and the mean arrival time of a particle to a small hole on the nucleus surface. We will extend the analysis to DNA movement in the nucleus after cellular irradiation, when the nucleus contains single and double broken DNA strands (dbDNAs). The number of remaining DNA breaks determines the activation of the repair machinery and the cell decision to enter into apoptosis. We will study the dsbDNA repair machinery engaged in the task of finding the DNA damage. We will formulate and analyze, both numerically and analytically, the equations that link the level of irradiation to apoptosis. The present project belongs to the new class of initiatives toward a quantitative analysis of intracellular trafficking.
Summary
Cytoplasmic motion is a key determinant of organelle transport, protein-protein interactions, RNA transport and drug delivery, to name but a few cellular phenomena. Nucleic acid trafficking is important in antisense and gene therapy based on viral and synthetic vectors. This proposal is dedicated to the theoretical study of intracellular transport of proteins, organelles and DNA particles. We propose to construct a mathematical model to quantify and predict the spatiotemporal dynamics of complex structures in the cytosol and the nucleus, based on the physical characteristics and the micro-rheology of the environment (viscosity). We model the passive motion of proteins or DNA as free or confined diffusion, while for the organelle and virus motion, we will include active cytoskeleton-dependent transport. The proposed mathematical model of cellular trafficking is based on physical principles. We propose to estimate the mean arrival time and the probability of viruses and plasmid DNA to arrive to a nuclear pore. The motion will be described by stochastic dynamics, containing both a drift (along microtubules) and a Brownian (free diffusion) component. The analysis of the equations requires the development of new asymptotic methods for the calculation of the probability and the mean arrival time of a particle to a small hole on the nucleus surface. We will extend the analysis to DNA movement in the nucleus after cellular irradiation, when the nucleus contains single and double broken DNA strands (dbDNAs). The number of remaining DNA breaks determines the activation of the repair machinery and the cell decision to enter into apoptosis. We will study the dsbDNA repair machinery engaged in the task of finding the DNA damage. We will formulate and analyze, both numerically and analytically, the equations that link the level of irradiation to apoptosis. The present project belongs to the new class of initiatives toward a quantitative analysis of intracellular trafficking.
Max ERC Funding
750 000 €
Duration
Start date: 2009-01-01, End date: 2014-06-30
Project acronym DEEPSEA
Project Parallelism and Beyond: Dynamic Parallel Computation for Efficiency and High Performance
Researcher (PI) Umut Acar
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary We propose to radically extend the frontiers of two major themes in
computing, parallelism and dynamism, and develop a novel paradigm of
computing: dynamic-parallelism. To this end, we will follow two
lines of research. First, we will develop techniques for extracting
efficiency and high performance from parallel programs written in
high-level programming languages. Second, we will develop the
dynamic-parallelism model, where computations can respond to a wide
variety of dynamic changes to their data automatically and
efficiently, by developing novel abstractions (calculi), high-level
programming-language constructs, and compilation techniques. The
research will culminate in a language that extends the C programming
language with support for parallel and dynamic-parallel programming.
The proposal is motivated by urgent needs driven by the advent of
multicore chips, which is making parallelism mainstream, and the
increasing ubiquity of software, which requires applications to
operate on highly dynamic data. These advances demand parallel and
highly dynamic software, which remains too difficult and labor
intensive to develop. The urgency is further underlined by the
increasing data and problem sizes---online data grows
exponentially, doubling every few years---that require similarly
powerful advances in performance.
The proposal will achieve profound impact by dramatically simplifying
the development of high-performing dynamic and dynamic-parallel
software. As a result, programmer productivity and software quality
including correctness, reliability, performance, and resource (e.g.,
time and energy) consumption will improve significantly. The proposal
will not only open new research opportunities in parallel computing,
programming languages, and compilers, but also in other fields where
parallel and dynamic problems abound, e.g., algorithms, computational
biology, geometry, graphics, machine learning, and software systems.
Summary
We propose to radically extend the frontiers of two major themes in
computing, parallelism and dynamism, and develop a novel paradigm of
computing: dynamic-parallelism. To this end, we will follow two
lines of research. First, we will develop techniques for extracting
efficiency and high performance from parallel programs written in
high-level programming languages. Second, we will develop the
dynamic-parallelism model, where computations can respond to a wide
variety of dynamic changes to their data automatically and
efficiently, by developing novel abstractions (calculi), high-level
programming-language constructs, and compilation techniques. The
research will culminate in a language that extends the C programming
language with support for parallel and dynamic-parallel programming.
The proposal is motivated by urgent needs driven by the advent of
multicore chips, which is making parallelism mainstream, and the
increasing ubiquity of software, which requires applications to
operate on highly dynamic data. These advances demand parallel and
highly dynamic software, which remains too difficult and labor
intensive to develop. The urgency is further underlined by the
increasing data and problem sizes---online data grows
exponentially, doubling every few years---that require similarly
powerful advances in performance.
The proposal will achieve profound impact by dramatically simplifying
the development of high-performing dynamic and dynamic-parallel
software. As a result, programmer productivity and software quality
including correctness, reliability, performance, and resource (e.g.,
time and energy) consumption will improve significantly. The proposal
will not only open new research opportunities in parallel computing,
programming languages, and compilers, but also in other fields where
parallel and dynamic problems abound, e.g., algorithms, computational
biology, geometry, graphics, machine learning, and software systems.
Max ERC Funding
1 076 570 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym DiGGeS
Project Discrete Groups and Geometric Structures
Researcher (PI) Fanny Solveig KASSEL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary Discrete subgroups of Lie groups, whose study originated in Fuchsian differential equations and crystallography at the end of the 19th century, are the basis of a large aspect of modern geometry. They are the object of fundamental theories such as Teichmüller theory, Kleinian groups, rigidity theories for lattices, homogeneous dynamics, and most recently Higher Teichmüller theory. They are closely related to the notion of a geometric structure on a manifold, which has played a crucial role in geometry since Thurston. In summary, discrete subgroups are a meeting point of geometry with Lie theory, differential equations, complex analysis, ergodic theory, representation theory, algebraic geometry, number theory, and mathematical physics, and these fascinating interactions make the subject extremely rich.
In real rank one, important classes of discrete subgroups of semisimple Lie groups are known for their good geometric, topological, and dynamical properties, such as convex cocompact or geometrically finite subgroups. In higher real rank, discrete groups beyond lattices remain quite mysterious. The goal of the project is to work towards a classification of discrete subgroups of semisimple Lie groups in higher real rank, from two complementary points of view. The first is actions on Riemannian symmetric spaces and their boundaries: important recent developments, in particular in the theory of Anosov representations, give hope to identify a number of meaningful classes of discrete groups which generalise in various ways the notions of convex cocompactness and geometric finiteness. The second point of view is actions on pseudo-Riemannian symmetric spaces: some very interesting geometric examples are now well understood, and recent links with the first point of view give hope to transfer progress from one side to the other. We expect powerful applications, both to the construction of proper actions on affine spaces and to the spectral theory of pseudo-Riemannian manifolds
Summary
Discrete subgroups of Lie groups, whose study originated in Fuchsian differential equations and crystallography at the end of the 19th century, are the basis of a large aspect of modern geometry. They are the object of fundamental theories such as Teichmüller theory, Kleinian groups, rigidity theories for lattices, homogeneous dynamics, and most recently Higher Teichmüller theory. They are closely related to the notion of a geometric structure on a manifold, which has played a crucial role in geometry since Thurston. In summary, discrete subgroups are a meeting point of geometry with Lie theory, differential equations, complex analysis, ergodic theory, representation theory, algebraic geometry, number theory, and mathematical physics, and these fascinating interactions make the subject extremely rich.
In real rank one, important classes of discrete subgroups of semisimple Lie groups are known for their good geometric, topological, and dynamical properties, such as convex cocompact or geometrically finite subgroups. In higher real rank, discrete groups beyond lattices remain quite mysterious. The goal of the project is to work towards a classification of discrete subgroups of semisimple Lie groups in higher real rank, from two complementary points of view. The first is actions on Riemannian symmetric spaces and their boundaries: important recent developments, in particular in the theory of Anosov representations, give hope to identify a number of meaningful classes of discrete groups which generalise in various ways the notions of convex cocompactness and geometric finiteness. The second point of view is actions on pseudo-Riemannian symmetric spaces: some very interesting geometric examples are now well understood, and recent links with the first point of view give hope to transfer progress from one side to the other. We expect powerful applications, both to the construction of proper actions on affine spaces and to the spectral theory of pseudo-Riemannian manifolds
Max ERC Funding
1 049 182 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym DIOCLES
Project Discrete bIOimaging perCeption for Longitudinal Organ modElling and computEr-aided diagnosiS
Researcher (PI) Nikolaos Paragyios
Host Institution (HI) ECOLE CENTRALE DES ARTS ET MANUFACTURES
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Recent hardware developments from the medical device manufacturers have made possible non-invasive/in-vivo acquisition of anatomical and physiological measurements. One can cite numerous emerging modalities (e.g. PET, fMRI, DTI). The nature (3D/multi-phase/vectorial) and the volume of this data make impossible in practice their interpretation from humans. On the other hand, these modalities can be used for early screening, therapeutic strategies evaluation as well as evaluating bio-markers for drugs development. Despite enormous progress made on the field of biomedical image analysis still a huge gap exists between clinical research and clinical use. The aim of this proposal is three-fold. First we would like to introduce a novel biomedical image perception framework for clinical use towards disease screening and drug evaluation. Such a framework is expected to be modular (can be used in various clinical settings), computationally efficient (would not require specialized hardware), and can provide a quantitative and qualitative anatomo-pathological indices. Second, leverage progress made on the field of machine learning along with novel, efficient, compact representation of clinical bio-markers toward computer aided diagnosis. Last, using these emerging multi-dimensional signals, we would like to perform longitudinal modelling and understanding the effects of aging to a number of organs and diseases that do not present pre-disease indicators such as brain neurological diseases, muscular diseases, certain forms of cancer, etc.
Such a challenging and pioneering effort lies on the interface of medicine (clinical context), biomedical imaging (choice of signals/modalities), machine learning (manifold representations of heterogeneous multivariate variables), discrete optimization (computationally efficient inference of higher-order models), and bio-medical image inference (measurement extraction and multi-modal fusion of heterogeneous information sources).
Summary
Recent hardware developments from the medical device manufacturers have made possible non-invasive/in-vivo acquisition of anatomical and physiological measurements. One can cite numerous emerging modalities (e.g. PET, fMRI, DTI). The nature (3D/multi-phase/vectorial) and the volume of this data make impossible in practice their interpretation from humans. On the other hand, these modalities can be used for early screening, therapeutic strategies evaluation as well as evaluating bio-markers for drugs development. Despite enormous progress made on the field of biomedical image analysis still a huge gap exists between clinical research and clinical use. The aim of this proposal is three-fold. First we would like to introduce a novel biomedical image perception framework for clinical use towards disease screening and drug evaluation. Such a framework is expected to be modular (can be used in various clinical settings), computationally efficient (would not require specialized hardware), and can provide a quantitative and qualitative anatomo-pathological indices. Second, leverage progress made on the field of machine learning along with novel, efficient, compact representation of clinical bio-markers toward computer aided diagnosis. Last, using these emerging multi-dimensional signals, we would like to perform longitudinal modelling and understanding the effects of aging to a number of organs and diseases that do not present pre-disease indicators such as brain neurological diseases, muscular diseases, certain forms of cancer, etc.
Such a challenging and pioneering effort lies on the interface of medicine (clinical context), biomedical imaging (choice of signals/modalities), machine learning (manifold representations of heterogeneous multivariate variables), discrete optimization (computationally efficient inference of higher-order models), and bio-medical image inference (measurement extraction and multi-modal fusion of heterogeneous information sources).
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym DISPEQ
Project Qualitative study of nonlinear dispersive equations
Researcher (PI) Nikolay Tzvetkov
Host Institution (HI) UNIVERSITE DE CERGY-PONTOISE
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary We plan to further improve the understanding of the nonlinear dispersive wave propagation phenomena. In particular we plan to develop tools allowing to make a statistical description of the corresponding flows and methods to study transverse stability independently of the very particular arguments based on the inverse scattering. We also plan to study critical problems in strongly non Euclidean geometries.
Summary
We plan to further improve the understanding of the nonlinear dispersive wave propagation phenomena. In particular we plan to develop tools allowing to make a statistical description of the corresponding flows and methods to study transverse stability independently of the very particular arguments based on the inverse scattering. We also plan to study critical problems in strongly non Euclidean geometries.
Max ERC Funding
880 270 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym DOFOCO
Project Do forests cool the Earth? Reconciling sustained productivity and minimum climate response with portfolios of contrasting forest management strategies
Researcher (PI) Sebastiaan Luyssaert
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), LS9, ERC-2009-StG
Summary Forests, of which globally 70% are managed, play a particularly important role in the global carbon cycle. Recently, forest management became a top priority on the agenda of the political negotiations to mitigate climate change because forest plantations may remove atmospheric CO2 and if used for energy production, the wood is a substitute for fossil fuel. However, this political imperative is at present running well ahead of the science required to deliver it. Despite the key implications of forest management on: 1) the carbon-energy-water balance, and 2) production, recreation and environmental protection, there are no integrated studies of its effects on the Earth s climate. The overall goal of DOFOCO is to quantify and understand the role of forest management in mitigating climate change. Specifically, I want to challenge the current focus on the carbon cycle and replace it with a total climate impact approach. Hence, the whole forest management spectrum ranging from short rotation coppice to old-growth forests will be analyzed for its effects on the water, energy and carbon cycles. Climate response of forest will be quantified by means of albedo, evapotranspiration, greenhouse gas sources and sinks and their resulting climate feedback mechanisms. The anticipated new quantitative results will be used to lay the foundations for a portfolio of management strategies which will sustain wood production while minimizing climate change impacts. DOFOCO is interdisciplinary and ground breaking because it brings together state-of-the art data and models from applied life and Earth system sciences; it will deliver the first quantitative insights into how forest management strategies can be linked to climate change mitigation.
Summary
Forests, of which globally 70% are managed, play a particularly important role in the global carbon cycle. Recently, forest management became a top priority on the agenda of the political negotiations to mitigate climate change because forest plantations may remove atmospheric CO2 and if used for energy production, the wood is a substitute for fossil fuel. However, this political imperative is at present running well ahead of the science required to deliver it. Despite the key implications of forest management on: 1) the carbon-energy-water balance, and 2) production, recreation and environmental protection, there are no integrated studies of its effects on the Earth s climate. The overall goal of DOFOCO is to quantify and understand the role of forest management in mitigating climate change. Specifically, I want to challenge the current focus on the carbon cycle and replace it with a total climate impact approach. Hence, the whole forest management spectrum ranging from short rotation coppice to old-growth forests will be analyzed for its effects on the water, energy and carbon cycles. Climate response of forest will be quantified by means of albedo, evapotranspiration, greenhouse gas sources and sinks and their resulting climate feedback mechanisms. The anticipated new quantitative results will be used to lay the foundations for a portfolio of management strategies which will sustain wood production while minimizing climate change impacts. DOFOCO is interdisciplinary and ground breaking because it brings together state-of-the art data and models from applied life and Earth system sciences; it will deliver the first quantitative insights into how forest management strategies can be linked to climate change mitigation.
Max ERC Funding
1 296 125 €
Duration
Start date: 2010-02-01, End date: 2015-10-31
Project acronym EQUALIS
Project EQualIS : Enhancing the Quality of Interacting Systems
Researcher (PI) Patricia Bouyer-Decitre
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary The ubiquitous use of computerized systems, and their increasing
complexity, demand formal evidences of their correctness. While
current formal-verification techniques have already been applied to a
number of case studies, they are not sufficient yet to fully analyze
several aspects of complex systems such as communication networks,
embedded systems or industrial controllers. There are three important
characteristics of these systems which need to be tackled:
- the rich interaction that crucially constrains the behaviour of
such systems is poorly taken into account in the actual models;
- the imprecisions and uncertainty inherent to systems that are
implemented (e.g. on a digital processor), or which interact via a
network, or which control physical equipments, are mostly ignored
by the verification process;
- the deployment of large interacting systems emphasizes the lack for
a modular approach to the synthesis of systems.
The goal of this project is to develop a systematic approach to the
formal analysis of interacting systems. We will use models from game
theory for properly taking into account the interaction in those
systems, and will propose quantitative measures of correctness and
quality, that will take into account the possible perturbations in the
systems. The core of the project will be the development of various
algorithms for synthesizing high-quality interactive systems. We will
be particularly attached to the modularity of the approach and to the
development of efficient algorithms. The EQualIS project will deeply
impact the design and verification of interacting systems, by
providing a rich framework, that will increase our confidence in the
analysis of such systems.
Summary
The ubiquitous use of computerized systems, and their increasing
complexity, demand formal evidences of their correctness. While
current formal-verification techniques have already been applied to a
number of case studies, they are not sufficient yet to fully analyze
several aspects of complex systems such as communication networks,
embedded systems or industrial controllers. There are three important
characteristics of these systems which need to be tackled:
- the rich interaction that crucially constrains the behaviour of
such systems is poorly taken into account in the actual models;
- the imprecisions and uncertainty inherent to systems that are
implemented (e.g. on a digital processor), or which interact via a
network, or which control physical equipments, are mostly ignored
by the verification process;
- the deployment of large interacting systems emphasizes the lack for
a modular approach to the synthesis of systems.
The goal of this project is to develop a systematic approach to the
formal analysis of interacting systems. We will use models from game
theory for properly taking into account the interaction in those
systems, and will propose quantitative measures of correctness and
quality, that will take into account the possible perturbations in the
systems. The core of the project will be the development of various
algorithms for synthesizing high-quality interactive systems. We will
be particularly attached to the modularity of the approach and to the
development of efficient algorithms. The EQualIS project will deeply
impact the design and verification of interacting systems, by
providing a rich framework, that will increase our confidence in the
analysis of such systems.
Max ERC Funding
1 497 431 €
Duration
Start date: 2013-01-01, End date: 2019-02-28
Project acronym EXPLOREMAPS
Project Combinatorial methods, from enumerative topology to random discrete structures and compact data representations
Researcher (PI) Gilles Schaeffer
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary "Our aim is to built on recent combinatorial and algorithmic progress to attack a series of deeply connected problems that have independantly surfaced in enumerative topology, statistical physics, and data compression. The relation between these problems lies in the notion of ""combinatorial map"", the natural discrete mathematical abstraction of objects with a 2-dimensional structures (like geographical maps, computer graphics' meshes, or 2d manifolds). A whole new set of properties of these maps has been uncovered in the last few years under the impulsion of the principal investigator. Rougly speaking, we have shown that classical graph exploration algorithms, when correctly applied to maps, lead to remarkable decompositions of the underlying surfaces. Our methods resort to algorithmic and enumerative combinatorics. In statistical physics, these decompositions offer an approach to the intrinsec geometry of discrete 2d quantum gravity: our method is here the first to outperform the celebrated ""topological expansion of matrix integrals"" of Brezin-Itzykson-Parisi-Zuber. Exploring its implications for the continuum limit of these random geometries is our great challenge now. From a computational geometry perspective, our approach yields the first encoding schemes with asymptotically optimal garanteed compression rates for the connectivity of triangular or polygonal meshes. These schemes improve on a long series of heuristically efficient but non optimal algorithms, and open the way to optimally compact data structures. Finally we have deep indications that the properties we have uncovered extend to the realm of ramified coverings of the sphere. Intriguing computations on the fundamental Hurwitz's numbers have been obtained using the ELSV formula, famous for its use by Okounkov et al. to rederive Kontsevich's model. We believe that further combinatorial progress here could allow to bypass the formula and obtaine an elementary explanation of these results."
Summary
"Our aim is to built on recent combinatorial and algorithmic progress to attack a series of deeply connected problems that have independantly surfaced in enumerative topology, statistical physics, and data compression. The relation between these problems lies in the notion of ""combinatorial map"", the natural discrete mathematical abstraction of objects with a 2-dimensional structures (like geographical maps, computer graphics' meshes, or 2d manifolds). A whole new set of properties of these maps has been uncovered in the last few years under the impulsion of the principal investigator. Rougly speaking, we have shown that classical graph exploration algorithms, when correctly applied to maps, lead to remarkable decompositions of the underlying surfaces. Our methods resort to algorithmic and enumerative combinatorics. In statistical physics, these decompositions offer an approach to the intrinsec geometry of discrete 2d quantum gravity: our method is here the first to outperform the celebrated ""topological expansion of matrix integrals"" of Brezin-Itzykson-Parisi-Zuber. Exploring its implications for the continuum limit of these random geometries is our great challenge now. From a computational geometry perspective, our approach yields the first encoding schemes with asymptotically optimal garanteed compression rates for the connectivity of triangular or polygonal meshes. These schemes improve on a long series of heuristically efficient but non optimal algorithms, and open the way to optimally compact data structures. Finally we have deep indications that the properties we have uncovered extend to the realm of ramified coverings of the sphere. Intriguing computations on the fundamental Hurwitz's numbers have been obtained using the ELSV formula, famous for its use by Okounkov et al. to rederive Kontsevich's model. We believe that further combinatorial progress here could allow to bypass the formula and obtaine an elementary explanation of these results."
Max ERC Funding
750 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym EXPLORERS
Project EXPLORERS Exploring epigenetic robotics: raising intelligence in machines
Researcher (PI) Pierre-Yves Oudeyer
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2009-StG
Summary In spite of considerable work in artificial intelligence, machine learning, and pattern recognition in the past 50 years, we have no machine capable of adapting to the physical and social environment with the flexibility, robustness and versatility of a 6-months old human child. Instead of trying to simulate directly the adult s intelligence, EXPLORERS proposes to focus on the developmental principles that give rise to intelligence in infants by re-implementing them in machines. Framed in the developmental/epigenetic robotics research agenda, and grounded in research in developmental psychology, its main target is to build robotic machines capable of autonomously learning and re-using a variety of skills and know-how that were not specified at design time, and with initially limited knowledge of the body and of the environment in which it will operate. This implies several fundamental issues: How can a robot discover its body and its relationships with the physical and social environment? How can it learn new skills without the intervention of an engineer? What internal motivations shall guide its exploration of vast spaces of skills? Can it learn through natural social interactions with humans? How to represent the learnt skills and how can they be re-used? EXPLORERS attacks directly those questions by proposing a series of fundamental scientific and technological advances, including computational intrinsic motivation systems for learning basic sensorimotor skills reused for grounded acquisition of the meaning of new words. This project not only addresses fundamental scientific questions, but also relates to important societal issues: personal home robots are bound to become part of everyday life in the 21st century, in particular as helpful social companions in an aging society. EXPLORERS objectives converge to the challenges implied by this vision: robots will have to be able to adapt and learn new skills in the unknown homes of users who are not engineers.
Summary
In spite of considerable work in artificial intelligence, machine learning, and pattern recognition in the past 50 years, we have no machine capable of adapting to the physical and social environment with the flexibility, robustness and versatility of a 6-months old human child. Instead of trying to simulate directly the adult s intelligence, EXPLORERS proposes to focus on the developmental principles that give rise to intelligence in infants by re-implementing them in machines. Framed in the developmental/epigenetic robotics research agenda, and grounded in research in developmental psychology, its main target is to build robotic machines capable of autonomously learning and re-using a variety of skills and know-how that were not specified at design time, and with initially limited knowledge of the body and of the environment in which it will operate. This implies several fundamental issues: How can a robot discover its body and its relationships with the physical and social environment? How can it learn new skills without the intervention of an engineer? What internal motivations shall guide its exploration of vast spaces of skills? Can it learn through natural social interactions with humans? How to represent the learnt skills and how can they be re-used? EXPLORERS attacks directly those questions by proposing a series of fundamental scientific and technological advances, including computational intrinsic motivation systems for learning basic sensorimotor skills reused for grounded acquisition of the meaning of new words. This project not only addresses fundamental scientific questions, but also relates to important societal issues: personal home robots are bound to become part of everyday life in the 21st century, in particular as helpful social companions in an aging society. EXPLORERS objectives converge to the challenges implied by this vision: robots will have to be able to adapt and learn new skills in the unknown homes of users who are not engineers.
Max ERC Funding
1 572 215 €
Duration
Start date: 2009-12-01, End date: 2015-05-31
Project acronym EXPROTEA
Project Exploring Relations in Structured Data with Functional Maps
Researcher (PI) Maksims OVSJANIKOVS
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary We propose to lay the theoretical foundations and design efficient computational methods for analyzing, quantifying and exploring relations and variability in structured data sets, such as collections of geometric shapes, point clouds, and large networks or graphs, among others. Unlike existing methods that are tied and often limited to the underlying data representation, our goal is to design a unified framework in which variability can be processed in a way that is largely agnostic to the underlying data type.
In particular, we propose to depart from the standard representations of objects as collections of primitives, such as points or triangles, and instead to treat them as functional spaces that can be easily manipulated and analyzed. Since real-valued functions can be defined on a wide variety of data representations and as they enjoy a rich algebraic structure, such an approach can provide a completely novel unified framework for representing and processing different types of data. Key to our study will be the exploration of relations and variability between objects, which can be expressed as operators acting on functions and thus treated and analyzed as objects in their own right using the vast number of tools from functional analysis in theory and numerical linear algebra in practice.
Such a unified computational framework of variability will enable entirely novel applications including accurate shape matching, efficiently tracking and highlighting most relevant changes in evolving systems, such as dynamic graphs, and analysis of shape collections. Thus, it will permit not only to compare or cluster objects, but also to reveal where and how they are different and what makes instances unique, which can be especially useful in medical imaging applications. Ultimately, we expect our study to create to a new rigorous, unified paradigm for computational variability, providing a common language and sets of tools applicable across diverse underlying domains.
Summary
We propose to lay the theoretical foundations and design efficient computational methods for analyzing, quantifying and exploring relations and variability in structured data sets, such as collections of geometric shapes, point clouds, and large networks or graphs, among others. Unlike existing methods that are tied and often limited to the underlying data representation, our goal is to design a unified framework in which variability can be processed in a way that is largely agnostic to the underlying data type.
In particular, we propose to depart from the standard representations of objects as collections of primitives, such as points or triangles, and instead to treat them as functional spaces that can be easily manipulated and analyzed. Since real-valued functions can be defined on a wide variety of data representations and as they enjoy a rich algebraic structure, such an approach can provide a completely novel unified framework for representing and processing different types of data. Key to our study will be the exploration of relations and variability between objects, which can be expressed as operators acting on functions and thus treated and analyzed as objects in their own right using the vast number of tools from functional analysis in theory and numerical linear algebra in practice.
Such a unified computational framework of variability will enable entirely novel applications including accurate shape matching, efficiently tracking and highlighting most relevant changes in evolving systems, such as dynamic graphs, and analysis of shape collections. Thus, it will permit not only to compare or cluster objects, but also to reveal where and how they are different and what makes instances unique, which can be especially useful in medical imaging applications. Ultimately, we expect our study to create to a new rigorous, unified paradigm for computational variability, providing a common language and sets of tools applicable across diverse underlying domains.
Max ERC Funding
1 499 845 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym FAnFArE
Project Fourier Analysis For/And Partial Differential Equations
Researcher (PI) Frederic, Jérôme, Louis Bernicot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary "This project aims to develop the field of Harmonic Analysis, and more precisely to study problems at the interface between Fourier Analysis and PDEs (and also some Geometry).
We are interested in two aspects of the Fourier Analysis:
(1) The Euclidean Fourier Analysis, where a deep analysis can be performed using specificities as the notion of "frequencies" (involving the Fourier transform) or the geometry of the Euclidean balls. By taking advantage of them, this proposal aims to pursue the study and bring novelties in three fashionable topics: the study of bilinear/multilinear Fourier multipliers, the development of the "space-time resonances" method in a systematic way and for some specific PDEs, and the study of nonlinear transport equations in BMO-type spaces (as Euler and Navier-Stokes equations).
(2) A Functional Fourier Analysis, which can be performed in a more general situation using the notion of "oscillation" adapted to a heat semigroup (or semigroup of operators). This second Challenge is (at the same time) independent of the first one and also very close. It is very close, due to the same point of view of Fourier Analysis involving a space decomposition and simultaneously some frequency decomposition. However they are quite independent because the main goal is to extend/develop an analysis in the more general framework given by a semigroup of operators (so without using the previous Euclidean specificities). By this way, we aim to transfer some results known in the Euclidean situation to some Riemannian manifolds, Fractals sets, bounded open set setting, ... Still having in mind some applications to the study of PDEs, such questions make also a connexion with the geometry of the ambient spaces (by its Riesz transform, Poincaré inequality, ...). I propose here to attack different problems as dispersive estimates, ""L^p""-version of De Giorgi inequalities and the study of paraproducts, all of them with a heat semigroup point of view."
Summary
"This project aims to develop the field of Harmonic Analysis, and more precisely to study problems at the interface between Fourier Analysis and PDEs (and also some Geometry).
We are interested in two aspects of the Fourier Analysis:
(1) The Euclidean Fourier Analysis, where a deep analysis can be performed using specificities as the notion of "frequencies" (involving the Fourier transform) or the geometry of the Euclidean balls. By taking advantage of them, this proposal aims to pursue the study and bring novelties in three fashionable topics: the study of bilinear/multilinear Fourier multipliers, the development of the "space-time resonances" method in a systematic way and for some specific PDEs, and the study of nonlinear transport equations in BMO-type spaces (as Euler and Navier-Stokes equations).
(2) A Functional Fourier Analysis, which can be performed in a more general situation using the notion of "oscillation" adapted to a heat semigroup (or semigroup of operators). This second Challenge is (at the same time) independent of the first one and also very close. It is very close, due to the same point of view of Fourier Analysis involving a space decomposition and simultaneously some frequency decomposition. However they are quite independent because the main goal is to extend/develop an analysis in the more general framework given by a semigroup of operators (so without using the previous Euclidean specificities). By this way, we aim to transfer some results known in the Euclidean situation to some Riemannian manifolds, Fractals sets, bounded open set setting, ... Still having in mind some applications to the study of PDEs, such questions make also a connexion with the geometry of the ambient spaces (by its Riesz transform, Poincaré inequality, ...). I propose here to attack different problems as dispersive estimates, ""L^p""-version of De Giorgi inequalities and the study of paraproducts, all of them with a heat semigroup point of view."
Max ERC Funding
940 540 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym FELICITY
Project Foundations of Efficient Lattice Cryptography
Researcher (PI) Vadim Lyubashevsky
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Public key cryptography is the backbone of internet security. Yet it is very likely that within the next few decades some government or corporate entity will succeed in building a general-purpose quantum computer that is capable of breaking all of today's public key protocols. Lattice cryptography, which appears to be resilient to quantum attacks, is currently viewed as the most promising candidate to take over as the basis for cryptography in the future. Recent theoretical breakthroughs have additionally shown that lattice cryptography may even allow for constructions of primitives with novel capabilities. But even though the progress in this latter area has been considerable, the resulting schemes are still extremely impractical.
The central objective of the FELICITY project is to substantially expand the boundaries of efficient lattice-based cryptography. This includes improving on the most crucial cryptographic protocols, some of which are already considered practical, as well as pushing towards efficiency in areas that currently seem out of reach. The methodology that we propose to use differs from the bulk of the research being done today. Rather than directly working on advanced primitives in which practical considerations are ignored, the focus of the project will be on finding novel ways in which to break the most fundamental barriers that are standing in the way of practicality. For this, I believe it is productive to concentrate on building schemes that stand at the frontier of what is considered efficient -- because it is there that the most critical barriers are most apparent. And since cryptographic techniques usually propagate themselves from simple to advanced primitives, improved solutions for the fundamental ones will eventually serve as building blocks for practical constructions of schemes having advanced capabilities.
Summary
Public key cryptography is the backbone of internet security. Yet it is very likely that within the next few decades some government or corporate entity will succeed in building a general-purpose quantum computer that is capable of breaking all of today's public key protocols. Lattice cryptography, which appears to be resilient to quantum attacks, is currently viewed as the most promising candidate to take over as the basis for cryptography in the future. Recent theoretical breakthroughs have additionally shown that lattice cryptography may even allow for constructions of primitives with novel capabilities. But even though the progress in this latter area has been considerable, the resulting schemes are still extremely impractical.
The central objective of the FELICITY project is to substantially expand the boundaries of efficient lattice-based cryptography. This includes improving on the most crucial cryptographic protocols, some of which are already considered practical, as well as pushing towards efficiency in areas that currently seem out of reach. The methodology that we propose to use differs from the bulk of the research being done today. Rather than directly working on advanced primitives in which practical considerations are ignored, the focus of the project will be on finding novel ways in which to break the most fundamental barriers that are standing in the way of practicality. For this, I believe it is productive to concentrate on building schemes that stand at the frontier of what is considered efficient -- because it is there that the most critical barriers are most apparent. And since cryptographic techniques usually propagate themselves from simple to advanced primitives, improved solutions for the fundamental ones will eventually serve as building blocks for practical constructions of schemes having advanced capabilities.
Max ERC Funding
1 311 688 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym FLEXABLE
Project Deformable Multiple-View Geometry and 3D Reconstruction, with Application to Minimally Invasive Surgery
Researcher (PI) Adrien Bartoli
Host Institution (HI) UNIVERSITE CLERMONT AUVERGNE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary Project FLEXABLE lies in the field of 3D Computer Vision, which seeks to recover depth or the 3D shape of the observed environment from images. One of the most successful and mature techniques in 3D Computer Vision is Shape-from-Motion which is based on the well-established theory of Multiple-View Geometry. This uses multiple images and assumes that the environment is rigid.
The world is however made of objects which move and undergo deformations. Researchers have tried to extend Shape-from-Motion to a deformable environment for about a decade, yet with only very limited success to date. We believe that there are two main reasons for this. Firstly there is still a lack of a solid theory for Deformable Shape-from-Motion. Fundamental questions, such as what kinds of deformation can facilitate unambiguous 3D reconstruction, are not yet answered. Secondly practical solutions have not yet come about: for accurate and dense 3D shape results, the Motion cue must be combined with other visual cues, since it is certainly weaker in the deformable case. It may require strong object-specific priors, needing one to bridge the gap with object recognition.
This project develops these two key areas. It includes three main lines of research: theory, its computational implementation, and its real-world application. Deformable Multiple-View Geometry will generalize the existing rigid theory and will provide researchers with a rigorous mathematical framework that underpins the use of Motion as a proper visual cue for Deformable 3D Reconstruction. Our theory will require us to introduce new mathematical tools from differentiable projective manifolds. Our implementation will study and develop new computational means for solving the difficult inverse problems formulated in our theory. Finally, we will develop cutting-edge applications of our framework specific to Minimally Invasive Surgery, for which there is a very high need for 3D computer vision.
Summary
Project FLEXABLE lies in the field of 3D Computer Vision, which seeks to recover depth or the 3D shape of the observed environment from images. One of the most successful and mature techniques in 3D Computer Vision is Shape-from-Motion which is based on the well-established theory of Multiple-View Geometry. This uses multiple images and assumes that the environment is rigid.
The world is however made of objects which move and undergo deformations. Researchers have tried to extend Shape-from-Motion to a deformable environment for about a decade, yet with only very limited success to date. We believe that there are two main reasons for this. Firstly there is still a lack of a solid theory for Deformable Shape-from-Motion. Fundamental questions, such as what kinds of deformation can facilitate unambiguous 3D reconstruction, are not yet answered. Secondly practical solutions have not yet come about: for accurate and dense 3D shape results, the Motion cue must be combined with other visual cues, since it is certainly weaker in the deformable case. It may require strong object-specific priors, needing one to bridge the gap with object recognition.
This project develops these two key areas. It includes three main lines of research: theory, its computational implementation, and its real-world application. Deformable Multiple-View Geometry will generalize the existing rigid theory and will provide researchers with a rigorous mathematical framework that underpins the use of Motion as a proper visual cue for Deformable 3D Reconstruction. Our theory will require us to introduce new mathematical tools from differentiable projective manifolds. Our implementation will study and develop new computational means for solving the difficult inverse problems formulated in our theory. Finally, we will develop cutting-edge applications of our framework specific to Minimally Invasive Surgery, for which there is a very high need for 3D computer vision.
Max ERC Funding
1 481 294 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym FOVEDIS
Project Formal specification and verification of distributed data structures
Researcher (PI) Constantin Enea
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Summary
The future of the computing technology relies on fast access, transformation, and exchange of data across large-scale networks such as the Internet. The design of software systems that support high-frequency parallel accesses to high-quantity data is a fundamental challenge. As more scalable alternatives to traditional relational databases, distributed data structures (DDSs) are at the basis of a wide range of automated services, for now, and for the foreseeable future.
This proposal aims to improve our understanding of the theoretical foundations of DDSs. The design and the usage of DDSs are based on new principles, for which we currently lack rigorous engineering methodologies. Specifically, we lack design procedures based on precise specifications, and automated reasoning techniques for enhancing the reliability of the engineering process.
The targeted breakthrough of this proposal is developing automated formal methods for rigorous engineering of DDSs. A first objective is to define coherent formal specifications that provide precise requirements at design time and explicit guarantees during their usage. Then, we will investigate practical programming principles, compatible with these specifications, for building applications that use DDSs. Finally, we will develop efficient automated reasoning techniques for debugging or validating DDS implementations against their specifications. The principles underlying automated reasoning are also important for identifying best practices in the design of these complex systems to increase confidence in their correctness. The developed methodologies based on formal specifications will thus benefit both the conception and automated validation of DDS implementations and the applications that use them.
Max ERC Funding
1 300 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym GADA
Project Group Actions: Interactions between Dynamical Systems and Arithmetic
Researcher (PI) Emmanuel Breuillard
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary "Our main goal is to apply the powerful analytical tools that are now emerging from areas of more ""applicable"" parts of mathematics such as ergodic theory, random walks, harmonic analysis and additive combinatorics to some longstanding open problems in more theoretical parts of mathematics such as group theory and number theory. The recent work of Green and Tao about arithmetic progressions of prime numbers, or Margulis' celebrated solution of the Oppenheim Conjecture about integer values of quadratic forms are examples of the growing interpenetration of such seemingly unrelated fields. We have in mind an explicit set of problems: a uniform Tits alternative, the equidistribution of dense subgroups, the Andre-Oort conjecture, the spectral gap conjecture, the Lehmer problem. All these questions involve group theory in various forms (discrete subgroups of Lie groups, representation theory and spectral theory, locally symmetric spaces and Shimura varieties, dynamics on homogeneous spaces of arithmetic origin, Cayley graphs of large finite groups, etc) and have also a number theoretic flavor. Their striking common feature is that each of them enjoys some intimate relationship, whether by the foreseen methods to tackle it or by its consequences, with ergodic theory on the one hand and harmonic analysis and combinatorics on the other. We believe that the new methods being currently developed in those fields will bring crucial insights to the problems at hand. This proposed research builds on previous results obtained by the author and addresses some of the most challenging open problems in the field."
Summary
"Our main goal is to apply the powerful analytical tools that are now emerging from areas of more ""applicable"" parts of mathematics such as ergodic theory, random walks, harmonic analysis and additive combinatorics to some longstanding open problems in more theoretical parts of mathematics such as group theory and number theory. The recent work of Green and Tao about arithmetic progressions of prime numbers, or Margulis' celebrated solution of the Oppenheim Conjecture about integer values of quadratic forms are examples of the growing interpenetration of such seemingly unrelated fields. We have in mind an explicit set of problems: a uniform Tits alternative, the equidistribution of dense subgroups, the Andre-Oort conjecture, the spectral gap conjecture, the Lehmer problem. All these questions involve group theory in various forms (discrete subgroups of Lie groups, representation theory and spectral theory, locally symmetric spaces and Shimura varieties, dynamics on homogeneous spaces of arithmetic origin, Cayley graphs of large finite groups, etc) and have also a number theoretic flavor. Their striking common feature is that each of them enjoys some intimate relationship, whether by the foreseen methods to tackle it or by its consequences, with ergodic theory on the one hand and harmonic analysis and combinatorics on the other. We believe that the new methods being currently developed in those fields will bring crucial insights to the problems at hand. This proposed research builds on previous results obtained by the author and addresses some of the most challenging open problems in the field."
Max ERC Funding
750 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym GALE
Project Games and Automata for Logic Extensions
Researcher (PI) Thomas Colcombet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary This proposal aims at generalising the central decidability results of Büchi and Rabin to more general settings, and understanding the consequences of those extensions both at theoretical and applicative levels.
The original results of Büchi and Rabin state the decidability of monadic second-order logic (monadic logic for short) over infinite words and trees respectively. Those results are of such importance that Rabin's theorem is also called the `Mother of all decidability results'.
The primary goal of this project is to demonstrate that it is possible to go significantly beyond the expressiveness of monadic logic, while retaining similar decidability results. We are considering extensions in two distinct directions. The first consists of enriching the logic with the ability to speak in a weak form about set cardinality. The second direction is an extension of monadic logic by topological capabilities. Those two branches form the core of the proposal.
The second aspect of this proposal is the study of the `applicability' of this theory. Three tasks are devoted to this. The first task is to precisely determine the cost of using the more complex techniques we will be developing rather than using the classical theory. The second task will be devoted to the description and the study of weaker formalisms allowing better complexities, namely corresponding temporal logics. Finally, the last task will be devoted to the study of related model checking problems.
The result of the completion of this program would be twofold. At a theoretical level, new deep results would be obtained, and new techniques in automata, logic and games would be developed for solving them. At a more applicative level, this program would define and validate new directions of research in the domain of the verification of open systems and related problems.
Summary
This proposal aims at generalising the central decidability results of Büchi and Rabin to more general settings, and understanding the consequences of those extensions both at theoretical and applicative levels.
The original results of Büchi and Rabin state the decidability of monadic second-order logic (monadic logic for short) over infinite words and trees respectively. Those results are of such importance that Rabin's theorem is also called the `Mother of all decidability results'.
The primary goal of this project is to demonstrate that it is possible to go significantly beyond the expressiveness of monadic logic, while retaining similar decidability results. We are considering extensions in two distinct directions. The first consists of enriching the logic with the ability to speak in a weak form about set cardinality. The second direction is an extension of monadic logic by topological capabilities. Those two branches form the core of the proposal.
The second aspect of this proposal is the study of the `applicability' of this theory. Three tasks are devoted to this. The first task is to precisely determine the cost of using the more complex techniques we will be developing rather than using the classical theory. The second task will be devoted to the description and the study of weaker formalisms allowing better complexities, namely corresponding temporal logics. Finally, the last task will be devoted to the study of related model checking problems.
The result of the completion of this program would be twofold. At a theoretical level, new deep results would be obtained, and new techniques in automata, logic and games would be developed for solving them. At a more applicative level, this program would define and validate new directions of research in the domain of the verification of open systems and related problems.
Max ERC Funding
931 760 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym GAN
Project Groups, Actions and von Neumann algebras
Researcher (PI) Cyril Houdayer
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary This research project focuses on the structure, classification and rigidity of three closely related objects: group actions on measure spaces, orbit equivalence relations and von Neumann algebras. Over the last 15 years, the study of interactions between these three topics has led to a process of mutual enrichment, providing both striking theorems and outstanding conjectures.
Some fundamental questions such as Connes' rigidity conjecture, the structure of von Neumann algebras associated with higher rank lattices, or the fine classification of factors of type III still remain untouched. The general aim of the project is to tackle these problems and other related questions by developing a further analysis and understanding of the interplay between von Neumann algebra theory on the one hand, as well as ergodic and group theory on the other hand. To do so, I will use and combine several tools and develop new ones arising from Popa's Deformation/Rigidity theory, Lie group theory (lattices, boundaries), topological and geometric group theory and representation group theory (amenability, property (T)). More specifically, the main directions of my research project are:
1) The structure of the von Neumann algebras arising from Voiculescu's Free Probability theory: Shlyakhtenko's free Araki-Woods factors, amalgamated free product von Neumann algebras and the free group factors.
2) The structure and the classification of the von Neumann algebras and the measured equivalence relations arising from lattices in higher rank semisimple connected Lie groups.
3) The measure equivalence rigidity of the Baumslag-Solitar groups and several other classes of discrete groups acting on trees.
Summary
This research project focuses on the structure, classification and rigidity of three closely related objects: group actions on measure spaces, orbit equivalence relations and von Neumann algebras. Over the last 15 years, the study of interactions between these three topics has led to a process of mutual enrichment, providing both striking theorems and outstanding conjectures.
Some fundamental questions such as Connes' rigidity conjecture, the structure of von Neumann algebras associated with higher rank lattices, or the fine classification of factors of type III still remain untouched. The general aim of the project is to tackle these problems and other related questions by developing a further analysis and understanding of the interplay between von Neumann algebra theory on the one hand, as well as ergodic and group theory on the other hand. To do so, I will use and combine several tools and develop new ones arising from Popa's Deformation/Rigidity theory, Lie group theory (lattices, boundaries), topological and geometric group theory and representation group theory (amenability, property (T)). More specifically, the main directions of my research project are:
1) The structure of the von Neumann algebras arising from Voiculescu's Free Probability theory: Shlyakhtenko's free Araki-Woods factors, amalgamated free product von Neumann algebras and the free group factors.
2) The structure and the classification of the von Neumann algebras and the measured equivalence relations arising from lattices in higher rank semisimple connected Lie groups.
3) The measure equivalence rigidity of the Baumslag-Solitar groups and several other classes of discrete groups acting on trees.
Max ERC Funding
876 750 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym GECOMETHODS
Project Geometric control methods for heat and Schroedinger equations
Researcher (PI) Ugo Boscain
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary "The aim of this project of 5 years is to create a research group on geometric control methods in PDEs with the arrival of the PI at the CNRS Laboratoire CMAP (Centre de Mathematiques Appliquees) of the Ecole Polytechnique in Paris (in January 09). With the ERC-Starting Grant, the PI plans to hire 4 post-doc fellows, 2 PhD students and also to organize advanced research schools and workshops. One of the main purpose of this project is to facilitate the collaboration with my research group which is quite spread across France and Italy. The PI plans to develop a research group studying certain PDEs for which geometric control techniques open new horizons. More precisely the PI plans to exploit the relation between the sub-Riemannian distance and the properties of the kernel of the corresponding hypoelliptic heat equation and to study controllability properties of the Schroedinger equation. In the last years the PI has developed a net of high level international collaborations and, together with his collaborators and PhD students, has obtained many important results via a mixed combination of geometric methods in control (Hamiltonian methods, Lie group techniques, conjugate point theory, singularity theory etc.) and noncommutative Fourier analysis. This has allowed to solve open problems in the field, e.g., the definition of an intrinsic hypoelliptic Laplacian, the explicit construction of the hypoelliptic heat kernel for the most important 3D Lie groups, and the proof of the controllability of the bilinear Schroedinger equation with discrete spectrum, under some ""generic"" assumptions. Many more related questions are still open and the scope of this project is to tackle them. All subjects studied in this project have real applications: the problem of controllability of the Schroedinger equation has direct applications in Nuclear Magnetic Resonance; the problem of nonisotropic diffusion has applications in models of human vision."
Summary
"The aim of this project of 5 years is to create a research group on geometric control methods in PDEs with the arrival of the PI at the CNRS Laboratoire CMAP (Centre de Mathematiques Appliquees) of the Ecole Polytechnique in Paris (in January 09). With the ERC-Starting Grant, the PI plans to hire 4 post-doc fellows, 2 PhD students and also to organize advanced research schools and workshops. One of the main purpose of this project is to facilitate the collaboration with my research group which is quite spread across France and Italy. The PI plans to develop a research group studying certain PDEs for which geometric control techniques open new horizons. More precisely the PI plans to exploit the relation between the sub-Riemannian distance and the properties of the kernel of the corresponding hypoelliptic heat equation and to study controllability properties of the Schroedinger equation. In the last years the PI has developed a net of high level international collaborations and, together with his collaborators and PhD students, has obtained many important results via a mixed combination of geometric methods in control (Hamiltonian methods, Lie group techniques, conjugate point theory, singularity theory etc.) and noncommutative Fourier analysis. This has allowed to solve open problems in the field, e.g., the definition of an intrinsic hypoelliptic Laplacian, the explicit construction of the hypoelliptic heat kernel for the most important 3D Lie groups, and the proof of the controllability of the bilinear Schroedinger equation with discrete spectrum, under some ""generic"" assumptions. Many more related questions are still open and the scope of this project is to tackle them. All subjects studied in this project have real applications: the problem of controllability of the Schroedinger equation has direct applications in Nuclear Magnetic Resonance; the problem of nonisotropic diffusion has applications in models of human vision."
Max ERC Funding
785 000 €
Duration
Start date: 2010-05-01, End date: 2016-04-30
Project acronym GEM
Project From Geometry to Motion: inverse modeling of complex mechanical structures
Researcher (PI) Florence Bertails-Descoubes
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary With the considerable advance of automatic image-based capture in Computer Vision and Computer Graphics these latest years, it becomes now affordable to acquire quickly and precisely the full 3D geometry of many mechanical objects featuring intricate shapes. Yet, while more and more geometrical data get collected and shared among the communities, there is currently very little study about how to infer the underlying mechanical properties of the captured objects merely from their geometrical configurations.
The GEM challenge consists in developing a non-invasive method for inferring the mechanical properties of complex objects from a minimal set of geometrical poses, in order to predict their dynamics. In contrast to classical inverse reconstruction methods, my proposal is built upon the claim that 1/ the mere geometrical shape of physical objects reveals a lot about their underlying mechanical properties and 2/ this property can be fully leveraged for a wide range of objects featuring rich geometrical configurations, such as slender structures subject to frictional contact (e.g., folded cloth or twined filaments).
To achieve this goal, we shall develop an original inverse modeling strategy based upon a/ the design of reduced and high-order discrete models for slender mechanical structures including rods, plates and shells, b/ a compact and well-posed mathematical formulation of our nonsmooth inverse problems, both in the static and dynamic cases, c/ the design of robust and efficient numerical tools for solving such complex problems, and d/ a thorough experimental validation of our methods relying on the most recent capturing tools.
In addition to significant advances in fast image-based measurement of diverse mechanical materials stemming from physics, biology, or manufacturing, this research is expected in the long run to ease considerably the design of physically realistic virtual worlds, as well as to boost the creation of dynamic human doubles.
Summary
With the considerable advance of automatic image-based capture in Computer Vision and Computer Graphics these latest years, it becomes now affordable to acquire quickly and precisely the full 3D geometry of many mechanical objects featuring intricate shapes. Yet, while more and more geometrical data get collected and shared among the communities, there is currently very little study about how to infer the underlying mechanical properties of the captured objects merely from their geometrical configurations.
The GEM challenge consists in developing a non-invasive method for inferring the mechanical properties of complex objects from a minimal set of geometrical poses, in order to predict their dynamics. In contrast to classical inverse reconstruction methods, my proposal is built upon the claim that 1/ the mere geometrical shape of physical objects reveals a lot about their underlying mechanical properties and 2/ this property can be fully leveraged for a wide range of objects featuring rich geometrical configurations, such as slender structures subject to frictional contact (e.g., folded cloth or twined filaments).
To achieve this goal, we shall develop an original inverse modeling strategy based upon a/ the design of reduced and high-order discrete models for slender mechanical structures including rods, plates and shells, b/ a compact and well-posed mathematical formulation of our nonsmooth inverse problems, both in the static and dynamic cases, c/ the design of robust and efficient numerical tools for solving such complex problems, and d/ a thorough experimental validation of our methods relying on the most recent capturing tools.
In addition to significant advances in fast image-based measurement of diverse mechanical materials stemming from physics, biology, or manufacturing, this research is expected in the long run to ease considerably the design of physically realistic virtual worlds, as well as to boost the creation of dynamic human doubles.
Max ERC Funding
1 498 570 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym GEODYCON
Project Geometry and dynamics via contact topology
Researcher (PI) Vincent Maurice Colin
Host Institution (HI) UNIVERSITE DE NANTES
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary I intend to cross ressources of holomorphic curves techniques and traditional topological methods to study some fundamental questions in symplectic and contact geometry such as:
- The Weinstein conjecture in dimension greater than 3.
- The construction of new invariants for both smooth manifolds and Legendrian/contact manifolds, in particular, try to define an analogue of Heegaard Floer homology in dimension larger than 3.
- The link, in dimension 3, between the geometry of the ambient manifold (especially hyperbolicity) and the dynamical/topological properties of its Reeb vector fields and contact structures.
- The topological characterization of odd-dimensional manifolds admitting a contact structure.
A crucial ingredient of my program is to understand the key role played by open book decompositions in dimensions larger than three.
This program requires a huge amount of mathematical knowledges. My idea is to organize a team around Ghiggini, Laudenbach, Rollin, Sandon and myself, augmented by two post-docs and one PhD student funded by the project. This will give us the critical size to organize a very active working seminar and to have a worldwide attractivity and recognition.
I also plan to invite one confirmed researcher every year (for 1-2 months), to organize one conference and one summer school, as well as several focused weeks.
Summary
I intend to cross ressources of holomorphic curves techniques and traditional topological methods to study some fundamental questions in symplectic and contact geometry such as:
- The Weinstein conjecture in dimension greater than 3.
- The construction of new invariants for both smooth manifolds and Legendrian/contact manifolds, in particular, try to define an analogue of Heegaard Floer homology in dimension larger than 3.
- The link, in dimension 3, between the geometry of the ambient manifold (especially hyperbolicity) and the dynamical/topological properties of its Reeb vector fields and contact structures.
- The topological characterization of odd-dimensional manifolds admitting a contact structure.
A crucial ingredient of my program is to understand the key role played by open book decompositions in dimensions larger than three.
This program requires a huge amount of mathematical knowledges. My idea is to organize a team around Ghiggini, Laudenbach, Rollin, Sandon and myself, augmented by two post-docs and one PhD student funded by the project. This will give us the critical size to organize a very active working seminar and to have a worldwide attractivity and recognition.
I also plan to invite one confirmed researcher every year (for 1-2 months), to organize one conference and one summer school, as well as several focused weeks.
Max ERC Funding
887 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym GEOPARDI
Project Numerical integration of Geometric Partial Differential Equations
Researcher (PI) Erwan Faou
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary "The goal of this project is to develop new numerical methods for the approximation of evolution equations possessing strong geometric properties such as Hamiltonian systems or stochastic differential equations. In such situations the exact solutions endow with many physical properties that are consequences of the geometric structure: Preservation of the total energy, momentum conservation or existence of ergodic invariant measures. However the preservation of such qualitative properties of the original system by numerical methods at a reasonable cost is not guaranteed at all, even for very precise (high order) methods.
The principal aim of geometric numerical integration is the understanding and analysis of such problems: How (and to which extend) reproduce qualitative behavior of differential equations over long time? The extension of this theory to partial differential equations is a fundamental ongoing challenge, which require the invention of a new mathematical framework bridging the most recent techniques used in the theory of nonlinear PDEs and stochastic ordinary and partial differential equations. The development of new efficient numerical schemes for geometric PDEs has to go together with the most recent progress in analysis (stability phenomena, energy transfers, multiscale problems, etc..)
The major challenges of the project are to derive new schemes by bridging the world of numerical simulation and the analysis community, and to consider deterministic and stochastic equations, with a general aim at deriving hybrid methods. We also aim to create a research platform devoted to extensive numerical simulations of difficult academic PDEs in order to highlight new nonlinear phenomena and test numerical methods."
Summary
"The goal of this project is to develop new numerical methods for the approximation of evolution equations possessing strong geometric properties such as Hamiltonian systems or stochastic differential equations. In such situations the exact solutions endow with many physical properties that are consequences of the geometric structure: Preservation of the total energy, momentum conservation or existence of ergodic invariant measures. However the preservation of such qualitative properties of the original system by numerical methods at a reasonable cost is not guaranteed at all, even for very precise (high order) methods.
The principal aim of geometric numerical integration is the understanding and analysis of such problems: How (and to which extend) reproduce qualitative behavior of differential equations over long time? The extension of this theory to partial differential equations is a fundamental ongoing challenge, which require the invention of a new mathematical framework bridging the most recent techniques used in the theory of nonlinear PDEs and stochastic ordinary and partial differential equations. The development of new efficient numerical schemes for geometric PDEs has to go together with the most recent progress in analysis (stability phenomena, energy transfers, multiscale problems, etc..)
The major challenges of the project are to derive new schemes by bridging the world of numerical simulation and the analysis community, and to consider deterministic and stochastic equations, with a general aim at deriving hybrid methods. We also aim to create a research platform devoted to extensive numerical simulations of difficult academic PDEs in order to highlight new nonlinear phenomena and test numerical methods."
Max ERC Funding
971 772 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym GEOWAKI
Project The analysis of geometric non-linear wave and kinetic equations
Researcher (PI) Jacques, Alexandre SMULEVICI
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The present proposal is concerned with the analysis of geometric non-linear wave equations, such as the Einstein equations, as well as coupled systems of wave and kinetic equations such as the Vlasov-Maxwell and Einstein-Vlasov equations. We intend to pursue three main lines of research, each of them concerning major open problems in the field.
I) The dynamics in a neighbourhood of the Anti-de-Sitter space with various boundary conditions.
This is a fundamental open problem of mathematical physics which aims at understanding the stability or instability properties of one of the simplest solutions to the Einstein equations. On top of its intrinsic mathematical interest, this question is also at the heart of an intense research activity in the theoretical physics community.
II) Non-linear systems of wave and kinetic equations. We have recently found out that the so-called vector field method of Klainerman, a fundamental tool in the study of quasilinear wave equations, in fact possesses a complete analogue in the case of kinetic transport equations. This opens the way to many new directions of research, with applications to several fundamental systems of kinetic theory, such as the Einstein-Vlasov or Vlasov-Maxwell systems, and creates a link between two areas of PDEs which have typically been studied via different methods. One of our objectives is to develop other potential links, such as a general analysis of null forms for relativistic kinetic equations.
III) The Einstein equations with data on a compact manifold. The long time dynamics of solutions to the Einstein equations arising from initial data given on a compact manifold is still very poorly understood. In particular, there is still no known stable asymptotic regime for the Einstein equations with data given on a simple manifold such as the torus. We intend to establish the existence of such a stable asymptotic regime.
Summary
The present proposal is concerned with the analysis of geometric non-linear wave equations, such as the Einstein equations, as well as coupled systems of wave and kinetic equations such as the Vlasov-Maxwell and Einstein-Vlasov equations. We intend to pursue three main lines of research, each of them concerning major open problems in the field.
I) The dynamics in a neighbourhood of the Anti-de-Sitter space with various boundary conditions.
This is a fundamental open problem of mathematical physics which aims at understanding the stability or instability properties of one of the simplest solutions to the Einstein equations. On top of its intrinsic mathematical interest, this question is also at the heart of an intense research activity in the theoretical physics community.
II) Non-linear systems of wave and kinetic equations. We have recently found out that the so-called vector field method of Klainerman, a fundamental tool in the study of quasilinear wave equations, in fact possesses a complete analogue in the case of kinetic transport equations. This opens the way to many new directions of research, with applications to several fundamental systems of kinetic theory, such as the Einstein-Vlasov or Vlasov-Maxwell systems, and creates a link between two areas of PDEs which have typically been studied via different methods. One of our objectives is to develop other potential links, such as a general analysis of null forms for relativistic kinetic equations.
III) The Einstein equations with data on a compact manifold. The long time dynamics of solutions to the Einstein equations arising from initial data given on a compact manifold is still very poorly understood. In particular, there is still no known stable asymptotic regime for the Einstein equations with data given on a simple manifold such as the torus. We intend to establish the existence of such a stable asymptotic regime.
Max ERC Funding
1 071 008 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym GTMT
Project Group Theory and Model Theory
Researcher (PI) Eric Herve Jaligot
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The project is located between logic and mathematics, more precisely between model theory and group theory. There are extremely difficult questions arising about the model theory of groups, notably the question of the construction of new groups with prescribed algebraic properties and at the same time good model-theoretic properties. In particular, it is an important question, both in model theory and in group theory, to build new stable groups and eventually new nonalgebraic groups with a good dimension notion.
The present project aims at filling these gaps. It is divided into three main directions. Firstly, it consists in the continuation of the classification of groups with a good dimension notion, notably groups of finite Morley rank or related notions. Secondly, it consists in a systematic inspection of the combinatorial and geometric group theory which can be applied to build new groups, keeping a control on their first order theory. Thirdly, and in connection to the previous difficult problem, it consists in a very systematic and general study of infinite permutation groups.
Summary
The project is located between logic and mathematics, more precisely between model theory and group theory. There are extremely difficult questions arising about the model theory of groups, notably the question of the construction of new groups with prescribed algebraic properties and at the same time good model-theoretic properties. In particular, it is an important question, both in model theory and in group theory, to build new stable groups and eventually new nonalgebraic groups with a good dimension notion.
The present project aims at filling these gaps. It is divided into three main directions. Firstly, it consists in the continuation of the classification of groups with a good dimension notion, notably groups of finite Morley rank or related notions. Secondly, it consists in a systematic inspection of the combinatorial and geometric group theory which can be applied to build new groups, keeping a control on their first order theory. Thirdly, and in connection to the previous difficult problem, it consists in a very systematic and general study of infinite permutation groups.
Max ERC Funding
366 598 €
Duration
Start date: 2011-10-01, End date: 2013-12-31
Project acronym HiChemSynPro
Project High-throughput combinatorial chemical protein synthesis as a novel research technology platform for chemical and synthetic biology
Researcher (PI) Vladimir TORBEEV
Host Institution (HI) CENTRE INTERNATIONAL DE RECHERCHE AUX FRONTIERES DE LA CHIMIE FONDATION
Call Details Starting Grant (StG), LS9, ERC-2016-STG
Summary Chemical protein synthesis is an indispensable method in chemical and synthetic biology. However, at the present moment, it is laborious and involves multiple optimization and purification steps. High-throughput approaches for total synthesis of combinatorial libraries of custom-modified protein variants are needed. To change the situation, the work will be carried out in two directions: (1) implementation of microfluidic techniques for automation, miniaturization and multiplexing of experimental steps involved in the total synthesis of proteins, and (2) design and synthesis of novel catalytic proteins for efficient enzyme-assisted peptide ligations under denatured conditions. This innovative research technology will allow robust chemical synthesis of protein libraries with (100–10,000)-compounds with natural and unnatural modifications, bearing variety of post-translational modifications and also protein-like biopolymers. In this project, the new methodology will be validated by chemical synthesis of library of phosphorylated analogues of high mobility group protein A (HMGA), which is involved in gene-transcription and cancer development. Other potential future applications include protein design, biological problems where post-translational modifications play a crucial role (ranging from the ‘histone code’ hypothesis to understanding long-term memory) and functional annotation of newly discovered genes.
Summary
Chemical protein synthesis is an indispensable method in chemical and synthetic biology. However, at the present moment, it is laborious and involves multiple optimization and purification steps. High-throughput approaches for total synthesis of combinatorial libraries of custom-modified protein variants are needed. To change the situation, the work will be carried out in two directions: (1) implementation of microfluidic techniques for automation, miniaturization and multiplexing of experimental steps involved in the total synthesis of proteins, and (2) design and synthesis of novel catalytic proteins for efficient enzyme-assisted peptide ligations under denatured conditions. This innovative research technology will allow robust chemical synthesis of protein libraries with (100–10,000)-compounds with natural and unnatural modifications, bearing variety of post-translational modifications and also protein-like biopolymers. In this project, the new methodology will be validated by chemical synthesis of library of phosphorylated analogues of high mobility group protein A (HMGA), which is involved in gene-transcription and cancer development. Other potential future applications include protein design, biological problems where post-translational modifications play a crucial role (ranging from the ‘histone code’ hypothesis to understanding long-term memory) and functional annotation of newly discovered genes.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym IRON
Project Robust Geometry Processing
Researcher (PI) Pierre Alliez
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Digital Geometry Processing (DGP) started nearly ten years ago on the premise that geometry would soon become the fourth type of digital medium after sounds, images, and video. While recent research efforts have successfully established some theoretical and algorithmic foundations to deal with this very special signal that is geometry, DGP has not resulted in the expected societal and technological impacts that Digital Signal Processing has generated, mostly due to the lack of robustness and genericity of the geometry processing pipeline. We propose a research agenda to harness the full potential of Digital Geometry Processing and make it as robust and impactful as Digital Signal Processing. Specifically, we argue that streamlining the DGP pipeline cannot be achieved by direct adaptation of existing machinery: a new and focused research phase is required to address such fundamental issues as the reconstruction and approximation of complex shapes from heterogeneous data, in order to develop ironclad techniques that are robust to defect-laden inputs and offer strong guarantees on the outputs. Only then can DGP will be ready, as promised, to bring forth a technological revolution.
Summary
Digital Geometry Processing (DGP) started nearly ten years ago on the premise that geometry would soon become the fourth type of digital medium after sounds, images, and video. While recent research efforts have successfully established some theoretical and algorithmic foundations to deal with this very special signal that is geometry, DGP has not resulted in the expected societal and technological impacts that Digital Signal Processing has generated, mostly due to the lack of robustness and genericity of the geometry processing pipeline. We propose a research agenda to harness the full potential of Digital Geometry Processing and make it as robust and impactful as Digital Signal Processing. Specifically, we argue that streamlining the DGP pipeline cannot be achieved by direct adaptation of existing machinery: a new and focused research phase is required to address such fundamental issues as the reconstruction and approximation of complex shapes from heterogeneous data, in order to develop ironclad techniques that are robust to defect-laden inputs and offer strong guarantees on the outputs. Only then can DGP will be ready, as promised, to bring forth a technological revolution.
Max ERC Funding
1 370 198 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym LattAC
Project Lattices: algorithms and cryptography
Researcher (PI) Damien, Noel Stehle
Host Institution (HI) ECOLE NORMALE SUPERIEURE DE LYON
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary Contemporary cryptography, with security relying on the factorisation and discrete logarithm problems, is ill-prepared for the future: It will collapse with the rise of quantum computers, its costly algorithms require growing resources, and it is utterly ill-fitted for the fast-developing trend of externalising computations to the cloud. The emerging field of *lattice-based cryptography* (LBC) addresses these concerns: it resists would-be quantum computers, trades memory for drastic run-time savings, and enables computations on encrypted data, leading to the prospect of a privacy-preserving cloud economy. LBC could supersede contemporary cryptography within a decade. A major goal of this project is to enable this technology switch. I will strengthen the security foundations, improve its performance, and extend the range of its functionalities.
A lattice is the set of integer linear combinations of linearly independent real vectors, called lattice basis. The core computational problem on lattices is the Shortest Vector Problem (SVP): Given a basis, find a shortest non-zero point in the spanned lattice. The hardness of SVP is the security foundation of LBC. In fact, SVP and its variants arise in a great variety of areas, including computer algebra, communications (coding and cryptography), computer arithmetic and algorithmic number theory, further motivating the study of lattice algorithms. In the matter of *algorithm design*, the community is quickly nearing the limits of the classical paradigms. The usual approach, lattice reduction, consists in representing a lattice by a basis and steadily improving its quality. I will assess the full potential of this framework and, in the longer term, develop alternative approaches to go beyond the current limitations.
This project aims at studying all computational aspects of lattices, with cryptography as the driving motive. The strength of LattAC lies in its theory-to-practice and interdisciplinary methodological approach
Summary
Contemporary cryptography, with security relying on the factorisation and discrete logarithm problems, is ill-prepared for the future: It will collapse with the rise of quantum computers, its costly algorithms require growing resources, and it is utterly ill-fitted for the fast-developing trend of externalising computations to the cloud. The emerging field of *lattice-based cryptography* (LBC) addresses these concerns: it resists would-be quantum computers, trades memory for drastic run-time savings, and enables computations on encrypted data, leading to the prospect of a privacy-preserving cloud economy. LBC could supersede contemporary cryptography within a decade. A major goal of this project is to enable this technology switch. I will strengthen the security foundations, improve its performance, and extend the range of its functionalities.
A lattice is the set of integer linear combinations of linearly independent real vectors, called lattice basis. The core computational problem on lattices is the Shortest Vector Problem (SVP): Given a basis, find a shortest non-zero point in the spanned lattice. The hardness of SVP is the security foundation of LBC. In fact, SVP and its variants arise in a great variety of areas, including computer algebra, communications (coding and cryptography), computer arithmetic and algorithmic number theory, further motivating the study of lattice algorithms. In the matter of *algorithm design*, the community is quickly nearing the limits of the classical paradigms. The usual approach, lattice reduction, consists in representing a lattice by a basis and steadily improving its quality. I will assess the full potential of this framework and, in the longer term, develop alternative approaches to go beyond the current limitations.
This project aims at studying all computational aspects of lattices, with cryptography as the driving motive. The strength of LattAC lies in its theory-to-practice and interdisciplinary methodological approach
Max ERC Funding
1 414 402 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym LEAP
Project LEarning from our collective visual memory to Analyze its trends and Predict future events
Researcher (PI) Josef Sivic
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary People constantly draw on past visual experiences to anticipate future events and better understand, navigate, and interact with their environment, for example, when seeing an angry dog or a quickly approaching car. Currently there is no artificial system with a similar level of visual analysis and prediction capabilities. LEAP is a first step in that direction, leveraging the emerging collective visual memory formed by the unprecedented amount of visual data available in public archives, on the Internet and from surveillance or personal cameras - a complex evolving net of dynamic scenes, distributed across many different data sources, and equipped with plentiful but noisy and incomplete metadata. The goal of this project is to analyze dynamic patterns in this shared visual experience in order (i) to find and quantify their trends; and (ii) learn to predict future events in dynamic scenes.
With ever expanding computational resources and this extraordinary data, the main scientific challenge is now to invent new and powerful models adapted to its scale and its spatio-temporal, distributed and dynamic nature. To address this challenge, we will first design new models that generalize across different data sources, where scenes are captured under vastly different imaging conditions. Next, we will develop a framework for finding, describing and quantifying trends that involve measuring long-term changes in many related scenes. Finally, we will develop a methodology and tools for synthesizing complex future predictions from aligned past visual experiences.
Breakthrough progress on these problems would have profound implications on our everyday lives as well as science and commerce, with safer cars that anticipate the behavior of pedestrians on streets; tools that help doctors monitor, diagnose and predict patients’ health; and smart glasses that help people react in unfamiliar situations enabled by the advances from this project.
Summary
People constantly draw on past visual experiences to anticipate future events and better understand, navigate, and interact with their environment, for example, when seeing an angry dog or a quickly approaching car. Currently there is no artificial system with a similar level of visual analysis and prediction capabilities. LEAP is a first step in that direction, leveraging the emerging collective visual memory formed by the unprecedented amount of visual data available in public archives, on the Internet and from surveillance or personal cameras - a complex evolving net of dynamic scenes, distributed across many different data sources, and equipped with plentiful but noisy and incomplete metadata. The goal of this project is to analyze dynamic patterns in this shared visual experience in order (i) to find and quantify their trends; and (ii) learn to predict future events in dynamic scenes.
With ever expanding computational resources and this extraordinary data, the main scientific challenge is now to invent new and powerful models adapted to its scale and its spatio-temporal, distributed and dynamic nature. To address this challenge, we will first design new models that generalize across different data sources, where scenes are captured under vastly different imaging conditions. Next, we will develop a framework for finding, describing and quantifying trends that involve measuring long-term changes in many related scenes. Finally, we will develop a methodology and tools for synthesizing complex future predictions from aligned past visual experiences.
Breakthrough progress on these problems would have profound implications on our everyday lives as well as science and commerce, with safer cars that anticipate the behavior of pedestrians on streets; tools that help doctors monitor, diagnose and predict patients’ health; and smart glasses that help people react in unfamiliar situations enabled by the advances from this project.
Max ERC Funding
1 496 736 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym LEASP
Project Learning spatiotemporal patterns in longitudinal image data sets of the aging brain
Researcher (PI) Stanley Durrleman
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Summary
Time-series of multimodal medical images offer a unique opportunity to track anatomical and functional alterations of the brain in aging individuals. A collection of such time series for several individuals forms a longitudinal data set, each data being a rich iconic-geometric representation of the brain anatomy and function. These data are already extraordinary complex and variable across individuals. Taking the temporal component into account further adds difficulty, in that each individual follows a different trajectory of changes, and at a different pace. Furthermore, a disease is here a progressive departure from an otherwise normal scenario of aging, so that one could not think of normal and pathologic brain aging as distinct categories, as in the standard case-control paradigm.
Bio-statisticians lack a suitable methodological framework to exhibit from these data the typical trajectories and dynamics of brain alterations, and the effects of a disease on these trajectories, thus limiting the investigation of essential clinical questions. To change this situation, we propose to construct virtual dynamical models of brain aging by learning typical spatiotemporal patterns of alterations propagation from longitudinal iconic-geometric data sets.
By including concepts of the Riemannian geometry into Bayesian mixed effect models, the project will introduce general principles to average complex individual trajectories of iconic-geometric changes and align the pace at which these trajectories are followed. It will estimate a set of elementary spatiotemporal patterns, which combine to yield a personal aging scenario for each individual. Disease-specific patterns will be detected with an increasing likelihood.
This new generation of statistical and computational tools will unveil clusters of patients sharing similar lesion propagation profiles, paving the way to design more specific treatments, and care patients when treatments have the highest chance of success.
Max ERC Funding
1 499 894 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LENA
Project non-LinEar sigNal processing for solving data challenges in Astrophysics
Researcher (PI) Jérôme Bobin
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Summary
Astrophysics has arrived to a turning point where the scientific exploitation of data requires overcoming challenging analysis issues, which mandates the development of advanced signal processing methods. In this context, sparsity and sparse signal representations have played a prominent role in astrophysics. Indeed, thanks to sparsity, an extremely clean full-sky map of the Cosmic Microwave Background (CMB) has been derived from the Planck data [Bobin14], a European space mission that observes the sky in the microwave wavelengths. This led to a noticeable breakthrough: we showed that the large-scale statistical studies of the CMB can be performed without having to mask the galactic centre anymore thanks to the achieved high quality component separation [Rassat14].
Despite the undeniable success of sparsity, standard linear signal processing approaches are too simplistic to capture the intrinsically non-linear properties of physical data. For instance, the analysis of the Planck data in polarization requires new sparse representations to finely capture the properties of polarization vector fields (e.g. rotation invariance), which cannot be tackled by linear approaches. Shifting from the linear to the non-linear signal representation paradigm is an emerging area in signal processing, which builds upon new connections with fields such as deep learning [Mallat13].
Inspired by these active and fertile connections, the LENA project will: i) study a new non-linear signal representation framework to design non-linear models that can account for the underlying physics, and ii) develop new numerical methods that can exploit these models. We will further demonstrate the impact of the developed models and algorithms to tackle data analysis challenges in the scope of the Planck mission and the European radio-interferometer LOFAR. We expect the results of the LENA project to impact astrophysical data analysis as significantly as deploying sparsity to the field has achieved.
Max ERC Funding
1 497 411 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym LIC
Project Loop models, integrability and combinatorics
Researcher (PI) Paul Georges Zinn-Justin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The purpose of this proposal is to investigate new connections which
have emerged in the recent years between problems from statistical
mechanics, namely two-dimensional exactly solvable models, and a variety
of combinatorial problems, among which: the enumeration of plane partitions,
alternating sign matrices and related objects;
combinatorial properties of certain
algebro-geometric objects such as orbital varieties or the Brauer loop scheme;
or finally certain problems in free probability. One of the key methods
that emerged in recent years is the use
of quantum integrability and more precisely the quantum Knizhnik--Zamolodchikov
equation, which itself is related to many deep results in representation theory.
The fruitful interaction between all these ideas has led to many advances
in the last few years, including proofs of some old conjectures but
also completely new results. More specifically, loop models
are a class of statistical models where the PI has made
significant progress, in particular in relation to the so-called
Razumov--Stroganov conjecture (now Cantini--Sportiello theorem).
New directions that should be pursued include:
further applications to enumerative combinatorics such as proofs of various
open conjectures relating Alternating Sign Matrices, Plane Partitions
and their symmetry classes;
a full understanding of the quantum integrability of the
Fully Packed Loop model,
a specific loop model at the heart of the Razumov--Stroganov correspondence;
a complete description of the Brauer loop scheme, including its
defining equations, and of the underlying poset; the extension
of the work on Di Francesco and Zinn-Justin on the loop model/6-vertex vertex
relation to the case of the 8-vertex model
(corresponding to elliptic solutions of the Yang--Baxter equation);
the study of solvable tilings models, in relation to
generalizations of the Littlewood--Richardson rule, and the determination
of their limiting shapes.
Summary
The purpose of this proposal is to investigate new connections which
have emerged in the recent years between problems from statistical
mechanics, namely two-dimensional exactly solvable models, and a variety
of combinatorial problems, among which: the enumeration of plane partitions,
alternating sign matrices and related objects;
combinatorial properties of certain
algebro-geometric objects such as orbital varieties or the Brauer loop scheme;
or finally certain problems in free probability. One of the key methods
that emerged in recent years is the use
of quantum integrability and more precisely the quantum Knizhnik--Zamolodchikov
equation, which itself is related to many deep results in representation theory.
The fruitful interaction between all these ideas has led to many advances
in the last few years, including proofs of some old conjectures but
also completely new results. More specifically, loop models
are a class of statistical models where the PI has made
significant progress, in particular in relation to the so-called
Razumov--Stroganov conjecture (now Cantini--Sportiello theorem).
New directions that should be pursued include:
further applications to enumerative combinatorics such as proofs of various
open conjectures relating Alternating Sign Matrices, Plane Partitions
and their symmetry classes;
a full understanding of the quantum integrability of the
Fully Packed Loop model,
a specific loop model at the heart of the Razumov--Stroganov correspondence;
a complete description of the Brauer loop scheme, including its
defining equations, and of the underlying poset; the extension
of the work on Di Francesco and Zinn-Justin on the loop model/6-vertex vertex
relation to the case of the 8-vertex model
(corresponding to elliptic solutions of the Yang--Baxter equation);
the study of solvable tilings models, in relation to
generalizations of the Littlewood--Richardson rule, and the determination
of their limiting shapes.
Max ERC Funding
840 120 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym LiKo
Project From Liouville to Kolmogorov: 2d quantum gravity, noise sensitivity and turbulent flows
Researcher (PI) Christophe Garban
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Summary
This research project is organized along three seemingly unrelated directions:
(1) Mathematical Liouville gravity deals with the geometry of large random planar maps. Historically, conformal invariance was a key ingredient in the construction of Liouville gravity in the physics literature. Conformal invariance has been restored recently with an attempt of understanding large random combinatorial planar maps once conformally embedded in the plane. The geometry induced by these embeddings is conjecturally described by the exponential of a highly oscillating distribution, the Gaussian Free Field. This conjecture is part of a broader program aimed at rigorously understanding the celebrated KPZ relation. The first major goal of my project is to make significant progress towards the completion of this program. I will combine for this several tools such as Liouville Brownian motion, circle packings, QLE processes and Bouchaud trap models.
(2) Euclidean statistical physics is closely related to area (1) through the above KPZ relation. I plan to push further the analysis of critical statistical physics models successfully initiated by the works of Schramm and Smirnov. I will focus in particular on dynamics at and near critical points with a special emphasis on the so-called noise sensitivity of these systems.
(3) 3d turbulence. A more tractable ambition than solving Navier-Stokes equation is to construct explicit stochastic vector fields which combine key features of experimentally observed velocity fields. I will make the mathematical framework precise by identifying four axioms that need to be satisfied. It has been observed recently that the exponential of a certain log-correlated field, as in (1), could be used to create such a realistic velocity field. I plan to construct and analyse this challenging object by relying on techniques from (1) and (2). This would be the first genuine stochastic model of turbulent flow in the spirit of what Kolmogorov was aiming at.
Max ERC Funding
935 000 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym MALIG
Project A mathematical approach to the liquid-glass transition: kinetically constrained models, cellular automata and mixed order phase transitions
Researcher (PI) cristina Toninelli
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Summary
This proposal focuses on the mathematics of three cross-disciplinary, very active and deeply interlaced research themes: interacting particle systems with kinetic constraints, bootstrap percolation cellular automata and mixed order phase transitions. These topics belong to the fertile area of mathematics at the intersection of probability and mathematical statistical mechanics. They are also extremely important in physics. Indeed they are intimately connected to the fundamental problem of understanding the liquid-glass transition, one of the longstanding open questions in condensed matter physics.
The funding of this project will allow the PI to lead a highly qualified team with complementary expertise. Such a diversity will allow a novel, interdisciplinary and potentially groundbreaking approach. Even if research on each one of the above topics has been lately quite lively, very few exchanges and little cross-fertilization occurred among them. One of our main goals is to overcome the barriers among the three different research communities and to explore the interfaces of these yet unconnected fields. We will open two novel and challenging chapters in the mathematics of interacting particle systems and cellular automata: interacting particle glassy systems and bootstrap percolation models with mixed order critical and discontinuous transitions. In order to achieve our groundbreaking goals we will have to go well beyond the present mathematical knowledge. We believe that the novel concepts and the unconventional approaches that we will develop will have a deep impact also in other areas including combinatorics, theory of randomized algorithms and complex systems.
The scientific background and expertise of the PI, with original and groundbreaking contributions in each of the above topics and with a broad and clearcut vision of the mathematics of the proposed research as well as of the fundamental physical questions,make the PI the ideal leader of this project.
Max ERC Funding
883 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31