Project acronym 3CBIOTECH
Project Cold Carbon Catabolism of Microbial Communities underprinning a Sustainable Bioenergy and Biorefinery Economy
Researcher (PI) Gavin James Collins
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND GALWAY
Call Details Starting Grant (StG), LS9, ERC-2010-StG_20091118
Summary The applicant will collaborate with Irish, European and U.S.-based colleagues to develop a sustainable biorefinery and bioenergy industry in Ireland and Europe. The focus of this ERC Starting Grant will be the application of classical microbiological, physiological and real-time polymerase chain reaction (PCR)-based assays, to qualitatively and quantitatively characterize microbial communities underpinning novel and innovative, low-temperature, anaerobic waste (and other biomass) conversion technologies, including municipal wastewater treatment and, demonstration- and full-scale biorefinery applications.
Anaerobic digestion (AD) is a naturally-occurring process, which is widely applied for the conversion of waste to methane-containing biogas. Low-temperature (<20 degrees C) AD has been applied by the applicant as a cost-effective alternative to mesophilic (c. 35C) AD for the treatment of several waste categories. However, the microbiology of low-temperature AD is poorly understood. The applicant will work with microbial consortia isolated from anaerobic bioreactors, which have been operated for long-term experiments (>3.5 years), and include organic acid-oxidizing, hydrogen-producing syntrophic microbes and hydrogen-consuming methanogens. A major focus of the project will be the ecophysiology of psychrotolerant and psychrophilic methanogens already identified and cultivated by the applicant. The project will also investigate the role(s) of poorly-understood Crenarchaeota populations and homoacetogenic bacteria, in complex consortia. The host organization is a leading player in the microbiology of waste-to-energy applications. The applicant will train a team of scientists in all aspects of the microbiology and bioengineering of biomass conversion systems.
Summary
The applicant will collaborate with Irish, European and U.S.-based colleagues to develop a sustainable biorefinery and bioenergy industry in Ireland and Europe. The focus of this ERC Starting Grant will be the application of classical microbiological, physiological and real-time polymerase chain reaction (PCR)-based assays, to qualitatively and quantitatively characterize microbial communities underpinning novel and innovative, low-temperature, anaerobic waste (and other biomass) conversion technologies, including municipal wastewater treatment and, demonstration- and full-scale biorefinery applications.
Anaerobic digestion (AD) is a naturally-occurring process, which is widely applied for the conversion of waste to methane-containing biogas. Low-temperature (<20 degrees C) AD has been applied by the applicant as a cost-effective alternative to mesophilic (c. 35C) AD for the treatment of several waste categories. However, the microbiology of low-temperature AD is poorly understood. The applicant will work with microbial consortia isolated from anaerobic bioreactors, which have been operated for long-term experiments (>3.5 years), and include organic acid-oxidizing, hydrogen-producing syntrophic microbes and hydrogen-consuming methanogens. A major focus of the project will be the ecophysiology of psychrotolerant and psychrophilic methanogens already identified and cultivated by the applicant. The project will also investigate the role(s) of poorly-understood Crenarchaeota populations and homoacetogenic bacteria, in complex consortia. The host organization is a leading player in the microbiology of waste-to-energy applications. The applicant will train a team of scientists in all aspects of the microbiology and bioengineering of biomass conversion systems.
Max ERC Funding
1 499 797 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym Active-DNA
Project Computationally Active DNA Nanostructures
Researcher (PI) Damien WOODS
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND MAYNOOTH
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Summary
During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Max ERC Funding
2 349 603 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym AltCheM
Project In vivo functional screens to decipher mechanisms of stochastically- and mutationally-induced chemoresistance in Acute Myeloid Leukemia
Researcher (PI) Alexandre PUISSANT
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS4, ERC-2017-STG
Summary Acute Myeloid Leukemia (AML), the most common leukemia diagnosed in adults, represents the paradigm of resistance to front-line therapies in hematology. Indeed, AML is so genetically complex that only few targeted therapies are currently tested in this disease and chemotherapy remains the only standard treatment for AML since the past four decades. Despite an initial sustained remission achieved by chemotherapeutic agents, almost all patients relapse with a chemoresistant minimal residual disease (MRD). The goal of my proposal is to characterize the still poorly understood biological mechanisms underlying persistence and emergence of MRD.
MRD is the consequence of the re-expansion of leukemia-initiating cells that are intrinsically more resistant to chemotherapy. This cell fraction may be stochastically more prone to survive front-line therapy regardless of their mutational status (the stochastic model), or genetically predetermined to resist by virtue of a collection of chemoprotective mutations (the mutational model).
I have already generated in mice, by consecutive rounds of chemotherapy, a stochastic MLL-AF9-driven chemoresistance model that I examined by RNA-sequencing. I will pursue the comprehensive cell autonomous and cell non-autonomous characterization of this chemoresistant AML disease using whole-exome and ChIP-sequencing.
To establish a mutationally-induced chemoresistant mouse model, I will conduct an innovative in vivo screen using pooled mutant open reading frame and shRNA libraries in order to predict which combinations of mutations, among those already known in AML, actively promote chemoresistance.
Finally, by combining genomic profiling and in vivo shRNA screening experiments, I will decipher the molecular mechanisms and identify the functional effectors of these two modes of resistance. Ultimately, I will then be able to firmly establish the fundamental relevance of the stochastic and/or the mutational model of chemoresistance for MRD genesis.
Summary
Acute Myeloid Leukemia (AML), the most common leukemia diagnosed in adults, represents the paradigm of resistance to front-line therapies in hematology. Indeed, AML is so genetically complex that only few targeted therapies are currently tested in this disease and chemotherapy remains the only standard treatment for AML since the past four decades. Despite an initial sustained remission achieved by chemotherapeutic agents, almost all patients relapse with a chemoresistant minimal residual disease (MRD). The goal of my proposal is to characterize the still poorly understood biological mechanisms underlying persistence and emergence of MRD.
MRD is the consequence of the re-expansion of leukemia-initiating cells that are intrinsically more resistant to chemotherapy. This cell fraction may be stochastically more prone to survive front-line therapy regardless of their mutational status (the stochastic model), or genetically predetermined to resist by virtue of a collection of chemoprotective mutations (the mutational model).
I have already generated in mice, by consecutive rounds of chemotherapy, a stochastic MLL-AF9-driven chemoresistance model that I examined by RNA-sequencing. I will pursue the comprehensive cell autonomous and cell non-autonomous characterization of this chemoresistant AML disease using whole-exome and ChIP-sequencing.
To establish a mutationally-induced chemoresistant mouse model, I will conduct an innovative in vivo screen using pooled mutant open reading frame and shRNA libraries in order to predict which combinations of mutations, among those already known in AML, actively promote chemoresistance.
Finally, by combining genomic profiling and in vivo shRNA screening experiments, I will decipher the molecular mechanisms and identify the functional effectors of these two modes of resistance. Ultimately, I will then be able to firmly establish the fundamental relevance of the stochastic and/or the mutational model of chemoresistance for MRD genesis.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym ANADEL
Project Analysis of Geometrical Effects on Dispersive Equations
Researcher (PI) Danela Oana IVANOVICI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Summary
We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Max ERC Funding
1 293 763 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym analysisdirac
Project The analysis of the Dirac operator: the hypoelliptic Laplacian and its applications
Researcher (PI) Jean-Michel Philippe Marie-José Bismut
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Summary
This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Max ERC Funding
1 112 400 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ANT
Project Automata in Number Theory
Researcher (PI) Boris Adamczewski
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Summary
Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Max ERC Funding
1 438 745 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym aSCEND
Project Secure Computation on Encrypted Data
Researcher (PI) Hoe Teck Wee
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Summary
Recent trends in computing have prompted users and organizations to store an increasingly large amount of sensitive data at third party locations in the cloud outside of their direct control. Storing data remotely poses an acute security threat as these data are outside our control and could potentially be accessed by untrusted parties. Indeed, the reality of these threats have been borne out by the Snowden leaks and hundreds of data breaches each year. In order to protect our data, we will need to encrypt it.
Functional encryption is a novel paradigm for public-key encryption that enables both fine-grained access control and selective computation on encrypted data, as is necessary to protect big, complex data in the cloud. Functional encryption also enables searches on encrypted travel records and surveillance video as well as medical studies on encrypted medical records in a privacy-preserving manner; we can give out restricted secret keys that reveal only the outcome of specific searches and tests. These mechanisms allow us to maintain public safety without compromising on civil liberties, and to facilitate medical break-throughs without compromising on individual privacy.
The goals of the aSCEND project are (i) to design pairing and lattice-based functional encryption that are more efficient and ultimately viable in practice; and (ii) to obtain a richer understanding of expressive functional encryption schemes and to push the boundaries from encrypting data to encrypting software. My long-term vision is the ubiquitous use of functional encryption to secure our data and our computation, just as public-key encryption is widely used today to secure our communication. Realizing this vision requires new advances in the foundations of functional encryption, which is the target of this project.
Max ERC Funding
1 253 893 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BetaRegeneration
Project Induction of Insulin-producing beta-cells Regeneration in vivo
Researcher (PI) Patrick Collombat
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS4, ERC-2011-StG_20101109
Summary Diabetes has become one of the most widespread metabolic disorders with epidemic dimensions affecting almost 6% of the world’s population. Despite modern treatments, the life expectancy of patients with Type 1 diabetes remains reduced as compared to healthy subjects. There is therefore a need for alternative therapies. Towards this aim, using the mouse, we recently demonstrated that the in vivo forced expression of a single factor in pancreatic alpha-cells is sufficient to induce a continuous regeneration of alpha-cells and their subsequent conversion into beta-like cells, such converted cells being capable of reversing the consequences of chemically-induced diabetes in vivo (Collombat et al. Cell, 2009).
The PI and his team therefore propose to further decipher the mechanisms involved in this alpha-cell-mediated beta-cell regeneration process and determine whether this approach may be applied to adult animals and whether it would efficiently reverse Type 1 diabetes. Furthermore, a major effort will be made to verify whether our findings could be translated to human. Specifically, we will use a tri-partite approach to address the following issues: (1) Can the in vivo alpha-cell-mediated beta-cell regeneration be induced in adults mice? What would be the genetic determinants involved? (2) Can alpha-cell-mediated beta-cell regeneration reverse diabetes in the NOD Type 1 diabetes mouse model? (3) Can adult human alpha-cells be converted into beta-like cells?
Together, these ambitious objectives will most certainly allow us to gain new insight into the mechanisms defining the identity and the reprogramming capabilities of mouse and human endocrine cells and may thereby open new avenues for the treatment of diabetes. Similarly, the determination of the molecular triggers implicated in the beta-cell regeneration observed in our diabetic mice may lead to exciting new findings, including the identification of “drugable” targets of importance for human diabetic patients.
Summary
Diabetes has become one of the most widespread metabolic disorders with epidemic dimensions affecting almost 6% of the world’s population. Despite modern treatments, the life expectancy of patients with Type 1 diabetes remains reduced as compared to healthy subjects. There is therefore a need for alternative therapies. Towards this aim, using the mouse, we recently demonstrated that the in vivo forced expression of a single factor in pancreatic alpha-cells is sufficient to induce a continuous regeneration of alpha-cells and their subsequent conversion into beta-like cells, such converted cells being capable of reversing the consequences of chemically-induced diabetes in vivo (Collombat et al. Cell, 2009).
The PI and his team therefore propose to further decipher the mechanisms involved in this alpha-cell-mediated beta-cell regeneration process and determine whether this approach may be applied to adult animals and whether it would efficiently reverse Type 1 diabetes. Furthermore, a major effort will be made to verify whether our findings could be translated to human. Specifically, we will use a tri-partite approach to address the following issues: (1) Can the in vivo alpha-cell-mediated beta-cell regeneration be induced in adults mice? What would be the genetic determinants involved? (2) Can alpha-cell-mediated beta-cell regeneration reverse diabetes in the NOD Type 1 diabetes mouse model? (3) Can adult human alpha-cells be converted into beta-like cells?
Together, these ambitious objectives will most certainly allow us to gain new insight into the mechanisms defining the identity and the reprogramming capabilities of mouse and human endocrine cells and may thereby open new avenues for the treatment of diabetes. Similarly, the determination of the molecular triggers implicated in the beta-cell regeneration observed in our diabetic mice may lead to exciting new findings, including the identification of “drugable” targets of importance for human diabetic patients.
Max ERC Funding
1 500 000 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BigFastData
Project Charting a New Horizon of Big and Fast Data Analysis through Integrated Algorithm Design
Researcher (PI) Yanlei DIAO
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2016-COG
Summary This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Summary
This proposal addresses a pressing need from emerging big data applications such as genomics and data center monitoring: besides the scale of processing, big data systems must also enable perpetual, low-latency processing for a broad set of analytical tasks, referred to as big and fast data analysis. Today’s technology falls severely short for such needs due to the lack of support of complex analytics with scale, low latency, and strong guarantees of user performance requirements. To bridge the gap, this proposal tackles a grand challenge: “How do we design an algorithmic foundation that enables the development of all necessary pillars of big and fast data analysis?” This proposal considers three pillars:
1) Parallelism: There is a fundamental tension between data parallelism (for scale) and pipeline parallelism (for low latency). We propose new approaches based on intelligent use of memory and workload properties to integrate both forms of parallelism.
2) Analytics: The literature lacks a large body of algorithms for critical order-related analytics to be run under data and pipeline parallelism. We propose new algorithmic frameworks to enable such analytics.
3) Optimization: To run analytics, today's big data systems are best effort only. We transform such systems into a principled optimization framework that suits the new characteristics of big data infrastructure and adapts to meet user performance requirements.
The scale and complexity of the proposed algorithm design makes this project high-risk, at the same time, high-gain: it will lay a solid foundation for big and fast data analysis, enabling a new integrated parallel processing paradigm, algorithms for critical order-related analytics, and a principled optimizer with strong performance guarantees. It will also broadly enable accelerated information discovery in emerging domains such as genomics, as well as economic benefits of early, well-informed decisions and reduced user payments.
Max ERC Funding
2 472 752 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BITCRUMBS
Project Towards a Reliable and Automated Analysis of Compromised Systems
Researcher (PI) Davide BALZAROTTI
Host Institution (HI) EURECOM
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Summary
"The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Max ERC Funding
1 991 504 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BLOC
Project Mathematical study of Boundary Layers in Oceanic Motions
Researcher (PI) Anne-Laure Perrine Dalibard
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Starting Grant (StG), PE1, ERC-2014-STG
Summary Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Summary
Boundary layer theory is a large component of fluid dynamics. It is ubiquitous in Oceanography, where boundary layer currents, such as the Gulf Stream, play an important role in the global circulation. Comprehending the underlying mechanisms in the formation of boundary layers is therefore crucial for applications. However, the treatment of boundary layers in ocean dynamics remains poorly understood at a theoretical level, due to the variety and complexity of the forces at stake.
The goal of this project is to develop several tools to bridge the gap between the mathematical state of the art and the physical reality of oceanic motion. There are four points on which we will mainly focus: degeneracy issues, including the treatment Stewartson boundary layers near the equator; rough boundaries (meaning boundaries with small amplitude and high frequency variations); the inclusion of the advection term in the construction of stationary boundary layers; and the linear and nonlinear stability of the boundary layers. We will address separately Ekman layers and western boundary layers, since they are ruled by equations whose mathematical behaviour is very different.
This project will allow us to have a better understanding of small scale phenomena in fluid mechanics, and in particular of the inviscid limit of incompressible fluids.
The team will be composed of the PI, two PhD students and three two-year postdocs over the whole period. We will also rely on the historical expertise of the host institution on fluid mechanics and asymptotic methods.
Max ERC Funding
1 267 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym BLOWDISOL
Project "BLOW UP, DISPERSION AND SOLITONS"
Researcher (PI) Franck Merle
Host Institution (HI) UNIVERSITE DE CERGY-PONTOISE
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary "Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Summary
"Many physical models involve nonlinear dispersive problems, like wave
or laser propagation, plasmas, ferromagnetism, etc. So far, the mathematical under-
standing of these equations is rather poor. In particular, we know little about the
detailed qualitative behavior of their solutions. Our point is that an apparent com-
plexity hides universal properties of these models; investigating and uncovering such
properties has started only recently. More than the equations themselves, these univer-
sal properties are essential for physical modelisation.
By considering several standard models such as the nonlinear Schrodinger, nonlinear
wave, generalized KdV equations and related geometric problems, the goal of this pro-
posal is to describe the generic global behavior of the solutions and the profiles which
emerge either for large time or by concentration due to strong nonlinear effects, if pos-
sible through a few relevant solutions (sometimes explicit solutions, like solitons). In
order to do this, we have to elaborate different mathematical tools depending on the
context and the specificity of the problems. Particular emphasis will be placed on
- large time asymptotics for global solutions, decomposition of generic solutions into
sums of decoupled solitons in non integrable situations,
- description of critical phenomenon for blow up in the Hamiltonian situation, stable
or generic behavior for blow up on critical dynamics, various relevant regularisations of
the problem,
- global existence for defocusing supercritical problems and blow up dynamics in the
focusing cases.
We believe that the PI and his team have the ability to tackle these problems at present.
The proposal will open whole fields of investigation in Partial Differential Equations in
the future, clarify and simplify our knowledge on the dynamical behavior of solutions
of these problems and provide Physicists some new insight on these models."
Max ERC Funding
2 079 798 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym BREAD
Project Breaking the curse of dimensionality: numerical challenges in high dimensional analysis and simulation
Researcher (PI) Albert Cohen
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Summary
"This project is concerned with problems that involve a very large number of variables, and whose efficient numerical treatment is challenged by the so-called curse of dimensionality, meaning that computational complexity increases exponentially in the variable dimension.
The PI intend to establish in his host institution a scientific leadership on the mathematical understanding and numerical treatment of these problems, and to contribute to the development of this area of research through international collaborations, organization of workshops and research schools, and training of postdocs and PhD students.
High dimensional problems are ubiquitous in an increasing number of areas of scientific computing, among which statistical or active learning theory, parametric and stochastic partial differential equations, parameter optimization in numerical codes. There is a high demand from the industrial world of efficient numerical methods for treating such problems.
The practical success of various numerical algorithms, that have been developed in recent years in these application areas, is often limited to moderate dimensional setting.
In addition, these developments tend to be, as a rule, rather problem specific and not always founded on a solid mathematical analysis.
The central scientific objectives of this project are therefore: (i) to identify fundamental mathematical principles behind overcoming the curse of dimensionality, (ii) to understand how these principles enter in relevant instances of the above applications, and (iii) based on the these principles beyond particular problem classes, to develop broadly applicable numerical strategies that benefit from such mechanisms.
The performances of these strategies should be provably independent of the variable dimension, and in that sense break the curse of dimensionality. They will be tested on both synthetic benchmark tests and real world problems coming from the afore-mentioned applications."
Max ERC Funding
1 848 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym BreakingBarriers
Project Targeting endothelial barriers to combat disease
Researcher (PI) Anne Eichmann
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Advanced Grant (AdG), LS4, ERC-2018-ADG
Summary Tissue homeostasis requires coordinated barrier function in blood and lymphatic vessels. Opening of junctions between endothelial cells (ECs) lining blood vessels leads to tissue fluid accumulation that is drained by lymphatic vessels. A pathological increase in blood vessel permeability or lack or malfunction of lymphatic vessels leads to edema and associated defects in macromolecule and immune cell clearance. Unbalanced barrier function between blood and lymphatic vessels contributes to neurodegeneration, chronic inflammation, and cardiovascular disease. In this proposal, we seek to gain mechanistic understanding into coordination of barrier function between blood and lymphatic vessels, how this process is altered in disease models and how it can be manipulated for therapeutic purposes. We will focus on two critical barriers with diametrically opposing functions, the blood-brain barrier (BBB) and the lymphatic capillary barrier (LCB). ECs of the BBB form very tight junctions that restrict paracellular access to the brain. In contrast, open junctions of the LCB ensure uptake of extravasated fluid, macromolecules and immune cells, as well as lipid in the gut. We have identified novel effectors of BBB and LCB junctions and will determine their role in adult homeostasis and in disease models. Mouse genetic gain and loss of function approaches in combination with histological, ultrastructural, functional and molecular analysis will determine mechanisms underlying formation of tissue specific EC barriers. Deliverables include in vivo validated targets that could be used for i) opening the BBB on demand for drug delivery into the brain, and ii) to lower plasma lipid uptake via interfering with the LCB, with implications for prevention of obesity, cardiovascular disease and inflammation. These pioneering studies promise to open up new opportunities for research and treatment of neurovascular and cardiovascular disease.
Summary
Tissue homeostasis requires coordinated barrier function in blood and lymphatic vessels. Opening of junctions between endothelial cells (ECs) lining blood vessels leads to tissue fluid accumulation that is drained by lymphatic vessels. A pathological increase in blood vessel permeability or lack or malfunction of lymphatic vessels leads to edema and associated defects in macromolecule and immune cell clearance. Unbalanced barrier function between blood and lymphatic vessels contributes to neurodegeneration, chronic inflammation, and cardiovascular disease. In this proposal, we seek to gain mechanistic understanding into coordination of barrier function between blood and lymphatic vessels, how this process is altered in disease models and how it can be manipulated for therapeutic purposes. We will focus on two critical barriers with diametrically opposing functions, the blood-brain barrier (BBB) and the lymphatic capillary barrier (LCB). ECs of the BBB form very tight junctions that restrict paracellular access to the brain. In contrast, open junctions of the LCB ensure uptake of extravasated fluid, macromolecules and immune cells, as well as lipid in the gut. We have identified novel effectors of BBB and LCB junctions and will determine their role in adult homeostasis and in disease models. Mouse genetic gain and loss of function approaches in combination with histological, ultrastructural, functional and molecular analysis will determine mechanisms underlying formation of tissue specific EC barriers. Deliverables include in vivo validated targets that could be used for i) opening the BBB on demand for drug delivery into the brain, and ii) to lower plasma lipid uptake via interfering with the LCB, with implications for prevention of obesity, cardiovascular disease and inflammation. These pioneering studies promise to open up new opportunities for research and treatment of neurovascular and cardiovascular disease.
Max ERC Funding
2 499 969 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym CHRiSHarMa
Project Commutators, Hilbert and Riesz transforms,Shifts, Harmonic extensions and Martingales
Researcher (PI) Stefanie Petermichl
Host Institution (HI) UNIVERSITE PAUL SABATIER TOULOUSE III
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Summary
This project aims to develop two arrays of questions at the heart of harmonic
analysis, probability and operator theory:
Multi-parameter harmonic analysis.
Through the use of wavelet methods in harmonic analysis, we plan to shed new
light on characterizations for boundedness of multi-parameter versions of
classical Hankel operators in a variety of settings. The classical Nehari's theorem on
the disk (1957) has found an important generalization to Hilbert space
valued functions, known as Page's theorem. A relevant extension of Nehari's
theorem to the bi-disk had been a long standing problem, finally solved in
2000, through novel harmonic analysis methods. It's operator analog remains
unknown and constitutes part of this proposal.
Sharp estimates for Calderon-Zygmund operators and martingale
inequalities.
We make use of the interplay between objects central to
Harmonic analysis, such as the Hilbert transform, and objects central to
probability theory, martingales. This connection has seen many faces, such as
in the UMD space classification by Bourgain and Burkholder or in the formula
of Gundy-Varapoulos, that uses orthogonal martingales to model the behavior of
the Hilbert transform. Martingale methods in combination with optimal control
have advanced an array of questions in harmonic analysis in recent years. In
this proposal we wish to continue this direction as well as exploit advances
in dyadic harmonic analysis for use in questions central to probability. There
is some focus on weighted estimates in a non-commutative and scalar setting, in the understanding of discretizations
of classical operators, such as the Hilbert transform and their role played
when acting on functions defined on discrete groups. From a martingale
standpoint, jump processes come into play. Another direction is the use of
numerical methods in combination with harmonic analysis achievements for martingale estimates.
Max ERC Funding
1 523 963 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CIRCUS
Project An end-to-end verification architecture for building Certified Implementations of Robust, Cryptographically Secure web applications
Researcher (PI) Karthikeyan Bhargavan
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Summary
The security of modern web applications depends on a variety of critical components including cryptographic libraries, Transport Layer Security (TLS), browser security mechanisms, and single sign-on protocols. Although these components are widely used, their security guarantees remain poorly understood, leading to subtle bugs and frequent attacks.
Rather than fixing one attack at a time, we advocate the use of formal security verification to identify and eliminate entire classes of vulnerabilities in one go. With the aid of my ERC starting grant, I have built a team that has already achieved landmark results in this direction. We built the first TLS implementation with a cryptographic proof of security. We discovered high-profile vulnerabilities such as the recent Triple Handshake and FREAK attacks, both of which triggered critical security updates to all major web browsers and TLS libraries.
So far, our security theorems only apply to carefully-written standalone reference implementations. CIRCUS proposes to take on the next great challenge: verifying the end-to-end security of web applications running in mainstream software. The key idea is to identify the core security components of web browsers and servers and replace them by rigorously verified components that offer the same functionality but with robust security guarantees.
Our goal is ambitious and there are many challenges to overcome, but we believe this is an opportune time for this proposal. In response to the Snowden reports, many cryptographic libraries and protocols are currently being audited and redesigned. Standards bodies and software developers are inviting researchers to help analyse their designs and code. Responding to their call requires a team of researchers who are willing to deal with the messy details of nascent standards and legacy code, and at the same time prove strong security theorems based on precise cryptographic assumptions. We are able, we are willing, and the time is now.
Max ERC Funding
1 885 248 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CoBCoM
Project Computational Brain Connectivity Mapping
Researcher (PI) Rachid DERICHE
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2015-AdG
Summary One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Summary
One third of the burden of all the diseases in Europe is due to problems caused by diseases affecting brain. Although exceptional progress has been obtained for exploring it during the past decades, the brain is still terra-incognita and calls for specic research efforts to better understand its architecture and functioning.
CoBCoM is our response to this great challenge of modern science with the overall goal to develop a joint Dynamical Structural-Functional Brain Connectivity Network (DSF-BCN) solidly grounded on advanced and integrated methods for diffusion Magnetic Resonance Imaging (dMRI) and Electro & Magneto-Encephalography (EEG & MEG).
To take up this grand challenge and achieve new frontiers for brain connectivity mapping, we will develop a new generation of computational models and methods for identifying and characterizing the structural and functional connectivities that will be at the heart of the DSF-BCN. Our strategy is to break with the tradition to incrementally and separately contributing to structure or function and develop a global approach involving strong interactions between structural and functional connectivities. To solve the limited view of the brain provided just by one imaging modality, our models will be developed under a rigorous computational framework integrating complementary non invasive imaging modalities: dMRI, EEG and MEG.
CoBCoM will push far forward the state-of-the-art in these modalities, developing innovative models and ground-breaking processing tools to provide in-fine a joint DSF-BCN solidly grounded on a detailed mapping of the brain connectivity, both in space and time.
Capitalizing on the strengths of dMRI, MEG & EEG methodologies and building on the bio- physical and mathematical foundations of our new generation of computational models, CoBCoM will be applied to high-impact diseases, and its ground-breaking computational nature and added clinical value will open new perspectives in neuroimaging.
Max ERC Funding
2 469 123 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COMBINEPIC
Project Elliptic Combinatorics: Solving famous models from combinatorics, probability and statistical mechanics, via a transversal approach of special functions
Researcher (PI) Kilian RASCHEL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary I am willing to solve several well-known models from combinatorics, probability theory and statistical mechanics: the Ising model on isoradial graphs, dimer models, spanning forests, random walks in cones, occupation time problems. Although completely unrelated a priori, these models have the common feature of being presumed “exactly solvable” models, for which surprising and spectacular formulas should exist for quantities of interest. This is captured by the title “Elliptic Combinatorics”, the wording elliptic referring to the use of special functions, in a broad sense: algebraic/differentially finite (or holonomic)/diagonals/(hyper)elliptic/ hypergeometric/etc.
Besides the exciting nature of the models which we aim at solving, one main strength of our project lies in the variety of modern methods and fields that we cover: combinatorics, probability, algebra (representation theory), computer algebra, algebraic geometry, with a spectrum going from applied to pure mathematics.
We propose in addition two major applications, in finance (Markovian order books) and in population biology (evolution of multitype populations). We plan to work in close collaborations with researchers from these fields, to eventually apply our results (study of extinction probabilities for self-incompatible flower populations, for instance).
Summary
I am willing to solve several well-known models from combinatorics, probability theory and statistical mechanics: the Ising model on isoradial graphs, dimer models, spanning forests, random walks in cones, occupation time problems. Although completely unrelated a priori, these models have the common feature of being presumed “exactly solvable” models, for which surprising and spectacular formulas should exist for quantities of interest. This is captured by the title “Elliptic Combinatorics”, the wording elliptic referring to the use of special functions, in a broad sense: algebraic/differentially finite (or holonomic)/diagonals/(hyper)elliptic/ hypergeometric/etc.
Besides the exciting nature of the models which we aim at solving, one main strength of our project lies in the variety of modern methods and fields that we cover: combinatorics, probability, algebra (representation theory), computer algebra, algebraic geometry, with a spectrum going from applied to pure mathematics.
We propose in addition two major applications, in finance (Markovian order books) and in population biology (evolution of multitype populations). We plan to work in close collaborations with researchers from these fields, to eventually apply our results (study of extinction probabilities for self-incompatible flower populations, for instance).
Max ERC Funding
1 242 400 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CombiTop
Project New Interactions of Combinatorics through Topological Expansions, at the crossroads of Probability, Graph theory, and Mathematical Physics
Researcher (PI) Guillaume CHAPUY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary "The purpose of this project is to use the ubiquitous nature of certain combinatorial topological objects called maps in order to unveil deep connections between several areas of mathematics. Maps, that describe the embedding of a graph into a surface, appear in probability theory, mathematical physics, enumerative geometry or graph theory, and different combinatorial viewpoints on these objects have been developed in connection with each topic. The originality of our project will be to study these approaches together and to unify them.
The outcome will be triple, as we will:
1. build a new, well structured branch of combinatorics of which many existing results in different areas of enumerative and algebraic combinatorics are only first fruits;
2. connect and unify several aspects of the domains related to it, most importantly between probability and integrable hierarchies thus proposing new directions, new tools and new results for each of them;
3. export the tools of this unified framework to reach at new applications, especially in random graph theory and in a rising domain of algebraic combinatorics related to Tamari lattices.
The methodology to reach the unification will be the study of some strategic interactions at different places involving topological expansions, that is to say, places where enumerative problems dealing with maps appear and their genus invariant plays a natural role, in particular: 1. the combinatorial theory of maps developped by the "French school" of combinatorics, and the study of random maps; 2. the combinatorics of Fermions underlying the theory of KP and 2-Toda hierarchies; 3; the Eynard-Orantin "topological recursion" coming from mathematical physics.
We present some key set of tasks in view of relating these different topics together. The pertinence of the approach is demonstrated by recent research of the principal investigator."
Summary
"The purpose of this project is to use the ubiquitous nature of certain combinatorial topological objects called maps in order to unveil deep connections between several areas of mathematics. Maps, that describe the embedding of a graph into a surface, appear in probability theory, mathematical physics, enumerative geometry or graph theory, and different combinatorial viewpoints on these objects have been developed in connection with each topic. The originality of our project will be to study these approaches together and to unify them.
The outcome will be triple, as we will:
1. build a new, well structured branch of combinatorics of which many existing results in different areas of enumerative and algebraic combinatorics are only first fruits;
2. connect and unify several aspects of the domains related to it, most importantly between probability and integrable hierarchies thus proposing new directions, new tools and new results for each of them;
3. export the tools of this unified framework to reach at new applications, especially in random graph theory and in a rising domain of algebraic combinatorics related to Tamari lattices.
The methodology to reach the unification will be the study of some strategic interactions at different places involving topological expansions, that is to say, places where enumerative problems dealing with maps appear and their genus invariant plays a natural role, in particular: 1. the combinatorial theory of maps developped by the "French school" of combinatorics, and the study of random maps; 2. the combinatorics of Fermions underlying the theory of KP and 2-Toda hierarchies; 3; the Eynard-Orantin "topological recursion" coming from mathematical physics.
We present some key set of tasks in view of relating these different topics together. The pertinence of the approach is demonstrated by recent research of the principal investigator."
Max ERC Funding
1 086 125 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym CONSTRAINTS
Project Ecophysiological and biophysical constraints on domestication in crop plants
Researcher (PI) Cyrille (Fabrice) Violle
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS9, ERC-2014-STG
Summary A fundamental question in biology is how constraints drive phenotypic changes and the diversification of life. We know little about the role of these constraints on crop domestication, nor how artificial selection can escape them. CONSTRAINTS questions whether crop domestication has shifted ecophysiological and biophysical traits related to resource acquisition, use and partitioning, and how trade-offs between them have constrained domestication and can limit future improvements in both optimal and sub-optimal conditions.
The project is based on three objectives: 1. revealing the existence (or lack) of generic resource-use domestication syndrome in crop science; 2. elucidating ecophysiological and biophysical trade-offs within crop science and delineating the envelope of constraints for artificial selection; 3. examining the shape of ecophysiological and biophysical trade-offs in crop species when grown in sub-optimal environmental conditions. This project will be investigated within and across crop species thanks to a core panel of 12 studied species (maize, sunflower, Japanese rice, sorghum, durum wheat, bread wheat, alfalfa, orchardgrass, silvergrass, pea, colza, vine) for which data and collections (ca. 1,300 genotypes total) are already available to the PI, and additional high throughput phenotyping using automatons. Additional species will be used for specific tasks: (i) a panel of 30 species for a comparative analysis of crop species and their wild progenitors; (ii) 400 worldwide accessions of Arabidopsis thaliana for a genome-wide association study of resource-use traits. Collectively, we will use a multiple-tool approach by using: field measurement, high-throughput phenotyping, common-garden experiment, comparative analysis using databases, modelling, genomics.
The ground-breaking nature of the project holds in the nature of the questions asked and in the unique opportunity to transfer knowledge from ecology and evolutionary biology to crop species.
Summary
A fundamental question in biology is how constraints drive phenotypic changes and the diversification of life. We know little about the role of these constraints on crop domestication, nor how artificial selection can escape them. CONSTRAINTS questions whether crop domestication has shifted ecophysiological and biophysical traits related to resource acquisition, use and partitioning, and how trade-offs between them have constrained domestication and can limit future improvements in both optimal and sub-optimal conditions.
The project is based on three objectives: 1. revealing the existence (or lack) of generic resource-use domestication syndrome in crop science; 2. elucidating ecophysiological and biophysical trade-offs within crop science and delineating the envelope of constraints for artificial selection; 3. examining the shape of ecophysiological and biophysical trade-offs in crop species when grown in sub-optimal environmental conditions. This project will be investigated within and across crop species thanks to a core panel of 12 studied species (maize, sunflower, Japanese rice, sorghum, durum wheat, bread wheat, alfalfa, orchardgrass, silvergrass, pea, colza, vine) for which data and collections (ca. 1,300 genotypes total) are already available to the PI, and additional high throughput phenotyping using automatons. Additional species will be used for specific tasks: (i) a panel of 30 species for a comparative analysis of crop species and their wild progenitors; (ii) 400 worldwide accessions of Arabidopsis thaliana for a genome-wide association study of resource-use traits. Collectively, we will use a multiple-tool approach by using: field measurement, high-throughput phenotyping, common-garden experiment, comparative analysis using databases, modelling, genomics.
The ground-breaking nature of the project holds in the nature of the questions asked and in the unique opportunity to transfer knowledge from ecology and evolutionary biology to crop species.
Max ERC Funding
1 499 979 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CONTACTMATH
Project Legendrian contact homology and generating families
Researcher (PI) Frédéric Bourgeois
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary A contact structure on an odd dimensional manifold in a maximally non integrable hyperplane field. It is the odd dimensional counterpart of a symplectic structure. Contact and symplectic topology is a recent and very active area that studies intrinsic questions about existence, (non) uniqueness and rigidity of contact and symplectic structures. It is intimately related to many other important disciplines, such as dynamical systems, singularity theory, knot theory, Morse theory, complex analysis, ... Legendrian submanifolds are a distinguished class of submanifolds in a contact manifold, which are tangent to the contact distribution. These manifolds are of a particular interest in contact topology. Important classes of Legendrian submanifolds can be described using generating families, and this description can be used to define Legendrian invariants via Morse theory. Other the other hand, Legendrian contact homology is an invariant for Legendrian submanifolds, based on holomorphic curves. The goal of this research proposal is to study the relationship between these two approaches. More precisely, we plan to show that the generating family homology and the linearized Legendrian contact homology can be defined for the same class of Legendrian submanifolds, and are isomorphic. This correspondence should be established using a parametrized version of symplectic homology, being developed by the Principal Investigator in collaboration with Oancea. Such a result would give an entirely new type of information about holomorphic curves invariants. Moreover, it can be used to obtain more general structural results on linearized Legendrian contact homology, to extend recent results on existence of Reeb chords, and to gain a much better understanding of the geography of Legendrian submanifolds.
Summary
A contact structure on an odd dimensional manifold in a maximally non integrable hyperplane field. It is the odd dimensional counterpart of a symplectic structure. Contact and symplectic topology is a recent and very active area that studies intrinsic questions about existence, (non) uniqueness and rigidity of contact and symplectic structures. It is intimately related to many other important disciplines, such as dynamical systems, singularity theory, knot theory, Morse theory, complex analysis, ... Legendrian submanifolds are a distinguished class of submanifolds in a contact manifold, which are tangent to the contact distribution. These manifolds are of a particular interest in contact topology. Important classes of Legendrian submanifolds can be described using generating families, and this description can be used to define Legendrian invariants via Morse theory. Other the other hand, Legendrian contact homology is an invariant for Legendrian submanifolds, based on holomorphic curves. The goal of this research proposal is to study the relationship between these two approaches. More precisely, we plan to show that the generating family homology and the linearized Legendrian contact homology can be defined for the same class of Legendrian submanifolds, and are isomorphic. This correspondence should be established using a parametrized version of symplectic homology, being developed by the Principal Investigator in collaboration with Oancea. Such a result would give an entirely new type of information about holomorphic curves invariants. Moreover, it can be used to obtain more general structural results on linearized Legendrian contact homology, to extend recent results on existence of Reeb chords, and to gain a much better understanding of the geography of Legendrian submanifolds.
Max ERC Funding
710 000 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym CoqHoTT
Project Coq for Homotopy Type Theory
Researcher (PI) nicolas Tabareau
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Summary
Every year, software bugs cost hundreds of millions of euros to companies and administrations. Hence, software quality is a prevalent notion and interactive theorem provers based on type theory have shown their efficiency to prove correctness of important pieces of software like the C compiler of the CompCert project. One main interest of such theorem provers is the ability to extract directly the code from the proof. Unfortunately, their democratization suffers from a major drawback, the mismatch between equality in mathematics and in type theory. Thus, significant Coq developments have only been done by virtuosos playing with advanced concepts of computer science and mathematics. Recently, an extension of type theory with homotopical concepts such as univalence is gaining traction because it allows for the first time to marry together expected principles of equality. But the univalence principle has been treated so far as a new axiom which breaks one fundamental property of mechanized proofs: the ability to compute with programs that make use of this axiom. The main goal of the CoqHoTT project is to provide a new generation of proof assistants with a computational version of univalence and use them as a base to implement effective logical model transformation so that the power of the internal logic of the proof assistant needed to prove the correctness of a program can be decided and changed at compile time—according to a trade-off between efficiency and logical expressivity. Our approach is based on a radically new compilation phase technique into a core type theory to modularize the difficulty of finding a decidable type checking algorithm for homotopy type theory.
The impact of the CoqHoTT project will be very strong. Even if Coq is already a success, this project will promote it as a major proof assistant, for both computer scientists and mathematicians. CoqHoTT will become an essential tool for program certification and formalization of mathematics.
Max ERC Funding
1 498 290 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym CORFRONMAT
Project Correlated frontiers of many-body quantum mathematics and condensed matter physics
Researcher (PI) Nicolas ROUGERIE
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary One of the main challenges in condensed matter physics is to understand strongly correlated quantum systems. Our purpose is to approach this issue from the point of view of rigorous mathematical analysis. The goals are twofold: develop a mathematical framework applicable to physically relevant scenarii, take inspiration from the physics to introduce new topics in mathematics. The scope of the proposal thus goes from physically oriented questions (theoretical description and modelization of physical systems) to analytical ones (rigorous derivation and analysis of reduced models) in several cases where strong correlations play the key role.
In a first part, we aim at developing mathematical methods of general applicability to go beyond mean-field theory in different contexts. Our long-term goal is to forge new tools to attack important open problems in the field. Particular emphasis will be put on the structural properties of large quantum states as a general tool.
A second part is concerned with so-called fractional quantum Hall states, host of the fractional quantum Hall effect. Despite the appealing structure of their built-in correlations, their mathematical study is in its infancy. They however constitute an excellent testing ground to develop ideas of possible wider applicability. In particular, we introduce and study a new class of many-body variational problems.
In the third part we discuss so-called anyons, exotic quasi-particles thought to emerge as excitations of highly-correlated quantum systems. Their modelization gives rise to rather unusual, strongly interacting, many-body Hamiltonians with a topological content. Mathematical analysis will help us shed light on those, clarifying the characteristic properties that could ultimately be experimentally tested.
Summary
One of the main challenges in condensed matter physics is to understand strongly correlated quantum systems. Our purpose is to approach this issue from the point of view of rigorous mathematical analysis. The goals are twofold: develop a mathematical framework applicable to physically relevant scenarii, take inspiration from the physics to introduce new topics in mathematics. The scope of the proposal thus goes from physically oriented questions (theoretical description and modelization of physical systems) to analytical ones (rigorous derivation and analysis of reduced models) in several cases where strong correlations play the key role.
In a first part, we aim at developing mathematical methods of general applicability to go beyond mean-field theory in different contexts. Our long-term goal is to forge new tools to attack important open problems in the field. Particular emphasis will be put on the structural properties of large quantum states as a general tool.
A second part is concerned with so-called fractional quantum Hall states, host of the fractional quantum Hall effect. Despite the appealing structure of their built-in correlations, their mathematical study is in its infancy. They however constitute an excellent testing ground to develop ideas of possible wider applicability. In particular, we introduce and study a new class of many-body variational problems.
In the third part we discuss so-called anyons, exotic quasi-particles thought to emerge as excitations of highly-correlated quantum systems. Their modelization gives rise to rather unusual, strongly interacting, many-body Hamiltonians with a topological content. Mathematical analysis will help us shed light on those, clarifying the characteristic properties that could ultimately be experimentally tested.
Max ERC Funding
1 056 664 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym CORTEXSELFCONTROL
Project Self-Modulating Neurons in the Cerebral Cortex: From Molecular Mechanisms to Cortical Network Activities
Researcher (PI) Alberto Bacci
Host Institution (HI) INSTITUT DU CERVEAU ET DE LA MOELLE EPINIERE
Call Details Starting Grant (StG), LS4, ERC-2007-StG
Summary In the mammalian brain, the neocortex is the site where sensory information is integrated into complex cognitive functions. This is accomplished by the activity of both principal glutamatergic neurons and locally-projecting inhibitory GABAergic interneurons, interconnected in complex networks. Inhibitory neurons play several key roles in neocortical function. For example, they shape sensory receptive fields and drive several high frequency network oscillations. On the other hand, defects in their function can lead to devastating diseases, such as epilepsy and schizophrenia. Cortical interneurons represent a highly heterogeneous cell population. Understanding the specific role of each interneuron subtype within cortical microcircuits is still a crucial open question. We have examined properties of two major functional interneuron subclasses in neocortical layer V: fast-spiking (FS) and low-threshold spiking (LTS) cells. Our previous data indicate that each group expresses a novel form of self inhibition, namely autaptic inhibitory transmission in FS cells and an endocannabinoid-mediated slow self inhibition in LTS interneurons. In this proposal we will address three major questions relevant to self-inhibition of neocortical interneurons: 1) What is the role of FS cell autapses in coordinating fast network synchrony? 2) What are the molecular mechanisms underlying autaptic asynchronous release, prolonging FS cell self-inhibition by several seconds, and what is its relevance during physiological and pathological network activities? 3) What are the induction mechanisms, the molecular players involved and the functional roles within cortical microcircuits of the endocannabinoid-mediated long-lasting self-inhibition in LTS interneurons? Results of these experiments will lead to a better understanding of GABAergic interneuron regulation of neocortical excitability, relevant to both normal and pathological cortical function.
Summary
In the mammalian brain, the neocortex is the site where sensory information is integrated into complex cognitive functions. This is accomplished by the activity of both principal glutamatergic neurons and locally-projecting inhibitory GABAergic interneurons, interconnected in complex networks. Inhibitory neurons play several key roles in neocortical function. For example, they shape sensory receptive fields and drive several high frequency network oscillations. On the other hand, defects in their function can lead to devastating diseases, such as epilepsy and schizophrenia. Cortical interneurons represent a highly heterogeneous cell population. Understanding the specific role of each interneuron subtype within cortical microcircuits is still a crucial open question. We have examined properties of two major functional interneuron subclasses in neocortical layer V: fast-spiking (FS) and low-threshold spiking (LTS) cells. Our previous data indicate that each group expresses a novel form of self inhibition, namely autaptic inhibitory transmission in FS cells and an endocannabinoid-mediated slow self inhibition in LTS interneurons. In this proposal we will address three major questions relevant to self-inhibition of neocortical interneurons: 1) What is the role of FS cell autapses in coordinating fast network synchrony? 2) What are the molecular mechanisms underlying autaptic asynchronous release, prolonging FS cell self-inhibition by several seconds, and what is its relevance during physiological and pathological network activities? 3) What are the induction mechanisms, the molecular players involved and the functional roles within cortical microcircuits of the endocannabinoid-mediated long-lasting self-inhibition in LTS interneurons? Results of these experiments will lead to a better understanding of GABAergic interneuron regulation of neocortical excitability, relevant to both normal and pathological cortical function.
Max ERC Funding
996 000 €
Duration
Start date: 2008-10-01, End date: 2014-03-31
Project acronym CoVeCe
Project Coinduction for Verification and Certification
Researcher (PI) Damien Gabriel Jacques Pous
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Summary
Software and hardware bugs cost hundreds of millions of euros every year to companies and administrations. Formal methods like verification provide automatic means of finding some of these bugs. Certification, using proof assistants like Coq or Isabelle/HOL, make it possible to guarantee the absence of bugs (up to a certain point).
These two kinds of tools are crucial in order to design safer programs and machines. Unfortunately, state-of-the art tools are not yet satisfactory. Verification tools often face state-explosion problems and require more efficient algorithms; certification tools need more automation: they currently require too much time and expertise, even for basic tasks that could be handled easily through verification.
In recent work with Bonchi, we have shown that an extremely simple idea from concurrency theory could give rise to algorithms that are often exponentially faster than the algorithms currently used in verification tools.
My claim is that this idea could scale to richer models, revolutionising existing verification tools and providing algorithms for problems whose decidability is still open.
Moreover, the expected simplicity of those algorithms will make it possible to implement them inside certification tools such as Coq, to provide powerful automation techniques based on verification techniques. In the end, we will thus provide efficient and certified verification tools going beyond the state-of-the-art, but also the ability to use such tools inside the Coq proof assistant, to alleviate the cost of certification tasks.
Max ERC Funding
1 407 413 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym CPDENL
Project Control of partial differential equations and nonlinearity
Researcher (PI) Jean-Michel Coron
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE1, ERC-2010-AdG_20100224
Summary The aim of this 5,5 years project is to create around the PI a research group on the control of systems modeled by partial differential equations at the Laboratory Jacques-Louis Lions of the UPMC and to develop with this group an intensive research activity focused on nonlinear phenomena.
With the ERC grant, the PI plans to hire post-doc fellows and PhD students, to offer 1-to-3 months positions to confirmed researchers, a regular seminar and workshops.
A lot is known on finite dimensional control systems and linear control systems modeled by partial differential equations. Much less is known for nonlinear control systems modeled by partial differential equations. In particular, in many important cases, one does not know how to use the classical iterated Lie brackets which are so useful to deal with nonlinear control systems in finite dimension.
In this project, the PI plans to develop, with the research group, methods to deal with the problems of controllability and of stabilization for nonlinear systems modeled by partial differential equations, in the case where the nonlinearity plays a crucial role. This is for example the case where the linearized control system around the equilibrium of interest is not controllable or not stabilizable. This is also the case when the nonlinearity is too big at infinity and one looks for global results. This is also the case if the nonlinearity contains too many derivatives. The PI has already introduced some methods to deal with these cases, but a lot remains to be done. Indeed, many natural important and challenging problems are still open. Precise examples, often coming from physics, are given in this proposal.
Summary
The aim of this 5,5 years project is to create around the PI a research group on the control of systems modeled by partial differential equations at the Laboratory Jacques-Louis Lions of the UPMC and to develop with this group an intensive research activity focused on nonlinear phenomena.
With the ERC grant, the PI plans to hire post-doc fellows and PhD students, to offer 1-to-3 months positions to confirmed researchers, a regular seminar and workshops.
A lot is known on finite dimensional control systems and linear control systems modeled by partial differential equations. Much less is known for nonlinear control systems modeled by partial differential equations. In particular, in many important cases, one does not know how to use the classical iterated Lie brackets which are so useful to deal with nonlinear control systems in finite dimension.
In this project, the PI plans to develop, with the research group, methods to deal with the problems of controllability and of stabilization for nonlinear systems modeled by partial differential equations, in the case where the nonlinearity plays a crucial role. This is for example the case where the linearized control system around the equilibrium of interest is not controllable or not stabilizable. This is also the case when the nonlinearity is too big at infinity and one looks for global results. This is also the case if the nonlinearity contains too many derivatives. The PI has already introduced some methods to deal with these cases, but a lot remains to be done. Indeed, many natural important and challenging problems are still open. Precise examples, often coming from physics, are given in this proposal.
Max ERC Funding
1 403 100 €
Duration
Start date: 2011-05-01, End date: 2016-09-30
Project acronym CREATIV
Project Creating Co-Adaptive Human-Computer Partnerships
Researcher (PI) Wendy Mackay
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary "CREATIV explores how the concept of co-adaptation can revolutionize the design and use of interactive software. Co-adaptation is the
parallel phenomenon in which users both adapt their behavior to the system’s constraints, learning its power and idiosyncrasies, and
appropriate the system for their own needs, often using it in ways unintended by the system designer.
A key insight in designing for co-adaptation is that we can encapsulate interactions and treat them as first class objects, called interaction
instruments This lets us focus on the specific characteristics of how human users express their intentions, both learning from and
controlling the system. By making instruments co-adaptive, we can radically change how people use interactive systems, providing
incrementally learnable paths that offer users greater expressive power and mastery of their technology.
The project offers theoretical, technical and empirical contributions. CREATIV will develop a novel architecture and generative principles for
creating co-adaptive instruments. The multi-disciplinary design team includes computer scientists, social scientists and designers as well
as ‘extreme users’, creative professionals who push the limits of their technology. Using participatory design techniques, we will articulate
the design space for co-adaptive instruments and build a series of prototypes. Evaluation activities include qualitative and quantitative
studies, in the lab and in the field, to test hypotheses and assess the success of the prototypes.
The initial goal of the CREATIV project is to fundamentally improve the learning and expressive capabilities of advanced users of creative
software, offering significantly enhanced methods for expressing and exploring their ideas. The ultimate goal is to radically transform
interactive systems for everyone by creating a powerful and flexible partnership between human users and interactive technology."
Summary
"CREATIV explores how the concept of co-adaptation can revolutionize the design and use of interactive software. Co-adaptation is the
parallel phenomenon in which users both adapt their behavior to the system’s constraints, learning its power and idiosyncrasies, and
appropriate the system for their own needs, often using it in ways unintended by the system designer.
A key insight in designing for co-adaptation is that we can encapsulate interactions and treat them as first class objects, called interaction
instruments This lets us focus on the specific characteristics of how human users express their intentions, both learning from and
controlling the system. By making instruments co-adaptive, we can radically change how people use interactive systems, providing
incrementally learnable paths that offer users greater expressive power and mastery of their technology.
The project offers theoretical, technical and empirical contributions. CREATIV will develop a novel architecture and generative principles for
creating co-adaptive instruments. The multi-disciplinary design team includes computer scientists, social scientists and designers as well
as ‘extreme users’, creative professionals who push the limits of their technology. Using participatory design techniques, we will articulate
the design space for co-adaptive instruments and build a series of prototypes. Evaluation activities include qualitative and quantitative
studies, in the lab and in the field, to test hypotheses and assess the success of the prototypes.
The initial goal of the CREATIV project is to fundamentally improve the learning and expressive capabilities of advanced users of creative
software, offering significantly enhanced methods for expressing and exploring their ideas. The ultimate goal is to radically transform
interactive systems for everyone by creating a powerful and flexible partnership between human users and interactive technology."
Max ERC Funding
2 458 996 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym CriBLaM
Project Critical behavior of lattice models
Researcher (PI) Hugo DUMINIL-COPIN
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary Statistical physics is a theory allowing the derivation of the statistical behavior of macroscopic systems from the description of the interactions of their microscopic constituents. For more than a century, lattice models (i.e. random systems defined on lattices) have been introduced as discrete models describing the phase transition for a large variety of phenomena, ranging from ferroelectrics to lattice gas.
In the last decades, our understanding of percolation and the Ising model, two classical exam- ples of lattice models, progressed greatly. Nonetheless, major questions remain open on these two models.
The goal of this project is to break new grounds in the understanding of phase transition in statistical physics by using and aggregating in a pioneering way multiple techniques from proba- bility, combinatorics, analysis and integrable systems. In this project, we will focus on three main goals:
Objective A Provide a solid mathematical framework for the study of universality for Bernoulli percolation and the Ising model in two dimensions.
Objective B Advance in the understanding of the critical behavior of Bernoulli percolation and the Ising model in dimensions larger or equal to 3.
Objective C Greatly improve the understanding of planar lattice models obtained by general- izations of percolation and the Ising model, through the design of an innovative mathematical theory of phase transition dedicated to graphical representations of classical lattice models, such as Fortuin-Kasteleyn percolation, Ashkin-Teller models and Loop models.
Most of the questions that we propose to tackle are notoriously difficult open problems. We believe that breakthroughs in these fundamental questions would reshape significantly our math- ematical understanding of phase transition.
Summary
Statistical physics is a theory allowing the derivation of the statistical behavior of macroscopic systems from the description of the interactions of their microscopic constituents. For more than a century, lattice models (i.e. random systems defined on lattices) have been introduced as discrete models describing the phase transition for a large variety of phenomena, ranging from ferroelectrics to lattice gas.
In the last decades, our understanding of percolation and the Ising model, two classical exam- ples of lattice models, progressed greatly. Nonetheless, major questions remain open on these two models.
The goal of this project is to break new grounds in the understanding of phase transition in statistical physics by using and aggregating in a pioneering way multiple techniques from proba- bility, combinatorics, analysis and integrable systems. In this project, we will focus on three main goals:
Objective A Provide a solid mathematical framework for the study of universality for Bernoulli percolation and the Ising model in two dimensions.
Objective B Advance in the understanding of the critical behavior of Bernoulli percolation and the Ising model in dimensions larger or equal to 3.
Objective C Greatly improve the understanding of planar lattice models obtained by general- izations of percolation and the Ising model, through the design of an innovative mathematical theory of phase transition dedicated to graphical representations of classical lattice models, such as Fortuin-Kasteleyn percolation, Ashkin-Teller models and Loop models.
Most of the questions that we propose to tackle are notoriously difficult open problems. We believe that breakthroughs in these fundamental questions would reshape significantly our math- ematical understanding of phase transition.
Max ERC Funding
1 499 912 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CryptoCloud
Project Cryptography for the Cloud
Researcher (PI) David Daniel Rene Pointcheval
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Summary
Many companies have already started the migration to the Cloud and many individuals share their personal informations on social networks. Unfortunately, in the current access mode, the provider first authenticates the client, and grants him access, or not, according to his rights in the access-control list. Therefore, the provider itself not only has total access to the data, but also knows which data are accessed, by whom, and how: privacy, which includes secrecy of data (confidentiality), identities (anonymity), and requests (obliviousness), should be enforced.
The industry of the Cloud introduces a new implicit trust requirement: nobody has any idea at all of where and how his data are stored and manipulated, but everybody should blindly trust the providers. Privacy-compliant procedures cannot be left to the responsibility of the provider: however strong the trustfulness of the provider may be, any system or human vulnerability can be exploited against privacy. This presents too huge a threat to tolerate. The distribution of the data and the secrecy of the actions must be given back to the users. It requires promoting privacy as a global security notion.
A new generation of secure multi-party computation protocols is required to protect everybody in an appropriate way, with privacy and efficiency: interactive protocols will be the core approach to provide privacy in practical systems.
Privacy for the Cloud will have a huge societal impact since it will revolutionize the trust model: users will be able to make safe use of outsourced storage, namely for personal, financial and medical data, without having to worry about failures or attacks of the server. It will also have a strong economic impact, conferring a competitive advantage on Cloud providers implementing these tools.
Max ERC Funding
2 168 261 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym CRYSP
Project CRYSP: A Novel Framework for Collaboratively Building Cryptographically Secure Programs and their Proofs
Researcher (PI) Karthikeyan Bhargavan
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary The field of software security analysis stands at a critical juncture.
Applications have become too large for security experts to examine by hand,
automated verification tools do not scale, and the risks of deploying insecure software are too great to tolerate anything less than mathematical proof.
A radical shift of strategy is needed if programming and analysis techniques are to keep up in a networked world where increasing amounts of governmental and individual information are generated, manipulated, and accessed through web-based software applications.
The basic tenet of this proposal is that the main roadblock to the security verification of a large program is not its size, but rather the lack of precise security specifications for the underlying libraries and security-critical application code. Since, large-scale software is often a collaborative effort, no single programmer knows all the design goals. Hence, this proposal advocates a collaborative specification and verification framework that helps teams of programmers write detailed security specifications incrementally and then verify that they are satisfied by the source program.
The main scientific challenge is to develop new program verification techniques that can be applied collaboratively, incrementally, and modularly to application and library code written in mainstream programming languages. The validation of this approach will be through substantial case studies. Our aim is to produce the first verified open source cryptographic protocol library and the first web applications with formal proofs of security.
The proposed project is bold and ambitious, but it is certainly feasible, and has the potential to change how software security is analyzed for years to come.
Summary
The field of software security analysis stands at a critical juncture.
Applications have become too large for security experts to examine by hand,
automated verification tools do not scale, and the risks of deploying insecure software are too great to tolerate anything less than mathematical proof.
A radical shift of strategy is needed if programming and analysis techniques are to keep up in a networked world where increasing amounts of governmental and individual information are generated, manipulated, and accessed through web-based software applications.
The basic tenet of this proposal is that the main roadblock to the security verification of a large program is not its size, but rather the lack of precise security specifications for the underlying libraries and security-critical application code. Since, large-scale software is often a collaborative effort, no single programmer knows all the design goals. Hence, this proposal advocates a collaborative specification and verification framework that helps teams of programmers write detailed security specifications incrementally and then verify that they are satisfied by the source program.
The main scientific challenge is to develop new program verification techniques that can be applied collaboratively, incrementally, and modularly to application and library code written in mainstream programming languages. The validation of this approach will be through substantial case studies. Our aim is to produce the first verified open source cryptographic protocol library and the first web applications with formal proofs of security.
The proposed project is bold and ambitious, but it is certainly feasible, and has the potential to change how software security is analyzed for years to come.
Max ERC Funding
1 406 726 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym D3
Project Interpreting Drawings for 3D Design
Researcher (PI) Adrien BOUSSEAU
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary Designers draw extensively to externalize their ideas and communicate with others. However, drawings are currently not directly interpretable by computers. To test their ideas against physical reality, designers have to create 3D models suitable for simulation and 3D printing. However, the visceral and approximate nature of drawing clashes with the tediousness and rigidity of 3D modeling. As a result, designers only model finalized concepts, and have no feedback on feasibility during creative exploration.
Our ambition is to bring the power of 3D engineering tools to the creative phase of design by automatically estimating 3D models from drawings. However, this problem is ill-posed: a point in the drawing can lie anywhere in depth. Existing solutions are limited to simple shapes, or require user input to “explain” to the computer how to interpret the drawing. Our originality is to exploit professional drawing techniques that designers developed to communicate shape most efficiently. Each technique provides geometric constraints that help viewers understand drawings, and that we shall leverage for 3D reconstruction.
Our first challenge is to formalize common drawing techniques and derive how they constrain 3D shape. Our second challenge is to identify which techniques are used in a drawing. We cast this problem as the joint optimization of discrete variables indicating which constraints apply, and continuous variables representing the 3D model that best satisfies these constraints. But evaluating all constraint configurations is impractical. To solve this inverse problem, we will first develop forward algorithms that synthesize drawings from 3D models. Our idea is to use this synthetic data to train machine learning algorithms that predict the likelihood that constraints apply in a given drawing.
In addition to tackling the long-standing problem of single-image 3D reconstruction, our research will significantly tighten design and engineering for rapid prototyping.
Summary
Designers draw extensively to externalize their ideas and communicate with others. However, drawings are currently not directly interpretable by computers. To test their ideas against physical reality, designers have to create 3D models suitable for simulation and 3D printing. However, the visceral and approximate nature of drawing clashes with the tediousness and rigidity of 3D modeling. As a result, designers only model finalized concepts, and have no feedback on feasibility during creative exploration.
Our ambition is to bring the power of 3D engineering tools to the creative phase of design by automatically estimating 3D models from drawings. However, this problem is ill-posed: a point in the drawing can lie anywhere in depth. Existing solutions are limited to simple shapes, or require user input to “explain” to the computer how to interpret the drawing. Our originality is to exploit professional drawing techniques that designers developed to communicate shape most efficiently. Each technique provides geometric constraints that help viewers understand drawings, and that we shall leverage for 3D reconstruction.
Our first challenge is to formalize common drawing techniques and derive how they constrain 3D shape. Our second challenge is to identify which techniques are used in a drawing. We cast this problem as the joint optimization of discrete variables indicating which constraints apply, and continuous variables representing the 3D model that best satisfies these constraints. But evaluating all constraint configurations is impractical. To solve this inverse problem, we will first develop forward algorithms that synthesize drawings from 3D models. Our idea is to use this synthetic data to train machine learning algorithms that predict the likelihood that constraints apply in a given drawing.
In addition to tackling the long-standing problem of single-image 3D reconstruction, our research will significantly tighten design and engineering for rapid prototyping.
Max ERC Funding
1 482 761 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym DAL
Project DAL: Defying Amdahl's Law
Researcher (PI) Andre Seznec
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2010-AdG_20100224
Summary Multicore processors have now become mainstream for both general-purpose and embedded computing. Instead of working on improving the architecture of the next generation multicore, with the DAL project, we deliberately anticipate the next few generations of multicores.
While multicores featuring 1000's of cores might become feasible around 2020, there are strong indications that sequential programming style will continue to be dominant. Even future mainstream parallel applications will exhibit large sequential sections. Amdahl's law indicates that high performance on these sequential sections is needed to enable overall high performance on the whole application. On many (most) applications, the effective performance of future computer systems using a 1000-core processor chip will significantly depend on their performance on both sequential code sections and single thread.
We envision that, around 2020, the processor chips will feature a few complex cores and many (may be 1000's) simpler, more silicon and power effective cores.
In the DAL research project, we will explore the microarchitecture techniques that will be needed to enable high performance on such heterogeneous processor chips. Very high performance will be required on both sequential sections -legacy sequential codes, sequential sections of parallel applications- and critical threads on parallel applications -e.g. the main thread controlling the application. Our research will focus on enhancing single process performance. On the microarchitecture side, we will explore both a radically new approach, the sequential accelerator, and more conventional processor architectures. We will also study how to exploit heterogeneous multicore architectures to enhance sequential thread performance.
Summary
Multicore processors have now become mainstream for both general-purpose and embedded computing. Instead of working on improving the architecture of the next generation multicore, with the DAL project, we deliberately anticipate the next few generations of multicores.
While multicores featuring 1000's of cores might become feasible around 2020, there are strong indications that sequential programming style will continue to be dominant. Even future mainstream parallel applications will exhibit large sequential sections. Amdahl's law indicates that high performance on these sequential sections is needed to enable overall high performance on the whole application. On many (most) applications, the effective performance of future computer systems using a 1000-core processor chip will significantly depend on their performance on both sequential code sections and single thread.
We envision that, around 2020, the processor chips will feature a few complex cores and many (may be 1000's) simpler, more silicon and power effective cores.
In the DAL research project, we will explore the microarchitecture techniques that will be needed to enable high performance on such heterogeneous processor chips. Very high performance will be required on both sequential sections -legacy sequential codes, sequential sections of parallel applications- and critical threads on parallel applications -e.g. the main thread controlling the application. Our research will focus on enhancing single process performance. On the microarchitecture side, we will explore both a radically new approach, the sequential accelerator, and more conventional processor architectures. We will also study how to exploit heterogeneous multicore architectures to enhance sequential thread performance.
Max ERC Funding
2 398 542 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym Damocles
Project Modelling brain aneurysm to elucidate the role of platelets
Researcher (PI) Yacine BOULAFTALI
Host Institution (HI) INSTITUT NATIONAL DE LA SANTE ET DE LA RECHERCHE MEDICALE
Call Details Starting Grant (StG), LS4, ERC-2017-STG
Summary In the European Union, 15 million people have an unruptured intracranial aneurysm (IA) that may rupture one day and lead to subarachnoid haemorrhage (SAH). The IA rupture event is ominous and lingers as a clinical quandary. No safe and effective non-invasive therapies have, as of yet, been identified and implemented in clinical practice mainly because of a lack of knowledge of the underlying mechanisms. Increasing evidence points to inflammation as one of the leading factors in the pathogenesis of IA. Intrasaccular clot formation is a common feature of IA occurring unruptured and ruptured IA. In addition to forming clots, activated platelets support leukocyte recruitment. Interestingly, platelets also prevent local hemorrhage in inflammatory situations independently of their ability to form a platelet plug.
We hypothesize that the role of platelet may evolve throughout the development of IA: initially playing a protective role of in the maintenance of vascular integrity in response to inflammation and contributing later to intrasaccular thrombus formation. What are the platelet signaling pathways and responses involved and to what extent do they contribute to the disease and the rupture event?
To answer these questions, we designed an interdisciplinary proposal, which gathers biophysical, pharmacological, and in-vivo approaches, with the following objectives: I) To investigate platelet functions from patients diagnosed with intracranial aneurysm at the sites of aneurysm sac. II) To delineate platelet mechanisms and responses in a cutting-edge technology of a 3D reconstruction of IA that will take into account the hemodynamic shear stress. III) To test in a preclinical mouse model of IA efficient anti-platelet therapies and define a therapeutic window to intervene on platelet activation. The proposed project will yield new insights in IA disease and in life science, from cell biology to the discovery of potential new targets in cardiovascular medicine.
Summary
In the European Union, 15 million people have an unruptured intracranial aneurysm (IA) that may rupture one day and lead to subarachnoid haemorrhage (SAH). The IA rupture event is ominous and lingers as a clinical quandary. No safe and effective non-invasive therapies have, as of yet, been identified and implemented in clinical practice mainly because of a lack of knowledge of the underlying mechanisms. Increasing evidence points to inflammation as one of the leading factors in the pathogenesis of IA. Intrasaccular clot formation is a common feature of IA occurring unruptured and ruptured IA. In addition to forming clots, activated platelets support leukocyte recruitment. Interestingly, platelets also prevent local hemorrhage in inflammatory situations independently of their ability to form a platelet plug.
We hypothesize that the role of platelet may evolve throughout the development of IA: initially playing a protective role of in the maintenance of vascular integrity in response to inflammation and contributing later to intrasaccular thrombus formation. What are the platelet signaling pathways and responses involved and to what extent do they contribute to the disease and the rupture event?
To answer these questions, we designed an interdisciplinary proposal, which gathers biophysical, pharmacological, and in-vivo approaches, with the following objectives: I) To investigate platelet functions from patients diagnosed with intracranial aneurysm at the sites of aneurysm sac. II) To delineate platelet mechanisms and responses in a cutting-edge technology of a 3D reconstruction of IA that will take into account the hemodynamic shear stress. III) To test in a preclinical mouse model of IA efficient anti-platelet therapies and define a therapeutic window to intervene on platelet activation. The proposed project will yield new insights in IA disease and in life science, from cell biology to the discovery of potential new targets in cardiovascular medicine.
Max ERC Funding
1 498 618 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym DANSEINCELL
Project Modeling cytoplasmic trafficking and molecular delivery in cellular microdomains
Researcher (PI) David Holcman
Host Institution (HI) ECOLE NORMALE SUPERIEURE
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary Cytoplasmic motion is a key determinant of organelle transport, protein-protein interactions, RNA transport and drug delivery, to name but a few cellular phenomena. Nucleic acid trafficking is important in antisense and gene therapy based on viral and synthetic vectors. This proposal is dedicated to the theoretical study of intracellular transport of proteins, organelles and DNA particles. We propose to construct a mathematical model to quantify and predict the spatiotemporal dynamics of complex structures in the cytosol and the nucleus, based on the physical characteristics and the micro-rheology of the environment (viscosity). We model the passive motion of proteins or DNA as free or confined diffusion, while for the organelle and virus motion, we will include active cytoskeleton-dependent transport. The proposed mathematical model of cellular trafficking is based on physical principles. We propose to estimate the mean arrival time and the probability of viruses and plasmid DNA to arrive to a nuclear pore. The motion will be described by stochastic dynamics, containing both a drift (along microtubules) and a Brownian (free diffusion) component. The analysis of the equations requires the development of new asymptotic methods for the calculation of the probability and the mean arrival time of a particle to a small hole on the nucleus surface. We will extend the analysis to DNA movement in the nucleus after cellular irradiation, when the nucleus contains single and double broken DNA strands (dbDNAs). The number of remaining DNA breaks determines the activation of the repair machinery and the cell decision to enter into apoptosis. We will study the dsbDNA repair machinery engaged in the task of finding the DNA damage. We will formulate and analyze, both numerically and analytically, the equations that link the level of irradiation to apoptosis. The present project belongs to the new class of initiatives toward a quantitative analysis of intracellular trafficking.
Summary
Cytoplasmic motion is a key determinant of organelle transport, protein-protein interactions, RNA transport and drug delivery, to name but a few cellular phenomena. Nucleic acid trafficking is important in antisense and gene therapy based on viral and synthetic vectors. This proposal is dedicated to the theoretical study of intracellular transport of proteins, organelles and DNA particles. We propose to construct a mathematical model to quantify and predict the spatiotemporal dynamics of complex structures in the cytosol and the nucleus, based on the physical characteristics and the micro-rheology of the environment (viscosity). We model the passive motion of proteins or DNA as free or confined diffusion, while for the organelle and virus motion, we will include active cytoskeleton-dependent transport. The proposed mathematical model of cellular trafficking is based on physical principles. We propose to estimate the mean arrival time and the probability of viruses and plasmid DNA to arrive to a nuclear pore. The motion will be described by stochastic dynamics, containing both a drift (along microtubules) and a Brownian (free diffusion) component. The analysis of the equations requires the development of new asymptotic methods for the calculation of the probability and the mean arrival time of a particle to a small hole on the nucleus surface. We will extend the analysis to DNA movement in the nucleus after cellular irradiation, when the nucleus contains single and double broken DNA strands (dbDNAs). The number of remaining DNA breaks determines the activation of the repair machinery and the cell decision to enter into apoptosis. We will study the dsbDNA repair machinery engaged in the task of finding the DNA damage. We will formulate and analyze, both numerically and analytically, the equations that link the level of irradiation to apoptosis. The present project belongs to the new class of initiatives toward a quantitative analysis of intracellular trafficking.
Max ERC Funding
750 000 €
Duration
Start date: 2009-01-01, End date: 2014-06-30
Project acronym DBA
Project Distributed Biological Algorithms
Researcher (PI) Amos Korman
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary This project proposes a new application for computational reasoning. More specifically, the purpose of this interdisciplinary project is to demonstrate the usefulness of an algorithmic perspective in studies of complex biological systems. We focus on the domain of collective behavior, and demonstrate the benefits of using techniques from the field of theoretical distributed computing in order to establish algorithmic insights regarding the behavior of biological ensembles. The project includes three related tasks, for which we have already obtained promising preliminary results. Each task contains a purely theoretical algorithmic component as well as one which integrates theoretical algorithmic studies with experiments. Most experiments are strategically designed by the PI based on computational insights, and are physically conducted by experimental biologists that have been carefully chosen by the PI. In turn, experimental outcomes will be theoretically analyzed via an algorithmic perspective. By this integration, we aim at deciphering how a biological individual (such as an ant) “thinks”, without having direct access to the neurological process within its brain, and how such limited individuals assemble into ensembles that appear to be far greater than the sum of their parts. The ultimate vision behind this project is to enable the formation of a new scientific field, called algorithmic biology, that bases biological studies on theoretical algorithmic insights.
Summary
This project proposes a new application for computational reasoning. More specifically, the purpose of this interdisciplinary project is to demonstrate the usefulness of an algorithmic perspective in studies of complex biological systems. We focus on the domain of collective behavior, and demonstrate the benefits of using techniques from the field of theoretical distributed computing in order to establish algorithmic insights regarding the behavior of biological ensembles. The project includes three related tasks, for which we have already obtained promising preliminary results. Each task contains a purely theoretical algorithmic component as well as one which integrates theoretical algorithmic studies with experiments. Most experiments are strategically designed by the PI based on computational insights, and are physically conducted by experimental biologists that have been carefully chosen by the PI. In turn, experimental outcomes will be theoretically analyzed via an algorithmic perspective. By this integration, we aim at deciphering how a biological individual (such as an ant) “thinks”, without having direct access to the neurological process within its brain, and how such limited individuals assemble into ensembles that appear to be far greater than the sum of their parts. The ultimate vision behind this project is to enable the formation of a new scientific field, called algorithmic biology, that bases biological studies on theoretical algorithmic insights.
Max ERC Funding
1 894 947 €
Duration
Start date: 2015-05-01, End date: 2021-04-30
Project acronym DEEPSEA
Project Parallelism and Beyond: Dynamic Parallel Computation for Efficiency and High Performance
Researcher (PI) Umut Acar
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary We propose to radically extend the frontiers of two major themes in
computing, parallelism and dynamism, and develop a novel paradigm of
computing: dynamic-parallelism. To this end, we will follow two
lines of research. First, we will develop techniques for extracting
efficiency and high performance from parallel programs written in
high-level programming languages. Second, we will develop the
dynamic-parallelism model, where computations can respond to a wide
variety of dynamic changes to their data automatically and
efficiently, by developing novel abstractions (calculi), high-level
programming-language constructs, and compilation techniques. The
research will culminate in a language that extends the C programming
language with support for parallel and dynamic-parallel programming.
The proposal is motivated by urgent needs driven by the advent of
multicore chips, which is making parallelism mainstream, and the
increasing ubiquity of software, which requires applications to
operate on highly dynamic data. These advances demand parallel and
highly dynamic software, which remains too difficult and labor
intensive to develop. The urgency is further underlined by the
increasing data and problem sizes---online data grows
exponentially, doubling every few years---that require similarly
powerful advances in performance.
The proposal will achieve profound impact by dramatically simplifying
the development of high-performing dynamic and dynamic-parallel
software. As a result, programmer productivity and software quality
including correctness, reliability, performance, and resource (e.g.,
time and energy) consumption will improve significantly. The proposal
will not only open new research opportunities in parallel computing,
programming languages, and compilers, but also in other fields where
parallel and dynamic problems abound, e.g., algorithms, computational
biology, geometry, graphics, machine learning, and software systems.
Summary
We propose to radically extend the frontiers of two major themes in
computing, parallelism and dynamism, and develop a novel paradigm of
computing: dynamic-parallelism. To this end, we will follow two
lines of research. First, we will develop techniques for extracting
efficiency and high performance from parallel programs written in
high-level programming languages. Second, we will develop the
dynamic-parallelism model, where computations can respond to a wide
variety of dynamic changes to their data automatically and
efficiently, by developing novel abstractions (calculi), high-level
programming-language constructs, and compilation techniques. The
research will culminate in a language that extends the C programming
language with support for parallel and dynamic-parallel programming.
The proposal is motivated by urgent needs driven by the advent of
multicore chips, which is making parallelism mainstream, and the
increasing ubiquity of software, which requires applications to
operate on highly dynamic data. These advances demand parallel and
highly dynamic software, which remains too difficult and labor
intensive to develop. The urgency is further underlined by the
increasing data and problem sizes---online data grows
exponentially, doubling every few years---that require similarly
powerful advances in performance.
The proposal will achieve profound impact by dramatically simplifying
the development of high-performing dynamic and dynamic-parallel
software. As a result, programmer productivity and software quality
including correctness, reliability, performance, and resource (e.g.,
time and energy) consumption will improve significantly. The proposal
will not only open new research opportunities in parallel computing,
programming languages, and compilers, but also in other fields where
parallel and dynamic problems abound, e.g., algorithms, computational
biology, geometry, graphics, machine learning, and software systems.
Max ERC Funding
1 076 570 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym DEPREC
Project The Dependence Receptors notion: from a cell biology paradigm to anti-cancer targeted therapy
Researcher (PI) Patrick Mehlen
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), LS4, ERC-2011-ADG_20110310
Summary "While it is assumed that transmembrane receptors are active only in the presence of ligand, we have proposed that some receptors may also be active in the absence of ligand stimulation. These receptors, named “dependence receptors” (DRs) share the ability to transmit two opposite signals: in the presence of ligand, these receptors transduce various classical “positive” signals, whereas in the absence of ligand, they trigger apoptosis. The expression of dependence receptors thus creates cellular states of dependence for survival on their respective ligands. To date, more than fifteen such receptors have been identified, including the netrin-1 receptors DCC (Deleted in Colorectal Cancer) and UNC5H1-4, some integrins, RET, EPHA4, TrkA, TrkC and the Sonic Hedgehog receptor Patched (Ptc). Even though the interest in this notion is increasing, two main questions remain poorly understood: (i) how very different receptors, with only modest homology, are able to trigger apoptosis when unengaged by their respective ligand, and (ii) what are the respective biological roles of this pro-apoptotic activity in vivo. We have hypothesized that the DRs pro-apoptotic activity is a mechanism that determines and regulates the territories of migration/localization of cells during embryonic development. We also demonstrated that this may be a mechanism that limits tumor growth and metastasis. The goal of the present project is, based on the study of a relatively small number of these receptors –i.e., DCC, UNC5H, RET, TrkC, Ptc- with a specifically larger emphasis on netrin-1 receptors, to address (i) the common and divergent cell signaling mechanisms triggering apoptosis downstream of these receptors and (ii) the physiological and pathological roles of these DRs on development of neoplasia in vivo. This latter goal will allow us to investigate how this pro-apoptotic activity can be of use to improve and diversify alternative anti-cancer therapeutic approaches."
Summary
"While it is assumed that transmembrane receptors are active only in the presence of ligand, we have proposed that some receptors may also be active in the absence of ligand stimulation. These receptors, named “dependence receptors” (DRs) share the ability to transmit two opposite signals: in the presence of ligand, these receptors transduce various classical “positive” signals, whereas in the absence of ligand, they trigger apoptosis. The expression of dependence receptors thus creates cellular states of dependence for survival on their respective ligands. To date, more than fifteen such receptors have been identified, including the netrin-1 receptors DCC (Deleted in Colorectal Cancer) and UNC5H1-4, some integrins, RET, EPHA4, TrkA, TrkC and the Sonic Hedgehog receptor Patched (Ptc). Even though the interest in this notion is increasing, two main questions remain poorly understood: (i) how very different receptors, with only modest homology, are able to trigger apoptosis when unengaged by their respective ligand, and (ii) what are the respective biological roles of this pro-apoptotic activity in vivo. We have hypothesized that the DRs pro-apoptotic activity is a mechanism that determines and regulates the territories of migration/localization of cells during embryonic development. We also demonstrated that this may be a mechanism that limits tumor growth and metastasis. The goal of the present project is, based on the study of a relatively small number of these receptors –i.e., DCC, UNC5H, RET, TrkC, Ptc- with a specifically larger emphasis on netrin-1 receptors, to address (i) the common and divergent cell signaling mechanisms triggering apoptosis downstream of these receptors and (ii) the physiological and pathological roles of these DRs on development of neoplasia in vivo. This latter goal will allow us to investigate how this pro-apoptotic activity can be of use to improve and diversify alternative anti-cancer therapeutic approaches."
Max ERC Funding
2 485 037 €
Duration
Start date: 2012-05-01, End date: 2017-04-30
Project acronym DerSympApp
Project Derived Symplectic Geometry and Applications
Researcher (PI) Damien CALAQUE
Host Institution (HI) UNIVERSITE DE MONTPELLIER
Call Details Consolidator Grant (CoG), PE1, ERC-2017-COG
Summary We propose a program that aims at providing new developments and new applications of shifted symplectic and Poisson structures. It is formulated in the language and framework of derived algebraic geometry after Toën–Vezzosi and Lurie.
On the foundational side, we will introduce the new notion of shifted symplectic groupoids and prove that they provide an alternative approach to shifted Poisson structures (as they were defined by the PI together with Tony Pantev, Bertrand Toën, Michel Vaquié and Gabriele Vezzosi). Along the way, we shall be able to prove several conjectures that have recently been formulated by the PI and other people.
Applications are related to mathematical physics. For instance:
- We will provide an interpretation of the Batalin–Vilkovisky formalism in terms of derived symplectic reduction.
- We will show that the semi-classical topological field theories with values in derived Lagrangian correspondences that were previously introduced by the PI are actually fully extended topological field theories in the sense of Baez–Dolan and Lurie.
- We will explain how one may use this formalism to rigorously construct a 2D topological field theory that has been discovered by Moore and Tachikawa.
Quantization problems will also be discussed at the end of the proposal.
This project proposal lies at the crossroads of algebraic geometry, mathematical physics (in its algebraic and geometric aspects) and higher algebra.
Summary
We propose a program that aims at providing new developments and new applications of shifted symplectic and Poisson structures. It is formulated in the language and framework of derived algebraic geometry after Toën–Vezzosi and Lurie.
On the foundational side, we will introduce the new notion of shifted symplectic groupoids and prove that they provide an alternative approach to shifted Poisson structures (as they were defined by the PI together with Tony Pantev, Bertrand Toën, Michel Vaquié and Gabriele Vezzosi). Along the way, we shall be able to prove several conjectures that have recently been formulated by the PI and other people.
Applications are related to mathematical physics. For instance:
- We will provide an interpretation of the Batalin–Vilkovisky formalism in terms of derived symplectic reduction.
- We will show that the semi-classical topological field theories with values in derived Lagrangian correspondences that were previously introduced by the PI are actually fully extended topological field theories in the sense of Baez–Dolan and Lurie.
- We will explain how one may use this formalism to rigorously construct a 2D topological field theory that has been discovered by Moore and Tachikawa.
Quantization problems will also be discussed at the end of the proposal.
This project proposal lies at the crossroads of algebraic geometry, mathematical physics (in its algebraic and geometric aspects) and higher algebra.
Max ERC Funding
1 385 247 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym DiGGeS
Project Discrete Groups and Geometric Structures
Researcher (PI) Fanny Solveig KASSEL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary Discrete subgroups of Lie groups, whose study originated in Fuchsian differential equations and crystallography at the end of the 19th century, are the basis of a large aspect of modern geometry. They are the object of fundamental theories such as Teichmüller theory, Kleinian groups, rigidity theories for lattices, homogeneous dynamics, and most recently Higher Teichmüller theory. They are closely related to the notion of a geometric structure on a manifold, which has played a crucial role in geometry since Thurston. In summary, discrete subgroups are a meeting point of geometry with Lie theory, differential equations, complex analysis, ergodic theory, representation theory, algebraic geometry, number theory, and mathematical physics, and these fascinating interactions make the subject extremely rich.
In real rank one, important classes of discrete subgroups of semisimple Lie groups are known for their good geometric, topological, and dynamical properties, such as convex cocompact or geometrically finite subgroups. In higher real rank, discrete groups beyond lattices remain quite mysterious. The goal of the project is to work towards a classification of discrete subgroups of semisimple Lie groups in higher real rank, from two complementary points of view. The first is actions on Riemannian symmetric spaces and their boundaries: important recent developments, in particular in the theory of Anosov representations, give hope to identify a number of meaningful classes of discrete groups which generalise in various ways the notions of convex cocompactness and geometric finiteness. The second point of view is actions on pseudo-Riemannian symmetric spaces: some very interesting geometric examples are now well understood, and recent links with the first point of view give hope to transfer progress from one side to the other. We expect powerful applications, both to the construction of proper actions on affine spaces and to the spectral theory of pseudo-Riemannian manifolds
Summary
Discrete subgroups of Lie groups, whose study originated in Fuchsian differential equations and crystallography at the end of the 19th century, are the basis of a large aspect of modern geometry. They are the object of fundamental theories such as Teichmüller theory, Kleinian groups, rigidity theories for lattices, homogeneous dynamics, and most recently Higher Teichmüller theory. They are closely related to the notion of a geometric structure on a manifold, which has played a crucial role in geometry since Thurston. In summary, discrete subgroups are a meeting point of geometry with Lie theory, differential equations, complex analysis, ergodic theory, representation theory, algebraic geometry, number theory, and mathematical physics, and these fascinating interactions make the subject extremely rich.
In real rank one, important classes of discrete subgroups of semisimple Lie groups are known for their good geometric, topological, and dynamical properties, such as convex cocompact or geometrically finite subgroups. In higher real rank, discrete groups beyond lattices remain quite mysterious. The goal of the project is to work towards a classification of discrete subgroups of semisimple Lie groups in higher real rank, from two complementary points of view. The first is actions on Riemannian symmetric spaces and their boundaries: important recent developments, in particular in the theory of Anosov representations, give hope to identify a number of meaningful classes of discrete groups which generalise in various ways the notions of convex cocompactness and geometric finiteness. The second point of view is actions on pseudo-Riemannian symmetric spaces: some very interesting geometric examples are now well understood, and recent links with the first point of view give hope to transfer progress from one side to the other. We expect powerful applications, both to the construction of proper actions on affine spaces and to the spectral theory of pseudo-Riemannian manifolds
Max ERC Funding
1 049 182 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym DIOCLES
Project Discrete bIOimaging perCeption for Longitudinal Organ modElling and computEr-aided diagnosiS
Researcher (PI) Nikolaos Paragyios
Host Institution (HI) ECOLE CENTRALE DES ARTS ET MANUFACTURES
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Recent hardware developments from the medical device manufacturers have made possible non-invasive/in-vivo acquisition of anatomical and physiological measurements. One can cite numerous emerging modalities (e.g. PET, fMRI, DTI). The nature (3D/multi-phase/vectorial) and the volume of this data make impossible in practice their interpretation from humans. On the other hand, these modalities can be used for early screening, therapeutic strategies evaluation as well as evaluating bio-markers for drugs development. Despite enormous progress made on the field of biomedical image analysis still a huge gap exists between clinical research and clinical use. The aim of this proposal is three-fold. First we would like to introduce a novel biomedical image perception framework for clinical use towards disease screening and drug evaluation. Such a framework is expected to be modular (can be used in various clinical settings), computationally efficient (would not require specialized hardware), and can provide a quantitative and qualitative anatomo-pathological indices. Second, leverage progress made on the field of machine learning along with novel, efficient, compact representation of clinical bio-markers toward computer aided diagnosis. Last, using these emerging multi-dimensional signals, we would like to perform longitudinal modelling and understanding the effects of aging to a number of organs and diseases that do not present pre-disease indicators such as brain neurological diseases, muscular diseases, certain forms of cancer, etc.
Such a challenging and pioneering effort lies on the interface of medicine (clinical context), biomedical imaging (choice of signals/modalities), machine learning (manifold representations of heterogeneous multivariate variables), discrete optimization (computationally efficient inference of higher-order models), and bio-medical image inference (measurement extraction and multi-modal fusion of heterogeneous information sources).
Summary
Recent hardware developments from the medical device manufacturers have made possible non-invasive/in-vivo acquisition of anatomical and physiological measurements. One can cite numerous emerging modalities (e.g. PET, fMRI, DTI). The nature (3D/multi-phase/vectorial) and the volume of this data make impossible in practice their interpretation from humans. On the other hand, these modalities can be used for early screening, therapeutic strategies evaluation as well as evaluating bio-markers for drugs development. Despite enormous progress made on the field of biomedical image analysis still a huge gap exists between clinical research and clinical use. The aim of this proposal is three-fold. First we would like to introduce a novel biomedical image perception framework for clinical use towards disease screening and drug evaluation. Such a framework is expected to be modular (can be used in various clinical settings), computationally efficient (would not require specialized hardware), and can provide a quantitative and qualitative anatomo-pathological indices. Second, leverage progress made on the field of machine learning along with novel, efficient, compact representation of clinical bio-markers toward computer aided diagnosis. Last, using these emerging multi-dimensional signals, we would like to perform longitudinal modelling and understanding the effects of aging to a number of organs and diseases that do not present pre-disease indicators such as brain neurological diseases, muscular diseases, certain forms of cancer, etc.
Such a challenging and pioneering effort lies on the interface of medicine (clinical context), biomedical imaging (choice of signals/modalities), machine learning (manifold representations of heterogeneous multivariate variables), discrete optimization (computationally efficient inference of higher-order models), and bio-medical image inference (measurement extraction and multi-modal fusion of heterogeneous information sources).
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-09-01, End date: 2016-08-31