Project acronym 100 Archaic Genomes
Project Genome sequences from extinct hominins
Researcher (PI) Svante PÄÄBO
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2015-AdG
Summary Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Summary
Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Max ERC Funding
2 350 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym 15CBOOKTRADE
Project The 15th-century Book Trade: An Evidence-based Assessment and Visualization of the Distribution, Sale, and Reception of Books in the Renaissance
Researcher (PI) Cristina Dondi
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Summary
The idea that underpins this project is to use the material evidence from thousands of surviving 15th-c. books, as well as unique documentary evidence — the unpublished ledger of a Venetian bookseller in the 1480s which records the sale of 25,000 printed books with their prices — to address four fundamental questions relating to the introduction of printing in the West which have so far eluded scholarship, partly because of lack of evidence, partly because of the lack of effective tools to deal with existing evidence. The book trade differs from other trades operating in the medieval and early modern periods in that the goods traded survive in considerable numbers. Not only do they survive, but many of them bear stratified evidence of their history in the form of marks of ownership, prices, manuscript annotations, binding and decoration styles. A British Academy pilot project conceived by the PI produced a now internationally-used database which gathers together this kind of evidence for thousands of surviving 15th-c. printed books. For the first time, this makes it possible to track the circulation of books, their trade routes and later collecting, across Europe and the USA, and throughout the centuries. The objectives of this project are to examine (1) the distribution and trade-routes, national and international, of 15th-c. printed books, along with the identity of the buyers and users (private, institutional, religious, lay, female, male, and by profession) and their reading practices; (2) the books' contemporary market value; (3) the transmission and dissemination of the texts they contain, their survival and their loss (rebalancing potentially skewed scholarship); and (4) the circulation and re-use of the illustrations they contain. Finally, the project will experiment with the application of scientific visualization techniques to represent, geographically and chronologically, the movement of 15th-c. printed books and of the texts they contain.
Max ERC Funding
1 999 172 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym 1st-principles-discs
Project A First Principles Approach to Accretion Discs
Researcher (PI) Martin Elias Pessah
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2012-StG_20111012
Summary Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Summary
Most celestial bodies, from planets, to stars, to black holes; gain mass during their lives by means of an accretion disc. Understanding the physical processes that determine the rate at which matter accretes and energy is radiated in these discs is vital for unraveling the formation, evolution, and fate of almost every type of object in the Universe. Despite the fact that magnetic fields have been known to be crucial in accretion discs since the early 90’s, the majority of astrophysical questions that depend on the details of how disc accretion proceeds are still being addressed using the “standard” accretion disc model (developed in the early 70’s), where magnetic fields do not play an explicit role. This has prevented us from fully exploring the astrophysical consequences and observational signatures of realistic accretion disc models, leading to a profound disconnect between observations (usually interpreted with the standard paradigm) and modern accretion disc theory and numerical simulations (where magnetic turbulence is crucial). The goal of this proposal is to use several complementary approaches in order to finally move beyond the standard paradigm. This program has two main objectives: 1) Develop the theoretical framework to incorporate magnetic fields, and the ensuing turbulence, into self-consistent accretion disc models, and investigate their observational implications. 2) Investigate transport and radiative processes in collision-less disc regions, where non-thermal radiation originates, by employing a kinetic particle description of the plasma. In order to achieve these goals, we will use, and build upon, state-of-the-art magnetohydrodynamic and particle-in-cell codes in conjunction with theoretical modeling. This framework will make it possible to address fundamental questions on stellar and planet formation, binary systems with a compact object, and supermassive black hole feedback in a way that has no counterpart within the standard paradigm.
Max ERC Funding
1 793 697 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym 2-3-AUT
Project Surfaces, 3-manifolds and automorphism groups
Researcher (PI) Nathalie Wahl
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\'e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.
Summary
The scientific goal of the proposal is to answer central questions related to diffeomorphism groups of manifolds of dimension 2 and 3, and to their deformation invariant analogs, the mapping class groups. While the classification of surfaces has been known for more than a century, their automorphism groups have yet to be fully understood. Even less is known about diffeomorphisms of 3-manifolds despite much interest, and the objects here have only been classified recently, by the breakthrough work of Perelman on the Poincar\'e and geometrization conjectures. In dimension 2, I will focus on the relationship between mapping class groups and topological conformal field theories, with applications to Hochschild homology. In dimension 3, I propose to compute the stable homology of classifying spaces of diffeomorphism groups and mapping class groups, as well as study the homotopy type of the space of diffeomorphisms. I propose moreover to establish homological stability theorems in the wider context of automorphism groups and more general families of groups. The project combines breakthrough methods from homotopy theory with methods from differential and geometric topology. The research team will consist of 3 PhD students, and 4 postdocs, which I will lead.
Max ERC Funding
724 992 €
Duration
Start date: 2009-11-01, End date: 2014-10-31
Project acronym 2-HIT
Project Genetic interaction networks: From C. elegans to human disease
Researcher (PI) Ben Lehner
Host Institution (HI) FUNDACIO CENTRE DE REGULACIO GENOMICA
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Summary
Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Max ERC Funding
1 100 000 €
Duration
Start date: 2008-09-01, End date: 2014-04-30
Project acronym 3D Reloaded
Project 3D Reloaded: Novel Algorithms for 3D Shape Inference and Analysis
Researcher (PI) Daniel Cremers
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.
Summary
Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym 3D-REPAIR
Project Spatial organization of DNA repair within the nucleus
Researcher (PI) Evanthia Soutoglou
Host Institution (HI) CENTRE EUROPEEN DE RECHERCHE EN BIOLOGIE ET MEDECINE
Call Details Consolidator Grant (CoG), LS2, ERC-2015-CoG
Summary Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Summary
Faithful repair of double stranded DNA breaks (DSBs) is essential, as they are at the origin of genome instability, chromosomal translocations and cancer. Cells repair DSBs through different pathways, which can be faithful or mutagenic, and the balance between them at a given locus must be tightly regulated to preserve genome integrity. Although, much is known about DSB repair factors, how the choice between pathways is controlled within the nuclear environment is not understood. We have shown that nuclear architecture and non-random genome organization determine the frequency of chromosomal translocations and that pathway choice is dictated by the spatial organization of DNA in the nucleus. Nevertheless, what determines which pathway is activated in response to DSBs at specific genomic locations is not understood. Furthermore, the impact of 3D-genome folding on the kinetics and efficiency of DSB repair is completely unknown.
Here we aim to understand how nuclear compartmentalization, chromatin structure and genome organization impact on the efficiency of detection, signaling and repair of DSBs. We will unravel what determines the DNA repair specificity within distinct nuclear compartments using protein tethering, promiscuous biotinylation and quantitative proteomics. We will determine how DNA repair is orchestrated at different heterochromatin structures using a CRISPR/Cas9-based system that allows, for the first time robust induction of DSBs at specific heterochromatin compartments. Finally, we will investigate the role of 3D-genome folding in the kinetics of DNA repair and pathway choice using single nucleotide resolution DSB-mapping coupled to 3D-topological maps.
This proposal has significant implications for understanding the mechanisms controlling DNA repair within the nuclear environment and will reveal the regions of the genome that are susceptible to genomic instability and help us understand why certain mutations and translocations are recurrent in cancer
Max ERC Funding
1 999 750 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3D_Tryps
Project The role of three-dimensional genome architecture in antigenic variation
Researcher (PI) Tim Nicolai SIEGEL
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary Antigenic variation is a widely employed strategy to evade the host immune response. It has similar functional requirements even in evolutionarily divergent pathogens. These include the mutually exclusive expression of antigens and the periodic, nonrandom switching in the expression of different antigens during the course of an infection. Despite decades of research the mechanisms of antigenic variation are not fully understood in any organism.
The recent development of high-throughput sequencing-based assays to probe the 3D genome architecture (Hi-C) has revealed the importance of the spatial organization of DNA inside the nucleus. 3D genome architecture plays a critical role in the regulation of mutually exclusive gene expression and the frequency of translocation between different genomic loci in many eukaryotes. Thus, genome architecture may also be a key regulator of antigenic variation, yet the causal links between genome architecture and the expression of antigens have not been studied systematically. In addition, the development of CRISPR-Cas9-based approaches to perform nucleotide-specific genome editing has opened unprecedented opportunities to study the influence of DNA sequence elements on the spatial organization of DNA and how this impacts antigen expression.
I have adapted both Hi-C and CRISPR-Cas9 technology to the protozoan parasite Trypanosoma brucei, one of the most important model organisms to study antigenic variation. These techniques will enable me to bridge the field of antigenic variation research with that of genome architecture. I will perform the first systematic analysis of the role of genome architecture in the mutually exclusive and hierarchical expression of antigens in any pathogen.
The experiments outlined in this proposal will provide new insight, facilitating a new view of antigenic variation and may eventually help medical intervention in T. brucei and in other pathogens relying on antigenic variation for their survival.
Summary
Antigenic variation is a widely employed strategy to evade the host immune response. It has similar functional requirements even in evolutionarily divergent pathogens. These include the mutually exclusive expression of antigens and the periodic, nonrandom switching in the expression of different antigens during the course of an infection. Despite decades of research the mechanisms of antigenic variation are not fully understood in any organism.
The recent development of high-throughput sequencing-based assays to probe the 3D genome architecture (Hi-C) has revealed the importance of the spatial organization of DNA inside the nucleus. 3D genome architecture plays a critical role in the regulation of mutually exclusive gene expression and the frequency of translocation between different genomic loci in many eukaryotes. Thus, genome architecture may also be a key regulator of antigenic variation, yet the causal links between genome architecture and the expression of antigens have not been studied systematically. In addition, the development of CRISPR-Cas9-based approaches to perform nucleotide-specific genome editing has opened unprecedented opportunities to study the influence of DNA sequence elements on the spatial organization of DNA and how this impacts antigen expression.
I have adapted both Hi-C and CRISPR-Cas9 technology to the protozoan parasite Trypanosoma brucei, one of the most important model organisms to study antigenic variation. These techniques will enable me to bridge the field of antigenic variation research with that of genome architecture. I will perform the first systematic analysis of the role of genome architecture in the mutually exclusive and hierarchical expression of antigens in any pathogen.
The experiments outlined in this proposal will provide new insight, facilitating a new view of antigenic variation and may eventually help medical intervention in T. brucei and in other pathogens relying on antigenic variation for their survival.
Max ERC Funding
1 498 175 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym 3DEpi
Project Transgenerational epigenetic inheritance of chromatin states : the role of Polycomb and 3D chromosome architecture
Researcher (PI) Giacomo CAVALLI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS2, ERC-2017-ADG
Summary Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Summary
Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym 3DICE
Project 3D Interstellar Chemo-physical Evolution
Researcher (PI) Valentine Wakelam
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2013-StG
Summary At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Summary
At the end of their life, stars spread their inner material into the diffuse interstellar medium. This diffuse medium gets locally denser and form dark clouds (also called dense or molecular clouds) whose innermost part is shielded from the external UV field by the dust, allowing for molecules to grow and get more complex. Gravitational collapse occurs inside these dense clouds, forming protostars and their surrounding disks, and eventually planetary systems like (or unlike) our solar system. The formation and evolution of molecules, minerals, ices and organics from the diffuse medium to planetary bodies, their alteration or preservation throughout this cosmic chemical history set the initial conditions for building planets, atmospheres and possibly the first bricks of life. The current view of interstellar chemistry is based on fragmental works on key steps of the sequence that are observed. The objective of this proposal is to follow the fractionation of the elements between the gas-phase and the interstellar grains, from the most diffuse medium to protoplanetary disks, in order to constrain the chemical composition of the material in which planets are formed. The potential outcome of this project is to get a consistent and more accurate description of the chemical evolution of interstellar matter. To achieve this objective, I will improve our chemical model by adding new processes on grain surfaces relevant under the diffuse medium conditions. This upgraded gas-grain model will be coupled to 3D dynamical models of the formation of dense clouds from diffuse medium and of protoplanetary disks from dense clouds. The computed chemical composition will also be used with 3D radiative transfer codes to study the chemical tracers of the physics of protoplanetary disk formation. The robustness of the model predictions will be studied with sensitivity analyses. Finally, model results will be confronted to observations to address some of the current challenges.
Max ERC Funding
1 166 231 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym 3DWATERWAVES
Project Mathematical aspects of three-dimensional water waves with vorticity
Researcher (PI) Erik Torsten Wahlén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Summary
The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Max ERC Funding
1 203 627 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym 4C
Project 4C technology: uncovering the multi-dimensional structure of the genome
Researcher (PI) Wouter Leonard De Laat
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Summary
The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Max ERC Funding
1 225 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym 4D-GenEx
Project Spatio-temporal Organization and Expression of the Genome
Researcher (PI) Antoine COULON
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), LS2, ERC-2017-STG
Summary This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Summary
This project investigates the two-way relationship between spatio-temporal genome organization and coordinated gene regulation, through an approach at the interface between physics, computer science and biology.
In the nucleus, preferred positions are observed from chromosomes to single genes, in relation to normal and pathological cellular states. Evidence indicates a complex spatio-temporal coupling between co-regulated genes: e.g. certain genes cluster spatially when responding to similar factors and transcriptional noise patterns suggest domain-wide mechanisms. Yet, no individual experiment allows probing transcriptional coordination in 4 dimensions (FISH, live locus tracking, Hi-C...). Interpreting such data also critically requires theory (stochastic processes, statistical physics…). A lack of appropriate experimental/analytical approaches is impairing our understanding of the 4D genome.
Our proposal combines cutting-edge single-molecule imaging, signal-theory data analysis and physical modeling to study how genes coordinate in space and time in a single nucleus. Our objectives are to understand (a) competition/recycling of shared resources between genes within subnuclear compartments, (b) how enhancers communicate with genes domain-wide, and (c) the role of local conformational dynamics and supercoiling in gene co-regulation. Our organizing hypothesis is that, by acting on their microenvironment, genes shape their co-expression with other genes.
Building upon my expertise, we will use dual-color MS2/PP7 RNA labeling to visualize for the first time transcription and motion of pairs of hormone-responsive genes in real time. With our innovative signal analysis tools, we will extract spatio-temporal signatures of underlying processes, which we will investigate with stochastic modeling and validate through experimental perturbations. We expect to uncover how the functional organization of the linear genome relates to its physical properties and dynamics in 4D.
Max ERC Funding
1 499 750 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym 4DRepLy
Project Closing the 4D Real World Reconstruction Loop
Researcher (PI) Christian THEOBALT
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary 4D reconstruction, the camera-based dense dynamic scene reconstruction, is a grand challenge in computer graphics and computer vision. Despite great progress, 4D capturing the complex, diverse real world outside a studio is still far from feasible. 4DRepLy builds a new generation of high-fidelity 4D reconstruction (4DRecon) methods. They will be the first to efficiently capture all types of deformable objects (humans and other types) in crowded real world scenes with a single color or depth camera. They capture space-time coherent deforming geometry, motion, high-frequency reflectance and illumination at unprecedented detail, and will be the first to handle difficult occlusions, topology changes and large groups of interacting objects. They automatically adapt to new scene types, yet deliver models with meaningful, interpretable parameters. This requires far reaching contributions: First, we develop groundbreaking new plasticity-enhanced model-based 4D reconstruction methods that automatically adapt to new scenes. Second, we develop radically new machine learning-based dense 4D reconstruction methods. Third, these model- and learning-based methods are combined in two revolutionary new classes of 4DRecon methods: 1) advanced fusion-based methods and 2) methods with deep architectural integration. Both, 1) and 2), are automatically designed in the 4D Real World Reconstruction Loop, a revolutionary new design paradigm in which 4DRecon methods refine and adapt themselves while continuously processing unlabeled real world input. This overcomes the previously unbreakable scalability barrier to real world scene diversity, complexity and generality. This paradigm shift opens up a new research direction in graphics and vision and has far reaching relevance across many scientific fields. It enables new applications of profound social pervasion and significant economic impact, e.g., for visual media and virtual/augmented reality, and for future autonomous and robotic systems.
Summary
4D reconstruction, the camera-based dense dynamic scene reconstruction, is a grand challenge in computer graphics and computer vision. Despite great progress, 4D capturing the complex, diverse real world outside a studio is still far from feasible. 4DRepLy builds a new generation of high-fidelity 4D reconstruction (4DRecon) methods. They will be the first to efficiently capture all types of deformable objects (humans and other types) in crowded real world scenes with a single color or depth camera. They capture space-time coherent deforming geometry, motion, high-frequency reflectance and illumination at unprecedented detail, and will be the first to handle difficult occlusions, topology changes and large groups of interacting objects. They automatically adapt to new scene types, yet deliver models with meaningful, interpretable parameters. This requires far reaching contributions: First, we develop groundbreaking new plasticity-enhanced model-based 4D reconstruction methods that automatically adapt to new scenes. Second, we develop radically new machine learning-based dense 4D reconstruction methods. Third, these model- and learning-based methods are combined in two revolutionary new classes of 4DRecon methods: 1) advanced fusion-based methods and 2) methods with deep architectural integration. Both, 1) and 2), are automatically designed in the 4D Real World Reconstruction Loop, a revolutionary new design paradigm in which 4DRecon methods refine and adapt themselves while continuously processing unlabeled real world input. This overcomes the previously unbreakable scalability barrier to real world scene diversity, complexity and generality. This paradigm shift opens up a new research direction in graphics and vision and has far reaching relevance across many scientific fields. It enables new applications of profound social pervasion and significant economic impact, e.g., for visual media and virtual/augmented reality, and for future autonomous and robotic systems.
Max ERC Funding
1 977 000 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym 4PI-SKY
Project 4 pi sky: Extreme Astrophysics with Revolutionary Radio Telescopes
Researcher (PI) Robert Philip Fender
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE9, ERC-2010-AdG_20100224
Summary Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Summary
Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Max ERC Funding
2 999 847 €
Duration
Start date: 2011-07-01, End date: 2017-06-30
Project acronym 5COFM
Project Five Centuries of Marriages
Researcher (PI) Anna Cabré
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Advanced Grant (AdG), SH6, ERC-2010-AdG_20100407
Summary This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Summary
This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Max ERC Funding
1 847 400 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym A-BINGOS
Project Accreting binary populations in Nearby Galaxies: Observations and Simulations
Researcher (PI) Andreas Zezas
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Summary
"High-energy observations of our Galaxy offer a good, albeit not complete, picture of the X-ray source populations, in particular the accreting binary sources. Recent ability to study accreting binaries in nearby galaxies has shown that we would be short-sighted if we restricted ourselves to our Galaxy or to a few nearby ones. I propose an ambitious project that involves a comprehensive study of all the galaxies within 10 Mpc for which we can study in detail their X-ray sources and stellar populations. The study will combine data from a unique suite of observatories (Chandra, XMM-Newton, HST, Spitzer) with state-of-the-art theoretical modelling of binary systems. I propose a novel approach that links the accreting binary populations to their parent stellar populations and surpasses any current studies of X-ray binary populations, both in scale and in scope, by: (a) combining methods and results from several different areas of astrophysics (compact objects, binary systems, stellar populations, galaxy evolution); (b) using data from almost the whole electromagnetic spectrum (infrared to X-ray bands); (c) identifying and studying the different sub-populations of accreting binaries; and (d) performing direct comparison between observations and theoretical predictions, over a broad parameter space. The project: (a) will answer the long-standing question of the formation efficiency of accreting binaries in different environments; and (b) will constrain their evolutionary paths. As by-products the project will provide eagerly awaited input to the fields of gravitational-wave sources, γ-ray bursts, and X-ray emitting galaxies at cosmological distances and it will produce a heritage multi-wavelength dataset and library of models for future studies of galaxies and accreting binaries."
Max ERC Funding
1 242 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym AAREA
Project The Archaeology of Agricultural Resilience in Eastern Africa
Researcher (PI) Daryl Stump
Host Institution (HI) UNIVERSITY OF YORK
Call Details Starting Grant (StG), SH6, ERC-2013-StG
Summary "The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Summary
"The twin concepts of sustainability and conservation that are so pivotal within current debates regarding economic development and biodiversity protection both contain an inherent temporal dimension, since both refer to the need to balance short-term gains with long-term resource maintenance. Proponents of resilience theory and of development based on ‘indigenous knowledge’ have thus argued for the necessity of including archaeological, historical and palaeoenvironmental components within development project design. Indeed, some have argued that archaeology should lead these interdisciplinary projects on the grounds that it provides the necessary time depth and bridges the social and natural sciences. The project proposed here accepts this logic and endorses this renewed contemporary relevance of archaeological research. However, it also needs to be admitted that moving beyond critiques of the misuse of historical data presents significant hurdles. When presenting results outside the discipline, for example, archaeological projects tend to downplay the poor archaeological visibility of certain agricultural practices, and computer models designed to test sustainability struggle to adequately account for local cultural preferences. This field will therefore not progress unless there is a frank appraisal of archaeology’s strengths and weaknesses. This project will provide this assessment by employing a range of established and groundbreaking archaeological and modelling techniques to examine the development of two east Africa agricultural systems: one at the abandoned site of Engaruka in Tanzania, commonly seen as an example of resource mismanagement and ecological collapse; and another at the current agricultural landscape in Konso, Ethiopia, described by the UN FAO as one of a select few African “lessons from the past”. The project thus aims to assess the sustainability of these systems, but will also assess the role archaeology can play in such debates worldwide."
Max ERC Funding
1 196 701 €
Duration
Start date: 2014-02-01, End date: 2018-01-31
Project acronym AARTFAAC
Project Amsterdam-ASTRON Radio Transient Facility And Analysis Centre: Probing the Extremes of Astrophysics
Researcher (PI) Ralph Antoine Marie Joseph Wijers
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE9, ERC-2009-AdG
Summary Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Summary
Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Max ERC Funding
3 499 128 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym AAS
Project Approximate algebraic structure and applications
Researcher (PI) Ben Green
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary This project studies several mathematical topics with a related theme, all of them part of the relatively new discipline known as additive combinatorics.
We look at approximate, or rough, variants of familiar mathematical notions such as group, polynomial or homomorphism. In each case we seek to describe the structure of these approximate objects, and then to give applications of the resulting theorems. This endeavour has already lead to groundbreaking results in the theory of prime numbers, group theory and combinatorial number theory.
Summary
This project studies several mathematical topics with a related theme, all of them part of the relatively new discipline known as additive combinatorics.
We look at approximate, or rough, variants of familiar mathematical notions such as group, polynomial or homomorphism. In each case we seek to describe the structure of these approximate objects, and then to give applications of the resulting theorems. This endeavour has already lead to groundbreaking results in the theory of prime numbers, group theory and combinatorial number theory.
Max ERC Funding
1 000 000 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ACCOMPLI
Project Assembly and maintenance of a co-regulated chromosomal compartment
Researcher (PI) Peter Burkhard Becker
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), LS2, ERC-2011-ADG_20110310
Summary "Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Summary
"Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Max ERC Funding
2 482 770 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACCORD
Project Algorithms for Complex Collective Decisions on Structured Domains
Researcher (PI) Edith Elkind
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Summary
Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Max ERC Funding
1 395 933 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ACROSS
Project Australasian Colonization Research: Origins of Seafaring to Sahul
Researcher (PI) Rosemary Helen FARR
Host Institution (HI) UNIVERSITY OF SOUTHAMPTON
Call Details Starting Grant (StG), SH6, ERC-2017-STG
Summary One of the most exciting research questions within archaeology is that of the peopling of Australasia by at least c.50,000 years ago. This represents some of the earliest evidence of modern human colonization outside Africa, yet, even at the greatest sea-level lowstand, this migration would have involved seafaring. It is the maritime nature of this dispersal which makes it so important to questions of technological, cognitive and social human development. These issues have traditionally been the preserve of archaeologists, but with a multidisciplinary approach that embraces cutting-edge marine geophysical, hydrodynamic and archaeogenetic analyses, we now have the opportunity to examine the When, Where, Who and How of the earliest seafaring in world history.
The voyage from Sunda (South East Asia) to Sahul (Australasia) provides evidence for the earliest ‘open water’ crossing in the world. A combination of the sparse number of early archaeological finds and the significant changes in the palaeolandscape and submergence of the broad north western Australian continental shelf, mean that little is known about the routes taken and what these crossings may have entailed.
This project will combine research of the submerged palaeolandscape of the continental shelf to refine our knowledge of the onshore/offshore environment, identify potential submerged prehistoric sites and enhance our understanding of the palaeoshoreline and tidal regime. This will be combined with archaeogenetic research targeting mtDNA and Y-chromosome data to resolve questions of demography and dating.
For the first time this project takes a truly multidisciplinary approach to address the colonization of Sahul, providing an unique opportunity to tackle some of the most important questions about human origins, the relationship between humans and the changing environment, population dynamics and migration, seafaring technology, social organisation and cognition.
Summary
One of the most exciting research questions within archaeology is that of the peopling of Australasia by at least c.50,000 years ago. This represents some of the earliest evidence of modern human colonization outside Africa, yet, even at the greatest sea-level lowstand, this migration would have involved seafaring. It is the maritime nature of this dispersal which makes it so important to questions of technological, cognitive and social human development. These issues have traditionally been the preserve of archaeologists, but with a multidisciplinary approach that embraces cutting-edge marine geophysical, hydrodynamic and archaeogenetic analyses, we now have the opportunity to examine the When, Where, Who and How of the earliest seafaring in world history.
The voyage from Sunda (South East Asia) to Sahul (Australasia) provides evidence for the earliest ‘open water’ crossing in the world. A combination of the sparse number of early archaeological finds and the significant changes in the palaeolandscape and submergence of the broad north western Australian continental shelf, mean that little is known about the routes taken and what these crossings may have entailed.
This project will combine research of the submerged palaeolandscape of the continental shelf to refine our knowledge of the onshore/offshore environment, identify potential submerged prehistoric sites and enhance our understanding of the palaeoshoreline and tidal regime. This will be combined with archaeogenetic research targeting mtDNA and Y-chromosome data to resolve questions of demography and dating.
For the first time this project takes a truly multidisciplinary approach to address the colonization of Sahul, providing an unique opportunity to tackle some of the most important questions about human origins, the relationship between humans and the changing environment, population dynamics and migration, seafaring technology, social organisation and cognition.
Max ERC Funding
1 134 928 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ACROSSBORDERS
Project Across ancient borders and cultures: An Egyptian microcosm in Sudan during the 2nd millennium BC
Researcher (PI) Julia Budka
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Starting Grant (StG), SH6, ERC-2012-StG_20111124
Summary Pharaonic Egypt is commonly known for its pyramids and tomb treasures. The present knowledge of Egyptian everyday life and social structures derives mostly from mortuary records associated with the upper classes, whereas traces of ordinary life from domestic sites are generally disregarded. Settlement archaeology in Egypt and Nubia (Ancient North Sudan) is still in its infancy; it is timely to strenghten this field. Responsible for the pottery at three major settlement sites (Abydos and Elephantine in Egypt; Sai Island in Sudan), the PI is in a unique position to co-ordinate a research project on settlement patterns in Northeast Africa of the 2nd millennium BC based on the detailed analysis of material remains. The selected case studies situated across ancient and modern borders and of diverse environmental and cultural preconditions, show very similar archaeological remains. Up to now, no attempt has been made to explain this situation in detail.
The focus of the project is the well-preserved, only partially explored site of Sai Island, seemingly an Egyptian microcosm in New Kingdom Upper Nubia. Little time is left to conduct the requisite large-scale archaeology as Sai is endangered by the planned high dam of Dal. With the application of microarchaeology we will introduce an approach that is new in Egyptian settlement archaeology. Our interdisciplinary research will result in novel insights into (a) multifaceted lives on Sai at a micro-spatial level and (b) domestic life in 2nd millennium BC Egypt and Nubia from a macroscopic view. The present understanding of the political situation in Upper Nubia during the New Kingdom as based on written records will be significantly enlarged by the envisaged approach. Furthermore, in reconstructing Sai Island as “home away from home”, the project presents a showcase study of what we can learn about acculturation and adaptation from ancient cultures, in this case from the coexistence of Egyptians and Nubians
Summary
Pharaonic Egypt is commonly known for its pyramids and tomb treasures. The present knowledge of Egyptian everyday life and social structures derives mostly from mortuary records associated with the upper classes, whereas traces of ordinary life from domestic sites are generally disregarded. Settlement archaeology in Egypt and Nubia (Ancient North Sudan) is still in its infancy; it is timely to strenghten this field. Responsible for the pottery at three major settlement sites (Abydos and Elephantine in Egypt; Sai Island in Sudan), the PI is in a unique position to co-ordinate a research project on settlement patterns in Northeast Africa of the 2nd millennium BC based on the detailed analysis of material remains. The selected case studies situated across ancient and modern borders and of diverse environmental and cultural preconditions, show very similar archaeological remains. Up to now, no attempt has been made to explain this situation in detail.
The focus of the project is the well-preserved, only partially explored site of Sai Island, seemingly an Egyptian microcosm in New Kingdom Upper Nubia. Little time is left to conduct the requisite large-scale archaeology as Sai is endangered by the planned high dam of Dal. With the application of microarchaeology we will introduce an approach that is new in Egyptian settlement archaeology. Our interdisciplinary research will result in novel insights into (a) multifaceted lives on Sai at a micro-spatial level and (b) domestic life in 2nd millennium BC Egypt and Nubia from a macroscopic view. The present understanding of the political situation in Upper Nubia during the New Kingdom as based on written records will be significantly enlarged by the envisaged approach. Furthermore, in reconstructing Sai Island as “home away from home”, the project presents a showcase study of what we can learn about acculturation and adaptation from ancient cultures, in this case from the coexistence of Egyptians and Nubians
Max ERC Funding
1 497 460 €
Duration
Start date: 2012-12-01, End date: 2018-04-30
Project acronym ACTIVATION OF XCI
Project Molecular mechanisms controlling X chromosome inactivation
Researcher (PI) Joost Henk Gribnau
Host Institution (HI) ERASMUS UNIVERSITAIR MEDISCH CENTRUM ROTTERDAM
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary In mammals, gene dosage of X-chromosomal genes is equalized between sexes by random inactivation of either one of the two X chromosomes in female cells. In the initial phase of X chromosome inactivation (XCI), a counting and initiation process determines the number of X chromosomes per nucleus, and elects the future inactive X chromosome (Xi). Xist is an X-encoded gene that plays a crucial role in the XCI process. At the start of XCI Xist expression is up-regulated and Xist RNA accumulates on the future Xi thereby initiating silencing in cis. Recent work performed in my laboratory indicates that the counting and initiation process is directed by a stochastic mechanism, in which each X chromosome has an independent probability to be inactivated. We also found that this probability is determined by the X:ploïdy ratio. These results indicated the presence of at least one X-linked activator of XCI. With a BAC screen we recently identified X-encoded RNF12 to be a dose-dependent activator of XCI. Expression of RNF12 correlates with Xist expression, and a heterozygous deletion of Rnf12 results in a marked loss of XCI in female cells. The presence of a small proportion of cells that still initiate XCI, in Rnf12+/- cells, also indicated that more XCI-activators are involved in XCI. Here, we propose to investigate the molecular mechanism by which RNF12 activates XCI in mouse and human, and to search for additional XCI-activators. We will also attempt to establish the role of different inhibitors of XCI, including CTCF and the pluripotency factors OCT4, SOX2 and NANOG. We anticipate that these studies will significantly advance our understanding of XCI mechanisms, which is highly relevant for a better insight in the manifestation of X-linked diseases that are affected by XCI.
Summary
In mammals, gene dosage of X-chromosomal genes is equalized between sexes by random inactivation of either one of the two X chromosomes in female cells. In the initial phase of X chromosome inactivation (XCI), a counting and initiation process determines the number of X chromosomes per nucleus, and elects the future inactive X chromosome (Xi). Xist is an X-encoded gene that plays a crucial role in the XCI process. At the start of XCI Xist expression is up-regulated and Xist RNA accumulates on the future Xi thereby initiating silencing in cis. Recent work performed in my laboratory indicates that the counting and initiation process is directed by a stochastic mechanism, in which each X chromosome has an independent probability to be inactivated. We also found that this probability is determined by the X:ploïdy ratio. These results indicated the presence of at least one X-linked activator of XCI. With a BAC screen we recently identified X-encoded RNF12 to be a dose-dependent activator of XCI. Expression of RNF12 correlates with Xist expression, and a heterozygous deletion of Rnf12 results in a marked loss of XCI in female cells. The presence of a small proportion of cells that still initiate XCI, in Rnf12+/- cells, also indicated that more XCI-activators are involved in XCI. Here, we propose to investigate the molecular mechanism by which RNF12 activates XCI in mouse and human, and to search for additional XCI-activators. We will also attempt to establish the role of different inhibitors of XCI, including CTCF and the pluripotency factors OCT4, SOX2 and NANOG. We anticipate that these studies will significantly advance our understanding of XCI mechanisms, which is highly relevant for a better insight in the manifestation of X-linked diseases that are affected by XCI.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym Active-DNA
Project Computationally Active DNA Nanostructures
Researcher (PI) Damien WOODS
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND MAYNOOTH
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Summary
During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Max ERC Funding
2 349 603 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym ACTIVIA
Project Visual Recognition of Function and Intention
Researcher (PI) Ivan Laptev
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Summary
"Computer vision is concerned with the automated interpretation of images and video streams. Today's research is (mostly) aimed at answering queries such as ""Is this a picture of a dog?"", (classification) or sometimes ""Find the dog in this photo"" (detection). While categorisation and detection are useful for many tasks, inferring correct class labels is not the final answer to visual recognition. The categories and locations of objects do not provide direct understanding of their function i.e., how things work, what they can be used for, or how they can act and react. Such an understanding, however, would be highly desirable to answer currently unsolvable queries such as ""Am I in danger?"" or ""What can happen in this scene?"". Solving such queries is the aim of this proposal.
My goal is to uncover the functional properties of objects and the purpose of actions by addressing visual recognition from a different and yet unexplored perspective. The main novelty of this proposal is to leverage observations of people, i.e., their actions and interactions to automatically learn the use, the purpose and the function of objects and scenes from visual data. The project is timely as it builds upon the two key recent technological advances: (a) the immense progress in visual recognition of objects, scenes and human actions achieved in the last ten years, as well as (b) the emergence of a massive amount of public image and video data now available to train visual models.
ACTIVIA addresses fundamental research issues in automated interpretation of dynamic visual scenes, but its results are expected to serve as a basis for ground-breaking technological advances in practical applications. The recognition of functional properties and intentions as explored in this project will directly support high-impact applications such as detection of abnormal events, which are likely to revolutionise today's approaches to crime protection, hazard prevention, elderly care, and many others."
Max ERC Funding
1 497 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ADAPT
Project Theory and Algorithms for Adaptive Particle Simulation
Researcher (PI) Stephane Redon
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Summary
"During the twentieth century, the development of macroscopic engineering has been largely stimulated by progress in digital prototyping: cars, planes, boats, etc. are nowadays designed and tested on computers. Digital prototypes have progressively replaced actual ones, and effective computer-aided engineering tools have helped cut costs and reduce production cycles of these macroscopic systems.
The twenty-first century is most likely to see a similar development at the atomic scale. Indeed, the recent years have seen tremendous progress in nanotechnology - in particular in the ability to control matter at the atomic scale. Similar to what has happened with macroscopic engineering, powerful and generic computational tools will be needed to engineer complex nanosystems, through modeling and simulation. As a result, a major challenge is to develop efficient simulation methods and algorithms.
NANO-D, the INRIA research group I started in January 2008 in Grenoble, France, aims at developing
efficient computational methods for modeling and simulating complex nanosystems, both natural and artificial. In particular, NANO-D develops SAMSON, a software application which gathers all algorithms designed by the group and its collaborators (SAMSON: Software for Adaptive Modeling and Simulation Of Nanosystems).
In this project, I propose to develop a unified theory, and associated algorithms, for adaptive particle simulation. The proposed theory will avoid problems that plague current popular multi-scale or hybrid simulation approaches by simulating a single potential throughout the system, while allowing users to finely trade precision for computational speed.
I believe the full development of the adaptive particle simulation theory will have an important impact on current modeling and simulation practices, and will enable practical design of complex nanosystems on desktop computers, which should significantly boost the emergence of generic nano-engineering."
Max ERC Funding
1 476 882 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym ADAPT
Project Life in a cold climate: the adaptation of cereals to new environments and the establishment of agriculture in Europe
Researcher (PI) Terence Austen Brown
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), SH6, ERC-2013-ADG
Summary "This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Summary
"This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Max ERC Funding
2 492 964 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ADaPt
Project Adaptation, Dispersals and Phenotype: understanding the roles of climate,
natural selection and energetics in shaping global hunter-gatherer adaptability
Researcher (PI) Jay Stock
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Consolidator Grant (CoG), SH6, ERC-2013-CoG
Summary Relative to other species, humans are characterised by considerable biological diversity despite genetic homogeneity. This diversity is reflected in skeletal variation, but we lack sufficient understanding of the underlying mechanisms to adequately interpret the archaeological record. The proposed research will address problems in our current understanding of the origins of human variation in the past by: 1) documenting and interpreting the pattern of global hunter-gatherer variation relative to genetic phylogenies and climatic variation; 2) testing the relationship between environmental and skeletal variation among genetically related hunter-gatherers from different environments; 3) examining the adaptability of living humans to different environments, through the study of energetic expenditure and life history trade-offs associated with locomotion; and 4) investigating the relationship between muscle and skeletal variation associated with locomotion in diverse environments. This will be achieved by linking: a) detailed study of the global pattern of hunter-gatherer variation in the Late Pleistocene and Holocene with; b) ground-breaking experimental research which tests the relationship between energetic stress, muscle function, and bone variation in living humans. The first component tests the correspondence between skeletal variation and both genetic and climatic history, to infer mechanisms driving variation. The second component integrates this skeletal variation with experimental studies of living humans to, for the first time, directly test adaptive implications of skeletal variation observed in the past. ADaPt will provide the first links between prehistoric hunter-gatherer variation and the evolutionary parameters of life history and energetics that may have shaped our success as a species. It will lead to breakthroughs necessary to interpret variation in the archaeological record, relative to human dispersals and adaptation in the past.
Summary
Relative to other species, humans are characterised by considerable biological diversity despite genetic homogeneity. This diversity is reflected in skeletal variation, but we lack sufficient understanding of the underlying mechanisms to adequately interpret the archaeological record. The proposed research will address problems in our current understanding of the origins of human variation in the past by: 1) documenting and interpreting the pattern of global hunter-gatherer variation relative to genetic phylogenies and climatic variation; 2) testing the relationship between environmental and skeletal variation among genetically related hunter-gatherers from different environments; 3) examining the adaptability of living humans to different environments, through the study of energetic expenditure and life history trade-offs associated with locomotion; and 4) investigating the relationship between muscle and skeletal variation associated with locomotion in diverse environments. This will be achieved by linking: a) detailed study of the global pattern of hunter-gatherer variation in the Late Pleistocene and Holocene with; b) ground-breaking experimental research which tests the relationship between energetic stress, muscle function, and bone variation in living humans. The first component tests the correspondence between skeletal variation and both genetic and climatic history, to infer mechanisms driving variation. The second component integrates this skeletal variation with experimental studies of living humans to, for the first time, directly test adaptive implications of skeletal variation observed in the past. ADaPt will provide the first links between prehistoric hunter-gatherer variation and the evolutionary parameters of life history and energetics that may have shaped our success as a species. It will lead to breakthroughs necessary to interpret variation in the archaeological record, relative to human dispersals and adaptation in the past.
Max ERC Funding
1 911 485 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym ADAPTIVES
Project Algorithmic Development and Analysis of Pioneer Techniques for Imaging with waVES
Researcher (PI) Chrysoula Tsogka
Host Institution (HI) IDRYMA TECHNOLOGIAS KAI EREVNAS
Call Details Starting Grant (StG), PE1, ERC-2009-StG
Summary The proposed work concerns the theoretical and numerical development of robust and adaptive methodologies for broadband imaging in clutter. The word clutter expresses our uncertainty on the wave speed of the propagation medium. Our results are expected to have a strong impact in a wide range of applications, including underwater acoustics, exploration geophysics and ultrasound non-destructive testing. Our machinery is coherent interferometry (CINT), a state-of-the-art statistically stable imaging methodology, highly suitable for the development of imaging methods in clutter. We aim to extend CINT along two complementary directions: novel types of applications, and further mathematical and numerical development so as to assess and extend its range of applicability. CINT is designed for imaging with partially coherent array data recorded in richly scattering media. It uses statistical smoothing techniques to obtain results that are independent of the clutter realization. Quantifying the amount of smoothing needed is difficult, especially when there is no a priori knowledge about the propagation medium. We intend to address this question by coupling the imaging process with the estimation of the medium's large scale features. Our algorithms rely on the residual coherence in the data. When the coherent signal is too weak, the CINT results are unsatisfactory. We propose two ways for enhancing the resolution of CINT: filter the data prior to imaging (noise reduction) and waveform design (optimize the source distribution). Finally, we propose to extend the applicability of our imaging-in-clutter methodologies by investigating the possibility of utilizing ambient noise sources to perform passive sensor imaging, as well as by studying the imaging problem in random waveguides.
Summary
The proposed work concerns the theoretical and numerical development of robust and adaptive methodologies for broadband imaging in clutter. The word clutter expresses our uncertainty on the wave speed of the propagation medium. Our results are expected to have a strong impact in a wide range of applications, including underwater acoustics, exploration geophysics and ultrasound non-destructive testing. Our machinery is coherent interferometry (CINT), a state-of-the-art statistically stable imaging methodology, highly suitable for the development of imaging methods in clutter. We aim to extend CINT along two complementary directions: novel types of applications, and further mathematical and numerical development so as to assess and extend its range of applicability. CINT is designed for imaging with partially coherent array data recorded in richly scattering media. It uses statistical smoothing techniques to obtain results that are independent of the clutter realization. Quantifying the amount of smoothing needed is difficult, especially when there is no a priori knowledge about the propagation medium. We intend to address this question by coupling the imaging process with the estimation of the medium's large scale features. Our algorithms rely on the residual coherence in the data. When the coherent signal is too weak, the CINT results are unsatisfactory. We propose two ways for enhancing the resolution of CINT: filter the data prior to imaging (noise reduction) and waveform design (optimize the source distribution). Finally, we propose to extend the applicability of our imaging-in-clutter methodologies by investigating the possibility of utilizing ambient noise sources to perform passive sensor imaging, as well as by studying the imaging problem in random waveguides.
Max ERC Funding
690 000 €
Duration
Start date: 2010-06-01, End date: 2015-11-30
Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ADIMMUNE
Project Decoding interactions between adipose tissue immune cells, metabolic function, and the intestinal microbiome in obesity
Researcher (PI) Eran Elinav
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS6, ERC-2018-COG
Summary Obesity and its metabolic co-morbidities have given rise to a rapidly expanding ‘metabolic syndrome’ pandemic affecting
hundreds of millions of individuals worldwide. The integrative genetic and environmental causes of the obesity pandemic
remain elusive. White adipose tissue (WAT)-resident immune cells have recently been highlighted as important factors
contributing to metabolic complications. However, a comprehensive understanding of the regulatory circuits governing their
function and the cell type-specific mechanisms by which they contribute to the development of metabolic syndrome is
lacking. Likewise, the gut microbiome has been suggested as a critical regulator of obesity, but the bacterial species and
metabolites that influence WAT inflammation are entirely unknown.
We propose to use our recently developed high-throughput genomic and gnotobiotic tools, integrated with CRISPR-mediated interrogation of gene function, microbial culturomics, and in-vivo metabolic analysis in newly generated mouse models, in order to achieve a new level of molecular understanding of how WAT immune cells integrate environmental cues into their crosstalk with organismal metabolism, and to explore the microbial contributions to the molecular etiology of WAT inflammation in the pathogenesis of diet-induced obesity. Specifically, we aim to (a) decipher the global regulatory landscape and interaction networks of WAT hematopoietic cells at the single-cell level, (b) identify new mediators of WAT immune cell contributions to metabolic homeostasis, and (c) decode how host-microbiome communication shapes the development of WAT inflammation and obesity.
Unraveling the principles of WAT immune cell regulation and their amenability to change by host-microbiota interactions
may lead to a conceptual leap forward in our understanding of metabolic physiology and disease. Concomitantly, it may
generate a platform for microbiome-based personalized therapy against obesity and its complications.
Summary
Obesity and its metabolic co-morbidities have given rise to a rapidly expanding ‘metabolic syndrome’ pandemic affecting
hundreds of millions of individuals worldwide. The integrative genetic and environmental causes of the obesity pandemic
remain elusive. White adipose tissue (WAT)-resident immune cells have recently been highlighted as important factors
contributing to metabolic complications. However, a comprehensive understanding of the regulatory circuits governing their
function and the cell type-specific mechanisms by which they contribute to the development of metabolic syndrome is
lacking. Likewise, the gut microbiome has been suggested as a critical regulator of obesity, but the bacterial species and
metabolites that influence WAT inflammation are entirely unknown.
We propose to use our recently developed high-throughput genomic and gnotobiotic tools, integrated with CRISPR-mediated interrogation of gene function, microbial culturomics, and in-vivo metabolic analysis in newly generated mouse models, in order to achieve a new level of molecular understanding of how WAT immune cells integrate environmental cues into their crosstalk with organismal metabolism, and to explore the microbial contributions to the molecular etiology of WAT inflammation in the pathogenesis of diet-induced obesity. Specifically, we aim to (a) decipher the global regulatory landscape and interaction networks of WAT hematopoietic cells at the single-cell level, (b) identify new mediators of WAT immune cell contributions to metabolic homeostasis, and (c) decode how host-microbiome communication shapes the development of WAT inflammation and obesity.
Unraveling the principles of WAT immune cell regulation and their amenability to change by host-microbiota interactions
may lead to a conceptual leap forward in our understanding of metabolic physiology and disease. Concomitantly, it may
generate a platform for microbiome-based personalized therapy against obesity and its complications.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym ADIPODIF
Project Adipocyte Differentiation and Metabolic Functions in Obesity and Type 2 Diabetes
Researcher (PI) Christian Wolfrum
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), LS6, ERC-2007-StG
Summary Obesity associated disorders such as T2D, hypertension and CVD, commonly referred to as the “metabolic syndrome”, are prevalent diseases of industrialized societies. Deranged adipose tissue proliferation and differentiation contribute significantly to the development of these metabolic disorders. Comparatively little however is known, about how these processes influence the development of metabolic disorders. Using a multidisciplinary approach, I plan to elucidate molecular mechanisms underlying the altered adipocyte differentiation and maturation in different models of obesity associated metabolic disorders. Special emphasis will be given to the analysis of gene expression, postranslational modifications and lipid molecular species composition. To achieve this goal, I am establishing several novel methods to isolate pure primary preadipocytes including a new animal model that will allow me to monitor preadipocytes, in vivo and track their cellular fate in the context of a complete organism. These systems will allow, for the first time to study preadipocyte biology, in an in vivo setting. By monitoring preadipocyte differentiation in vivo, I will also be able to answer the key questions regarding the development of preadipocytes and examine signals that induce or inhibit their differentiation. Using transplantation techniques, I will elucidate the genetic and environmental contributions to the progression of obesity and its associated metabolic disorders. Furthermore, these studies will integrate a lipidomics approach to systematically analyze lipid molecular species composition in different models of metabolic disorders. My studies will provide new insights into the mechanisms and dynamics underlying adipocyte differentiation and maturation, and relate them to metabolic disorders. Detailed knowledge of these mechanisms will facilitate development of novel therapeutic approaches for the treatment of obesity and associated metabolic disorders.
Summary
Obesity associated disorders such as T2D, hypertension and CVD, commonly referred to as the “metabolic syndrome”, are prevalent diseases of industrialized societies. Deranged adipose tissue proliferation and differentiation contribute significantly to the development of these metabolic disorders. Comparatively little however is known, about how these processes influence the development of metabolic disorders. Using a multidisciplinary approach, I plan to elucidate molecular mechanisms underlying the altered adipocyte differentiation and maturation in different models of obesity associated metabolic disorders. Special emphasis will be given to the analysis of gene expression, postranslational modifications and lipid molecular species composition. To achieve this goal, I am establishing several novel methods to isolate pure primary preadipocytes including a new animal model that will allow me to monitor preadipocytes, in vivo and track their cellular fate in the context of a complete organism. These systems will allow, for the first time to study preadipocyte biology, in an in vivo setting. By monitoring preadipocyte differentiation in vivo, I will also be able to answer the key questions regarding the development of preadipocytes and examine signals that induce or inhibit their differentiation. Using transplantation techniques, I will elucidate the genetic and environmental contributions to the progression of obesity and its associated metabolic disorders. Furthermore, these studies will integrate a lipidomics approach to systematically analyze lipid molecular species composition in different models of metabolic disorders. My studies will provide new insights into the mechanisms and dynamics underlying adipocyte differentiation and maturation, and relate them to metabolic disorders. Detailed knowledge of these mechanisms will facilitate development of novel therapeutic approaches for the treatment of obesity and associated metabolic disorders.
Max ERC Funding
1 607 105 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym ADNABIOARC
Project From the earliest modern humans to the onset of farming (45,000-4,500 BP): the role of climate, life-style, health, migration and selection in shaping European population history
Researcher (PI) Ron Pinhasi
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Starting Grant (StG), SH6, ERC-2010-StG_20091209
Summary The colonisation of Europe by anatomically modern humans (AMHs) ca. 45,000 years before present (BP) and the transition to farming ca. 8,000 BP are two major events in human prehistory. Both events involved certain cultural and biological adaptations, technological innovations, and behavioural plasticity which are unique to our species. The reconstruction of these processes and the causality between them has so far remained elusive due to technological, methodological and logistical complexities. Major developments in our understanding of the anthropology of the Upper Palaeolithic, Mesolithic and Neolithic, and advances in ancient DNA (aDNA) technology and chronometric methods now allow us to assess in sufficient resolution the interface between these evolutionary processes, and changes in human culture and behaviour.
The proposed research will investigate the complex interface between the morphological, genetic, behavioural, and cultural factors that shaped the population history of European AMHs. The PI s interdisciplinary expertise in these areas, his access to and experience of relevant skeletal collections, and his ongoing European collaborations will allow significant progress in addressing these fundamental questions. The approach taken will include (a) the collection of bioarchaeological, aDNA, stable isotope (for the analysis of ancient diet) and radiometric data on 500 skeletons from key sites/phases in Europe and western Anatolia, and (b) the application of existing and novel aDNA, bioarchaeological and simulation methodologies. This research will yield results that transform our current understanding of major demographic and evolutionary processes and will place Europe at the forefront of anthropological biological and genetic research.
Summary
The colonisation of Europe by anatomically modern humans (AMHs) ca. 45,000 years before present (BP) and the transition to farming ca. 8,000 BP are two major events in human prehistory. Both events involved certain cultural and biological adaptations, technological innovations, and behavioural plasticity which are unique to our species. The reconstruction of these processes and the causality between them has so far remained elusive due to technological, methodological and logistical complexities. Major developments in our understanding of the anthropology of the Upper Palaeolithic, Mesolithic and Neolithic, and advances in ancient DNA (aDNA) technology and chronometric methods now allow us to assess in sufficient resolution the interface between these evolutionary processes, and changes in human culture and behaviour.
The proposed research will investigate the complex interface between the morphological, genetic, behavioural, and cultural factors that shaped the population history of European AMHs. The PI s interdisciplinary expertise in these areas, his access to and experience of relevant skeletal collections, and his ongoing European collaborations will allow significant progress in addressing these fundamental questions. The approach taken will include (a) the collection of bioarchaeological, aDNA, stable isotope (for the analysis of ancient diet) and radiometric data on 500 skeletons from key sites/phases in Europe and western Anatolia, and (b) the application of existing and novel aDNA, bioarchaeological and simulation methodologies. This research will yield results that transform our current understanding of major demographic and evolutionary processes and will place Europe at the forefront of anthropological biological and genetic research.
Max ERC Funding
1 088 386 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ADULT
Project Analysis of the Dark Universe through Lensing Tomography
Researcher (PI) Hendrik Hoekstra
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Summary
The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Max ERC Funding
1 316 880 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AF and MSOGR
Project Automorphic Forms and Moduli Spaces of Galois Representations
Researcher (PI) Toby Gee
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary I propose to establish a research group to develop completely new tools in order to solve three important problems on the relationships between automorphic forms and Galois representations, which lie at the heart of the Langlands program. The first is to prove Serre’s conjecture for real quadratic fields. I will use automorphic induction to transfer the problem to U(4) over the rational numbers, where I will use automorphy lifting theorems and results on the weight part of Serre's conjecture that I established in my earlier work to reduce the problem to proving results in small weight and level. I will prove these base cases via integral p-adic Hodge theory and discriminant bounds.
The second is to develop a geometric theory of moduli spaces of mod p and p-adic Galois representations, and to use it to establish the Breuil–Mézard conjecture in arbitrary dimension, by reinterpreting the conjecture in geometric terms. This will transform the subject by building the first connections between the p-adic Langlands program and the geometric Langlands program, providing an entirely new world of techniques for number theorists. As a consequence of the Breuil-Mézard conjecture, I will be able to deduce far stronger automorphy lifting theorems (in arbitrary dimension) than those currently available.
The third is to completely determine the reduction mod p of certain two-dimensional crystalline representations, and as an application prove a strengthened version of the Gouvêa–Mazur conjecture. I will do this by means of explicit computations with the p-adic local Langlands correspondence for GL_2(Q_p), as well as by improving existing arguments which prove multiplicity one theorems via automorphy lifting theorems. This work will show that the existence of counterexamples to the Gouvêa-Mazur conjecture is due to a purely local phenomenon, and that when this local obstruction vanishes, far stronger conjectures of Buzzard on the slopes of the U_p operator hold.
Summary
I propose to establish a research group to develop completely new tools in order to solve three important problems on the relationships between automorphic forms and Galois representations, which lie at the heart of the Langlands program. The first is to prove Serre’s conjecture for real quadratic fields. I will use automorphic induction to transfer the problem to U(4) over the rational numbers, where I will use automorphy lifting theorems and results on the weight part of Serre's conjecture that I established in my earlier work to reduce the problem to proving results in small weight and level. I will prove these base cases via integral p-adic Hodge theory and discriminant bounds.
The second is to develop a geometric theory of moduli spaces of mod p and p-adic Galois representations, and to use it to establish the Breuil–Mézard conjecture in arbitrary dimension, by reinterpreting the conjecture in geometric terms. This will transform the subject by building the first connections between the p-adic Langlands program and the geometric Langlands program, providing an entirely new world of techniques for number theorists. As a consequence of the Breuil-Mézard conjecture, I will be able to deduce far stronger automorphy lifting theorems (in arbitrary dimension) than those currently available.
The third is to completely determine the reduction mod p of certain two-dimensional crystalline representations, and as an application prove a strengthened version of the Gouvêa–Mazur conjecture. I will do this by means of explicit computations with the p-adic local Langlands correspondence for GL_2(Q_p), as well as by improving existing arguments which prove multiplicity one theorems via automorphy lifting theorems. This work will show that the existence of counterexamples to the Gouvêa-Mazur conjecture is due to a purely local phenomenon, and that when this local obstruction vanishes, far stronger conjectures of Buzzard on the slopes of the U_p operator hold.
Max ERC Funding
1 131 339 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AFMIDMOA
Project "Applying Fundamental Mathematics in Discrete Mathematics, Optimization, and Algorithmics"
Researcher (PI) Alexander Schrijver
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Summary
"This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Max ERC Funding
2 001 598 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym AfricanNeo
Project The African Neolithic: A genetic perspective
Researcher (PI) Carina SCHLEBUSCH
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), SH6, ERC-2017-STG
Summary The spread of farming practices in various parts of the world had a marked influence on how humans live today and how we are distributed around the globe. Around 10,000 years ago, warmer conditions lead to population increases, coinciding with the invention of farming in several places around the world. Archaeological evidence attest to the spread of these practices to neighboring regions. In many cases this lead to whole continents being converted from hunter-gatherer to farming societies. It is however difficult to see from archaeological records if only the farming culture spread to other places or whether the farming people themselves migrated. Investigating patterns of genetic variation for farming populations and for remaining hunter-gatherer groups can help to resolve questions on population movements co-occurring with the spread of farming practices. It can further shed light on the routes of migration and dates when migrants arrived.
The spread of farming to Europe has been thoroughly investigated in the fields of archaeology, linguistics and genetics, while on other continents these events have been less investigated. In Africa, mainly linguistic and archaeological studies have attempted to elucidate the spread of farming and herding practices. I propose to investigate the movement of farmer and pastoral groups in Africa, by typing densely spaced genome-wide variant positions in a large number of African populations. The data will be used to infer how farming and pastoralism was introduced to various regions, where the incoming people originated from and when these (potential) population movements occurred. Through this study, the Holocene history of Africa will be revealed and placed into a global context of migration, mobility and cultural transitions. Additionally the study will give due credence to one of the largest Neolithic expansion events, the Bantu-expansion, which caused a pronounced change in the demographic landscape of the African continent
Summary
The spread of farming practices in various parts of the world had a marked influence on how humans live today and how we are distributed around the globe. Around 10,000 years ago, warmer conditions lead to population increases, coinciding with the invention of farming in several places around the world. Archaeological evidence attest to the spread of these practices to neighboring regions. In many cases this lead to whole continents being converted from hunter-gatherer to farming societies. It is however difficult to see from archaeological records if only the farming culture spread to other places or whether the farming people themselves migrated. Investigating patterns of genetic variation for farming populations and for remaining hunter-gatherer groups can help to resolve questions on population movements co-occurring with the spread of farming practices. It can further shed light on the routes of migration and dates when migrants arrived.
The spread of farming to Europe has been thoroughly investigated in the fields of archaeology, linguistics and genetics, while on other continents these events have been less investigated. In Africa, mainly linguistic and archaeological studies have attempted to elucidate the spread of farming and herding practices. I propose to investigate the movement of farmer and pastoral groups in Africa, by typing densely spaced genome-wide variant positions in a large number of African populations. The data will be used to infer how farming and pastoralism was introduced to various regions, where the incoming people originated from and when these (potential) population movements occurred. Through this study, the Holocene history of Africa will be revealed and placed into a global context of migration, mobility and cultural transitions. Additionally the study will give due credence to one of the largest Neolithic expansion events, the Bantu-expansion, which caused a pronounced change in the demographic landscape of the African continent
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym Aftermath
Project THE AFTERMATH OF THE EAST ASIAN WAR OF 1592-1598.
Researcher (PI) Rebekah CLEMENTS
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Starting Grant (StG), SH6, ERC-2017-STG
Summary Aftermath seeks to understand the legacy of the East Asian War of 1592-1598. This conflict involved over 500,000 combatants from Japan, China, and Korea; up to 100,000 Korean civilians were abducted to Japan. The war caused momentous demographic upheaval and widespread destruction, but also had long-lasting cultural impact as a result of the removal to Japan of Korean technology and skilled labourers. The conflict and its aftermath bear striking parallels to events in East Asia during World War 2, and memories of the 16th century war remain deeply resonant in the region. However, the war and its immediate aftermath are also significant because they occurred at the juncture of periods often characterized as “medieval” and “early modern” in the East Asian case. What were the implications for the social, economic, and cultural contours of early modern East Asia? What can this conflict tell us about war “aftermath” across historical periods and about such periodization itself? There is little Western scholarship on the war and few studies in any language cross linguistic, disciplinary, and national boundaries to achieve a regional perspective that reflects the interconnected history of East Asia. Aftermath will radically alter our understanding of the region’s history by providing the first analysis of the state of East Asia as a result of the war. The focus will be on the period up to the middle of the 17th century, but not precluding ongoing effects. The team, with expertise covering Japan, Korea, and China, will investigate three themes: the movement of people and demographic change, the impact on the natural environment, and technological diffusion. The project will be the first large scale investigation to use Japanese, Korean, and Chinese sources to understand the war’s aftermath. It will broaden understandings of the early modern world, and push the boundaries of war legacy studies by exploring the meanings of “aftermath” in the early modern East Asian context.
Summary
Aftermath seeks to understand the legacy of the East Asian War of 1592-1598. This conflict involved over 500,000 combatants from Japan, China, and Korea; up to 100,000 Korean civilians were abducted to Japan. The war caused momentous demographic upheaval and widespread destruction, but also had long-lasting cultural impact as a result of the removal to Japan of Korean technology and skilled labourers. The conflict and its aftermath bear striking parallels to events in East Asia during World War 2, and memories of the 16th century war remain deeply resonant in the region. However, the war and its immediate aftermath are also significant because they occurred at the juncture of periods often characterized as “medieval” and “early modern” in the East Asian case. What were the implications for the social, economic, and cultural contours of early modern East Asia? What can this conflict tell us about war “aftermath” across historical periods and about such periodization itself? There is little Western scholarship on the war and few studies in any language cross linguistic, disciplinary, and national boundaries to achieve a regional perspective that reflects the interconnected history of East Asia. Aftermath will radically alter our understanding of the region’s history by providing the first analysis of the state of East Asia as a result of the war. The focus will be on the period up to the middle of the 17th century, but not precluding ongoing effects. The team, with expertise covering Japan, Korea, and China, will investigate three themes: the movement of people and demographic change, the impact on the natural environment, and technological diffusion. The project will be the first large scale investigation to use Japanese, Korean, and Chinese sources to understand the war’s aftermath. It will broaden understandings of the early modern world, and push the boundaries of war legacy studies by exploring the meanings of “aftermath” in the early modern East Asian context.
Max ERC Funding
1 444 980 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym AGALT
Project Asymptotic Geometric Analysis and Learning Theory
Researcher (PI) Shahar Mendelson
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Summary
In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Max ERC Funding
750 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym AGELESS
Project Comparative genomics / ‘wildlife’ transcriptomics uncovers the mechanisms of halted ageing in mammals
Researcher (PI) Emma Teeling
Host Institution (HI) UNIVERSITY COLLEGE DUBLIN, NATIONAL UNIVERSITY OF IRELAND, DUBLIN
Call Details Starting Grant (StG), LS2, ERC-2012-StG_20111109
Summary "Ageing is the gradual and irreversible breakdown of living systems associated with the advancement of time, which leads to an increase in vulnerability and eventual mortality. Despite recent advances in ageing research, the intrinsic complexity of the ageing process has prevented a full understanding of this process, therefore, ageing remains a grand challenge in contemporary biology. In AGELESS, we will tackle this challenge by uncovering the molecular mechanisms of halted ageing in a unique model system, the bats. Bats are the longest-lived mammals relative to their body size, and defy the ‘rate-of-living’ theories as they use twice as much the energy as other species of considerable size, but live far longer. This suggests that bats have some underlying mechanisms that may explain their exceptional longevity. In AGELESS, we will identify the molecular mechanisms that enable mammals to achieve extraordinary longevity, using state-of-the-art comparative genomic methodologies focused on bats. We will identify, using population transcriptomics and telomere/mtDNA genomics, the molecular changes that occur in an ageing wild population of bats to uncover how bats ‘age’ so slowly compared with other mammals. In silico whole genome analyses, field based ageing transcriptomic data, mtDNA and telomeric studies will be integrated and analysed using a networks approach, to ascertain how these systems interact to halt ageing. For the first time, we will be able to utilize the diversity seen within nature to identify key molecular targets and regions that regulate and control ageing in mammals. AGELESS will provide a deeper understanding of the causal mechanisms of ageing, potentially uncovering the crucial molecular pathways that can be modified to halt, alleviate and perhaps even reverse this process in man."
Summary
"Ageing is the gradual and irreversible breakdown of living systems associated with the advancement of time, which leads to an increase in vulnerability and eventual mortality. Despite recent advances in ageing research, the intrinsic complexity of the ageing process has prevented a full understanding of this process, therefore, ageing remains a grand challenge in contemporary biology. In AGELESS, we will tackle this challenge by uncovering the molecular mechanisms of halted ageing in a unique model system, the bats. Bats are the longest-lived mammals relative to their body size, and defy the ‘rate-of-living’ theories as they use twice as much the energy as other species of considerable size, but live far longer. This suggests that bats have some underlying mechanisms that may explain their exceptional longevity. In AGELESS, we will identify the molecular mechanisms that enable mammals to achieve extraordinary longevity, using state-of-the-art comparative genomic methodologies focused on bats. We will identify, using population transcriptomics and telomere/mtDNA genomics, the molecular changes that occur in an ageing wild population of bats to uncover how bats ‘age’ so slowly compared with other mammals. In silico whole genome analyses, field based ageing transcriptomic data, mtDNA and telomeric studies will be integrated and analysed using a networks approach, to ascertain how these systems interact to halt ageing. For the first time, we will be able to utilize the diversity seen within nature to identify key molecular targets and regions that regulate and control ageing in mammals. AGELESS will provide a deeper understanding of the causal mechanisms of ageing, potentially uncovering the crucial molecular pathways that can be modified to halt, alleviate and perhaps even reverse this process in man."
Max ERC Funding
1 499 768 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym Agglomerates
Project Infinite Protein Self-Assembly in Health and Disease
Researcher (PI) Emmanuel Doram LEVY
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS2, ERC-2018-COG
Summary Understanding how proteins respond to mutations is of paramount importance to biology and disease. While protein stability and misfolding have been instrumental in rationalizing the impact of mutations, we recently discovered that an alternative route is also frequent, where mutations at the surface of symmetric proteins trigger novel self-interactions that lead to infinite self-assembly. This mechanism can be involved in disease, as in sickle-cell anemia, but may also serve in adaptation. Importantly, it differs fundamentally from aggregation, because misfolding does not drive it. Thus, we term it “agglomeration”. The ease with which agglomeration can occur, even by single point mutations, shifts the paradigm of how quickly new protein assemblies can emerge, both in health and disease. This prompts us to determine the basic principles of protein agglomeration and explore its implications in cell physiology and human disease.
We propose an interdisciplinary research program bridging atomic and cellular scales to explore agglomeration in three aims: (i) Map the landscape of protein agglomeration in response to mutation in endogenous yeast proteins; (ii) Characterize how yeast physiology impacts agglomeration by changes in gene expression or cell state, and, conversely, how protein agglomerates impact yeast fitness. (iii) Analyze agglomeration in relation to human disease via two approaches. First, by predicting single nucleotide polymorphisms that trigger agglomeration, prioritizing them using knowledge from Aims 1 & 2, and characterizing them experimentally. Second, by providing a proof-of-concept that agglomeration can be exploited in drug design, whereby drugs induce its formation, like mutations can do.
Overall, through this research, we aim to establish agglomeration as a paradigm for protein assembly, with implications for our understanding of evolution, physiology, and disease.
Summary
Understanding how proteins respond to mutations is of paramount importance to biology and disease. While protein stability and misfolding have been instrumental in rationalizing the impact of mutations, we recently discovered that an alternative route is also frequent, where mutations at the surface of symmetric proteins trigger novel self-interactions that lead to infinite self-assembly. This mechanism can be involved in disease, as in sickle-cell anemia, but may also serve in adaptation. Importantly, it differs fundamentally from aggregation, because misfolding does not drive it. Thus, we term it “agglomeration”. The ease with which agglomeration can occur, even by single point mutations, shifts the paradigm of how quickly new protein assemblies can emerge, both in health and disease. This prompts us to determine the basic principles of protein agglomeration and explore its implications in cell physiology and human disease.
We propose an interdisciplinary research program bridging atomic and cellular scales to explore agglomeration in three aims: (i) Map the landscape of protein agglomeration in response to mutation in endogenous yeast proteins; (ii) Characterize how yeast physiology impacts agglomeration by changes in gene expression or cell state, and, conversely, how protein agglomerates impact yeast fitness. (iii) Analyze agglomeration in relation to human disease via two approaches. First, by predicting single nucleotide polymorphisms that trigger agglomeration, prioritizing them using knowledge from Aims 1 & 2, and characterizing them experimentally. Second, by providing a proof-of-concept that agglomeration can be exploited in drug design, whereby drugs induce its formation, like mutations can do.
Overall, through this research, we aim to establish agglomeration as a paradigm for protein assembly, with implications for our understanding of evolution, physiology, and disease.
Max ERC Funding
2 574 819 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym AgricUrb
Project The Agricultural Origins of Urban Civilization
Researcher (PI) Amy Marie Bogaard
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), SH6, ERC-2012-StG_20111124
Summary The establishment of farming is a pivotal moment in human history, setting the stage for the emergence of class-based society and urbanization. Monolithic views of the nature and development of early agriculture, however, have prevented clear understanding of how exactly farming fuelled, shaped and sustained the emergence of complex societies. A breakthrough in archaeological approach is needed to determine the actual roles of farming in the emergence of social complexity. The methodology required must push beyond conventional interpretation of the most direct farming evidence – archaeobotanical remains of crops and associated arable weeds – to reconstruct not only what crops were grown, but also how, where and why farming was practised. Addressing these related aspects, in contexts ranging from early agricultural villages to some of the world’s earliest cities, would provide the key to unraveling the contribution of farming to the development of lasting social inequalities. The research proposed here takes a new interdisciplinary approach combining archaeobotany, plant stable isotope chemistry and functional plant ecology, building on groundwork laid in previous research by the applicant. These approaches will be applied to two relatively well researched areas, western Asia and Europe, where a series of sites that chart multiple pathways to early complex societies offer rich plant and other bioarchaeological assemblages. The proposed project will set a wholly new standard of insight into early farming and its relationship with early civilization, facilitating similar approaches in other parts of the world and the construction of comparative perspectives on the global significance of early agriculture in social development.
Summary
The establishment of farming is a pivotal moment in human history, setting the stage for the emergence of class-based society and urbanization. Monolithic views of the nature and development of early agriculture, however, have prevented clear understanding of how exactly farming fuelled, shaped and sustained the emergence of complex societies. A breakthrough in archaeological approach is needed to determine the actual roles of farming in the emergence of social complexity. The methodology required must push beyond conventional interpretation of the most direct farming evidence – archaeobotanical remains of crops and associated arable weeds – to reconstruct not only what crops were grown, but also how, where and why farming was practised. Addressing these related aspects, in contexts ranging from early agricultural villages to some of the world’s earliest cities, would provide the key to unraveling the contribution of farming to the development of lasting social inequalities. The research proposed here takes a new interdisciplinary approach combining archaeobotany, plant stable isotope chemistry and functional plant ecology, building on groundwork laid in previous research by the applicant. These approaches will be applied to two relatively well researched areas, western Asia and Europe, where a series of sites that chart multiple pathways to early complex societies offer rich plant and other bioarchaeological assemblages. The proposed project will set a wholly new standard of insight into early farming and its relationship with early civilization, facilitating similar approaches in other parts of the world and the construction of comparative perspectives on the global significance of early agriculture in social development.
Max ERC Funding
1 199 647 €
Duration
Start date: 2013-02-01, End date: 2017-01-31
Project acronym AGRIWESTMED
Project Origins and spread of agriculture in the south-western Mediterranean region
Researcher (PI) Maria Leonor Peña Chocarro
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), SH6, ERC-2008-AdG
Summary This project focuses on one of the most fascinating events of the long history of the human species: the origins and spread of agriculture. Research over the past 40 years has provided an invaluable dataset on crop domestication and the spread of agriculture into Europe. However, despite the enormous advances in research there are important areas that remain almost unexplored, some of immense interest. This is the case of the western Mediterranean region from where our knowledge is still limited (Iberian Peninsula) or almost inexistent (northern Morocco). The last few years have witnessed a considerable increase in archaeobotany and the effort of a group of Spanish researchers working together in different aspects of agriculture has started to produce the first results. My proposal will approach the study of the arrival of agriculture to the western Mediterranean by exploring different interrelated research areas. The project involves the
application of different techniques (analysis of charred plant remains, pollen and non-pollen microfossils, phytoliths, micro-wear analyses, isotopes, soil micromorphology, genetics, and ethnoarchaeology) which will help to define the emergence and spread of agriculture in the area, its likely place of origin, its main technological attributes as well as the range crop husbandry practices carried out. The interaction between the different approaches and the methodologies involved will allow achieving a greater understanding of the type of agriculture that characterized the first farming communities in the most south-western part of Europe.
Summary
This project focuses on one of the most fascinating events of the long history of the human species: the origins and spread of agriculture. Research over the past 40 years has provided an invaluable dataset on crop domestication and the spread of agriculture into Europe. However, despite the enormous advances in research there are important areas that remain almost unexplored, some of immense interest. This is the case of the western Mediterranean region from where our knowledge is still limited (Iberian Peninsula) or almost inexistent (northern Morocco). The last few years have witnessed a considerable increase in archaeobotany and the effort of a group of Spanish researchers working together in different aspects of agriculture has started to produce the first results. My proposal will approach the study of the arrival of agriculture to the western Mediterranean by exploring different interrelated research areas. The project involves the
application of different techniques (analysis of charred plant remains, pollen and non-pollen microfossils, phytoliths, micro-wear analyses, isotopes, soil micromorphology, genetics, and ethnoarchaeology) which will help to define the emergence and spread of agriculture in the area, its likely place of origin, its main technological attributes as well as the range crop husbandry practices carried out. The interaction between the different approaches and the methodologies involved will allow achieving a greater understanding of the type of agriculture that characterized the first farming communities in the most south-western part of Europe.
Max ERC Funding
1 545 169 €
Duration
Start date: 2009-04-01, End date: 2013-03-31
Project acronym AHRIMMUNITY
Project The influence of Aryl hydrocarbon receptor ligands on protective and pathological immune responses
Researcher (PI) Brigitta Stockinger
Host Institution (HI) MEDICAL RESEARCH COUNCIL
Call Details Advanced Grant (AdG), LS6, ERC-2008-AdG
Summary The Aryl hydrocarbon receptor is an evolutionary conserved widely expressed transcription factor that mediates the toxicity of a substantial variety of exogenous toxins, but is also stimulated by endogenous physiological ligands. While it is known that this receptor mediates the toxicity of dioxin, this is unlikely to be its physiological function. We have recently identified selective expression of AhR in the Th17 subset of effector CD4 T cells. Ligation of AhR by a candidate endogenous ligand (FICZ) which is a UV metabolite of tryptophan causes expansion of Th17 cells and the induction of IL-22 production. As a consequence, AhR ligation will exacerbate autoimmune diseases such as experimental autoimmune encephalomyelitis. Little is known so far about the impact of AhR ligands on IL-17/IL-22 mediated immune defense functions. IL-22 is considered a pro-inflammatory Th17 cytokine, which is involved in the etiology of psoriasis, but it has also been shown to be a survival factor for epithelial cells. AhR is polymorphic and defined as high or low affinity receptor for dioxin leading to the classification of high and low responder mouse strains based on defined mutations. In humans similar polymorphisms exist and although on the whole human AhR is thought to be of low affinity in humans, there are identified mutations that confer high responder status. No correlations have been made with Th17 mediated immune responses in mice and humans. This study aims to investigate the role of AhR ligands and polymorphisms in autoimmunity as well as protective immune responses using both mouse models and human samples from normal controls as well as psoriasis patients.
Summary
The Aryl hydrocarbon receptor is an evolutionary conserved widely expressed transcription factor that mediates the toxicity of a substantial variety of exogenous toxins, but is also stimulated by endogenous physiological ligands. While it is known that this receptor mediates the toxicity of dioxin, this is unlikely to be its physiological function. We have recently identified selective expression of AhR in the Th17 subset of effector CD4 T cells. Ligation of AhR by a candidate endogenous ligand (FICZ) which is a UV metabolite of tryptophan causes expansion of Th17 cells and the induction of IL-22 production. As a consequence, AhR ligation will exacerbate autoimmune diseases such as experimental autoimmune encephalomyelitis. Little is known so far about the impact of AhR ligands on IL-17/IL-22 mediated immune defense functions. IL-22 is considered a pro-inflammatory Th17 cytokine, which is involved in the etiology of psoriasis, but it has also been shown to be a survival factor for epithelial cells. AhR is polymorphic and defined as high or low affinity receptor for dioxin leading to the classification of high and low responder mouse strains based on defined mutations. In humans similar polymorphisms exist and although on the whole human AhR is thought to be of low affinity in humans, there are identified mutations that confer high responder status. No correlations have been made with Th17 mediated immune responses in mice and humans. This study aims to investigate the role of AhR ligands and polymorphisms in autoimmunity as well as protective immune responses using both mouse models and human samples from normal controls as well as psoriasis patients.
Max ERC Funding
1 242 352 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym AI4REASON
Project Artificial Intelligence for Large-Scale Computer-Assisted Reasoning
Researcher (PI) Josef Urban
Host Institution (HI) CESKE VYSOKE UCENI TECHNICKE V PRAZE
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary The goal of the AI4REASON project is a breakthrough in what is considered a very hard problem in AI and automation of reasoning, namely the problem of automatically proving theorems in large and complex theories. Such complex formal theories arise in projects aimed at verification of today's advanced mathematics such as the Formal Proof of the Kepler Conjecture (Flyspeck), verification of software and hardware designs such as the seL4 operating system kernel, and verification of other advanced systems and technologies on which today's information society critically depends.
It seems extremely complex and unlikely to design an explicitly programmed solution to the problem. However, we have recently demonstrated that the performance of existing approaches can be multiplied by data-driven AI methods that learn reasoning guidance from large proof corpora. The breakthrough will be achieved by developing such novel AI methods. First, we will devise suitable Automated Reasoning and Machine Learning methods that learn reasoning knowledge and steer the reasoning processes at various levels of granularity. Second, we will combine them into autonomous self-improving AI systems that interleave deduction and learning in positive feedback loops. Third, we will develop approaches that aggregate reasoning knowledge across many formal, semi-formal and informal corpora and deploy the methods as strong automation services for the formal proof community.
The expected outcome is our ability to prove automatically at least 50% more theorems in high-assurance projects such as Flyspeck and seL4, bringing a major breakthrough in formal reasoning and verification. As an AI effort, the project offers a unique path to large-scale semantic AI. The formal corpora concentrate centuries of deep human thinking in a computer-understandable form on which deductive and inductive AI can be combined and co-evolved, providing new insights into how humans do mathematics and science.
Summary
The goal of the AI4REASON project is a breakthrough in what is considered a very hard problem in AI and automation of reasoning, namely the problem of automatically proving theorems in large and complex theories. Such complex formal theories arise in projects aimed at verification of today's advanced mathematics such as the Formal Proof of the Kepler Conjecture (Flyspeck), verification of software and hardware designs such as the seL4 operating system kernel, and verification of other advanced systems and technologies on which today's information society critically depends.
It seems extremely complex and unlikely to design an explicitly programmed solution to the problem. However, we have recently demonstrated that the performance of existing approaches can be multiplied by data-driven AI methods that learn reasoning guidance from large proof corpora. The breakthrough will be achieved by developing such novel AI methods. First, we will devise suitable Automated Reasoning and Machine Learning methods that learn reasoning knowledge and steer the reasoning processes at various levels of granularity. Second, we will combine them into autonomous self-improving AI systems that interleave deduction and learning in positive feedback loops. Third, we will develop approaches that aggregate reasoning knowledge across many formal, semi-formal and informal corpora and deploy the methods as strong automation services for the formal proof community.
The expected outcome is our ability to prove automatically at least 50% more theorems in high-assurance projects such as Flyspeck and seL4, bringing a major breakthrough in formal reasoning and verification. As an AI effort, the project offers a unique path to large-scale semantic AI. The formal corpora concentrate centuries of deep human thinking in a computer-understandable form on which deductive and inductive AI can be combined and co-evolved, providing new insights into how humans do mathematics and science.
Max ERC Funding
1 499 500 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AIM2 INFLAMMASOME
Project Cytosolic recognition of foreign nucleic acids: Molecular and functional characterization of AIM2, a central player in DNA-triggered inflammasome activation
Researcher (PI) Veit Hornung
Host Institution (HI) UNIVERSITAETSKLINIKUM BONN
Call Details Starting Grant (StG), LS6, ERC-2009-StG
Summary Host cytokines, chemokines and type I IFNs are critical effectors of the innate immune response to viral and bacterial pathogens. Several classes of germ-line encoded pattern recognition receptors have been identified, which sense non-self nucleic acids and trigger these responses. Recently NLRP-3, a member of the NOD-like receptor (NLR) family, has been shown to sense endogenous danger signals, environmental insults and the DNA viruses adenovirus and HSV. Activation of NLRP-3 induces the formation of a large multiprotein complex in cells termed inflammasome , which controls the activity of pro-caspase-1 and the maturation of pro-IL-1² and pro-IL18 into their active forms. NLRP-3, however, does not regulate these responses to double stranded cytosolic DNA. We identified the cytosolic protein AIM2 as the missing receptor for cytosolic DNA. AIM2 contains a HIN200 domain, which binds to DNA and a pyrin domain, which associates with the adapter molecule ASC to activate both NF-ºB and caspase-1. Knock down of AIM2 down-regulates caspase-1-mediated IL-1² responses following DNA stimulation or vaccinia virus infection. Collectively, these observations demonstrate that AIM2 forms an inflammasome with the DNA ligand and ASC to activate caspase-1. Our underlying hypothesis for this proposal is that AIM2 plays a central role in host-defence to cytosolic microbial pathogens and also in DNA-triggered autoimmunity. The goals of this research proposal are to further characterize the DNA ligand for AIM2, to explore the molecular mechanisms of AIM2 activation, to define the contribution of AIM2 to host-defence against viral and bacterial pathogens and to assess its function in nucleic acid triggered autoimmune disease. The characterization of AIM2 and its role in innate immunity could open new avenues in the advancement of immunotherapy and treatment of autoimmune disease.
Summary
Host cytokines, chemokines and type I IFNs are critical effectors of the innate immune response to viral and bacterial pathogens. Several classes of germ-line encoded pattern recognition receptors have been identified, which sense non-self nucleic acids and trigger these responses. Recently NLRP-3, a member of the NOD-like receptor (NLR) family, has been shown to sense endogenous danger signals, environmental insults and the DNA viruses adenovirus and HSV. Activation of NLRP-3 induces the formation of a large multiprotein complex in cells termed inflammasome , which controls the activity of pro-caspase-1 and the maturation of pro-IL-1² and pro-IL18 into their active forms. NLRP-3, however, does not regulate these responses to double stranded cytosolic DNA. We identified the cytosolic protein AIM2 as the missing receptor for cytosolic DNA. AIM2 contains a HIN200 domain, which binds to DNA and a pyrin domain, which associates with the adapter molecule ASC to activate both NF-ºB and caspase-1. Knock down of AIM2 down-regulates caspase-1-mediated IL-1² responses following DNA stimulation or vaccinia virus infection. Collectively, these observations demonstrate that AIM2 forms an inflammasome with the DNA ligand and ASC to activate caspase-1. Our underlying hypothesis for this proposal is that AIM2 plays a central role in host-defence to cytosolic microbial pathogens and also in DNA-triggered autoimmunity. The goals of this research proposal are to further characterize the DNA ligand for AIM2, to explore the molecular mechanisms of AIM2 activation, to define the contribution of AIM2 to host-defence against viral and bacterial pathogens and to assess its function in nucleic acid triggered autoimmune disease. The characterization of AIM2 and its role in innate immunity could open new avenues in the advancement of immunotherapy and treatment of autoimmune disease.
Max ERC Funding
1 727 920 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym ALBUGON
Project Genomics and effectoromics to understand defence suppression and disease resistance in Arabidopsis-Albugo candida interactions
Researcher (PI) Jonathan Jones
Host Institution (HI) THE SAINSBURY LABORATORY
Call Details Advanced Grant (AdG), LS6, ERC-2008-AdG
Summary This project focuses on two questions about host/parasite interactions: how do biotrophic plant pathogens suppress host defence? and, what is the basis for pathogen specialization on specific host species? A broadly accepted model explains resistance and susceptibility to plant pathogens. First, pathogens make conserved molecules ( PAMPS ) such as flagellin, that plants detect via cell surface receptors, leading to PAMP-Triggered Immunity (PTI). Second, pathogens make effectors that suppress PTI. Third, plants carry 100s of Resistance (R) genes that detect an effector, and activate Effector-Triggered Immunity (ETI). One effector is sufficient to trigger resistance. Albugo candida (Ac) (white rust) strongly suppresses host defence; Ac-infected Arabidopsis are susceptible to pathogen races to which they are otherwise resistant. Ac is an oomycete, not a fungus. Arabidopsis is resistant to races of Ac that infect brassicas. The proposed project involves three programs. First ( genomics, transcriptomics and bioinformatics ), we will use next-generation sequencing (NGS) methods (Solexa and GS-Flex), and novel transcriptomics methods to define the genome sequence and effector set of three Ac strains, as well as carrying out >40- deep resequencing of 7 additional Ac strains. Second, ( effectoromics ), we will carry out functional assays using Effector Detector Vectors (Sohn Plant Cell 19:4077 [2007]), with the set of Ac effectors, screening for enhanced virulence, for suppression of defence, for effectors that are recognized by R genes in disease resistant Arabidopsis and for host effector targets. Third, ( resistance diversity ), we will characterize Arabidopsis germplasm for R genes to Ac, both for recognition of Arabidopsis strains of Ac, and for recognition in Arabidopsis of effectors from Ac strains that infect brassica. This proposal focuses on Ac, but will establish methods that could discover new R genes in non-hosts against many plant diseases.
Summary
This project focuses on two questions about host/parasite interactions: how do biotrophic plant pathogens suppress host defence? and, what is the basis for pathogen specialization on specific host species? A broadly accepted model explains resistance and susceptibility to plant pathogens. First, pathogens make conserved molecules ( PAMPS ) such as flagellin, that plants detect via cell surface receptors, leading to PAMP-Triggered Immunity (PTI). Second, pathogens make effectors that suppress PTI. Third, plants carry 100s of Resistance (R) genes that detect an effector, and activate Effector-Triggered Immunity (ETI). One effector is sufficient to trigger resistance. Albugo candida (Ac) (white rust) strongly suppresses host defence; Ac-infected Arabidopsis are susceptible to pathogen races to which they are otherwise resistant. Ac is an oomycete, not a fungus. Arabidopsis is resistant to races of Ac that infect brassicas. The proposed project involves three programs. First ( genomics, transcriptomics and bioinformatics ), we will use next-generation sequencing (NGS) methods (Solexa and GS-Flex), and novel transcriptomics methods to define the genome sequence and effector set of three Ac strains, as well as carrying out >40- deep resequencing of 7 additional Ac strains. Second, ( effectoromics ), we will carry out functional assays using Effector Detector Vectors (Sohn Plant Cell 19:4077 [2007]), with the set of Ac effectors, screening for enhanced virulence, for suppression of defence, for effectors that are recognized by R genes in disease resistant Arabidopsis and for host effector targets. Third, ( resistance diversity ), we will characterize Arabidopsis germplasm for R genes to Ac, both for recognition of Arabidopsis strains of Ac, and for recognition in Arabidopsis of effectors from Ac strains that infect brassica. This proposal focuses on Ac, but will establish methods that could discover new R genes in non-hosts against many plant diseases.
Max ERC Funding
2 498 923 €
Duration
Start date: 2009-01-01, End date: 2014-06-30
Project acronym ALERT
Project ALERT - The Apertif-LOFAR Exploration of the Radio Transient Sky
Researcher (PI) Albert Van Leeuwen
Host Institution (HI) STICHTING ASTRON, NETHERLANDS INSTITUTE FOR RADIO ASTRONOMY
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Summary
"In our largely unchanging radio Universe, a highly dynamic component was recently discovered: flashes of bright radio emission that last only milliseconds but appear all over the sky. Some of these radio bursts can be traced to intermittently pulsating neutron stars. Other bursts however, apparently originate far outside our Galaxy. Due to great observational challenges, the evolution of the neutron stars is not understood, while more importantly, the nature of the extragalactic bursts remains an outright mystery.
My overall aim is to understand the physics that drives both kinds of brief and luminous bursts.
My primary goal is to identify the highly compact astrophysical explosions powering the extragalactic bursts. My previous surveys are the state of the art in fast-transient detection; I will now increase by a factor of 10 this exploration volume. In real-time I will provide arcsec positions, 10,000-fold more accurate than currently possible, to localize such extragalactic bursts for the first time and understand their origin.
My secondary goal is to unravel the unexplained evolution of intermittently pulsating neutron stars (building on e.g., my recent papers in Science, 2013), by doubling their number and modeling their population.
To achieve these goals, I will carry out a highly innovative survey: the Apertif-LOFAR Exploration of the Radio Transient Sky. ALERT is over an order of magnitude more sensitive than all current state-of-the art fast-transient surveys.
Through its novel, extremely wide field-of-view, Westerbork/Apertif will detect many tens of extragalactic bursts. Through real-time triggers to LOFAR I will next provide the precise localisation that is essential for radio, optical and high-energy follow-up to, for the first time, shed light on the physics and objects driving these bursts – evaporating primordial black holes; explosions in host galaxies; or, the unknown?"
Max ERC Funding
1 999 823 €
Duration
Start date: 2014-12-01, End date: 2019-11-30
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALEXANDRIA
Project Large-Scale Formal Proof for the Working Mathematician
Researcher (PI) Lawrence PAULSON
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Summary
Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Max ERC Funding
2 430 140 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALFA
Project Shaping a European Scientific Scene : Alfonsine Astronomy
Researcher (PI) Matthieu Husson
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), SH6, ERC-2016-COG
Summary Alfonsine astronomy is arguably among the first European scientific achievements. It shaped a scene for actors like Regiomontanus or Copernicus. There is however little detailed historical analysis encompassing its development in its full breadth. ALFA addresses this issue by studying tables, instruments, mathematical and theoretical texts in a methodologically innovative way relying on approaches from the history of manuscript cultures, history of mathematics, and history of astronomy.
ALFA integrates these approaches not only to benefit from different perspectives but also to build new questions from their interactions. For instance the analysis of mathematical practices in astral sciences manuscripts induces new ways to analyse the documents and to think about astronomical questions.
Relying on these approaches the main objectives of ALFA are thus to:
- Retrace the development of the corpus of Alfonsine texts from its origin in the second half of the 13th century to the end of the 15th century by following, on the manuscript level, the milieus fostering it;
- Analyse the Alfonsine astronomers’ practices, their relations to mathematics, to the natural world, to proofs and justification, their intellectual context and audiences;
- Build a meaningful narrative showing how astronomers in different milieus with diverse practices shaped, also from Arabic materials, an original scientific scene in Europe.
ALFA will shed new light on the intellectual history of the late medieval period as a whole and produce a better understanding of its relations to related scientific periods in Europe and beyond. It will also produce methodological breakthroughs impacting the ways history of knowledge is practiced outside the field of ancient and medieval sciences. Efforts will be devoted to bring these results not only to the relevant scholarly communities but also to a wider audience as a resource in the public debates around science, knowledge and culture.
Summary
Alfonsine astronomy is arguably among the first European scientific achievements. It shaped a scene for actors like Regiomontanus or Copernicus. There is however little detailed historical analysis encompassing its development in its full breadth. ALFA addresses this issue by studying tables, instruments, mathematical and theoretical texts in a methodologically innovative way relying on approaches from the history of manuscript cultures, history of mathematics, and history of astronomy.
ALFA integrates these approaches not only to benefit from different perspectives but also to build new questions from their interactions. For instance the analysis of mathematical practices in astral sciences manuscripts induces new ways to analyse the documents and to think about astronomical questions.
Relying on these approaches the main objectives of ALFA are thus to:
- Retrace the development of the corpus of Alfonsine texts from its origin in the second half of the 13th century to the end of the 15th century by following, on the manuscript level, the milieus fostering it;
- Analyse the Alfonsine astronomers’ practices, their relations to mathematics, to the natural world, to proofs and justification, their intellectual context and audiences;
- Build a meaningful narrative showing how astronomers in different milieus with diverse practices shaped, also from Arabic materials, an original scientific scene in Europe.
ALFA will shed new light on the intellectual history of the late medieval period as a whole and produce a better understanding of its relations to related scientific periods in Europe and beyond. It will also produce methodological breakthroughs impacting the ways history of knowledge is practiced outside the field of ancient and medieval sciences. Efforts will be devoted to bring these results not only to the relevant scholarly communities but also to a wider audience as a resource in the public debates around science, knowledge and culture.
Max ERC Funding
1 871 250 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALGAME
Project Algorithms, Games, Mechanisms, and the Price of Anarchy
Researcher (PI) Elias Koutsoupias
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Summary
The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Max ERC Funding
2 461 000 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym ALGILE
Project Foundations of Algebraic and Dynamic Data Management Systems
Researcher (PI) Christoph Koch
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Summary
"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Max ERC Funding
1 480 548 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym AlgoRNN
Project Recurrent Neural Networks and Related Machines That Learn Algorithms
Researcher (PI) Juergen Schmidhuber
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Summary
Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym Allelic Regulation
Project Revealing Allele-level Regulation and Dynamics using Single-cell Gene Expression Analyses
Researcher (PI) Thore Rickard Hakan Sandberg
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Consolidator Grant (CoG), LS2, ERC-2014-CoG
Summary As diploid organisms inherit one gene copy from each parent, a gene can be expressed from both alleles (biallelic) or from only one allele (monoallelic). Although transcription from both alleles is detected for most genes in cell population experiments, little is known about allele-specific expression in single cells and its phenotypic consequences. To answer fundamental questions about allelic transcription heterogeneity in single cells, this research program will focus on single-cell transcriptome analyses with allelic-origin resolution. To this end, we will investigate both clonally stable and dynamic random monoallelic expression across a large number of cell types, including cells from embryonic and adult stages. This research program will be accomplished with the novel single-cell RNA-seq method developed within my lab to obtain quantitative, genome-wide gene expression measurement. To distinguish between mitotically stable and dynamic patterns of allelic expression, we will analyze large numbers a clonally related cells per cell type, from both primary cultures (in vitro) and using transgenic models to obtain clonally related cells in vivo.
The biological significance of the research program is first an understanding of allelic transcription, including the nature and extent of random monoallelic expression across in vivo tissues and cell types. These novel insights into allelic transcription will be important for an improved understanding of how variable phenotypes (e.g. incomplete penetrance and variable expressivity) can arise in genetically identical individuals. Additionally, the single-cell transcriptome analyses of clonally related cells in vivo will provide unique insights into the clonality of gene expression per se.
Summary
As diploid organisms inherit one gene copy from each parent, a gene can be expressed from both alleles (biallelic) or from only one allele (monoallelic). Although transcription from both alleles is detected for most genes in cell population experiments, little is known about allele-specific expression in single cells and its phenotypic consequences. To answer fundamental questions about allelic transcription heterogeneity in single cells, this research program will focus on single-cell transcriptome analyses with allelic-origin resolution. To this end, we will investigate both clonally stable and dynamic random monoallelic expression across a large number of cell types, including cells from embryonic and adult stages. This research program will be accomplished with the novel single-cell RNA-seq method developed within my lab to obtain quantitative, genome-wide gene expression measurement. To distinguish between mitotically stable and dynamic patterns of allelic expression, we will analyze large numbers a clonally related cells per cell type, from both primary cultures (in vitro) and using transgenic models to obtain clonally related cells in vivo.
The biological significance of the research program is first an understanding of allelic transcription, including the nature and extent of random monoallelic expression across in vivo tissues and cell types. These novel insights into allelic transcription will be important for an improved understanding of how variable phenotypes (e.g. incomplete penetrance and variable expressivity) can arise in genetically identical individuals. Additionally, the single-cell transcriptome analyses of clonally related cells in vivo will provide unique insights into the clonality of gene expression per se.
Max ERC Funding
1 923 060 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ALLERGUT
Project Mucosal Tolerance and Allergic Predisposition: Does it all start in the gut?
Researcher (PI) Caspar OHNMACHT
Host Institution (HI) HELMHOLTZ ZENTRUM MUENCHEN DEUTSCHES FORSCHUNGSZENTRUM FUER GESUNDHEIT UND UMWELT GMBH
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary Currently, more than 30% of all Europeans suffer from one or more allergic disorder but treatment is still mostly symptomatic due to a lack of understanding the underlying causality. Allergies are caused by type 2 immune responses triggered by recognition of harmless antigens. Both genetic and environmental factors have been proposed to favour allergic predisposition and both factors have a huge impact on the symbiotic microbiota and the intestinal immune system. Recently we and others showed that the transcription factor ROR(γt) seems to play a key role in mucosal tolerance in the gut and also regulates intestinal type 2 immune responses.
Based on these results I postulate two major events in the gut for the development of an allergy in the lifetime of an individual: First, a failure to establish mucosal tolerance or anergy constitutes a necessity for the outbreak of allergic symptoms and allergic disease. Second, a certain ‘core’ microbiome or pathway of the intestinal microbiota predispose certain individuals for the later development of allergic disorders. Therefore, I will address the following aims:
1) Influence of ROR(γt) on mucosal tolerance induction and allergic disorders
2) Elucidate the T cell receptor repertoire of intestinal Th2 and ROR(γt)+ Tregs and assess the role of alternative NFκB pathway for induction of mucosal tolerance
3) Identification of ‘core’ microbiome signatures or metabolic pathways that favour allergic predisposition
ALLERGUT will provide ground-breaking knowledge on molecular mechanisms of the failure of mucosal tolerance in the gut and will prove if the resident ROR(γt)+ T(reg) cells can function as a mechanistic starting point for molecular intervention strategies on the background of the hygiene hypothesis. The vision of ALLERGUT is to diagnose mucosal disbalance, prevent and treat allergic disorders even before outbreak and thereby promote Public Health initiative for better living.
Summary
Currently, more than 30% of all Europeans suffer from one or more allergic disorder but treatment is still mostly symptomatic due to a lack of understanding the underlying causality. Allergies are caused by type 2 immune responses triggered by recognition of harmless antigens. Both genetic and environmental factors have been proposed to favour allergic predisposition and both factors have a huge impact on the symbiotic microbiota and the intestinal immune system. Recently we and others showed that the transcription factor ROR(γt) seems to play a key role in mucosal tolerance in the gut and also regulates intestinal type 2 immune responses.
Based on these results I postulate two major events in the gut for the development of an allergy in the lifetime of an individual: First, a failure to establish mucosal tolerance or anergy constitutes a necessity for the outbreak of allergic symptoms and allergic disease. Second, a certain ‘core’ microbiome or pathway of the intestinal microbiota predispose certain individuals for the later development of allergic disorders. Therefore, I will address the following aims:
1) Influence of ROR(γt) on mucosal tolerance induction and allergic disorders
2) Elucidate the T cell receptor repertoire of intestinal Th2 and ROR(γt)+ Tregs and assess the role of alternative NFκB pathway for induction of mucosal tolerance
3) Identification of ‘core’ microbiome signatures or metabolic pathways that favour allergic predisposition
ALLERGUT will provide ground-breaking knowledge on molecular mechanisms of the failure of mucosal tolerance in the gut and will prove if the resident ROR(γt)+ T(reg) cells can function as a mechanistic starting point for molecular intervention strategies on the background of the hygiene hypothesis. The vision of ALLERGUT is to diagnose mucosal disbalance, prevent and treat allergic disorders even before outbreak and thereby promote Public Health initiative for better living.
Max ERC Funding
1 498 175 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym ALPHA
Project Alpha Shape Theory Extended
Researcher (PI) Herbert Edelsbrunner
Host Institution (HI) INSTITUTE OF SCIENCE AND TECHNOLOGYAUSTRIA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Summary
Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Max ERC Funding
1 678 432 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym ALUNIF
Project Algorithms and Lower Bounds: A Unified Approach
Researcher (PI) Rahul Santhanam
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Summary
One of the fundamental goals of theoretical computer science is to
understand the possibilities and limits of efficient computation. This
quest has two dimensions. The
theory of algorithms focuses on finding efficient solutions to
problems, while computational complexity theory aims to understand when
and why problems are hard to solve. These two areas have different
philosophies and use different sets of techniques. However, in recent
years there have been indications of deep and mysterious connections
between them.
In this project, we propose to explore and develop the connections between
algorithmic analysis and complexity lower bounds in a systematic way.
On the one hand, we plan to use complexity lower bound techniques as inspiration
to design new and improved algorithms for Satisfiability and other
NP-complete problems, as well as to analyze existing algorithms better.
On the other hand, we plan to strengthen implications yielding circuit
lower bounds from non-trivial algorithms for Satisfiability, and to derive
new circuit lower bounds using these stronger implications.
This project has potential for massive impact in both the areas of algorithms
and computational complexity. Improved algorithms for Satisfiability could lead
to improved SAT solvers, and the new analytical tools would lead to a better
understanding of existing heuristics. Complexity lower bound questions are
fundamental
but notoriously difficult, and new lower bounds would open the way to
unconditionally secure cryptographic protocols and derandomization of
probabilistic algorithms. More broadly, this project aims to initiate greater
dialogue between the two areas, with an exchange of ideas and techniques
which leads to accelerated progress in both, as well as a deeper understanding
of the nature of efficient computation.
Max ERC Funding
1 274 496 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym AMAREC
Project Amenability, Approximation and Reconstruction
Researcher (PI) Wilhelm WINTER
Host Institution (HI) WESTFAELISCHE WILHELMS-UNIVERSITAET MUENSTER
Call Details Advanced Grant (AdG), PE1, ERC-2018-ADG
Summary Algebras of operators on Hilbert spaces were originally introduced as the right framework for the mathematical description of quantum mechanics. In modern mathematics the scope has much broadened due to the highly versatile nature of operator algebras. They are particularly useful in the analysis of groups and their actions. Amenability is a finiteness property which occurs in many different contexts and which can be characterised in many different ways. We will analyse amenability in terms of approximation properties, in the frameworks of abstract C*-algebras, of topological dynamical systems, and of discrete groups. Such approximation properties will serve as bridging devices between these setups, and they will be used to systematically recover geometric information about the underlying structures. When passing from groups, and more generally from dynamical systems, to operator algebras, one loses information, but one gains new tools to isolate and analyse pertinent properties of the underlying structure. We will mostly be interested in the topological setting, and in the associated C*-algebras. Amenability of groups or of dynamical systems then translates into the completely positive approximation property. Systems of completely positive approximations store all the essential data about a C*-algebra, and sometimes one can arrange the systems so that one can directly read of such information. For transformation group C*-algebras, one can achieve this by using approximation properties of the underlying dynamics. To some extent one can even go back, and extract dynamical approximation properties from completely positive approximations of the C*-algebra. This interplay between approximation properties in topological dynamics and in noncommutative topology carries a surprisingly rich structure. It connects directly to the heart of the classification problem for nuclear C*-algebras on the one hand, and to central open questions on amenable dynamics on the other.
Summary
Algebras of operators on Hilbert spaces were originally introduced as the right framework for the mathematical description of quantum mechanics. In modern mathematics the scope has much broadened due to the highly versatile nature of operator algebras. They are particularly useful in the analysis of groups and their actions. Amenability is a finiteness property which occurs in many different contexts and which can be characterised in many different ways. We will analyse amenability in terms of approximation properties, in the frameworks of abstract C*-algebras, of topological dynamical systems, and of discrete groups. Such approximation properties will serve as bridging devices between these setups, and they will be used to systematically recover geometric information about the underlying structures. When passing from groups, and more generally from dynamical systems, to operator algebras, one loses information, but one gains new tools to isolate and analyse pertinent properties of the underlying structure. We will mostly be interested in the topological setting, and in the associated C*-algebras. Amenability of groups or of dynamical systems then translates into the completely positive approximation property. Systems of completely positive approximations store all the essential data about a C*-algebra, and sometimes one can arrange the systems so that one can directly read of such information. For transformation group C*-algebras, one can achieve this by using approximation properties of the underlying dynamics. To some extent one can even go back, and extract dynamical approximation properties from completely positive approximations of the C*-algebra. This interplay between approximation properties in topological dynamics and in noncommutative topology carries a surprisingly rich structure. It connects directly to the heart of the classification problem for nuclear C*-algebras on the one hand, and to central open questions on amenable dynamics on the other.
Max ERC Funding
1 596 017 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym AMD
Project Algorithmic Mechanism Design: Beyond Truthful Mechanisms
Researcher (PI) Michal Feldman
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Summary
"The first decade of Algorithmic Mechanism Design (AMD) concentrated, very successfully, on the design of truthful mechanisms for the allocation of resources among agents with private preferences.
Truthful mechanisms are ones that incentivize rational users to report their preferences truthfully.
Truthfulness, however, for all its theoretical appeal, suffers from several inherent limitations, mainly its high communication and computation complexities.
It is not surprising, therefore, that practical applications forego truthfulness and use simpler mechanisms instead.
Simplicity in itself, however, is not sufficient, as any meaningful mechanism should also have some notion of fairness; otherwise agents will stop using it over time.
In this project I plan to develop an innovative AMD theoretical framework that will go beyond truthfulness and focus instead on the natural themes of simplicity and fairness, in addition to computational tractability.
One of my primary goals will be the design of simple and fair poly-time mechanisms that perform at near optimal levels with respect to important economic objectives such as social welfare and revenue.
To this end, I will work toward providing precise definitions of simplicity and fairness and quantifying the effects of these restrictions on the performance levels that can be obtained.
A major challenge in the evaluation of non-truthful mechanisms is defining a reasonable behavior model that will enable their evaluation.
The success of this project could have a broad impact on Europe and beyond, as it would guide the design of natural mechanisms for markets of tens of billions of dollars in revenue, such as online advertising, or sales of wireless frequencies.
The timing of this project is ideal, as the AMD field is now sufficiently mature to lead to a breakthrough and at the same time young enough to be receptive to new approaches and themes."
Max ERC Funding
1 394 600 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AMPLify
Project Allocation Made PracticaL
Researcher (PI) Toby Walsh
Host Institution (HI) TECHNISCHE UNIVERSITAT BERLIN
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Allocation Made PracticaL
The AMPLify project will lay the foundations of a new field, computational behavioural game theory that brings a computational perspective, computational implementation, and behavioural insights to game theory. These foundations will be laid by tackling a pressing problem facing society today: the efficient and fair allocation of resources and costs. Research in allocation has previously considered simple, abstract models like cake cutting. We propose to develop richer models that capture important new features like asynchronicity which occur in many markets being developed in our highly connected and online world. The mechanisms currently used to allocate resources and costs are limited to these simple, abstract models and also do not take into account how people actually behave in practice. We will therefore design new mechanisms for these richer allocation problems that exploit insights gained from behavioural game theory like loss aversion. We will also tackle the complexity of these rich models and mechanisms with computational tools. Finally, we will use computation to increase both the efficiency and fairness of allocations. As a result, we will be able to do more with fewer resources and greater fairness. Our initial case studies in resource and cost allocation demonstrate that we can improve efficiency greatly, offering one company alone savings of up to 10% (which is worth tens of millions of dollars every year). We predict even greater impact with the more sophisticated mechanisms to be developed during the course of this project.
Summary
Allocation Made PracticaL
The AMPLify project will lay the foundations of a new field, computational behavioural game theory that brings a computational perspective, computational implementation, and behavioural insights to game theory. These foundations will be laid by tackling a pressing problem facing society today: the efficient and fair allocation of resources and costs. Research in allocation has previously considered simple, abstract models like cake cutting. We propose to develop richer models that capture important new features like asynchronicity which occur in many markets being developed in our highly connected and online world. The mechanisms currently used to allocate resources and costs are limited to these simple, abstract models and also do not take into account how people actually behave in practice. We will therefore design new mechanisms for these richer allocation problems that exploit insights gained from behavioural game theory like loss aversion. We will also tackle the complexity of these rich models and mechanisms with computational tools. Finally, we will use computation to increase both the efficiency and fairness of allocations. As a result, we will be able to do more with fewer resources and greater fairness. Our initial case studies in resource and cost allocation demonstrate that we can improve efficiency greatly, offering one company alone savings of up to 10% (which is worth tens of millions of dollars every year). We predict even greater impact with the more sophisticated mechanisms to be developed during the course of this project.
Max ERC Funding
2 499 681 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym AMPLIFY
Project Amplifying Human Perception Through Interactive Digital Technologies
Researcher (PI) Albrecht Schmidt
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Summary
Current technical sensor systems offer capabilities that are superior to human perception. Cameras can capture a spectrum that is wider than visible light, high-speed cameras can show movements that are invisible to the human eye, and directional microphones can pick up sounds at long distances. The vision of this project is to lay a foundation for the creation of digital technologies that provide novel sensory experiences and new perceptual capabilities for humans that are natural and intuitive to use. In a first step, the project will assess the feasibility of creating artificial human senses that provide new perceptual channels to the human mind, without increasing the experienced cognitive load. A particular focus is on creating intuitive and natural control mechanisms for amplified senses using eye gaze, muscle activity, and brain signals. Through the creation of a prototype that provides mildly unpleasant stimulations in response to perceived information, the feasibility of implementing an artificial reflex will be experimentally explored. The project will quantify the effectiveness of new senses and artificial perceptual aids compared to the baseline of unaugmented perception. The overall objective is to systematically research, explore, and model new means for increasing the human intake of information in order to lay the foundation for new and improved human senses enabled through digital technologies and to enable artificial reflexes. The ground-breaking contributions of this project are (1) to demonstrate the feasibility of reliably implementing amplified senses and new perceptual capabilities, (2) to prove the possibility of creating an artificial reflex, (3) to provide an example implementation of amplified cognition that is empirically validated, and (4) to develop models, concepts, components, and platforms that will enable and ease the creation of interactive systems that measurably increase human perceptual capabilities.
Max ERC Funding
1 925 250 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym AMSTAT
Project Problems at the Applied Mathematics-Statistics Interface
Researcher (PI) Andrew Stuart
Host Institution (HI) THE UNIVERSITY OF WARWICK
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Applied mathematics is concerned with developing models with predictive capability, and with probing those models to obtain qualitative and quantitative insight into the phenomena being modelled. Statistics is data-driven and is aimed at the development of methodologies to optimize the information derived from data. The increasing complexity of phenomena that scientists and engineers wish to model, together with our increased ability to gather, store and interrogate data, mean that the subjects of applied mathematics and statistics are increasingly required to work in conjunction. This research proposal is concerned with a research program at the interface between these two disciplines, aimed at problems in differential equations where profusion of data and the sophisticated model combine to produce the mathematical problem of obtaining information from a probability measure on function space. Applications are far-reaching and include the atmospheric sciences, geophysics, chemistry, econometrics and signal processing. The objectives of the research are: (i) to create the systematic foundations for a range of problems at the applied mathematics and statistics interface which share the common mathematical structure underpinning the range of applications described above; (ii) to exploit this common mathematical structure to design effecient algorithms to sample probability measures on function space; (iii) to apply these algorithms to attack a range of significant problems arising in molecular dynamics and in the atmospheric sciences.
Summary
Applied mathematics is concerned with developing models with predictive capability, and with probing those models to obtain qualitative and quantitative insight into the phenomena being modelled. Statistics is data-driven and is aimed at the development of methodologies to optimize the information derived from data. The increasing complexity of phenomena that scientists and engineers wish to model, together with our increased ability to gather, store and interrogate data, mean that the subjects of applied mathematics and statistics are increasingly required to work in conjunction. This research proposal is concerned with a research program at the interface between these two disciplines, aimed at problems in differential equations where profusion of data and the sophisticated model combine to produce the mathematical problem of obtaining information from a probability measure on function space. Applications are far-reaching and include the atmospheric sciences, geophysics, chemistry, econometrics and signal processing. The objectives of the research are: (i) to create the systematic foundations for a range of problems at the applied mathematics and statistics interface which share the common mathematical structure underpinning the range of applications described above; (ii) to exploit this common mathematical structure to design effecient algorithms to sample probability measures on function space; (iii) to apply these algorithms to attack a range of significant problems arising in molecular dynamics and in the atmospheric sciences.
Max ERC Funding
1 693 501 €
Duration
Start date: 2008-12-01, End date: 2014-11-30
Project acronym ANADEL
Project Analysis of Geometrical Effects on Dispersive Equations
Researcher (PI) Danela Oana IVANOVICI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Summary
We are concerned with localization properties of solutions to hyperbolic PDEs, especially problems with a geometric component: how do boundaries and heterogeneous media influence spreading and concentration of solutions. While our first focus is on wave and Schrödinger equations on manifolds with boundary, strong connections exist with phase space localization for (clusters of) eigenfunctions, which are of independent interest. Motivations come from nonlinear dispersive models (in physically relevant settings), properties of eigenfunctions in quantum chaos (related to both physics of optic fiber design as well as number theoretic questions), or harmonic analysis on manifolds.
Waves propagation in real life physics occur in media which are neither homogeneous or spatially infinity. The birth of radar/sonar technologies (and the raise of computed tomography) greatly motivated numerous developments in microlocal analysis and the linear theory. Only recently toy nonlinear models have been studied on a curved background, sometimes compact or rough. Understanding how to extend such tools, dealing with wave dispersion or focusing, will allow us to significantly progress in our mathematical understanding of physically relevant models. There, boundaries appear naturally and most earlier developments related to propagation of singularities in this context have limited scope with respect to crucial dispersive effects. Despite great progress over the last decade, driven by the study of quasilinear equations, our knowledge is still very limited. Going beyond this recent activity requires new tools whose development is at the heart of this proposal, including good approximate solutions (parametrices) going over arbitrarily large numbers of caustics, sharp pointwise bounds on Green functions, development of efficient wave packets methods, quantitative refinements of propagation of singularities (with direct applications in control theory), only to name a few important ones.
Max ERC Funding
1 293 763 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym analysisdirac
Project The analysis of the Dirac operator: the hypoelliptic Laplacian and its applications
Researcher (PI) Jean-Michel Philippe Marie-José Bismut
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Summary
This proposal is devoted to the applications of a new hypoelliptic Dirac operator,
whose analytic properties have been studied by Lebeau and myself. Its construction connects classical Hodge theory with the geodesic flow, and more generally any geometrically defined Hodge Laplacian with a dynamical system on the cotangent bundle. The proper description of this object can be given in analytic, index theoretic and probabilistic terms, which explains both its potential many applications, and also its complexity.
Max ERC Funding
1 112 400 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ANALYTIC
Project ANALYTIC PROPERTIES OF INFINITE GROUPS:
limits, curvature, and randomness
Researcher (PI) Gulnara Arzhantseva
Host Institution (HI) UNIVERSITAT WIEN
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary The overall goal of this project is to develop new concepts and techniques in geometric and asymptotic group theory for a systematic study of the analytic properties of discrete groups. These are properties depending on the unitary representation theory of the group. The fundamental examples are amenability, discovered by von Neumann in 1929, and property (T), introduced by Kazhdan in 1967.
My main objective is to establish the precise relations between groups recently appeared in K-theory and topology such as C*-exact groups and groups coarsely embeddable into a Hilbert space, versus those discovered in ergodic theory and operator algebra, for example, sofic and hyperlinear groups. This is a first ever attempt to confront the analytic behavior of so different nature. I plan to work on crucial open questions: Is every coarsely embeddable group C*-exact? Is every group sofic? Is every hyperlinear group sofic?
My motivation is two-fold:
- Many outstanding conjectures were recently solved for these groups, e.g. the Novikov conjecture (1965) for coarsely embeddable groups by Yu in 2000 and the Gottschalk surjunctivity conjecture (1973) for sofic groups by Gromov in 1999. However, their group-theoretical structure remains mysterious.
- In recent years, geometric group theory has undergone significant changes, mainly due to the growing impact of this theory on other branches of mathematics. However, the interplay between geometric, asymptotic, and analytic group properties has not yet been fully understood.
The main innovative contribution of this proposal lies in the interaction between 3 axes: (i) limits of groups, in the space of marked groups or metric ultralimits; (ii) analytic properties of groups with curvature, of lacunary or relatively hyperbolic groups; (iii) random groups, in a topological or statistical meaning. As a result, I will describe the above apparently unrelated classes of groups in a unified way and will detail their algebraic behavior.
Summary
The overall goal of this project is to develop new concepts and techniques in geometric and asymptotic group theory for a systematic study of the analytic properties of discrete groups. These are properties depending on the unitary representation theory of the group. The fundamental examples are amenability, discovered by von Neumann in 1929, and property (T), introduced by Kazhdan in 1967.
My main objective is to establish the precise relations between groups recently appeared in K-theory and topology such as C*-exact groups and groups coarsely embeddable into a Hilbert space, versus those discovered in ergodic theory and operator algebra, for example, sofic and hyperlinear groups. This is a first ever attempt to confront the analytic behavior of so different nature. I plan to work on crucial open questions: Is every coarsely embeddable group C*-exact? Is every group sofic? Is every hyperlinear group sofic?
My motivation is two-fold:
- Many outstanding conjectures were recently solved for these groups, e.g. the Novikov conjecture (1965) for coarsely embeddable groups by Yu in 2000 and the Gottschalk surjunctivity conjecture (1973) for sofic groups by Gromov in 1999. However, their group-theoretical structure remains mysterious.
- In recent years, geometric group theory has undergone significant changes, mainly due to the growing impact of this theory on other branches of mathematics. However, the interplay between geometric, asymptotic, and analytic group properties has not yet been fully understood.
The main innovative contribution of this proposal lies in the interaction between 3 axes: (i) limits of groups, in the space of marked groups or metric ultralimits; (ii) analytic properties of groups with curvature, of lacunary or relatively hyperbolic groups; (iii) random groups, in a topological or statistical meaning. As a result, I will describe the above apparently unrelated classes of groups in a unified way and will detail their algebraic behavior.
Max ERC Funding
1 065 500 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym ANAMULTISCALE
Project Analysis of Multiscale Systems Driven by Functionals
Researcher (PI) Alexander Mielke
Host Institution (HI) FORSCHUNGSVERBUND BERLIN EV
Call Details Advanced Grant (AdG), PE1, ERC-2010-AdG_20100224
Summary Many complex phenomena in the sciences are described by nonlinear partial differential equations, the solutions of which exhibit oscillations and concentration effects on multiple temporal or spatial scales. Our aim is to use methods from applied analysis to contribute to the understanding of the interplay of effects on different scales. The central question is to determine those quantities on the microscale which are needed to for the correct description of the macroscopic evolution.
We aim to develop a mathematical framework for analyzing and modeling coupled systems with multiple scales. This will include Hamiltonian dynamics as well as different types of dissipation like gradient flows or rate-independent dynamics. The choice of models will be guided by specific applications in material modeling (e.g., thermoplasticity, pattern formation, porous media) and optoelectronics (pulse interaction, Maxwell-Bloch systems, semiconductors, quantum mechanics). The research will address mathematically fundamental issues like existence and stability of solutions but will mainly be devoted to the modeling of multiscale phenomena in evolution systems. We will focus on systems with geometric structures, where the dynamics is driven by functionals. Thus, we can go much beyond the classical theory of homogenization and singular perturbations. The novel features of our approach are
- the combination of different dynamical effects in one framework,
- the use of geometric and metric structures for coupled partial differential equations,
- the exploitation of Gamma-convergence for evolution systems driven by functionals.
Summary
Many complex phenomena in the sciences are described by nonlinear partial differential equations, the solutions of which exhibit oscillations and concentration effects on multiple temporal or spatial scales. Our aim is to use methods from applied analysis to contribute to the understanding of the interplay of effects on different scales. The central question is to determine those quantities on the microscale which are needed to for the correct description of the macroscopic evolution.
We aim to develop a mathematical framework for analyzing and modeling coupled systems with multiple scales. This will include Hamiltonian dynamics as well as different types of dissipation like gradient flows or rate-independent dynamics. The choice of models will be guided by specific applications in material modeling (e.g., thermoplasticity, pattern formation, porous media) and optoelectronics (pulse interaction, Maxwell-Bloch systems, semiconductors, quantum mechanics). The research will address mathematically fundamental issues like existence and stability of solutions but will mainly be devoted to the modeling of multiscale phenomena in evolution systems. We will focus on systems with geometric structures, where the dynamics is driven by functionals. Thus, we can go much beyond the classical theory of homogenization and singular perturbations. The novel features of our approach are
- the combination of different dynamical effects in one framework,
- the use of geometric and metric structures for coupled partial differential equations,
- the exploitation of Gamma-convergence for evolution systems driven by functionals.
Max ERC Funding
1 390 000 €
Duration
Start date: 2011-04-01, End date: 2017-03-31
Project acronym AncientAdhesives
Project Ancient Adhesives - A window on prehistoric technological complexity
Researcher (PI) Geeske LANGEJANS
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), SH6, ERC-2018-STG
Summary AncientAdhesives addresses the most crucial problem in Palaeolithic archaeology: How to reliably infer cognitively complex behaviour in the deep past. To study the evolution of Neandertal and modern human cognitive capacities, certain find categories are taken to reflect behavioural and thus cognitive complexitye.g. Among these are art objects, personal ornaments and complex technology. Of these technology is best-suited to trace changing behavioural complexity, because 1) it is the least vulnerable to differential preservation, and 2) technological behaviours are present throughout the history of our genus. Adhesives are the oldest examples of highly complex technology. They are also known earlier from Neandertal than from modern human contexts. Understanding their technological complexity is thus essential to resolve debates on differences in cognitive complexity of both species. However, currently, there is no agreed-upon method to measure technological complexity.
The aim of AncientAdhesives is to create the first reliable method to compare the complexity of Neandertal and modern human technologies. This is achieved through three main objectives:
1. Collate the first comprehensive body of knowledge on adhesives, including ethnography, archaeology and (experimental) material properties (e.g. preservation, production).
2. Develop a new archaeological methodology by modifying industrial process modelling for archaeological applications.
3. Evaluate the development of adhesive technological complexity through time and across species using a range of explicit complexity measures.
By analysing adhesives, it is possible to measure technological complexity, to identify idiosyncratic behaviours and to track adoption and loss of complex technological know-how. This represents a step-change in debates about the development of behavioural complexity and differences/similarities between Neanderthals and modern humans.
Summary
AncientAdhesives addresses the most crucial problem in Palaeolithic archaeology: How to reliably infer cognitively complex behaviour in the deep past. To study the evolution of Neandertal and modern human cognitive capacities, certain find categories are taken to reflect behavioural and thus cognitive complexitye.g. Among these are art objects, personal ornaments and complex technology. Of these technology is best-suited to trace changing behavioural complexity, because 1) it is the least vulnerable to differential preservation, and 2) technological behaviours are present throughout the history of our genus. Adhesives are the oldest examples of highly complex technology. They are also known earlier from Neandertal than from modern human contexts. Understanding their technological complexity is thus essential to resolve debates on differences in cognitive complexity of both species. However, currently, there is no agreed-upon method to measure technological complexity.
The aim of AncientAdhesives is to create the first reliable method to compare the complexity of Neandertal and modern human technologies. This is achieved through three main objectives:
1. Collate the first comprehensive body of knowledge on adhesives, including ethnography, archaeology and (experimental) material properties (e.g. preservation, production).
2. Develop a new archaeological methodology by modifying industrial process modelling for archaeological applications.
3. Evaluate the development of adhesive technological complexity through time and across species using a range of explicit complexity measures.
By analysing adhesives, it is possible to measure technological complexity, to identify idiosyncratic behaviours and to track adoption and loss of complex technological know-how. This represents a step-change in debates about the development of behavioural complexity and differences/similarities between Neanderthals and modern humans.
Max ERC Funding
1 499 926 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym ANGEOM
Project Geometric analysis in the Euclidean space
Researcher (PI) Xavier Tolsa Domenech
Host Institution (HI) UNIVERSITAT AUTONOMA DE BARCELONA
Call Details Advanced Grant (AdG), PE1, ERC-2012-ADG_20120216
Summary "We propose to study different questions in the area of the so called geometric analysis. Most of the topics we are interested in deal with the connection between the behavior of singular integrals and the geometry of sets and measures. The study of this connection has been shown to be extremely helpful in the solution of certain long standing problems in the last years, such as the solution of the Painlev\'e problem or the obtaining of the optimal distortion bounds for quasiconformal mappings by Astala.
More specifically, we would like to study the relationship between the L^2 boundedness of singular integrals associated with Riesz and other related kernels, and rectifiability and other geometric notions. The so called David-Semmes problem is probably the main open problem in this area. Up to now, the techniques used to deal with this problem come from multiscale analysis and involve ideas from Littlewood-Paley theory and quantitative techniques of rectifiability. We propose to apply new ideas that combine variational arguments with other techniques which have connections with mass transportation. Further, we think that it is worth to explore in more detail the connection among mass transportation, singular integrals, and uniform rectifiability.
We are also interested in the field of quasiconformal mappings. We plan to study a problem regarding the quasiconformal distortion of quasicircles. This problem consists in proving that the bounds obtained recently by S. Smirnov on the dimension of K-quasicircles are optimal. We want to apply techniques from quantitative geometric measure theory to deal with this question.
Another question that we intend to explore lies in the interplay of harmonic analysis, geometric measure theory and partial differential equations. This concerns an old problem on the unique continuation of harmonic functions at the boundary open C^1 or Lipschitz domain. All the results known by now deal with smoother Dini domains."
Summary
"We propose to study different questions in the area of the so called geometric analysis. Most of the topics we are interested in deal with the connection between the behavior of singular integrals and the geometry of sets and measures. The study of this connection has been shown to be extremely helpful in the solution of certain long standing problems in the last years, such as the solution of the Painlev\'e problem or the obtaining of the optimal distortion bounds for quasiconformal mappings by Astala.
More specifically, we would like to study the relationship between the L^2 boundedness of singular integrals associated with Riesz and other related kernels, and rectifiability and other geometric notions. The so called David-Semmes problem is probably the main open problem in this area. Up to now, the techniques used to deal with this problem come from multiscale analysis and involve ideas from Littlewood-Paley theory and quantitative techniques of rectifiability. We propose to apply new ideas that combine variational arguments with other techniques which have connections with mass transportation. Further, we think that it is worth to explore in more detail the connection among mass transportation, singular integrals, and uniform rectifiability.
We are also interested in the field of quasiconformal mappings. We plan to study a problem regarding the quasiconformal distortion of quasicircles. This problem consists in proving that the bounds obtained recently by S. Smirnov on the dimension of K-quasicircles are optimal. We want to apply techniques from quantitative geometric measure theory to deal with this question.
Another question that we intend to explore lies in the interplay of harmonic analysis, geometric measure theory and partial differential equations. This concerns an old problem on the unique continuation of harmonic functions at the boundary open C^1 or Lipschitz domain. All the results known by now deal with smoother Dini domains."
Max ERC Funding
1 105 930 €
Duration
Start date: 2013-05-01, End date: 2018-04-30
Project acronym ANIMETRICS
Project Measurement-Based Modeling and Animation of Complex Mechanical Phenomena
Researcher (PI) Miguel Angel Otaduy Tristan
Host Institution (HI) UNIVERSIDAD REY JUAN CARLOS
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Summary
Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Max ERC Funding
1 277 969 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AnoPath
Project Genetics of mosquito resistance to pathogens
Researcher (PI) Kenneth Du Souchet Vernick
Host Institution (HI) INSTITUT PASTEUR
Call Details Advanced Grant (AdG), LS2, ERC-2012-ADG_20120314
Summary Malaria parasite infection in humans has been called “the strongest known force for evolutionary selection in the recent history of the human genome”, and I hypothesize that a similar statement may apply to the mosquito vector, which is the definitive host of the malaria parasite. We previously discovered efficient malaria-resistance mechanisms in natural populations of the African malaria vector, Anopheles gambiae. Aim 1 of the proposed project will implement a novel genetic mapping design to systematically survey the mosquito population for common and rare genetic variants of strong effect against the human malaria parasite, Plasmodium falciparum. A product of the mapping design will be living mosquito families carrying the resistance loci. Aim 2 will use the segregating families to functionally dissect the underlying molecular mechanisms controlled by the loci, including determination of the pathogen specificity spectra of the host-defense traits. Aim 3 targets arbovirus transmission, where Anopheles mosquitoes transmit human malaria but not arboviruses such as Dengue and Chikungunya, even though the two mosquitoes bite the same people and are exposed to the same pathogens, often in malaria-arbovirus co-infections. We will use deep-sequencing to detect processing of the arbovirus dsRNA intermediates of replication produced by the RNAi pathway of the mosquitoes. The results will reveal important new information about differences in the efficiency and quality of the RNAi response between mosquitoes, which is likely to underlie at least part of the host specificity of arbovirus transmission. The 3 Aims will make significant contributions to understanding malaria and arbovirus transmission, major global public health problems, will aid the development of a next generation of vector surveillance and control tools, and will produce a definitive description of the major genetic factors influencing host-pathogen interactions in mosquito immunity.
Summary
Malaria parasite infection in humans has been called “the strongest known force for evolutionary selection in the recent history of the human genome”, and I hypothesize that a similar statement may apply to the mosquito vector, which is the definitive host of the malaria parasite. We previously discovered efficient malaria-resistance mechanisms in natural populations of the African malaria vector, Anopheles gambiae. Aim 1 of the proposed project will implement a novel genetic mapping design to systematically survey the mosquito population for common and rare genetic variants of strong effect against the human malaria parasite, Plasmodium falciparum. A product of the mapping design will be living mosquito families carrying the resistance loci. Aim 2 will use the segregating families to functionally dissect the underlying molecular mechanisms controlled by the loci, including determination of the pathogen specificity spectra of the host-defense traits. Aim 3 targets arbovirus transmission, where Anopheles mosquitoes transmit human malaria but not arboviruses such as Dengue and Chikungunya, even though the two mosquitoes bite the same people and are exposed to the same pathogens, often in malaria-arbovirus co-infections. We will use deep-sequencing to detect processing of the arbovirus dsRNA intermediates of replication produced by the RNAi pathway of the mosquitoes. The results will reveal important new information about differences in the efficiency and quality of the RNAi response between mosquitoes, which is likely to underlie at least part of the host specificity of arbovirus transmission. The 3 Aims will make significant contributions to understanding malaria and arbovirus transmission, major global public health problems, will aid the development of a next generation of vector surveillance and control tools, and will produce a definitive description of the major genetic factors influencing host-pathogen interactions in mosquito immunity.
Max ERC Funding
2 307 800 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym ANOPTSETCON
Project Analysis of optimal sets and optimal constants: old questions and new results
Researcher (PI) Aldo Pratelli
Host Institution (HI) FRIEDRICH-ALEXANDER-UNIVERSITAET ERLANGEN NUERNBERG
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary The analysis of geometric and functional inequalities naturally leads to consider the extremal cases, thus
looking for optimal sets, or optimal functions, or optimal constants. The most classical examples are the (different versions of the) isoperimetric inequality and the Sobolev-like inequalities. Much is known about equality cases and best constants, but there are still many questions which seem quite natural but yet have no answer. For instance, it is not known, even in the 2-dimensional space, the answer of a question by Brezis: which set,
among those with a given volume, has the biggest Sobolev-Poincaré constant for p=1? This is a very natural problem, and it appears reasonable that the optimal set should be the ball, but this has never been proved. The interest in problems like this relies not only in the extreme simplicity of the questions and in their classical flavour, but also in the new ideas and techniques which are needed to provide the answers.
The main techniques that we aim to use are fine arguments of symmetrization, geometric constructions and tools from mass transportation (which is well known to be deeply connected with functional inequalities). These are the basic tools that we already used to reach, in last years, many results in a specific direction, namely the search of sharp quantitative inequalities. Our first result, together with Fusco and Maggi, showed what follows. Everybody knows that the set which minimizes the perimeter with given volume is the ball.
But is it true that a set which almost minimizes the perimeter must be close to a ball? The question had been posed in the 1920's and many partial result appeared in the years. In our paper (Ann. of Math., 2007) we proved the sharp result. Many other results of this kind were obtained in last two years.
Summary
The analysis of geometric and functional inequalities naturally leads to consider the extremal cases, thus
looking for optimal sets, or optimal functions, or optimal constants. The most classical examples are the (different versions of the) isoperimetric inequality and the Sobolev-like inequalities. Much is known about equality cases and best constants, but there are still many questions which seem quite natural but yet have no answer. For instance, it is not known, even in the 2-dimensional space, the answer of a question by Brezis: which set,
among those with a given volume, has the biggest Sobolev-Poincaré constant for p=1? This is a very natural problem, and it appears reasonable that the optimal set should be the ball, but this has never been proved. The interest in problems like this relies not only in the extreme simplicity of the questions and in their classical flavour, but also in the new ideas and techniques which are needed to provide the answers.
The main techniques that we aim to use are fine arguments of symmetrization, geometric constructions and tools from mass transportation (which is well known to be deeply connected with functional inequalities). These are the basic tools that we already used to reach, in last years, many results in a specific direction, namely the search of sharp quantitative inequalities. Our first result, together with Fusco and Maggi, showed what follows. Everybody knows that the set which minimizes the perimeter with given volume is the ball.
But is it true that a set which almost minimizes the perimeter must be close to a ball? The question had been posed in the 1920's and many partial result appeared in the years. In our paper (Ann. of Math., 2007) we proved the sharp result. Many other results of this kind were obtained in last two years.
Max ERC Funding
540 000 €
Duration
Start date: 2010-08-01, End date: 2015-07-31
Project acronym ANOREP
Project Targeting the reproductive biology of the malaria mosquito Anopheles gambiae: from laboratory studies to field applications
Researcher (PI) Flaminia Catteruccia
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PERUGIA
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Summary
Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANPROB
Project Analytic-probabilistic methods for borderline singular integrals
Researcher (PI) Tuomas Pentinpoika Hytönen
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Summary
The proposal consists of an extensive research program to advance the understanding of singular integral operators of Harmonic Analysis in various situations on the borderline of the existing theory. This is to be achieved by a creative combination of techniques from Analysis and Probability. On top of the standard arsenal of modern Harmonic Analysis, the main probabilistic tools are the martingale transform inequalities of Burkholder, and random geometric constructions in the spirit of the random dyadic cubes introduced to Nonhomogeneous Analysis by Nazarov, Treil and Volberg.
The problems to be addressed fall under the following subtitles, with many interconnections and overlap: (i) sharp weighted inequalities; (ii) nonhomogeneous singular integrals on metric spaces; (iii) local Tb theorems with borderline assumptions; (iv) functional calculus of rough differential operators; and (v) vector-valued singular integrals.
Topic (i) is a part of Classical Analysis, where new methods have led to substantial recent progress, culminating in my solution in July 2010 of a celebrated problem on the linear dependence of the weighted operator norm on the Muckenhoupt norm of the weight. The proof should be extendible to several related questions, and the aim is to also address some outstanding open problems in the area.
Topics (ii) and (v) deal with extensions of the theory of singular integrals to functions with more general domain and range spaces, allowing them to be abstract metric and Banach spaces, respectively. In case (ii), I have recently been able to relax the requirements on the space compared to the established theories, opening a new research direction here. Topics (iii) and (iv) are concerned with weakening the assumptions on singular integrals in the usual Euclidean space, to allow certain applications in the theory of Partial Differential Equations. The goal is to maintain a close contact and exchange of ideas between such abstract and concrete questions.
Max ERC Funding
1 100 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym ANT
Project Automata in Number Theory
Researcher (PI) Boris Adamczewski
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE1, ERC-2014-CoG
Summary Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Summary
Finite automata are fundamental objects in Computer Science, of great importance on one hand for theoretical aspects (formal language theory, decidability, complexity) and on the other for practical applications (parsing). In number theory, finite automata are mainly used as simple devices for generating sequences of symbols over a finite set (e.g., digital representations of real numbers), and for recognizing some sets of integers or more generally of finitely generated abelian groups or monoids. One of the main features of these automatic structures comes from the fact that they are highly ordered without necessarily being trivial (i.e., periodic). With their rich fractal nature, they lie somewhere between order and chaos, even if, in most respects, their rigidity prevails. Over the last few years, several ground-breaking results have lead to a great renewed interest in the study of automatic structures in arithmetics.
A primary objective of the ANT project is to exploit this opportunity by developing new directions and interactions between automata and number theory. In this proposal, we outline three lines of research concerning fundamental number theoretical problems that have baffled mathematicians for decades. They include the study of integer base expansions of classical constants, of arithmetical linear differential equations and their link with enumerative combinatorics, and of arithmetics in positive characteristic. At first glance, these topics may seem unrelated, but, surprisingly enough, the theory of finite automata will serve as a natural guideline. We stress that this new point of view on classical questions is a key part of our methodology: we aim at creating a powerful synergy between the different approaches we propose to develop, placing automata theory and related methods at the heart of the subject. This project provides a unique opportunity to create the first international team focusing on these different problems as a whole.
Max ERC Funding
1 438 745 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANTHOS
Project Analytic Number Theory: Higher Order Structures
Researcher (PI) Valentin Blomer
Host Institution (HI) GEORG-AUGUST-UNIVERSITAT GOTTINGENSTIFTUNG OFFENTLICHEN RECHTS
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary This is a proposal for research at the interface of analytic number theory, automorphic forms and algebraic geometry. Motivated by fundamental conjectures in number theory, classical problems will be investigated in higher order situations: general number fields, automorphic forms on higher rank groups, the arithmetic of algebraic varieties of higher degree. In particular, I want to focus on
- computation of moments of L-function of degree 3 and higher with applications to subconvexity and/or non-vanishing, as well as subconvexity for multiple L-functions;
- bounds for sup-norms of cusp forms on various spaces and equidistribution of Hecke correspondences;
- automorphic forms on higher rank groups and general number fields, in particular new bounds towards the Ramanujan conjecture;
- a proof of Manin's conjecture for a certain class of singular algebraic varieties.
The underlying methods are closely related; for example, rational points on algebraic varieties
will be counted by a multiple L-series technique.
Summary
This is a proposal for research at the interface of analytic number theory, automorphic forms and algebraic geometry. Motivated by fundamental conjectures in number theory, classical problems will be investigated in higher order situations: general number fields, automorphic forms on higher rank groups, the arithmetic of algebraic varieties of higher degree. In particular, I want to focus on
- computation of moments of L-function of degree 3 and higher with applications to subconvexity and/or non-vanishing, as well as subconvexity for multiple L-functions;
- bounds for sup-norms of cusp forms on various spaces and equidistribution of Hecke correspondences;
- automorphic forms on higher rank groups and general number fields, in particular new bounds towards the Ramanujan conjecture;
- a proof of Manin's conjecture for a certain class of singular algebraic varieties.
The underlying methods are closely related; for example, rational points on algebraic varieties
will be counted by a multiple L-series technique.
Max ERC Funding
1 004 000 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym ANTHROPOID
Project Great ape organoids to reconstruct uniquely human development
Researcher (PI) Jarrett CAMP
Host Institution (HI) INSTITUT FUR MOLEKULARE UND KLINISCHE OPHTHALMOLOGIE BASEL
Call Details Starting Grant (StG), LS2, ERC-2018-STG
Summary Humans diverged from our closest living relatives, chimpanzees and other great apes, 6-10 million years ago. Since this divergence, our ancestors acquired genetic changes that enhanced cognition, altered metabolism, and endowed our species with an adaptive capacity to colonize the entire planet and reshape the biosphere. Through genome comparisons between modern humans, Neandertals, chimpanzees and other apes we have identified genetic changes that likely contribute to innovations in human metabolic and cognitive physiology. However, it has been difficult to assess the functional effects of these genetic changes due to the lack of cell culture systems that recapitulate great ape organ complexity. Human and chimpanzee pluripotent stem cells (PSCs) can self-organize into three-dimensional (3D) tissues that recapitulate the morphology, function, and genetic programs controlling organ development. Our vision is to use organoids to study the changes that set modern humans apart from our closest evolutionary relatives as well as all other organisms on the planet. In ANTHROPOID we will generate a great ape developmental cell atlas using cortex, liver, and small intestine organoids. We will use single-cell transcriptomics and chromatin accessibility to identify cell type-specific features of transcriptome divergence at cellular resolution. We will dissect enhancer evolution using single-cell genomic screens and ancestralize human cells to resurrect pre-human cellular phenotypes. ANTHROPOID utilizes quantitative and state-of-the-art methods to explore exciting high-risk questions at multiple branches of the modern human lineage. This project is a ground breaking starting point to replay evolution and tackle the ancient question of what makes us uniquely human?
Summary
Humans diverged from our closest living relatives, chimpanzees and other great apes, 6-10 million years ago. Since this divergence, our ancestors acquired genetic changes that enhanced cognition, altered metabolism, and endowed our species with an adaptive capacity to colonize the entire planet and reshape the biosphere. Through genome comparisons between modern humans, Neandertals, chimpanzees and other apes we have identified genetic changes that likely contribute to innovations in human metabolic and cognitive physiology. However, it has been difficult to assess the functional effects of these genetic changes due to the lack of cell culture systems that recapitulate great ape organ complexity. Human and chimpanzee pluripotent stem cells (PSCs) can self-organize into three-dimensional (3D) tissues that recapitulate the morphology, function, and genetic programs controlling organ development. Our vision is to use organoids to study the changes that set modern humans apart from our closest evolutionary relatives as well as all other organisms on the planet. In ANTHROPOID we will generate a great ape developmental cell atlas using cortex, liver, and small intestine organoids. We will use single-cell transcriptomics and chromatin accessibility to identify cell type-specific features of transcriptome divergence at cellular resolution. We will dissect enhancer evolution using single-cell genomic screens and ancestralize human cells to resurrect pre-human cellular phenotypes. ANTHROPOID utilizes quantitative and state-of-the-art methods to explore exciting high-risk questions at multiple branches of the modern human lineage. This project is a ground breaking starting point to replay evolution and tackle the ancient question of what makes us uniquely human?
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym Anti-Virome
Project A combined evolutionary and proteomics approach to the discovery, induction and application of antiviral immunity factors
Researcher (PI) Frank Kirchhoff
Host Institution (HI) UNIVERSITAET ULM
Call Details Advanced Grant (AdG), LS6, ERC-2012-ADG_20120314
Summary "Humans are equipped with a variety of intrinsic immunity or host restriction factors. These evolved under positive selection pressure for diversification and represent a first line of defence against invading viruses. Unfortunately, however, many pathogens have evolved effective antagonists against our defences. For example, the capability of HIV-1 to counteract human restriction factors that interfere with reverse transcription, uncoating and virion release has been a prerequisite for the global spread of AIDS. We are just beginning to understand the diversity and induction of antiretroviral factors and how pandemic HIV-1 group M (major) strains evolved to counteract all of them. Here, I propose to use a genetics, proteomics and evolutionary approach to discover and define as-yet-unknown antiviral effectors and their inducers. To identify novel antiviral factors, we will examine the capability of all primate genes that are under strong positive selection pressure to inhibit HIV and its simian (SIV) precursors. This examination from the evolutionary perspective of the invading pathogen will also reveal which adaptations allowed HIV-1 to cause the AIDS pandemic. Furthermore, complex peptide-protein libraries representing essentially the entire human peptidome, will be utilized to identify novel specific inducers of antiviral restriction factors. My ultimate aim is to unravel the network of inducers and effectors of antiviral immunity - the ""Anti-Virome"" - and to use this knowledge to develop novel effective preventive and therapeutic approaches based on the induction of combinations of antiviral factors targeting different steps of the viral life cycle. The results of this innovative and interdisciplinary program will provide fundamental new insights into intrinsic immunity and may offer alternatives to conventional vaccine and therapeutic approaches because most restriction factors have broad antiviral activity and are thus effective against various pathogens."
Summary
"Humans are equipped with a variety of intrinsic immunity or host restriction factors. These evolved under positive selection pressure for diversification and represent a first line of defence against invading viruses. Unfortunately, however, many pathogens have evolved effective antagonists against our defences. For example, the capability of HIV-1 to counteract human restriction factors that interfere with reverse transcription, uncoating and virion release has been a prerequisite for the global spread of AIDS. We are just beginning to understand the diversity and induction of antiretroviral factors and how pandemic HIV-1 group M (major) strains evolved to counteract all of them. Here, I propose to use a genetics, proteomics and evolutionary approach to discover and define as-yet-unknown antiviral effectors and their inducers. To identify novel antiviral factors, we will examine the capability of all primate genes that are under strong positive selection pressure to inhibit HIV and its simian (SIV) precursors. This examination from the evolutionary perspective of the invading pathogen will also reveal which adaptations allowed HIV-1 to cause the AIDS pandemic. Furthermore, complex peptide-protein libraries representing essentially the entire human peptidome, will be utilized to identify novel specific inducers of antiviral restriction factors. My ultimate aim is to unravel the network of inducers and effectors of antiviral immunity - the ""Anti-Virome"" - and to use this knowledge to develop novel effective preventive and therapeutic approaches based on the induction of combinations of antiviral factors targeting different steps of the viral life cycle. The results of this innovative and interdisciplinary program will provide fundamental new insights into intrinsic immunity and may offer alternatives to conventional vaccine and therapeutic approaches because most restriction factors have broad antiviral activity and are thus effective against various pathogens."
Max ERC Funding
1 915 200 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym ANTICIPATE
Project Anticipatory Human-Computer Interaction
Researcher (PI) Andreas BULLING
Host Institution (HI) UNIVERSITAET STUTTGART
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.
ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.
Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.
Summary
Even after three decades of research on human-computer interaction (HCI), current general-purpose user interfaces (UI) still lack the ability to attribute mental states to their users, i.e. they fail to understand users' intentions and needs and to anticipate their actions. This drastically restricts their interactive capabilities.
ANTICIPATE aims to establish the scientific foundations for a new generation of user interfaces that pro-actively adapt to users' future input actions by monitoring their attention and predicting their interaction intentions - thereby significantly improving the naturalness, efficiency, and user experience of the interactions. Realising this vision of anticipatory human-computer interaction requires groundbreaking advances in everyday sensing of user attention from eye and brain activity. We will further pioneer methods to predict entangled user intentions and forecast interactive behaviour with fine temporal granularity during interactions in everyday stationary and mobile settings. Finally, we will develop fundamental interaction paradigms that enable anticipatory UIs to pro-actively adapt to users' attention and intentions in a mindful way. The new capabilities will be demonstrated in four challenging cases: 1) mobile information retrieval, 2) intelligent notification management, 3) Autism diagnosis and monitoring, and 4) computer-based training.
Anticipatory human-computer interaction offers a strong complement to existing UI paradigms that only react to user input post-hoc. If successful, ANTICIPATE will deliver the first important building blocks for implementing Theory of Mind in general-purpose UIs. As such, the project has the potential to drastically improve the billions of interactions we perform with computers every day, to trigger a wide range of follow-up research in HCI as well as adjacent areas within and outside computer science, and to act as a key technical enabler for new applications, e.g. in healthcare and education.
Max ERC Funding
1 499 625 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31