Project acronym 4DRepLy
Project Closing the 4D Real World Reconstruction Loop
Researcher (PI) Christian THEOBALT
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary 4D reconstruction, the camera-based dense dynamic scene reconstruction, is a grand challenge in computer graphics and computer vision. Despite great progress, 4D capturing the complex, diverse real world outside a studio is still far from feasible. 4DRepLy builds a new generation of high-fidelity 4D reconstruction (4DRecon) methods. They will be the first to efficiently capture all types of deformable objects (humans and other types) in crowded real world scenes with a single color or depth camera. They capture space-time coherent deforming geometry, motion, high-frequency reflectance and illumination at unprecedented detail, and will be the first to handle difficult occlusions, topology changes and large groups of interacting objects. They automatically adapt to new scene types, yet deliver models with meaningful, interpretable parameters. This requires far reaching contributions: First, we develop groundbreaking new plasticity-enhanced model-based 4D reconstruction methods that automatically adapt to new scenes. Second, we develop radically new machine learning-based dense 4D reconstruction methods. Third, these model- and learning-based methods are combined in two revolutionary new classes of 4DRecon methods: 1) advanced fusion-based methods and 2) methods with deep architectural integration. Both, 1) and 2), are automatically designed in the 4D Real World Reconstruction Loop, a revolutionary new design paradigm in which 4DRecon methods refine and adapt themselves while continuously processing unlabeled real world input. This overcomes the previously unbreakable scalability barrier to real world scene diversity, complexity and generality. This paradigm shift opens up a new research direction in graphics and vision and has far reaching relevance across many scientific fields. It enables new applications of profound social pervasion and significant economic impact, e.g., for visual media and virtual/augmented reality, and for future autonomous and robotic systems.
Summary
4D reconstruction, the camera-based dense dynamic scene reconstruction, is a grand challenge in computer graphics and computer vision. Despite great progress, 4D capturing the complex, diverse real world outside a studio is still far from feasible. 4DRepLy builds a new generation of high-fidelity 4D reconstruction (4DRecon) methods. They will be the first to efficiently capture all types of deformable objects (humans and other types) in crowded real world scenes with a single color or depth camera. They capture space-time coherent deforming geometry, motion, high-frequency reflectance and illumination at unprecedented detail, and will be the first to handle difficult occlusions, topology changes and large groups of interacting objects. They automatically adapt to new scene types, yet deliver models with meaningful, interpretable parameters. This requires far reaching contributions: First, we develop groundbreaking new plasticity-enhanced model-based 4D reconstruction methods that automatically adapt to new scenes. Second, we develop radically new machine learning-based dense 4D reconstruction methods. Third, these model- and learning-based methods are combined in two revolutionary new classes of 4DRecon methods: 1) advanced fusion-based methods and 2) methods with deep architectural integration. Both, 1) and 2), are automatically designed in the 4D Real World Reconstruction Loop, a revolutionary new design paradigm in which 4DRecon methods refine and adapt themselves while continuously processing unlabeled real world input. This overcomes the previously unbreakable scalability barrier to real world scene diversity, complexity and generality. This paradigm shift opens up a new research direction in graphics and vision and has far reaching relevance across many scientific fields. It enables new applications of profound social pervasion and significant economic impact, e.g., for visual media and virtual/augmented reality, and for future autonomous and robotic systems.
Max ERC Funding
1 977 000 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym Active-DNA
Project Computationally Active DNA Nanostructures
Researcher (PI) Damien WOODS
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND MAYNOOTH
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Summary
During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Max ERC Funding
2 349 603 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym ADULT
Project Analysis of the Dark Universe through Lensing Tomography
Researcher (PI) Hendrik Hoekstra
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Summary
The discoveries that the expansion of the universe is accelerating due to an unknown “dark energy”
and that most of the matter is invisible, highlight our lack of understanding of the major constituents
of the universe. These surprising findings set the stage for research in cosmology at the start of the
21st century. The objective of this proposal is to advance observational constraints to a level where we can distinguish between physical mechanisms that aim to explain the properties of dark energy and the observed distribution of dark matter throughout the universe. We use a relatively new technique called weak gravitational lensing: the accurate measurement of correlations in the orientations of distant galaxies enables us to map the dark matter distribution directly and to extract the cosmological information that is encoded by the large-scale structure.
To study the dark universe we will analyse data from a new state-of-the-art imaging survey: the Kilo-
Degree Survey (KiDS) will cover 1500 square degrees in 9 filters. The combination of its large survey
area and the availability of exquisite photometric redshifts for the sources makes KiDS the first
project that can place interesting constraints on the dark energy equation-of-state using lensing data
alone. Combined with complementary results from Planck, our measurements will provide one of the
best views of the dark side of the universe before much larger space-based projects commence.
To reach the desired accuracy we need to carefully measure the shapes of distant background galaxies. We also need to account for any intrinsic alignments that arise due to tidal interactions, rather than through lensing. Reducing these observational and physical biases to negligible levels is a necessarystep to ensure the success of KiDS and an important part of our preparation for more challenging projects such as the European-led space mission Euclid.
Max ERC Funding
1 316 880 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym AfricanWomen
Project Women in Africa
Researcher (PI) catherine GUIRKINGER
Host Institution (HI) UNIVERSITE DE NAMUR ASBL
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Summary
Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Max ERC Funding
1 499 313 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym ALGILE
Project Foundations of Algebraic and Dynamic Data Management Systems
Researcher (PI) Christoph Koch
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Summary
"Contemporary database query languages are ultimately founded on logic and feature an additive operation – usually a form of (multi)set union or disjunction – that is asymmetric in that additions or updates do not always have an inverse. This asymmetry puts a greater part of the machinery of abstract algebra for equation solving outside the reach of databases. However, such equation solving would be a key functionality that problems such as query equivalence testing and data integration could be reduced to: In the current scenario of the presence of an asymmetric additive operation they are undecidable. Moreover, query languages with a symmetric additive operation (i.e., which has an inverse and is thus based on ring theory) would open up databases for a large range of new scientific and mathematical applications.
The goal of the proposed project is to reinvent database management systems with a foundation in abstract algebra and specifically in ring theory. The presence of an additive inverse allows to cleanly define differences between queries. This gives rise to a database analog of differential calculus that leads to radically new incremental and adaptive query evaluation algorithms that substantially outperform the state of the art techniques. These algorithms enable a new class of systems which I call Dynamic Data Management Systems. Such systems can maintain continuously fresh query views at extremely high update rates and have important applications in interactive Large-scale Data Analysis. There is a natural connection between differences and updates, motivating the group theoretic study of updates that will lead to better ways of creating out-of-core data processing algorithms for new storage devices. Basing queries on ring theory leads to a new class of systems, Algebraic Data Management Systems, which herald a convergence of database systems and computer algebra systems."
Max ERC Funding
1 480 548 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ALPHA
Project Alpha Shape Theory Extended
Researcher (PI) Herbert Edelsbrunner
Host Institution (HI) INSTITUTE OF SCIENCE AND TECHNOLOGYAUSTRIA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Summary
Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Max ERC Funding
1 678 432 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym ANIMETRICS
Project Measurement-Based Modeling and Animation of Complex Mechanical Phenomena
Researcher (PI) Miguel Angel Otaduy Tristan
Host Institution (HI) UNIVERSIDAD REY JUAN CARLOS
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Summary
Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Max ERC Funding
1 277 969 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym ANTICS
Project Algorithmic Number Theory in Computer Science
Researcher (PI) Andreas Enge
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary "During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Summary
"During the past twenty years, we have witnessed profound technological changes, summarised under the terms of digital revolution or entering the information age. It is evident that these technological changes will have a deep societal impact, and questions of privacy and security are primordial to ensure the survival of a free and open society.
Cryptology is a main building block of any security solution, and at the heart of projects such as electronic identity and health cards, access control, digital content distribution or electronic voting, to mention only a few important applications. During the past decades, public-key cryptology has established itself as a research topic in computer science; tools of theoretical computer science are employed to “prove” the security of cryptographic primitives such as encryption or digital signatures and of more complex protocols. It is often forgotten, however, that all practically relevant public-key cryptosystems are rooted in pure mathematics, in particular, number theory and arithmetic geometry. In fact, the socalled security “proofs” are all conditional to the algorithmic untractability of certain number theoretic problems, such as factorisation of large integers or discrete logarithms in algebraic curves. Unfortunately, there is a large cultural gap between computer scientists using a black-box security reduction to a supposedly hard problem in algorithmic number theory and number theorists, who are often interested in solving small and easy instances of the same problem. The theoretical grounds on which current algorithmic number theory operates are actually rather shaky, and cryptologists are generally unaware of this fact.
The central goal of ANTICS is to rebuild algorithmic number theory on the firm grounds of theoretical computer science."
Max ERC Funding
1 453 507 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym APMPAL-HET
Project Asset Prices and Macro Policy when Agents Learn and are Heterogeneous
Researcher (PI) Albert MARCET TORRENS
Host Institution (HI) FUNDACIÓ MARKETS, ORGANIZATIONS AND VOTES IN ECONOMICS
Call Details Advanced Grant (AdG), SH1, ERC-2017-ADG
Summary Based on the APMPAL (ERC) project we continue to develop the frameworks of internal rationality (IR) and optimal signal extraction (OSE). Under IR investors/consumers behave rationally given their subjective beliefs about prices, these beliefs are compatible with data. Under OSE the government has partial information, it knows how policy influences observed variables and signal extraction.
We develop further the foundations of IR and OSE with an emphasis on heterogeneous agents. We study sovereign bond crisis and heterogeneity of beliefs in asset pricing models under IR, using survey data on expectations. Under IR the assets’ stochastic discount factor depends on the agents’ decision function and beliefs; this modifies some key asset pricing results. We extend OSE to models with state variables, forward-looking constraints and heterogeneity.
Under IR agents’ prior beliefs determine the effects of a policy reform. If the government does not observe prior beliefs it has partial information, thus OSE should be used to analyse policy reforms under IR.
If IR heterogeneous workers forecast their productivity either from their own wage or their neighbours’ in a network, low current wages discourage search and human capital accumulation, leading to low productivity. This can explain low development of a country or social exclusion of a group. Worker subsidies redistribute wealth and can increase productivity if they “teach” agents to exit a low-wage state.
We build DSGE models under IR for prediction and policy analysis. We develop time-series tools for predicting macro and asset market variables, using information available to the analyst, and we introduce non-linearities and survey expectations using insights from models under IR.
We study how IR and OSE change the view on macro policy issues such as tax smoothing, debt management, Taylor rule, level of inflation, fiscal/monetary policy coordination, factor taxation or redistribution.
Summary
Based on the APMPAL (ERC) project we continue to develop the frameworks of internal rationality (IR) and optimal signal extraction (OSE). Under IR investors/consumers behave rationally given their subjective beliefs about prices, these beliefs are compatible with data. Under OSE the government has partial information, it knows how policy influences observed variables and signal extraction.
We develop further the foundations of IR and OSE with an emphasis on heterogeneous agents. We study sovereign bond crisis and heterogeneity of beliefs in asset pricing models under IR, using survey data on expectations. Under IR the assets’ stochastic discount factor depends on the agents’ decision function and beliefs; this modifies some key asset pricing results. We extend OSE to models with state variables, forward-looking constraints and heterogeneity.
Under IR agents’ prior beliefs determine the effects of a policy reform. If the government does not observe prior beliefs it has partial information, thus OSE should be used to analyse policy reforms under IR.
If IR heterogeneous workers forecast their productivity either from their own wage or their neighbours’ in a network, low current wages discourage search and human capital accumulation, leading to low productivity. This can explain low development of a country or social exclusion of a group. Worker subsidies redistribute wealth and can increase productivity if they “teach” agents to exit a low-wage state.
We build DSGE models under IR for prediction and policy analysis. We develop time-series tools for predicting macro and asset market variables, using information available to the analyst, and we introduce non-linearities and survey expectations using insights from models under IR.
We study how IR and OSE change the view on macro policy issues such as tax smoothing, debt management, Taylor rule, level of inflation, fiscal/monetary policy coordination, factor taxation or redistribution.
Max ERC Funding
1 524 144 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ASAP
Project Adaptive Security and Privacy
Researcher (PI) Bashar Nuseibeh
Host Institution (HI) THE OPEN UNIVERSITY
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Summary
With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Max ERC Funding
2 499 041 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym ASSESS
Project Episodic Mass Loss in the Most Massive Stars: Key to Understanding the Explosive Early Universe
Researcher (PI) Alceste BONANOS
Host Institution (HI) NATIONAL OBSERVATORY OF ATHENS
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Summary
Massive stars dominate their surroundings during their short lifetimes, while their explosive deaths impact the chemical evolution and spatial cohesion of their hosts. After birth, their evolution is largely dictated by their ability to remove layers of hydrogen from their envelopes. Multiple lines of evidence are pointing to violent, episodic mass-loss events being responsible for removing a large part of the massive stellar envelope, especially in low-metallicity galaxies. Episodic mass loss, however, is not understood theoretically, neither accounted for in state-of-the-art models of stellar evolution, which has far-reaching consequences for many areas of astronomy. We aim to determine whether episodic mass loss is a dominant process in the evolution of the most massive stars by conducting the first extensive, multi-wavelength survey of evolved massive stars in the nearby Universe. The project hinges on the fact that mass-losing stars form dust and are bright in the mid-infrared. We plan to (i) derive physical parameters of a large sample of dusty, evolved targets and estimate the amount of ejected mass, (ii) constrain evolutionary models, (iii) quantify the duration and frequency of episodic mass loss as a function of metallicity. The approach involves applying machine-learning algorithms to existing multi-band and time-series photometry of luminous sources in ~25 nearby galaxies. Dusty, luminous evolved massive stars will thus be automatically classified and follow-up spectroscopy will be obtained for selected targets. Atmospheric and SED modeling will yield parameters and estimates of time-dependent mass loss for ~1000 luminous stars. The emerging trend for the ubiquity of episodic mass loss, if confirmed, will be key to understanding the explosive early Universe and will have profound consequences for low-metallicity stars, reionization, and the chemical evolution of galaxies.
Max ERC Funding
1 128 750 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym Asterochronometry
Project Galactic archeology with high temporal resolution
Researcher (PI) Andrea MIGLIO
Host Institution (HI) THE UNIVERSITY OF BIRMINGHAM
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The Milky Way is a complex system, with dynamical and chemical substructures, where several competing processes such as mergers, internal secular evolution, gas accretion and gas flows take place. To study in detail how such a giant spiral galaxy was formed and evolved, we need to reconstruct the sequence of its main formation events with high (~10%) temporal resolution.
Asterochronometry will determine accurate, precise ages for tens of thousands of stars in the Galaxy. We will take an approach distinguished by a number of key aspects including, developing novel star-dating methods that fully utilise the potential of individual pulsation modes, coupled with a careful appraisal of systematic uncertainties on age deriving from our limited understanding of stellar physics.
We will then capitalise on opportunities provided by the timely availability of astrometric, spectroscopic, and asteroseismic data to build and data-mine chrono-chemo-dynamical maps of regions of the Milky Way probed by the space missions CoRoT, Kepler, K2, and TESS. We will quantify, by comparison with predictions of chemodynamical models, the relative importance of various processes which play a role in shaping the Galaxy, for example mergers and dynamical processes. We will use chrono-chemical tagging to look for evidence of aggregates, and precise and accurate ages to reconstruct the early star formation history of the Milky Way’s main constituents.
The Asterochronometry project will also provide stringent observational tests of stellar structure and answer some of the long-standing open questions in stellar modelling (e.g. efficiency of transport processes, mass loss on the giant branch, the occurrence of products of coalescence / mass exchange). These tests will improve our ability to determine stellar ages and chemical yields, with wide impact e.g. on the characterisation and ensemble studies of exoplanets, on evolutionary population synthesis, integrated colours and thus ages of galaxies.
Summary
The Milky Way is a complex system, with dynamical and chemical substructures, where several competing processes such as mergers, internal secular evolution, gas accretion and gas flows take place. To study in detail how such a giant spiral galaxy was formed and evolved, we need to reconstruct the sequence of its main formation events with high (~10%) temporal resolution.
Asterochronometry will determine accurate, precise ages for tens of thousands of stars in the Galaxy. We will take an approach distinguished by a number of key aspects including, developing novel star-dating methods that fully utilise the potential of individual pulsation modes, coupled with a careful appraisal of systematic uncertainties on age deriving from our limited understanding of stellar physics.
We will then capitalise on opportunities provided by the timely availability of astrometric, spectroscopic, and asteroseismic data to build and data-mine chrono-chemo-dynamical maps of regions of the Milky Way probed by the space missions CoRoT, Kepler, K2, and TESS. We will quantify, by comparison with predictions of chemodynamical models, the relative importance of various processes which play a role in shaping the Galaxy, for example mergers and dynamical processes. We will use chrono-chemical tagging to look for evidence of aggregates, and precise and accurate ages to reconstruct the early star formation history of the Milky Way’s main constituents.
The Asterochronometry project will also provide stringent observational tests of stellar structure and answer some of the long-standing open questions in stellar modelling (e.g. efficiency of transport processes, mass loss on the giant branch, the occurrence of products of coalescence / mass exchange). These tests will improve our ability to determine stellar ages and chemical yields, with wide impact e.g. on the characterisation and ensemble studies of exoplanets, on evolutionary population synthesis, integrated colours and thus ages of galaxies.
Max ERC Funding
1 958 863 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ATMO
Project Atmospheres across the Universe
Researcher (PI) Pascal TREMBLIN
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Summary
Which molecules are present in the atmosphere of exoplanets? What are their mass, radius and age? Do they have clouds, convection (atmospheric turbulence), fingering convection, or a circulation induced by irradiation? These questions are fundamental in exoplanetology in order to study issues such as planet formation and exoplanet habitability.
Yet, the impact of fingering convection and circulation induced by irradiation remain poorly understood:
- Fingering convection (triggered by gradients of mean-molecular-weight) has already been suggested to happen in stars (accumulation of heavy elements) and in brown dwarfs and exoplanets (chemical transition e.g. CO/CH4). A large-scale efficient turbulent transport of energy through the fingering instability can reduce the temperature gradient in the atmosphere and explain many observed spectral properties of brown dwarfs and exoplanets. Nonetheless, this large-scale efficiency is not yet characterized and standard approximations (Boussinesq) cannot be used to achieve this goal.
- The interaction between atmospheric circulation and the fingering instability is an open question in the case of irradiated exoplanets. Fingering convection can change the location and magnitude of the hot spot induced by irradiation, whereas the hot deep atmosphere induced by irradiation can change the location of the chemical transitions that trigger the fingering instability.
This project will characterize the impact of fingering convection in the atmosphere of stars, brown dwarfs, and exoplanets and its interaction with the circulation in the case of irradiated planets. By developing innovative numerical models, we will characterize the reduction of the temperature gradient of the atmosphere induced by the instability and study the impact of the circulation. We will then predict and interpret the mass, radius, and chemical composition of exoplanets that will be observed with future missions such as the James Webb Space Telescope (JWST).
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym Auger-Horizon
Project A large-scale radio detector for the Pierre Auger cosmic-ray Observatory – precision measurements of ultra-high-energy cosmic rays
Researcher (PI) Jörg HÖRANDEL
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Cosmic Rays (ionized atomic nuclei) are the only matter from beyond our solar system or even from extragalactic space, that we can directly investigate. Up to energies of 10^17 eV they most likely originate in our Galaxy. The highest-energy cosmic rays (>10^18 eV) cannot be magnetically bound any more to the Galaxy and are most likely of extragalactic origin.
The pure existence of these particles raises the question about their origin – how and where are they accelerated? How do they propagate through the universe and interact? How can we directly probe extragalactic matter and how can we locate its origin?
A key to understand the origin of cosmic rays is to measure the particle species (atomic mass). A precise mass measurement will allow discriminating astrophysical models and will clarify the reason for the observed suppression of the cosmic-ray flux at the highest energies, namely the maximum energy of the accelerators or the energy losses during propagation.
I address these questions by employing a new technique to precisely measure the cosmic-ray mass composition, which my group pioneered, the radio detection of air showers (induced by high-energy cosmic rays in the atmosphere) on very large scales, detecting horizontal air showers with zenith angles from 60° to 90°.
The new set-up will be the world-largest radio array, operated together with the well-established Auger surface and fluorescence detectors, forming a unique set-up to measure the properties of cosmic rays with unprecedented precision for energies above 10^17.5 eV. The radio technique is a cost-effective and robust method to measure the cosmic-ray energy and mass, complementary to established techniques. The energy scale of the radio measurements is established from first principles. The proposed detectors will also enhance the detection capabilities for high-energy neutrinos and the search for new physics through precision measurements of the electromagnetic and muonic shower components.
Summary
Cosmic Rays (ionized atomic nuclei) are the only matter from beyond our solar system or even from extragalactic space, that we can directly investigate. Up to energies of 10^17 eV they most likely originate in our Galaxy. The highest-energy cosmic rays (>10^18 eV) cannot be magnetically bound any more to the Galaxy and are most likely of extragalactic origin.
The pure existence of these particles raises the question about their origin – how and where are they accelerated? How do they propagate through the universe and interact? How can we directly probe extragalactic matter and how can we locate its origin?
A key to understand the origin of cosmic rays is to measure the particle species (atomic mass). A precise mass measurement will allow discriminating astrophysical models and will clarify the reason for the observed suppression of the cosmic-ray flux at the highest energies, namely the maximum energy of the accelerators or the energy losses during propagation.
I address these questions by employing a new technique to precisely measure the cosmic-ray mass composition, which my group pioneered, the radio detection of air showers (induced by high-energy cosmic rays in the atmosphere) on very large scales, detecting horizontal air showers with zenith angles from 60° to 90°.
The new set-up will be the world-largest radio array, operated together with the well-established Auger surface and fluorescence detectors, forming a unique set-up to measure the properties of cosmic rays with unprecedented precision for energies above 10^17.5 eV. The radio technique is a cost-effective and robust method to measure the cosmic-ray energy and mass, complementary to established techniques. The energy scale of the radio measurements is established from first principles. The proposed detectors will also enhance the detection capabilities for high-energy neutrinos and the search for new physics through precision measurements of the electromagnetic and muonic shower components.
Max ERC Funding
3 499 249 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym AV-SMP
Project Algorithmic Verification of String Manipulating Programs
Researcher (PI) Anthony LIN
Host Institution (HI) TECHNISCHE UNIVERSITAET KAISERSLAUTERN
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary String is among the most fundamental and commonly used data types in virtually all modern programming languages, especially with the rapidly growing popularity of scripting languages (e.g. JavaScript and Python). Programs written in such languages tend to perform heavy string manipulations, which are complex to reason about and could easily lead to programming mistakes. In some cases, such mistakes could have serious consequences, e.g., in the case of client-side web applications, cross-site scripting (XSS) attacks that could lead to a security breach by a malicious user.
The central objective of the proposed project is to develop novel verification algorithms for analysing the correctness (esp. with respect to safety and termination properties) of programs with string variables, and transform them into robust verification tools. To meet this key objective, we will make fundamental breakthroughs on both theoretical and tool implementation challenges. On the theoretical side, we address two important problems: (1) design expressive constraint languages over strings (in combination with other data types like integers) that permit decidability with good complexity, and (2) design generic semi-algorithms for verifying string programs that have strong theoretical performance guarantee. On the implementation side, we will address the challenging problem of designing novel implementation methods that can substantially speed up the basic string analysis procedures in practice. Finally, as a proof of concept, we will apply our technologies to two key application domains: (1) automatic detection of XSS vulnerabilities in web applications, and (2) automatic grading systems for a programming course.
The project will not only make fundamental theoretical contributions — potentially solving long-standing open problems in the area — but also yield powerful methods that can be used in various applications.
Summary
String is among the most fundamental and commonly used data types in virtually all modern programming languages, especially with the rapidly growing popularity of scripting languages (e.g. JavaScript and Python). Programs written in such languages tend to perform heavy string manipulations, which are complex to reason about and could easily lead to programming mistakes. In some cases, such mistakes could have serious consequences, e.g., in the case of client-side web applications, cross-site scripting (XSS) attacks that could lead to a security breach by a malicious user.
The central objective of the proposed project is to develop novel verification algorithms for analysing the correctness (esp. with respect to safety and termination properties) of programs with string variables, and transform them into robust verification tools. To meet this key objective, we will make fundamental breakthroughs on both theoretical and tool implementation challenges. On the theoretical side, we address two important problems: (1) design expressive constraint languages over strings (in combination with other data types like integers) that permit decidability with good complexity, and (2) design generic semi-algorithms for verifying string programs that have strong theoretical performance guarantee. On the implementation side, we will address the challenging problem of designing novel implementation methods that can substantially speed up the basic string analysis procedures in practice. Finally, as a proof of concept, we will apply our technologies to two key application domains: (1) automatic detection of XSS vulnerabilities in web applications, and (2) automatic grading systems for a programming course.
The project will not only make fundamental theoretical contributions — potentially solving long-standing open problems in the area — but also yield powerful methods that can be used in various applications.
Max ERC Funding
1 496 687 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym BAHAMAS
Project A holistic approach to large-scale structure cosmology
Researcher (PI) Ian MCCARTHY
Host Institution (HI) LIVERPOOL JOHN MOORES UNIVERSITY
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The standard model of cosmology, the ɅCDM model, is remarkably successful at explaining a wide range of observations of our Universe. However, it is now being subjected to much more stringent tests than ever before, and recent large-scale structure (LSS) measurements appear to be in tension with its predictions. Is this tension signalling that new physics is required? For example, time-varying dark energy, or perhaps a modified theory of gravity? A contribution from massive neutrinos? Before coming to such bold conclusions we must be certain that all of the important systematic errors in the LSS tests have been accounted for.
Presently, the largest source of systematic uncertainty is from the modelling of complicated astrophysical phenomena associated with galaxy formation. In particular, energetic feedback processes associated with star formation and black hole growth can heat and expel gas from collapsed structures and modify the large-scale distribution of matter. Furthermore, the LSS field is presently separated into many sub-fields (each using different models, that usually neglect feedback), preventing a coherent analysis.
Cosmological hydrodynamical simulations (are the only method which) can follow all the relevant matter components and self-consistently capture the effects of feedback. I have been leading the development of large-scale simulations with physically-motivated prescriptions for feedback that are unrivalled in their ability to reproduce the observed properties of massive systems. With ERC support, I will build a team to exploit these developments, to produce a suite of simulations designed specifically for LSS cosmology applications with the effects of feedback realistically accounted for and which will allow us to unite the different LSS tests. My team and I will make the first self-consistent comparisons with the full range of LSS cosmology tests, and critically assess the evidence for physics beyond the standard model.
Summary
The standard model of cosmology, the ɅCDM model, is remarkably successful at explaining a wide range of observations of our Universe. However, it is now being subjected to much more stringent tests than ever before, and recent large-scale structure (LSS) measurements appear to be in tension with its predictions. Is this tension signalling that new physics is required? For example, time-varying dark energy, or perhaps a modified theory of gravity? A contribution from massive neutrinos? Before coming to such bold conclusions we must be certain that all of the important systematic errors in the LSS tests have been accounted for.
Presently, the largest source of systematic uncertainty is from the modelling of complicated astrophysical phenomena associated with galaxy formation. In particular, energetic feedback processes associated with star formation and black hole growth can heat and expel gas from collapsed structures and modify the large-scale distribution of matter. Furthermore, the LSS field is presently separated into many sub-fields (each using different models, that usually neglect feedback), preventing a coherent analysis.
Cosmological hydrodynamical simulations (are the only method which) can follow all the relevant matter components and self-consistently capture the effects of feedback. I have been leading the development of large-scale simulations with physically-motivated prescriptions for feedback that are unrivalled in their ability to reproduce the observed properties of massive systems. With ERC support, I will build a team to exploit these developments, to produce a suite of simulations designed specifically for LSS cosmology applications with the effects of feedback realistically accounted for and which will allow us to unite the different LSS tests. My team and I will make the first self-consistent comparisons with the full range of LSS cosmology tests, and critically assess the evidence for physics beyond the standard model.
Max ERC Funding
1 725 982 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym BANDWIDTH
Project The cost of limited communication bandwidth in distributed computing
Researcher (PI) Keren CENSOR-HILLEL
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Distributed systems underlie many modern technologies, a prime example being the Internet. The ever-increasing abundance of distributed systems necessitates their design and usage to be backed by strong theoretical foundations.
A major challenge that distributed systems face is the lack of a central authority, which brings many aspects of uncertainty into the environment, in the form of unknown network topology or unpredictable dynamic behavior. A practical restriction of distributed systems, which is at the heart of this proposal, is the limited bandwidth available for communication between the network components.
A central family of distributed tasks is that of local tasks, which are informally described as tasks which are possible to solve by sending information through only a relatively small number of hops. A cornerstone example is the need to break symmetry and provide a better utilization of resources, which can be obtained by the task of producing a valid coloring of the nodes given some small number of colors. Amazingly, there are still huge gaps between the known upper and lower bounds for the complexity of many local tasks. This holds even if one allows powerful assumptions of unlimited bandwidth. While some known algorithms indeed use small messages, the complexity gaps are even larger compared to the unlimited bandwidth case. This is not a mere coincidence, and in fact the existing theoretical infrastructure is provably incapable of
giving stronger lower bounds for many local tasks under limited bandwidth.
This proposal zooms in on this crucial blind spot in the current literature on the theory of distributed computing, namely, the study of local tasks under limited bandwidth. The goal of this research is to produce fast algorithms for fundamental distributed local tasks under restricted bandwidth, as well as understand their limitations by providing lower bounds.
Summary
Distributed systems underlie many modern technologies, a prime example being the Internet. The ever-increasing abundance of distributed systems necessitates their design and usage to be backed by strong theoretical foundations.
A major challenge that distributed systems face is the lack of a central authority, which brings many aspects of uncertainty into the environment, in the form of unknown network topology or unpredictable dynamic behavior. A practical restriction of distributed systems, which is at the heart of this proposal, is the limited bandwidth available for communication between the network components.
A central family of distributed tasks is that of local tasks, which are informally described as tasks which are possible to solve by sending information through only a relatively small number of hops. A cornerstone example is the need to break symmetry and provide a better utilization of resources, which can be obtained by the task of producing a valid coloring of the nodes given some small number of colors. Amazingly, there are still huge gaps between the known upper and lower bounds for the complexity of many local tasks. This holds even if one allows powerful assumptions of unlimited bandwidth. While some known algorithms indeed use small messages, the complexity gaps are even larger compared to the unlimited bandwidth case. This is not a mere coincidence, and in fact the existing theoretical infrastructure is provably incapable of
giving stronger lower bounds for many local tasks under limited bandwidth.
This proposal zooms in on this crucial blind spot in the current literature on the theory of distributed computing, namely, the study of local tasks under limited bandwidth. The goal of this research is to produce fast algorithms for fundamental distributed local tasks under restricted bandwidth, as well as understand their limitations by providing lower bounds.
Max ERC Funding
1 486 480 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym Beacon
Project Beacons in the Dark
Researcher (PI) Paulo César Carvalho Freire
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary BEACON aims at performing an ambitious multi-disciplinary (optical, radio astronomy and theoretical physics) study to enable a fundamentally improved understanding of gravitation and space-time. For almost a century Einstein's general relativity has been the last word on gravity. However, superstring theory predicts new gravitational phenomena beyond relativity. In this proposal I will attempt to detect these new phenomena, with a sensitivity 20 times better than state-of-the-art attempts. A successful detection would take physics beyond its current understanding of the Universe.
These new gravitational phenomena are emission of dipolar gravitational waves and the violation of the strong equivalence principle (SEP). I plan to look for them by timing newly discovered binary pulsars. I will improve upon the best current limits on dipolar gravitational wave emission by a factor of 20 within the time of this proposal. I also plan to develop a test of the Strong Equivalence Principle using a new pulsar/main-sequence star binary. The precision of this test is likely to surpass the current best limits within the time frame of this proposal and then keep improving indefinitely with time. This happens because this is the cleanest gravitational experiment ever carried out.
In order to further these goals, I plan to build the ultimate pulsar observing system. By taking advantage of recent technological advances in microwave engineering (particularly sensitive ultra-wide band receivers) digital electronics (fast analogue-to-digital converters and digital spectrometers) and computing, my team and me will be able to greatly improve the sensitivity and precision for pulsar timing experiments and exploit the capabilities of modern radio telescopes to their limits.
Pulsars are the beacons that will guide me in these new, uncharted seas.
Summary
BEACON aims at performing an ambitious multi-disciplinary (optical, radio astronomy and theoretical physics) study to enable a fundamentally improved understanding of gravitation and space-time. For almost a century Einstein's general relativity has been the last word on gravity. However, superstring theory predicts new gravitational phenomena beyond relativity. In this proposal I will attempt to detect these new phenomena, with a sensitivity 20 times better than state-of-the-art attempts. A successful detection would take physics beyond its current understanding of the Universe.
These new gravitational phenomena are emission of dipolar gravitational waves and the violation of the strong equivalence principle (SEP). I plan to look for them by timing newly discovered binary pulsars. I will improve upon the best current limits on dipolar gravitational wave emission by a factor of 20 within the time of this proposal. I also plan to develop a test of the Strong Equivalence Principle using a new pulsar/main-sequence star binary. The precision of this test is likely to surpass the current best limits within the time frame of this proposal and then keep improving indefinitely with time. This happens because this is the cleanest gravitational experiment ever carried out.
In order to further these goals, I plan to build the ultimate pulsar observing system. By taking advantage of recent technological advances in microwave engineering (particularly sensitive ultra-wide band receivers) digital electronics (fast analogue-to-digital converters and digital spectrometers) and computing, my team and me will be able to greatly improve the sensitivity and precision for pulsar timing experiments and exploit the capabilities of modern radio telescopes to their limits.
Pulsars are the beacons that will guide me in these new, uncharted seas.
Max ERC Funding
1 892 376 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym BEAMING
Project Detecting massive-planet/brown-dwarf/low-mass-stellar companions with the beaming effect
Researcher (PI) Moshe Zvi Mazeh
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary "I propose to lead an international observational effort to characterize the population of massive planets, brown dwarf and stellar secondaries orbiting their parent stars with short periods, up to 10-30 days. The effort will utilize the superb, accurate, continuous lightcurves of more than hundred thousand stars obtained recently by two space missions – CoRoT and Kepler. I propose to use these lightcurves to detect non-transiting low-mass companions with a new algorithm, BEER, which I developed recently together with Simchon Faigler. BEER searches for the beaming effect, which causes the stellar intensity to increase if the star is moving towards the observer. The combination of the beaming effect with other modulations induced by a low-mass companion produces periodic modulation with a specific signature, which is used to detect small non-transiting companions. The accuracy of the space mission lightcurves is enough to detect massive planets with short periods. The proposed project is equivalent to a radial-velocity survey of tens of thousands of stars, instead of the presently active surveys which observe only hundreds of stars.
We will use an assortment of telescopes to perform radial velocity follow-up observations in order to confirm the existence of the detected companions, and to derive their masses and orbital eccentricities. We will discover many tens, if not hundreds, of new massive planets and brown dwarfs with short periods, and many thousands of new binaries. The findings will enable us to map the mass, period, and eccentricity distributions of planets and stellar companions, determine the upper mass of planets, understand the nature of the brown-dwarf desert, and put strong constrains on the theory of planet and binary formation and evolution."
Summary
"I propose to lead an international observational effort to characterize the population of massive planets, brown dwarf and stellar secondaries orbiting their parent stars with short periods, up to 10-30 days. The effort will utilize the superb, accurate, continuous lightcurves of more than hundred thousand stars obtained recently by two space missions – CoRoT and Kepler. I propose to use these lightcurves to detect non-transiting low-mass companions with a new algorithm, BEER, which I developed recently together with Simchon Faigler. BEER searches for the beaming effect, which causes the stellar intensity to increase if the star is moving towards the observer. The combination of the beaming effect with other modulations induced by a low-mass companion produces periodic modulation with a specific signature, which is used to detect small non-transiting companions. The accuracy of the space mission lightcurves is enough to detect massive planets with short periods. The proposed project is equivalent to a radial-velocity survey of tens of thousands of stars, instead of the presently active surveys which observe only hundreds of stars.
We will use an assortment of telescopes to perform radial velocity follow-up observations in order to confirm the existence of the detected companions, and to derive their masses and orbital eccentricities. We will discover many tens, if not hundreds, of new massive planets and brown dwarfs with short periods, and many thousands of new binaries. The findings will enable us to map the mass, period, and eccentricity distributions of planets and stellar companions, determine the upper mass of planets, understand the nature of the brown-dwarf desert, and put strong constrains on the theory of planet and binary formation and evolution."
Max ERC Funding
1 737 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym BEHAVFRICTIONS
Project Behavioral Implications of Information-Processing Frictions
Researcher (PI) Jakub STEINER
Host Institution (HI) NARODOHOSPODARSKY USTAV AKADEMIE VED CESKE REPUBLIKY VEREJNA VYZKUMNA INSTITUCE
Call Details Consolidator Grant (CoG), SH1, ERC-2017-COG
Summary BEHAVFRICTIONS will use novel models focussing on information-processing frictions to explain choice patterns described in behavioral economics and psychology. The proposed research will provide microfoundations that are essential for (i) identification of stable preferences, (ii) counterfactual predictions, and (iii) normative conclusions.
(i) Agents who face information-processing costs must trade the precision of choice against information costs. Their behavior thus reflects both their stable preferences and the context-dependent procedures that manage their errors stemming from imperfect information processing. In the absence of micro-founded models, the two drivers of the behavior are difficult to disentangle for outside observers. In some pillars of the proposal, the agents follow choice rules that closely resemble logit rules used in structural estimation. This will allow me to reinterpret the structural estimation fits to choice data and to make a distinction between the stable preferences and frictions.
(ii) Such a distinction is important in counterfactual policy analysis because the second-best decision procedures that manage the errors in choice are affected by the analysed policy. Incorporation of the information-processing frictions into existing empirical methods will improve our ability to predict effects of the policies.
(iii) My preliminary results suggest that when an agent is prone to committing errors, biases--such as overconfidence, confirmatory bias, or perception biases known from prospect theory--arise under second-best strategies. By providing the link between the agent's environment and the second-best distribution of the perception errors, my models will delineate environments in which these biases shield the agents from the most costly mistakes from environments in which the biases turn into maladaptations. The distinction will inform the normative debate on debiasing.
Summary
BEHAVFRICTIONS will use novel models focussing on information-processing frictions to explain choice patterns described in behavioral economics and psychology. The proposed research will provide microfoundations that are essential for (i) identification of stable preferences, (ii) counterfactual predictions, and (iii) normative conclusions.
(i) Agents who face information-processing costs must trade the precision of choice against information costs. Their behavior thus reflects both their stable preferences and the context-dependent procedures that manage their errors stemming from imperfect information processing. In the absence of micro-founded models, the two drivers of the behavior are difficult to disentangle for outside observers. In some pillars of the proposal, the agents follow choice rules that closely resemble logit rules used in structural estimation. This will allow me to reinterpret the structural estimation fits to choice data and to make a distinction between the stable preferences and frictions.
(ii) Such a distinction is important in counterfactual policy analysis because the second-best decision procedures that manage the errors in choice are affected by the analysed policy. Incorporation of the information-processing frictions into existing empirical methods will improve our ability to predict effects of the policies.
(iii) My preliminary results suggest that when an agent is prone to committing errors, biases--such as overconfidence, confirmatory bias, or perception biases known from prospect theory--arise under second-best strategies. By providing the link between the agent's environment and the second-best distribution of the perception errors, my models will delineate environments in which these biases shield the agents from the most costly mistakes from environments in which the biases turn into maladaptations. The distinction will inform the normative debate on debiasing.
Max ERC Funding
1 321 488 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym BigEarth
Project Accurate and Scalable Processing of Big Data in Earth Observation
Researcher (PI) Begüm Demir
Host Institution (HI) TECHNISCHE UNIVERSITAT BERLIN
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary During the last decade, a huge number of earth observation (EO) satellites with optical and Synthetic Aperture Radar sensors onboard have been launched and advances in satellite systems have increased the amount, variety and spatial/spectral resolution of EO data. This has led to massive EO data archives with huge amount of remote sensing (RS) images, from which mining and retrieving useful information are challenging. In view of that, content based image retrieval (CBIR) has attracted great attention in the RS community. However, existing RS CBIR systems have limitations on: i) characterization of high-level semantic content and spectral information present in RS images, and ii) large-scale RS CBIR problems since their search mechanism is time-demanding and not scalable in operational applications. The BigEarth project aims to develop highly innovative feature extraction and content based retrieval methods and tools for RS images, which can significantly improve the state-of-the-art both in the theory and in the tools currently available. To this end, very important scientific and practical problems will be addressed by focusing on the main challenges of Big EO data on RS image characterization, indexing and search from massive archives. In particular, novel methods and tools will be developed, aiming to: 1) characterize and exploit high level semantic content and spectral information present in RS images; 2) extract features directly from the compressed RS images; 3) achieve accurate and scalable RS image indexing and retrieval; and 4) integrate feature representations of different RS image sources into a unified form of feature representation. Moreover, a benchmark archive with high amount of multi-source RS images will be constructed. From an application point of view, the developed methodologies and tools will have a significant impact on many EO data applications, such as accurate and scalable retrieval of: specific man-made structures and burned forest areas.
Summary
During the last decade, a huge number of earth observation (EO) satellites with optical and Synthetic Aperture Radar sensors onboard have been launched and advances in satellite systems have increased the amount, variety and spatial/spectral resolution of EO data. This has led to massive EO data archives with huge amount of remote sensing (RS) images, from which mining and retrieving useful information are challenging. In view of that, content based image retrieval (CBIR) has attracted great attention in the RS community. However, existing RS CBIR systems have limitations on: i) characterization of high-level semantic content and spectral information present in RS images, and ii) large-scale RS CBIR problems since their search mechanism is time-demanding and not scalable in operational applications. The BigEarth project aims to develop highly innovative feature extraction and content based retrieval methods and tools for RS images, which can significantly improve the state-of-the-art both in the theory and in the tools currently available. To this end, very important scientific and practical problems will be addressed by focusing on the main challenges of Big EO data on RS image characterization, indexing and search from massive archives. In particular, novel methods and tools will be developed, aiming to: 1) characterize and exploit high level semantic content and spectral information present in RS images; 2) extract features directly from the compressed RS images; 3) achieve accurate and scalable RS image indexing and retrieval; and 4) integrate feature representations of different RS image sources into a unified form of feature representation. Moreover, a benchmark archive with high amount of multi-source RS images will be constructed. From an application point of view, the developed methodologies and tools will have a significant impact on many EO data applications, such as accurate and scalable retrieval of: specific man-made structures and burned forest areas.
Max ERC Funding
1 491 479 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BIGlobal
Project Firm Growth and Market Power in the Global Economy
Researcher (PI) Swati DHINGRA
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary According to the European Commission, to design effective policies for ensuring a “more dynamic, innovative and competitive” economy, it is essential to understand the decision-making process of firms as they differ a lot in terms of their capacities and policy responses (EC 2007). The objective of my future research is to provide such an analysis. BIGlobal will examine the sources of firm growth and market power to provide new insights into welfare and policy in a globalized world.
Much of analysis of the global economy is set in the paradigm of markets that allocate resources efficiently and there is little role for policy. But big firms dominate economic activity, especially across borders. How do firms grow and what is the effect of their market power on the welfare impact of globalization? This project will determine how firm decisions matter for the aggregate gains from globalization, the division of these gains across different individuals and their implications for policy design.
Over the next five years, I will incorporate richer firms behaviour in models of international trade to understand how trade and industrial policies impact the growth process, especially in less developed markets. The specific questions I will address include: how can trade and competition policy ensure consumers benefit from globalization when firms engaged in international trade have market power, how do domestic policies to encourage agribusiness firms affect the extent to which small farmers gain from trade, how do industrial policies affect firm growth through input linkages, and what is the impact of banking globalization on the growth of firms in the real sector.
Each project will combine theoretical work with rich data from developing economies to expand the frontier of knowledge on trade and industrial policy, and to provide a basis for informed policymaking.
Summary
According to the European Commission, to design effective policies for ensuring a “more dynamic, innovative and competitive” economy, it is essential to understand the decision-making process of firms as they differ a lot in terms of their capacities and policy responses (EC 2007). The objective of my future research is to provide such an analysis. BIGlobal will examine the sources of firm growth and market power to provide new insights into welfare and policy in a globalized world.
Much of analysis of the global economy is set in the paradigm of markets that allocate resources efficiently and there is little role for policy. But big firms dominate economic activity, especially across borders. How do firms grow and what is the effect of their market power on the welfare impact of globalization? This project will determine how firm decisions matter for the aggregate gains from globalization, the division of these gains across different individuals and their implications for policy design.
Over the next five years, I will incorporate richer firms behaviour in models of international trade to understand how trade and industrial policies impact the growth process, especially in less developed markets. The specific questions I will address include: how can trade and competition policy ensure consumers benefit from globalization when firms engaged in international trade have market power, how do domestic policies to encourage agribusiness firms affect the extent to which small farmers gain from trade, how do industrial policies affect firm growth through input linkages, and what is the impact of banking globalization on the growth of firms in the real sector.
Each project will combine theoretical work with rich data from developing economies to expand the frontier of knowledge on trade and industrial policy, and to provide a basis for informed policymaking.
Max ERC Funding
1 313 103 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym BIONET
Project Network Topology Complements Genome as a Source of Biological Information
Researcher (PI) Natasa Przulj
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Summary
Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Max ERC Funding
1 638 175 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym BITCRUMBS
Project Towards a Reliable and Automated Analysis of Compromised Systems
Researcher (PI) Davide BALZAROTTI
Host Institution (HI) EURECOM
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Summary
"The vast majority of research in computer security is dedicated to the design of detection, protection, and prevention solutions. While these techniques play a critical role to increase the security and privacy of our digital infrastructure, it is enough to look at the news to understand that it is not a matter of ""if"" a computer system will be compromised, but only a matter of ""when"". It is a well known fact that there is no 100% secure system, and that there is no practical way to prevent attackers with enough resources from breaking into sensitive targets. Therefore, it is extremely important to develop automated techniques to timely and precisely analyze computer security incidents and compromised systems. Unfortunately, the area of incident response received very little research attention, and it is still largely considered an art more than a science because of its lack of a proper theoretical and scientific background.
The objective of BITCRUMBS is to rethink the Incident Response (IR) field from its foundations by proposing a more scientific and comprehensive approach to the analysis of compromised systems. BITCRUMBS will achieve this goal in three steps: (1) by introducing a new systematic approach to precisely measure the effectiveness and accuracy of IR techniques and their resilience to evasion and forgery; (2) by designing and implementing new automated techniques to cope with advanced threats and the analysis of IoT devices; and (3) by proposing a novel forensics-by-design development methodology and a set of guidelines for the design of future systems and software.
To provide the right context for these new techniques and show the impact of the project in different fields and scenarios, BITCRUMBS plans to address its objectives using real case studies borrowed from two different
domains: traditional computer software, and embedded systems.
"
Max ERC Funding
1 991 504 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym Boom & Bust Cycles
Project Boom and Bust Cycles in Asset Prices: Real Implications and Monetary Policy Options
Researcher (PI) Klaus Adam
Host Institution (HI) UNIVERSITAET MANNHEIM
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary I seek increasing our understanding of the origin of asset price booms and bust cycles and propose constructing structural dynamic equilibrium models that allow formalizing their interaction with the dynamics of consumption, hours worked, the current account, stock market trading activity, and monetary policy. For this purpose I propose developing macroeconomic models that relax the assumption of common knowledge of beliefs and preferences, incorporating instead subjective beliefs and learning about market behavior. These features allow for sustained deviations of asset prices from fundamentals in a setting where all agents behave individually rational.
The first research project derives the derivative price implications of asset price models with learning agents and determines the limits to arbitrage required so that learning models are consistent with the existence of only weak incentives for improving forecasts and beliefs. The second project introduces housing, collateral constraints and open economy features into existing asset pricing models under learning to explain a range of cross-sectional facts about the behavior of the current account that have been observed in the recent housing boom and bust cycle. The third project constructs quantitatively plausible macro asset pricing models that can explain the dynamics of consumption and hours worked jointly with the occurrence of asset price boom and busts cycles. The forth project develops a set of monetary policy models allowing to study the interaction between monetary policies, the real economy and asset prices, and determines how monetary policy should optimally react to asset price movements. The last project explains the aggregate trading patterns on stock exchanges over boom and bust cycles and improves our understanding of the forces supporting the large cross-sectional heterogeneity in return expectations revealed in survey data.
Summary
I seek increasing our understanding of the origin of asset price booms and bust cycles and propose constructing structural dynamic equilibrium models that allow formalizing their interaction with the dynamics of consumption, hours worked, the current account, stock market trading activity, and monetary policy. For this purpose I propose developing macroeconomic models that relax the assumption of common knowledge of beliefs and preferences, incorporating instead subjective beliefs and learning about market behavior. These features allow for sustained deviations of asset prices from fundamentals in a setting where all agents behave individually rational.
The first research project derives the derivative price implications of asset price models with learning agents and determines the limits to arbitrage required so that learning models are consistent with the existence of only weak incentives for improving forecasts and beliefs. The second project introduces housing, collateral constraints and open economy features into existing asset pricing models under learning to explain a range of cross-sectional facts about the behavior of the current account that have been observed in the recent housing boom and bust cycle. The third project constructs quantitatively plausible macro asset pricing models that can explain the dynamics of consumption and hours worked jointly with the occurrence of asset price boom and busts cycles. The forth project develops a set of monetary policy models allowing to study the interaction between monetary policies, the real economy and asset prices, and determines how monetary policy should optimally react to asset price movements. The last project explains the aggregate trading patterns on stock exchanges over boom and bust cycles and improves our understanding of the forces supporting the large cross-sectional heterogeneity in return expectations revealed in survey data.
Max ERC Funding
769 440 €
Duration
Start date: 2011-09-01, End date: 2017-04-30
Project acronym BRiCPT
Project Basic Research in Cryptographic Protocol Theory
Researcher (PI) Jesper Buus Nielsen
Host Institution (HI) AARHUS UNIVERSITET
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Summary
In cryptographic protocol theory, we consider a situation where a number of entities want to solve some problem over a computer network. Each entity has some secret data it does not want the other entities to learn, yet, they all want to learn something about the common set of data. In an electronic election, they want to know the number of yes-votes without revealing who voted what. For instance, in an electronic auction, they want to find the winner without leaking the bids of the losers.
A main focus of the project is to develop new techniques for solving such protocol problems. We are in particular interested in techniques which can automatically construct a protocol solving a problem given only a description of what the problem is. My focus will be theoretical basic research, but I believe that advancing the theory of secure protocol compilers will have an immense impact on the practice of developing secure protocols for practice.
When one develops complex protocols, it is important to be able to verify their correctness before they are deployed, in particular so, when the purpose of the protocols is to protect information. If and when an error is found and corrected, the sensitive data will possibly already be compromised. Therefore, cryptographic protocol theory develops models of what it means for a protocol to be secure, and techniques for analyzing whether a given protocol is secure or not.
A main focuses of the project is to develop better security models, as existing security models either suffer from the problem that it is possible to prove some protocols secure which are not secure in practice, or they suffer from the problem that it is impossible to prove security of some protocol which are believed to be secure in practice. My focus will again be on theoretical basic research, but I believe that better security models are important for advancing a practice where protocols are verified as secure before deployed.
Max ERC Funding
1 171 019 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym Browsec
Project Foundations and Tools for Client-Side Web Security
Researcher (PI) Matteo MAFFEI
Host Institution (HI) TECHNISCHE UNIVERSITAET WIEN
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary The constantly increasing number of attacks on web applications shows how their rapid development has not been accompanied by adequate security foundations and demonstrates the lack of solid security enforcement tools. Indeed, web applications expose a gigantic attack surface, which hinders a rigorous understanding and enforcement of security properties. Hence, despite the worthwhile efforts to design secure web applications, users for a while will be confronted with vulnerable, or maliciously crafted, code. Unfortunately, end users have no way at present to reliably protect themselves from malicious applications.
BROWSEC will develop a holistic approach to client-side web security, laying its theoretical foundations and developing innovative security enforcement technologies. In particular, BROWSEC will deliver the first client-side tool to secure web applications that is practical, in that it is implemented as an extension and can thus be easily deployed at large, and also provably sound, i.e., backed up by machine-checked proofs that the tool provides end users with the required security guarantees. At the core of the proposal lies a novel monitoring technique, which treats the browser as a blackbox and intercepts its inputs and outputs in order to prevent dangerous information flows. With this lightweight monitoring approach, we aim at enforcing strong security properties without requiring any expensive and, given the dynamic nature of web applications, statically infeasible program analysis.
BROWSEC is thus a multidisciplinary research effort, promising practical impact and delivering breakthrough advancements in various disciplines, such as web security, JavaScript semantics, software engineering, and program verification.
Summary
The constantly increasing number of attacks on web applications shows how their rapid development has not been accompanied by adequate security foundations and demonstrates the lack of solid security enforcement tools. Indeed, web applications expose a gigantic attack surface, which hinders a rigorous understanding and enforcement of security properties. Hence, despite the worthwhile efforts to design secure web applications, users for a while will be confronted with vulnerable, or maliciously crafted, code. Unfortunately, end users have no way at present to reliably protect themselves from malicious applications.
BROWSEC will develop a holistic approach to client-side web security, laying its theoretical foundations and developing innovative security enforcement technologies. In particular, BROWSEC will deliver the first client-side tool to secure web applications that is practical, in that it is implemented as an extension and can thus be easily deployed at large, and also provably sound, i.e., backed up by machine-checked proofs that the tool provides end users with the required security guarantees. At the core of the proposal lies a novel monitoring technique, which treats the browser as a blackbox and intercepts its inputs and outputs in order to prevent dangerous information flows. With this lightweight monitoring approach, we aim at enforcing strong security properties without requiring any expensive and, given the dynamic nature of web applications, statically infeasible program analysis.
BROWSEC is thus a multidisciplinary research effort, promising practical impact and delivering breakthrough advancements in various disciplines, such as web security, JavaScript semantics, software engineering, and program verification.
Max ERC Funding
1 990 000 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym CALCULUS
Project Commonsense and Anticipation enriched Learning of Continuous representations sUpporting Language UnderStanding
Researcher (PI) Marie-Francine MOENS
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Summary
Natural language understanding (NLU) by the machine is of large scientific, economic and social value. Humans perform the NLU task in an efficient way by relying on their capability to imagine or anticipate situations. They engage commonsense and world knowledge that is often acquired through perceptual experiences to make explicit what is left implicit in language. Inspired by these characteristics CALCULUS will design, implement and evaluate innovative paradigms supporting NLU, where it will combine old but powerful ideas for language understanding from the early days of artificial intelligence with new approaches from machine learning. The project focuses on the effective learning of anticipatory, continuous, non-symbolic representations of event frames and narrative structures of events that are trained on language and visual data. The grammatical structure of language is grounded in the geometric structure of visual data while embodying aspects of commonsense and world knowledge. The reusable representations are evaluated in a selection of NLU tasks requiring efficient real-time retrieval of the representations and parsing of the targeted written texts. Finally, we will evaluate the inference potential of the anticipatory representations in situations not seen in the training data and when inferring spatial and temporal information in metric real world spaces that is not mentioned in the processed language. The machine learning methods focus on learning latent variable models relying on Bayesian probabilistic models and neural networks and focus on settings with limited training data that are manually annotated. The best models will be integrated in a demonstrator that translates the language of stories to events happening in a 3-D virtual world. The PI has interdisciplinary expertise in natural language processing, joint processing of language and visual data, information retrieval and machine learning needed for the successful realization of the project.
Max ERC Funding
2 227 500 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CAstRA
Project Comet and Asteroid Re-Shaping through Activity
Researcher (PI) Jessica AGARWAL
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary The proposed project will significantly improve the insight in the processes that have changed a comet nucleus or asteroid since their formation. These processes typically go along with activity, the observable release of gas and/or dust. Understanding the evolutionary processes of comets and asteroids will allow us to answer the crucial question which aspects of these present-day bodies still provide essential clues to their formation in the protoplanetary disc of the early solar system.
Ground-breaking progress in understanding these fundamental questions can now be made thanks to the huge and unprecedented data set returned between 2014 and 2016 by the European Space Agency’s Rosetta mission to comet 67P/Churyumov-Gerasimenko, and by recent major advances in the observational study of active asteroids facilitated by the increased availability of sky surveys and follow-on observations with world-class telescopes.
The key aims of this proposal are to
- Obtain a unified quantitative picture of the different erosion processes active in comets and asteroids,
- Investigate how ice is stored in comets and asteroids,
- Characterize the ejected dust (size distribution, optical and thermal properties) and relate it to dust around other stars,
- Understand in which respects comet 67P can be considered as representative of a wider sample of comets or even asteroids.
We will follow a highly multi-disciplinary approach analyzing data from many Rosetta instruments, ground- and space-based telescopes, and connect these through numerical models of the dust dynamics and thermal properties.
Summary
The proposed project will significantly improve the insight in the processes that have changed a comet nucleus or asteroid since their formation. These processes typically go along with activity, the observable release of gas and/or dust. Understanding the evolutionary processes of comets and asteroids will allow us to answer the crucial question which aspects of these present-day bodies still provide essential clues to their formation in the protoplanetary disc of the early solar system.
Ground-breaking progress in understanding these fundamental questions can now be made thanks to the huge and unprecedented data set returned between 2014 and 2016 by the European Space Agency’s Rosetta mission to comet 67P/Churyumov-Gerasimenko, and by recent major advances in the observational study of active asteroids facilitated by the increased availability of sky surveys and follow-on observations with world-class telescopes.
The key aims of this proposal are to
- Obtain a unified quantitative picture of the different erosion processes active in comets and asteroids,
- Investigate how ice is stored in comets and asteroids,
- Characterize the ejected dust (size distribution, optical and thermal properties) and relate it to dust around other stars,
- Understand in which respects comet 67P can be considered as representative of a wider sample of comets or even asteroids.
We will follow a highly multi-disciplinary approach analyzing data from many Rosetta instruments, ground- and space-based telescopes, and connect these through numerical models of the dust dynamics and thermal properties.
Max ERC Funding
1 484 688 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym CGinsideNP
Project Complexity Inside NP - A Computational Geometry Perspective
Researcher (PI) Wolfgang MULZER
Host Institution (HI) FREIE UNIVERSITAET BERLIN
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Traditional complexity theory focuses on the dichotomy between P and NP-hard
problems. Lately, it has become increasingly clear that this misses a major part
of the picture. Results by the PI and others offer glimpses on a fascinating structure
hiding inside NP: new computational problems that seem to lie between polynomial
and NP-hard have been identified; new conditional lower bounds for problems with
large polynomial running times have been found; long-held beliefs on the difficulty
of problems in P have been overturned. Computational geometry plays a major role
in these developments, providing some of the main questions and concepts.
We propose to explore this fascinating landscape inside NP from the perspective
of computational geometry, guided by three complementary questions:
(A) What can we say about the complexity of search problems derived from
existence theorems in discrete geometry? These problems offer a new
perspective on complexity classes previously studied in algorithmic game
theory (PPAD, PLS, CLS). Preliminary work indicates that they have the
potential to answer long-standing open questions on these classes.
(B) Can we provide meaningful conditional lower bounds on geometric
problems for which we have only algorithms with large polynomial running
time? Prompted by a question raised by the PI and collaborators, such lower
bounds were developed for the Frechet distance. Are similar results possible
for problems not related to distance measures? If so, this could dramatically
extend the traditional theory based on 3SUM-hardness to a much more
diverse and nuanced picture.
(C) Can we find subquadratic decision trees and faster algorithms for
3SUM-hard problems? After recent results by Pettie and Gronlund on
3SUM and by the PI and collaborators on the Frechet distance, we
have the potential to gain new insights on this large class of well-studied
problems and to improve long-standing complexity bounds for them.
Summary
Traditional complexity theory focuses on the dichotomy between P and NP-hard
problems. Lately, it has become increasingly clear that this misses a major part
of the picture. Results by the PI and others offer glimpses on a fascinating structure
hiding inside NP: new computational problems that seem to lie between polynomial
and NP-hard have been identified; new conditional lower bounds for problems with
large polynomial running times have been found; long-held beliefs on the difficulty
of problems in P have been overturned. Computational geometry plays a major role
in these developments, providing some of the main questions and concepts.
We propose to explore this fascinating landscape inside NP from the perspective
of computational geometry, guided by three complementary questions:
(A) What can we say about the complexity of search problems derived from
existence theorems in discrete geometry? These problems offer a new
perspective on complexity classes previously studied in algorithmic game
theory (PPAD, PLS, CLS). Preliminary work indicates that they have the
potential to answer long-standing open questions on these classes.
(B) Can we provide meaningful conditional lower bounds on geometric
problems for which we have only algorithms with large polynomial running
time? Prompted by a question raised by the PI and collaborators, such lower
bounds were developed for the Frechet distance. Are similar results possible
for problems not related to distance measures? If so, this could dramatically
extend the traditional theory based on 3SUM-hardness to a much more
diverse and nuanced picture.
(C) Can we find subquadratic decision trees and faster algorithms for
3SUM-hard problems? After recent results by Pettie and Gronlund on
3SUM and by the PI and collaborators on the Frechet distance, we
have the potential to gain new insights on this large class of well-studied
problems and to improve long-standing complexity bounds for them.
Max ERC Funding
1 486 800 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CHEMPLAN
Project Astrochemistry and the Origin of Planetary Systems
Researcher (PI) Ewine Fleur Van Dishoeck
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary When interstellar clouds collapse to form new stars and planets, the surrounding gas and dust become part of the infalling envelopes and rotating disks, thus providing the basic material from which new solar systems are made. Instrumentation to probe the physics and chemistry in low-mass star-forming regions has so far lacked spatial resolution. I propose here an integrated observational-modeling-laboratory program to survey protostars and disks on the relevant scales of 1-50 AU where planet formation takes place. The observations are centered on new data coming from the Atacama Large Millimeter / submillimeter Array (ALMA), and the analysis includes unique new data from key programs on Herschel, Spitzer and VLT that I am (co-)leading. The combination of millimeter and infrared data allows the full range of temperatures from 10-2000 K in star- and planet- forming regions to be probed, for both gas and solids. The molecular line data are used as diagnostics of physical parameters (such as UV field, cosmic ray ionization rate, kinematics, mixing, shock strength, grain growth, gas/dust ratios) as well as to follow the chemistry of water and complex organic molecules from cores to disks, which ultimately may be delivered to terrestrial planets. The implications for the history of volatile material in our own solar systen and exo-planetary atmospheres will be assessed by comparing models and data with cometary taxonomy and, ultimately, feeding them into planet population synthesis models. Altogether, this program will bring the link between interstellar chemistry and solar system and exo-planetary research to a new level.
The project will train four PhD students in a truly interdisciplinary environment in which they are exposed to all aspects of molecular astrophysics and have access to ample ALMA expertise, and it will prepare two postdocs for future faculty positions.
Summary
When interstellar clouds collapse to form new stars and planets, the surrounding gas and dust become part of the infalling envelopes and rotating disks, thus providing the basic material from which new solar systems are made. Instrumentation to probe the physics and chemistry in low-mass star-forming regions has so far lacked spatial resolution. I propose here an integrated observational-modeling-laboratory program to survey protostars and disks on the relevant scales of 1-50 AU where planet formation takes place. The observations are centered on new data coming from the Atacama Large Millimeter / submillimeter Array (ALMA), and the analysis includes unique new data from key programs on Herschel, Spitzer and VLT that I am (co-)leading. The combination of millimeter and infrared data allows the full range of temperatures from 10-2000 K in star- and planet- forming regions to be probed, for both gas and solids. The molecular line data are used as diagnostics of physical parameters (such as UV field, cosmic ray ionization rate, kinematics, mixing, shock strength, grain growth, gas/dust ratios) as well as to follow the chemistry of water and complex organic molecules from cores to disks, which ultimately may be delivered to terrestrial planets. The implications for the history of volatile material in our own solar systen and exo-planetary atmospheres will be assessed by comparing models and data with cometary taxonomy and, ultimately, feeding them into planet population synthesis models. Altogether, this program will bring the link between interstellar chemistry and solar system and exo-planetary research to a new level.
The project will train four PhD students in a truly interdisciplinary environment in which they are exposed to all aspects of molecular astrophysics and have access to ample ALMA expertise, and it will prepare two postdocs for future faculty positions.
Max ERC Funding
2 499 150 €
Duration
Start date: 2012-07-01, End date: 2018-06-30
Project acronym CHROMPHYS
Project Physics of the Solar Chromosphere
Researcher (PI) Mats Per-Olof Carlsson
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Summary
CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Max ERC Funding
2 487 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym CIVICS
Project Criminality, Victimization and Social Interactions
Researcher (PI) Katrine Vellesen LOKEN
Host Institution (HI) NORGES HANDELSHOYSKOLE
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary A large social science literature tries to describe and understand the causes and consequences of crime, usually focusing on individuals’ criminal activity in isolation. The ambitious aim of this research project is to establish a broader perspective of crime that takes into account the social context in which it takes place. The findings will inform policymakers on how to better use funds both for crime prevention and the rehabilitation of incarcerated criminals.
Criminal activity is often a group phenomenon, yet little is known about how criminal networks form and what can be done to break them up or prevent them from forming in the first place. Overlooking victims of crime and their relationships to criminals has led to an incomplete and distorted view of crime and its individual and social costs. While a better understanding of these social interactions is crucial for designing more effective anti-crime policy, existing research in criminology, sociology and economics has struggled to identify causal effects due to data limitations and difficult statistical identification issues.
This project will push the research frontier by combining register datasets that have never been merged before, and by using several state-of-the-art statistical methods to estimate causal effects related to criminal peer groups and their victims. More specifically, we aim to do the following:
-Use recent advances in network modelling to describe the structure and density of various criminal networks and study network dynamics following the arrest/incarceration or death of a central player in a network.
-Obtain a more accurate measure of the societal costs of crime, including actual measures for lost earnings and physical and mental health problems, following victims and their offenders both before and after a crime takes place.
-Conduct a randomized controlled trial within a prison system to better understand how current rehabilitation programs affect criminal and victim networks.
Summary
A large social science literature tries to describe and understand the causes and consequences of crime, usually focusing on individuals’ criminal activity in isolation. The ambitious aim of this research project is to establish a broader perspective of crime that takes into account the social context in which it takes place. The findings will inform policymakers on how to better use funds both for crime prevention and the rehabilitation of incarcerated criminals.
Criminal activity is often a group phenomenon, yet little is known about how criminal networks form and what can be done to break them up or prevent them from forming in the first place. Overlooking victims of crime and their relationships to criminals has led to an incomplete and distorted view of crime and its individual and social costs. While a better understanding of these social interactions is crucial for designing more effective anti-crime policy, existing research in criminology, sociology and economics has struggled to identify causal effects due to data limitations and difficult statistical identification issues.
This project will push the research frontier by combining register datasets that have never been merged before, and by using several state-of-the-art statistical methods to estimate causal effects related to criminal peer groups and their victims. More specifically, we aim to do the following:
-Use recent advances in network modelling to describe the structure and density of various criminal networks and study network dynamics following the arrest/incarceration or death of a central player in a network.
-Obtain a more accurate measure of the societal costs of crime, including actual measures for lost earnings and physical and mental health problems, following victims and their offenders both before and after a crime takes place.
-Conduct a randomized controlled trial within a prison system to better understand how current rehabilitation programs affect criminal and victim networks.
Max ERC Funding
1 187 046 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym CLOUDMAP
Project Cloud Computing via Homomorphic Encryption and Multilinear Maps
Researcher (PI) Jean-Sebastien Coron
Host Institution (HI) UNIVERSITE DU LUXEMBOURG
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary The past thirty years have seen cryptography move from arcane to commonplace: Internet, mobile phones, banking system, etc. Homomorphic cryptography now offers the tantalizing goal of being able to process sensitive information in encrypted form, without needing to compromise on the privacy and security of the citizens and organizations that provide the input data. More recently, cryptographic multilinear maps have revolutionized cryptography with the emergence of indistinguishability obfuscation (iO), which in theory can been used to realize numerous advanced cryptographic functionalities that previously seemed beyond reach. However the security of multilinear maps is still poorly understood, and many iO schemes have been broken; moreover all constructions of iO are currently unpractical.
The goal of the CLOUDMAP project is to make these advanced cryptographic tasks usable in practice, so that citizens do not have to compromise on the privacy and security of their input data. This goal can only be achieved by considering the mathematical foundations of these primitives, working "from first principles", rather than focusing on premature optimizations. To achieve this goal, our first objective will be to better understand the security of the underlying primitives of multilinear maps and iO schemes. Our second objective will be to develop new approaches to significantly improve their efficiency. Our third objective will be to build applications of multilinear maps and iO that can be implemented in practice.
Summary
The past thirty years have seen cryptography move from arcane to commonplace: Internet, mobile phones, banking system, etc. Homomorphic cryptography now offers the tantalizing goal of being able to process sensitive information in encrypted form, without needing to compromise on the privacy and security of the citizens and organizations that provide the input data. More recently, cryptographic multilinear maps have revolutionized cryptography with the emergence of indistinguishability obfuscation (iO), which in theory can been used to realize numerous advanced cryptographic functionalities that previously seemed beyond reach. However the security of multilinear maps is still poorly understood, and many iO schemes have been broken; moreover all constructions of iO are currently unpractical.
The goal of the CLOUDMAP project is to make these advanced cryptographic tasks usable in practice, so that citizens do not have to compromise on the privacy and security of their input data. This goal can only be achieved by considering the mathematical foundations of these primitives, working "from first principles", rather than focusing on premature optimizations. To achieve this goal, our first objective will be to better understand the security of the underlying primitives of multilinear maps and iO schemes. Our second objective will be to develop new approaches to significantly improve their efficiency. Our third objective will be to build applications of multilinear maps and iO that can be implemented in practice.
Max ERC Funding
2 491 266 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym CME
Project Concurrency Made Easy
Researcher (PI) Bertrand Philippe Meyer
Host Institution (HI) POLITECNICO DI MILANO
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary The “Concurrency Made Easy” project is an attempt to achieve a conceptual breakthrough on the most daunting challenge in information technology today: mastering concurrency. Concurrency, once a specialized technique for experts, is forcing itself onto the entire IT community because of a disruptive phenomenon: the “end of Moore’s law as we know it”. Increases in performance can no longer happen through raw hardware speed, but only through concurrency, as in multicore architectures. Concurrency is also critical for networking, cloud computing and the progress of natural sciences. Software support for these advances lags, mired in concepts from the 1960s such as semaphores. Existing formal models are hard to apply in practice. Incremental progress is not sufficient; neither are techniques that place the burden on programmers, who cannot all be expected to become concurrency experts. The CME project attempts a major shift on the side of the supporting technology: languages, formal models, verification techniques. The core idea of the CME project is to make concurrency easy for programmers, by building on established ideas of modern programming methodology (object technology, Design by Contract) shifting the concurrency difficulties to the internals of the model and implementation.
The project includes the following elements.
1. Sound conceptual model for concurrency. The starting point is the influential previous work of the PI: concepts of object-oriented design, particularly Design by Contract, and the SCOOP concurrency model.
2. Reference implementation, integrated into an IDE.
3. Performance analysis.
4. Theory and formal basis, including full semantics.
5. Proof techniques, compatible with proof techniques for the sequential part.
6. Complementary verification techniques such as concurrent testing.
7. Library of concurrency components and examples.
8. Publication, including a major textbook on concurrency.
Summary
The “Concurrency Made Easy” project is an attempt to achieve a conceptual breakthrough on the most daunting challenge in information technology today: mastering concurrency. Concurrency, once a specialized technique for experts, is forcing itself onto the entire IT community because of a disruptive phenomenon: the “end of Moore’s law as we know it”. Increases in performance can no longer happen through raw hardware speed, but only through concurrency, as in multicore architectures. Concurrency is also critical for networking, cloud computing and the progress of natural sciences. Software support for these advances lags, mired in concepts from the 1960s such as semaphores. Existing formal models are hard to apply in practice. Incremental progress is not sufficient; neither are techniques that place the burden on programmers, who cannot all be expected to become concurrency experts. The CME project attempts a major shift on the side of the supporting technology: languages, formal models, verification techniques. The core idea of the CME project is to make concurrency easy for programmers, by building on established ideas of modern programming methodology (object technology, Design by Contract) shifting the concurrency difficulties to the internals of the model and implementation.
The project includes the following elements.
1. Sound conceptual model for concurrency. The starting point is the influential previous work of the PI: concepts of object-oriented design, particularly Design by Contract, and the SCOOP concurrency model.
2. Reference implementation, integrated into an IDE.
3. Performance analysis.
4. Theory and formal basis, including full semantics.
5. Proof techniques, compatible with proof techniques for the sequential part.
6. Complementary verification techniques such as concurrent testing.
7. Library of concurrency components and examples.
8. Publication, including a major textbook on concurrency.
Max ERC Funding
2 482 957 €
Duration
Start date: 2012-04-01, End date: 2018-09-30
Project acronym COBOM
Project Convective Boundary Mixing in Stars
Researcher (PI) Isabelle Baraffe
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Stellar evolution models are fundamental to nearly all fields of astrophysics, from exoplanet to galactic and extra-galactic research.
The heart of the COBOM project is to develop a global physical picture of fundamental mixing processes in stars in order to derive robust and predictive stellar evolution models.
The complex dynamics of flows at convective boundaries is a key process in stellar interiors that drives the transport of chemical species and heat, strongly affecting the structure and the evolution of many types of stars. The same physical processes can also drive transport of angular momentum, affecting the rotation evolution and the generation of magnetic field of stars. The treatment of mixing processes at convective boundaries (also referred to as overshooting) is currently one of the major uncertainties in stellar evolution theory. This mixing can dramatically affect the size of a convective core, the lifetime of major burning phases or the surface chemistry over a wide range of stellar masses.
The main objectives of this project are to (1) develop a global theoretical framework to describe mixing and heat transport at convective boundaries in stellar interiors, (2) derive new physically-based transport coefficients and parametrizations for one-dimensional stellar evolution models and (3) test the new formalisms against a wide range of observations.
We will accomplish these goals by performing the most comprehensive study ever performed of mixing processes in stars using a fundamentally new approach. We will combine the power of multi-dimensional fully compressible time implicit magneto-hydrodynamic simulations and rare event statistics, which are usually applied in finance or climate science.
The key strength of the project is to establish a direct link between multi-dimensional results and observations (asteroseismology, eclipsing binaries, color-magnitude diagrams) via the exploitation of 1D stellar evolution models.
Summary
Stellar evolution models are fundamental to nearly all fields of astrophysics, from exoplanet to galactic and extra-galactic research.
The heart of the COBOM project is to develop a global physical picture of fundamental mixing processes in stars in order to derive robust and predictive stellar evolution models.
The complex dynamics of flows at convective boundaries is a key process in stellar interiors that drives the transport of chemical species and heat, strongly affecting the structure and the evolution of many types of stars. The same physical processes can also drive transport of angular momentum, affecting the rotation evolution and the generation of magnetic field of stars. The treatment of mixing processes at convective boundaries (also referred to as overshooting) is currently one of the major uncertainties in stellar evolution theory. This mixing can dramatically affect the size of a convective core, the lifetime of major burning phases or the surface chemistry over a wide range of stellar masses.
The main objectives of this project are to (1) develop a global theoretical framework to describe mixing and heat transport at convective boundaries in stellar interiors, (2) derive new physically-based transport coefficients and parametrizations for one-dimensional stellar evolution models and (3) test the new formalisms against a wide range of observations.
We will accomplish these goals by performing the most comprehensive study ever performed of mixing processes in stars using a fundamentally new approach. We will combine the power of multi-dimensional fully compressible time implicit magneto-hydrodynamic simulations and rare event statistics, which are usually applied in finance or climate science.
The key strength of the project is to establish a direct link between multi-dimensional results and observations (asteroseismology, eclipsing binaries, color-magnitude diagrams) via the exploitation of 1D stellar evolution models.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym CoCoSym
Project Symmetry in Computational Complexity
Researcher (PI) Libor BARTO
Host Institution (HI) UNIVERZITA KARLOVA
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary The last 20 years of rapid development in the computational-theoretic aspects of the fixed-language Constraint Satisfaction Problems (CSPs) has been fueled by a connection between the complexity and a certain concept capturing symmetry of computational problems in this class.
My vision is that this connection will eventually evolve into the organizing principle of computational complexity and will lead to solutions of fundamental problems such as the Unique Games Conjecture or even the P-versus-NP problem. In order to break through the current limits of this algebraic approach, I will concentrate on specific goals designed to
(A) discover suitable objects capturing symmetry that reflect the complexity in problem classes, where such an object is not known yet;
(B) make the natural ordering of symmetries coarser so that it reflects the complexity more faithfully;
(C) delineate the borderline between computationally hard and easy problems;
(D) strengthen characterizations of existing borderlines to increase their usefulness as tools for proving hardness and designing efficient algorithm; and
(E) design efficient algorithms based on direct and indirect uses of symmetries.
The specific goals concern the fixed-language CSP over finite relational structures and its generalizations to infinite domains (iCSP) and weighted relations (vCSP), in which the algebraic theory is highly developed and the limitations are clearly visible.
The approach is based on joining the forces of the universal algebraic methods in finite domains, model-theoretical and topological methods in the iCSP, and analytical and probabilistic methods in the vCSP. The starting point is to generalize and improve the Absorption Theory from finite domains.
Summary
The last 20 years of rapid development in the computational-theoretic aspects of the fixed-language Constraint Satisfaction Problems (CSPs) has been fueled by a connection between the complexity and a certain concept capturing symmetry of computational problems in this class.
My vision is that this connection will eventually evolve into the organizing principle of computational complexity and will lead to solutions of fundamental problems such as the Unique Games Conjecture or even the P-versus-NP problem. In order to break through the current limits of this algebraic approach, I will concentrate on specific goals designed to
(A) discover suitable objects capturing symmetry that reflect the complexity in problem classes, where such an object is not known yet;
(B) make the natural ordering of symmetries coarser so that it reflects the complexity more faithfully;
(C) delineate the borderline between computationally hard and easy problems;
(D) strengthen characterizations of existing borderlines to increase their usefulness as tools for proving hardness and designing efficient algorithm; and
(E) design efficient algorithms based on direct and indirect uses of symmetries.
The specific goals concern the fixed-language CSP over finite relational structures and its generalizations to infinite domains (iCSP) and weighted relations (vCSP), in which the algebraic theory is highly developed and the limitations are clearly visible.
The approach is based on joining the forces of the universal algebraic methods in finite domains, model-theoretical and topological methods in the iCSP, and analytical and probabilistic methods in the vCSP. The starting point is to generalize and improve the Absorption Theory from finite domains.
Max ERC Funding
1 211 375 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym CONCERTO
Project Intensity mapping of the atomic carbon CII line: the promise of a new observational probe of dusty star-formation in post-reionization and reionization epoch
Researcher (PI) Guilaine LAGACHE
Host Institution (HI) UNIVERSITE D'AIX MARSEILLE
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary I propose for funding to construct a spectrometer to map in 3-D the intensity due to line emission, a
technique known as Intensity Mapping. Instead of detecting individual galaxies, this emerging technique
measures signal fluctuations produced by the combined emission of the galaxy population on large regions
of the sky in a wide frequency (i.e. redshift) band, and thus increases sensitivity to faint sources.
Capitalizing on a recent technology breakthrough, our intensity mapping experiment will measure the 3-D
fluctuations of the [CII] line at redshifts 4.5<z<8.5. [CII] is one of the most valuable star formation tracers
at high redshift. My project will answer the outstanding questions of whether dusty star-formation
contributes to early galaxy evolution, and whether dusty galaxies play an important role in shaping cosmic
reionization.
My team will first build, test, and finally install the instrument on the APEX antenna following an
agreement with APEX partners. The spectrometer will be based on the state-of-the-art development of new
arrays in the millimeter using Kinetic Inductance Detectors. Spectra (200-360 GHz) will be obtained by a
fast Martin-Puplett interferometer. Then, we will observe with CONCERTO a few square degrees and offer
a straight forward alternative for probing star formation and dust build-up in the early Universe. Finally,
CONCERTO will set to music the various cosmic evolution probes. Cross-correlation of the signals will be
used in particular to capture the topology of the end of reionization era.
CONCERTO will be one of two instruments in the world to perform intensity mapping of the [CII] line in
the short term. The novel methodology is extremely promising as it targets an unexplored observable
touching on some of the fundamental processes building the early universe. In the flourishing of new ideas
in the intensity-mapping field, CONCERTO lies at the forefront.
Summary
I propose for funding to construct a spectrometer to map in 3-D the intensity due to line emission, a
technique known as Intensity Mapping. Instead of detecting individual galaxies, this emerging technique
measures signal fluctuations produced by the combined emission of the galaxy population on large regions
of the sky in a wide frequency (i.e. redshift) band, and thus increases sensitivity to faint sources.
Capitalizing on a recent technology breakthrough, our intensity mapping experiment will measure the 3-D
fluctuations of the [CII] line at redshifts 4.5<z<8.5. [CII] is one of the most valuable star formation tracers
at high redshift. My project will answer the outstanding questions of whether dusty star-formation
contributes to early galaxy evolution, and whether dusty galaxies play an important role in shaping cosmic
reionization.
My team will first build, test, and finally install the instrument on the APEX antenna following an
agreement with APEX partners. The spectrometer will be based on the state-of-the-art development of new
arrays in the millimeter using Kinetic Inductance Detectors. Spectra (200-360 GHz) will be obtained by a
fast Martin-Puplett interferometer. Then, we will observe with CONCERTO a few square degrees and offer
a straight forward alternative for probing star formation and dust build-up in the early Universe. Finally,
CONCERTO will set to music the various cosmic evolution probes. Cross-correlation of the signals will be
used in particular to capture the topology of the end of reionization era.
CONCERTO will be one of two instruments in the world to perform intensity mapping of the [CII] line in
the short term. The novel methodology is extremely promising as it targets an unexplored observable
touching on some of the fundamental processes building the early universe. In the flourishing of new ideas
in the intensity-mapping field, CONCERTO lies at the forefront.
Max ERC Funding
3 499 942 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym COOPERATION
Project Putting Strong Reciprocity into Context: The Role of Incentives, Social Norms, and Culture for Voluntary Cooperation
Researcher (PI) Simon Gaechter
Host Institution (HI) THE UNIVERSITY OF NOTTINGHAM
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary Many important social problems—from the workplace to climate change—require the cooperation of individuals in situations in which collective welfare is jeopardized by self-interest and contractual solutions that align collective and individual interest are not feasible. While this suggests a bleak outcome if people are selfish, recent research in the behavioural sciences suggests that rather than being selfish, many people are non-strategic ‘strong reciprocators’ who cooperate if others cooperate and who punish unfair behaviour even if such cooperation or punishment is individually costly. The fundamental importance of strong reciprocity is that is helps achieving cooperation in situations in which self-interest predicts its breakdown.
The major ambition and innovation of this research programme is to “put strong reciprocity into context” by investigating how incentives, social and cultural context, and gender and personality differences, shape strong reciprocity and, as a consequence, cooperation.
I propose four linked work packages, which all address key open questions of interest to economists and other behavioural scientists. First, I investigate how incentives influence strong reciprocity: Under which conditions do incentives undermine or enhance strong reciprocity and thereby cooperation? Second, I investigate how strong reciprocity relates to social norms of cooperation and is shaped by social context. Third, I use cross-cultural experiments to study the role of cultural influences on strong reciprocity and how culture interacts with incentive structures: when does culture matter for cooperation? Finally, I study personality and gender differences in strong reciprocity.
All projects use economic experiments and insights from across the behavioural sciences. The overarching objective is to develop a ‘behavioural economics of cooperation’, that is, the basic science of relevant behavioural principles that are needed to achieve sustainable cooperation.
Summary
Many important social problems—from the workplace to climate change—require the cooperation of individuals in situations in which collective welfare is jeopardized by self-interest and contractual solutions that align collective and individual interest are not feasible. While this suggests a bleak outcome if people are selfish, recent research in the behavioural sciences suggests that rather than being selfish, many people are non-strategic ‘strong reciprocators’ who cooperate if others cooperate and who punish unfair behaviour even if such cooperation or punishment is individually costly. The fundamental importance of strong reciprocity is that is helps achieving cooperation in situations in which self-interest predicts its breakdown.
The major ambition and innovation of this research programme is to “put strong reciprocity into context” by investigating how incentives, social and cultural context, and gender and personality differences, shape strong reciprocity and, as a consequence, cooperation.
I propose four linked work packages, which all address key open questions of interest to economists and other behavioural scientists. First, I investigate how incentives influence strong reciprocity: Under which conditions do incentives undermine or enhance strong reciprocity and thereby cooperation? Second, I investigate how strong reciprocity relates to social norms of cooperation and is shaped by social context. Third, I use cross-cultural experiments to study the role of cultural influences on strong reciprocity and how culture interacts with incentive structures: when does culture matter for cooperation? Finally, I study personality and gender differences in strong reciprocity.
All projects use economic experiments and insights from across the behavioural sciences. The overarching objective is to develop a ‘behavioural economics of cooperation’, that is, the basic science of relevant behavioural principles that are needed to achieve sustainable cooperation.
Max ERC Funding
2 072 844 €
Duration
Start date: 2012-05-01, End date: 2017-04-30
Project acronym CORNET
Project Provably Correct Networks
Researcher (PI) Costin RAICIU
Host Institution (HI) UNIVERSITATEA POLITEHNICA DIN BUCURESTI
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Networks are the backbone of our society, but configuring them is error-prone and tedious: misconfigured networks result in headline grabbing network outages that affect many users and hurt company revenues while security breaches that endanger millions of customers. There are currently no guarantees that deployed networks correctly implement their operator’s policy.
Existing research has focused on two directions: a) low level analysis and instrumentation of real networking code prevents memory bugs in individual network elements, but does not capture network-wide properties desired by operators such as reachability or loop freedom; b) high-level analysis of network-wide properties to verify operator policies on abstract network models; unfortunately, there are no guarantees that the models are an accurate representation of the real network code, and often low-level errors invalidate the conclusions of the high-level analysis.
We propose to achieve provably correct networks by simultaneously targeting both low-level security concerns and network-wide policy compliance checking. Our key proposal is to rely on exhaustive network symbolic execution for verification and to automatically generate provably correct implementations from network models. Generating efficient code that is equivalent to the model poses great challenges that we will address with three key contributions:
a) We will develop a novel theoretical equivalence framework based on symbolic execution semantics, as well as equivalence-preserving model transformations to automatically optimize network models for runtime efficiency.
b) We will develop compilers that take network models and generate functionally equivalent and efficient executable code for different targets (e.g. P4 and C).
c) We will design algorithms that generate and insert runtime guards that ensure correctness of the network with respect to the desired policy even when legacy boxes are deployed in the network.
Summary
Networks are the backbone of our society, but configuring them is error-prone and tedious: misconfigured networks result in headline grabbing network outages that affect many users and hurt company revenues while security breaches that endanger millions of customers. There are currently no guarantees that deployed networks correctly implement their operator’s policy.
Existing research has focused on two directions: a) low level analysis and instrumentation of real networking code prevents memory bugs in individual network elements, but does not capture network-wide properties desired by operators such as reachability or loop freedom; b) high-level analysis of network-wide properties to verify operator policies on abstract network models; unfortunately, there are no guarantees that the models are an accurate representation of the real network code, and often low-level errors invalidate the conclusions of the high-level analysis.
We propose to achieve provably correct networks by simultaneously targeting both low-level security concerns and network-wide policy compliance checking. Our key proposal is to rely on exhaustive network symbolic execution for verification and to automatically generate provably correct implementations from network models. Generating efficient code that is equivalent to the model poses great challenges that we will address with three key contributions:
a) We will develop a novel theoretical equivalence framework based on symbolic execution semantics, as well as equivalence-preserving model transformations to automatically optimize network models for runtime efficiency.
b) We will develop compilers that take network models and generate functionally equivalent and efficient executable code for different targets (e.g. P4 and C).
c) We will design algorithms that generate and insert runtime guards that ensure correctness of the network with respect to the desired policy even when legacy boxes are deployed in the network.
Max ERC Funding
1 325 000 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym COSMIC-LITMUS
Project Turning cosmic shear into a litmus test for the standard model of cosmology
Researcher (PI) Hendrik Jurgen HILDEBRANDT
Host Institution (HI) RUHR-UNIVERSITAET BOCHUM
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The standard model of cosmology is impressively consistent with a large number of observations. Its parameters have been determined with great accuracy with the Planck CMB (cosmic microwave background) mission. However, recently local determinations of the Hubble constant as well as ob- servations of strong and weak gravitational lensing have found some tension with Planck. Are those observations first glimpses at a crack in the standard model and hints of an evolving dark energy com- ponent? With this ERC Consolidator Grant I will answer these questions by greatly increasing the robustness of one of those cosmological probes, the weak lensing effect of the large scale structure of the Universe also called cosmic shear.
In order to reach this goal I will concentrate on the largest outstanding source of systematic error: photometric redshifts (photo-z). I will exploit the unique combination of two European imaging surveys in the optical and infrared wavelength regime, an additional narrow-band imaging survey with extremely precise photo-z, and spectroscopic calibration data from a recently approved ESO large program on the VLT. Using angular cross-correlations and machine-learning I will calibrate the photo- z in a two-stage process making sure that this crucial systematic uncertainty will keep pace with the growing statistical power of imaging surveys. This will yield an uncertainty on the amplitude of the clustering of dark matter that is smaller than the best constraints from the CMB.
I will also apply these methods to ESA’s Euclid mission launching in 2020, which will fail if photo-z are not better understood by then. If the discrepancy between lensing and CMB measurements holds this would potentially result in a revolution of our understanding of the Universe. Regardless of this spectacular short-term possibility I will turn cosmic shear – one of the most powerful cosmological probes of dark energy – into a litmus test for our cosmological paradigm.
Summary
The standard model of cosmology is impressively consistent with a large number of observations. Its parameters have been determined with great accuracy with the Planck CMB (cosmic microwave background) mission. However, recently local determinations of the Hubble constant as well as ob- servations of strong and weak gravitational lensing have found some tension with Planck. Are those observations first glimpses at a crack in the standard model and hints of an evolving dark energy com- ponent? With this ERC Consolidator Grant I will answer these questions by greatly increasing the robustness of one of those cosmological probes, the weak lensing effect of the large scale structure of the Universe also called cosmic shear.
In order to reach this goal I will concentrate on the largest outstanding source of systematic error: photometric redshifts (photo-z). I will exploit the unique combination of two European imaging surveys in the optical and infrared wavelength regime, an additional narrow-band imaging survey with extremely precise photo-z, and spectroscopic calibration data from a recently approved ESO large program on the VLT. Using angular cross-correlations and machine-learning I will calibrate the photo- z in a two-stage process making sure that this crucial systematic uncertainty will keep pace with the growing statistical power of imaging surveys. This will yield an uncertainty on the amplitude of the clustering of dark matter that is smaller than the best constraints from the CMB.
I will also apply these methods to ESA’s Euclid mission launching in 2020, which will fail if photo-z are not better understood by then. If the discrepancy between lensing and CMB measurements holds this would potentially result in a revolution of our understanding of the Universe. Regardless of this spectacular short-term possibility I will turn cosmic shear – one of the most powerful cosmological probes of dark energy – into a litmus test for our cosmological paradigm.
Max ERC Funding
1 931 493 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym COSMICLENS
Project Cosmology with Strong Gravitational Lensing
Researcher (PI) Frederic Yves Michel COURBIN
Host Institution (HI) ECOLE POLYTECHNIQUE FEDERALE DE LAUSANNE
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Measuring cosmological distances has revolutionized our understanding of the Universe, and is still doing so! Early work in the 1920s led to the discovery of the expansion of the Universe. More precise distance measurements in the 90s with type-Ia supernovae revealed that this expansion is accelerating, with crucial consequences in cosmology and physics. Is the acceleration due to some repulsive form of dark energy? To Einstein's cosmological constant? Do we need to consider new physics? Answering these fundamental questions requires a precise measurement of the Hubble parameter, H0, which is my goal using the time delay (TD) method in strongly lensed quasars.
The TD method exploits well-known physics on galaxy-scales. It is one of the very few techniques that can yield H0 to <2% using a single methodology. It involves no calibration, and is truly independent of any other cosmological probe. Capitalizing on the successful pathfinders COSMOGRAIL (PI: Courbin) and H0LiCOW (PI: Suyu, CoI: Courbin) time has come to fully exploit TDs with an observational, modeling and technical boost, organized in 2 phases.
Phase I will secure H0 to 2% using the current chain of analysis, with feasible enhancements beyond the current state-of the-art. This will confirm or refute the tension seen between H0 values with different cosmological probes. Phase II targets 1% precision, improving the FoM of Stage-IV cosmological surveys by 40%. The 4 proposed Work Packages can transform the field within the next 5 years by 1- implementing the first high-cadence photometric monitoring of lensed quasars to measure 50 new TDs, 2- providing new flexible non-parameteric lens models based on sparse regularization of the reconstructed source and lens mass/light distributions, 3- providing a modular end-to-end simulation framework to mock lensed systems from hydro-simulations and to evaluate in detail the impact model degeneracies on H0, 4- discovering new suitable lensed quasars in current surveys.
Summary
Measuring cosmological distances has revolutionized our understanding of the Universe, and is still doing so! Early work in the 1920s led to the discovery of the expansion of the Universe. More precise distance measurements in the 90s with type-Ia supernovae revealed that this expansion is accelerating, with crucial consequences in cosmology and physics. Is the acceleration due to some repulsive form of dark energy? To Einstein's cosmological constant? Do we need to consider new physics? Answering these fundamental questions requires a precise measurement of the Hubble parameter, H0, which is my goal using the time delay (TD) method in strongly lensed quasars.
The TD method exploits well-known physics on galaxy-scales. It is one of the very few techniques that can yield H0 to <2% using a single methodology. It involves no calibration, and is truly independent of any other cosmological probe. Capitalizing on the successful pathfinders COSMOGRAIL (PI: Courbin) and H0LiCOW (PI: Suyu, CoI: Courbin) time has come to fully exploit TDs with an observational, modeling and technical boost, organized in 2 phases.
Phase I will secure H0 to 2% using the current chain of analysis, with feasible enhancements beyond the current state-of the-art. This will confirm or refute the tension seen between H0 values with different cosmological probes. Phase II targets 1% precision, improving the FoM of Stage-IV cosmological surveys by 40%. The 4 proposed Work Packages can transform the field within the next 5 years by 1- implementing the first high-cadence photometric monitoring of lensed quasars to measure 50 new TDs, 2- providing new flexible non-parameteric lens models based on sparse regularization of the reconstructed source and lens mass/light distributions, 3- providing a modular end-to-end simulation framework to mock lensed systems from hydro-simulations and to evaluate in detail the impact model degeneracies on H0, 4- discovering new suitable lensed quasars in current surveys.
Max ERC Funding
3 129 689 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym CPROVER
Project Validation of Concurrent Software Across Abstraction Layers
Researcher (PI) Daniel Heinrich Friedrich Kroening
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary The cost of software quality assurance (QA) dominates the cost of IT development and maintenance projects. QA is frequently on the critical path to market. Effective software QA is therefore decisive for the competitiveness of numerous industries that rely on IT, and essential for government tasks that rely heavily on IT.
This research programme will provide a pragmatic solution to the most pressing issue in software QA in mainstream software engineering: the use of concurrency. Programmers make use of numerous favors of concurrency in order to achieve better scalability, savings in power, increase reliability, and to boost performance. The need for software that makes diligent use of concurrent computational resources has been exacerbated by power-efficient multi-core CPUs, which are now widely deployed, but still unfertilized due to the lack of appropriate software. Concurrent software is particularly difficult to test, as bugs depend on particular interlavings between the sequential computations. Defects are therefore difficult to reproduce and diagnose, and often elude even very experienced programmers.
We propose to develop new, ground-braking reasoning and testing technology for this kind of software,
with the goal of cutting the staff effort in QA of concurrent effort in half. We will use a tightly integrated combination of scalable and performant testing technology and Model Checking and abstract interpretation engines to prune the search. Every aspect of the research programme is geared towards improving the productivity of the average application programmer. Our theories and reasoning technology will therefore be implemented in a seamless fashion within the existing, well-accepted programming environments Visual Studio and Eclipse, in close collaboration with Microsoft and IBM.
Summary
The cost of software quality assurance (QA) dominates the cost of IT development and maintenance projects. QA is frequently on the critical path to market. Effective software QA is therefore decisive for the competitiveness of numerous industries that rely on IT, and essential for government tasks that rely heavily on IT.
This research programme will provide a pragmatic solution to the most pressing issue in software QA in mainstream software engineering: the use of concurrency. Programmers make use of numerous favors of concurrency in order to achieve better scalability, savings in power, increase reliability, and to boost performance. The need for software that makes diligent use of concurrent computational resources has been exacerbated by power-efficient multi-core CPUs, which are now widely deployed, but still unfertilized due to the lack of appropriate software. Concurrent software is particularly difficult to test, as bugs depend on particular interlavings between the sequential computations. Defects are therefore difficult to reproduce and diagnose, and often elude even very experienced programmers.
We propose to develop new, ground-braking reasoning and testing technology for this kind of software,
with the goal of cutting the staff effort in QA of concurrent effort in half. We will use a tightly integrated combination of scalable and performant testing technology and Model Checking and abstract interpretation engines to prune the search. Every aspect of the research programme is geared towards improving the productivity of the average application programmer. Our theories and reasoning technology will therefore be implemented in a seamless fashion within the existing, well-accepted programming environments Visual Studio and Eclipse, in close collaboration with Microsoft and IBM.
Max ERC Funding
1 368 355 €
Duration
Start date: 2011-12-01, End date: 2017-11-30
Project acronym CRASH
Project CRyptographic Algorithms and Secure Hardware
Researcher (PI) François-Xavier Standaert
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Side-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.
Summary
Side-channel attacks are an important threat against cryptographic implementations in which an adversary takes advantage of physical leakages, such as the power consumption of a smart card, in order to recover secret information. By circumventing the models in which standard security proofs are obtained, they can lead to powerful attacks against a large class of devices. As a consequence, formalizing implementation security and efficiently preventing side-channel attacks is one of the most challenging open problems in modern cryptography. Physical attacks imply new optimization criteria, with potential impact on the way we conceive algorithms and the way we design circuits. By putting together mathematical and electrical engineering problems, just as they are raised in reality, the CRASH project is expected to develop concrete basements for the next generation of cryptographic algorithms and their implementation. For this purpose, three main directions will be considered. First, we will investigate sound evaluation tools for side-channel attacks and validate them on different prototype chips. Second, we will consider the impact of physical attacks on the mathematical aspects of cryptography, both destructively (i.e. by developing new attacks and advanced cryptanalysis tools) and constructively (i.e. by investigating new cipher designs and security proof techniques). Third, we will evaluate the possibility to integrate physical security analysis into the design tools of integrated circuits (e.g. in order to obtain “physical security aware” compilers). Summarizing, this project aims to break the barrier between the abstractions of mathematical cryptography and the concrete peculiarities of physical security in present microelectronic devices. By considering the system and algorithmic issues in a unified way, it is expected to get rid of the incompatibilities between the separate formalisms that are usually considered in order to explain these concurrent realities.
Max ERC Funding
1 498 874 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym D-SynMA
Project Distributed Synthesis: from Single to Multiple Agents
Researcher (PI) Nir PITERMAN
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary Computing is changing from living on our desktops and in dedicated devices to being everywhere. In phones, sensors, appliances, and robots – computers (from now on devices) are everywhere and affecting all aspects of our lives. The techniques to make them safe and reliable are investigated and are starting to emerge and consolidate. However, these techniques enable devices to work in isolation or co-exist. We currently do not have techniques that enable development of real autonomous collaboration between devices. Such techniques will revolutionize all usage of devices and, as consequence, our lives. Manufacturing, supply chain, transportation, infrastructures, and earth- and space exploration would all transform using techniques that enable development of collaborating devices.
When considering isolated (and co-existing) devices, reactive synthesis – automatic production of plans from high level specification – is emerging as a viable tool for the development of robots and reactive software. This is especially important in the context of safety-critical systems, where assurances are required and systems need to have guarantees on performance. The techniques that are developed today to support robust, assured, reliable, and adaptive devices rely on a major change in focus of reactive synthesis. The revolution of correct-by-construction systems from specifications is occurring and is being pushed forward.
However, to take this approach forward to work also for real collaboration between devices the theoretical frameworks that will enable distributed synthesis are required. Such foundations will enable the correct-by-construction revolution to unleash its potential and allow a multiplicative increase of utility by cooperative computation.
d-SynMA will take distributed synthesis to this new frontier by considering novel interaction and communication concepts that would create an adaptable framework of correct-by-construction application of collaborating devices.
Summary
Computing is changing from living on our desktops and in dedicated devices to being everywhere. In phones, sensors, appliances, and robots – computers (from now on devices) are everywhere and affecting all aspects of our lives. The techniques to make them safe and reliable are investigated and are starting to emerge and consolidate. However, these techniques enable devices to work in isolation or co-exist. We currently do not have techniques that enable development of real autonomous collaboration between devices. Such techniques will revolutionize all usage of devices and, as consequence, our lives. Manufacturing, supply chain, transportation, infrastructures, and earth- and space exploration would all transform using techniques that enable development of collaborating devices.
When considering isolated (and co-existing) devices, reactive synthesis – automatic production of plans from high level specification – is emerging as a viable tool for the development of robots and reactive software. This is especially important in the context of safety-critical systems, where assurances are required and systems need to have guarantees on performance. The techniques that are developed today to support robust, assured, reliable, and adaptive devices rely on a major change in focus of reactive synthesis. The revolution of correct-by-construction systems from specifications is occurring and is being pushed forward.
However, to take this approach forward to work also for real collaboration between devices the theoretical frameworks that will enable distributed synthesis are required. Such foundations will enable the correct-by-construction revolution to unleash its potential and allow a multiplicative increase of utility by cooperative computation.
d-SynMA will take distributed synthesis to this new frontier by considering novel interaction and communication concepts that would create an adaptable framework of correct-by-construction application of collaborating devices.
Max ERC Funding
1 871 272 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym D5S
Project Direct Statistical Simulation of the Sun and Stars
Researcher (PI) Steven Tobias
Host Institution (HI) UNIVERSITY OF LEEDS
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary This proposal (D5S) addresses a key problem of astrophysics – the origin of magnetic activity in the sun and solar-type
stars. This is a problem not only of outstanding theoretical importance but also significant practical impact – solar activity has
major terrestrial consequences. An increase in activity can lead to an increase in the number and violence of solar flares and
coronal mass ejections, with profound consequences for our terrestrial environment, causing disruption to satellites and
power. Predictions of magnetic activity are highly desired by government and industry groups alike. A deep understanding of
the mechanisms leading to solar magnetic activity is required. The variable magnetic field is generated by a dynamo in the
solar interior. Though this mechanism is known to involve the interaction of magnetohydrodynamic (MHD) turbulence with
rotation, no realistic model for dynamo action currently exists. D5S utilises two recent significant breakthroughs to construct
new models for magnetic field generation in the sun and other solar-type stars. The first of these involves an entirely new
approach termed Direct Statistical Simulation (DSS) (developed by the PI), where the statistics of the astrophysical flows are
solved directly (enabling the construction of more realistic models). This approach is coupled to a breakthrough (recently
published by the PI in Nature) in our understanding of the physics of MHD turbulence at the extreme parameters relevant to
solar interiors. D5S also uses the methodology of DSS to provide statistical subgrid models for Direct Numerical Simulation
(DNS). This will increase the utility, fidelity and predictability of such models for solar magnetic activity. Either of these new
approaches, taken in isolation, would lead to significant progress in our understanding of magnetic field generation in stars.
Taken together, as in this proposal, they will provide a paradigm shift in our theories for solar magnetic activity.
Summary
This proposal (D5S) addresses a key problem of astrophysics – the origin of magnetic activity in the sun and solar-type
stars. This is a problem not only of outstanding theoretical importance but also significant practical impact – solar activity has
major terrestrial consequences. An increase in activity can lead to an increase in the number and violence of solar flares and
coronal mass ejections, with profound consequences for our terrestrial environment, causing disruption to satellites and
power. Predictions of magnetic activity are highly desired by government and industry groups alike. A deep understanding of
the mechanisms leading to solar magnetic activity is required. The variable magnetic field is generated by a dynamo in the
solar interior. Though this mechanism is known to involve the interaction of magnetohydrodynamic (MHD) turbulence with
rotation, no realistic model for dynamo action currently exists. D5S utilises two recent significant breakthroughs to construct
new models for magnetic field generation in the sun and other solar-type stars. The first of these involves an entirely new
approach termed Direct Statistical Simulation (DSS) (developed by the PI), where the statistics of the astrophysical flows are
solved directly (enabling the construction of more realistic models). This approach is coupled to a breakthrough (recently
published by the PI in Nature) in our understanding of the physics of MHD turbulence at the extreme parameters relevant to
solar interiors. D5S also uses the methodology of DSS to provide statistical subgrid models for Direct Numerical Simulation
(DNS). This will increase the utility, fidelity and predictability of such models for solar magnetic activity. Either of these new
approaches, taken in isolation, would lead to significant progress in our understanding of magnetic field generation in stars.
Taken together, as in this proposal, they will provide a paradigm shift in our theories for solar magnetic activity.
Max ERC Funding
2 499 899 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym DARKLIGHT
Project ILLUMINATING DARK ENERGY WITH THE NEXT GENERATION OF COSMOLOGICAL REDSHIFT SURVEYS
Researcher (PI) Luigi Guzzo
Host Institution (HI) UNIVERSITA DEGLI STUDI DI MILANO
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary Galaxy redshift surveys have been central in establishing the current successful cosmological model. Reconstructing the large-scale distribution of galaxies in space and time, they provide us with a unique probe of the basic constituents of the Universe, their evolution and the background fundamental physics. A new generation of even larger surveys is planned for the starting decade, with the aim of solving the remaining mysteries of the standard model using high-precision measurements of galaxy clustering. These entail the nature of the “dark sector” and in particular the origin of the accelerated cosmic expansion. While data accumulation already started, the needed analysis capabilities to reach the required percent levels in both accuracy and precision are not ready yet.
I propose to establish a focused research group to develop these tools and optimally analyze the new data, while being directly involved in their collection. New techniques as redshift-space distortions and well-known but still debated probes as galaxy clusters will be refined to a new level. They will be combined with more standard methods as baryonic acoustic oscillations and external data as CMB anisotropies. Performances will be validated on mock samples from large numerical simulations and then applied to state-of-the-art data with enhanced control over systematic errors to obtain the best achievable measurements.
These new capabilities will be decisive in enabling ongoing and future surveys to tackle the key open problems in cosmology: What is the nature of dark energy? Is it produced by an evolving scalar field? Or does it rather require a modification of the laws of gravity? How does it relate to dark matter? What is the role of neutrinos? The answer to these questions may well revolutionize our view of physics.
Summary
Galaxy redshift surveys have been central in establishing the current successful cosmological model. Reconstructing the large-scale distribution of galaxies in space and time, they provide us with a unique probe of the basic constituents of the Universe, their evolution and the background fundamental physics. A new generation of even larger surveys is planned for the starting decade, with the aim of solving the remaining mysteries of the standard model using high-precision measurements of galaxy clustering. These entail the nature of the “dark sector” and in particular the origin of the accelerated cosmic expansion. While data accumulation already started, the needed analysis capabilities to reach the required percent levels in both accuracy and precision are not ready yet.
I propose to establish a focused research group to develop these tools and optimally analyze the new data, while being directly involved in their collection. New techniques as redshift-space distortions and well-known but still debated probes as galaxy clusters will be refined to a new level. They will be combined with more standard methods as baryonic acoustic oscillations and external data as CMB anisotropies. Performances will be validated on mock samples from large numerical simulations and then applied to state-of-the-art data with enhanced control over systematic errors to obtain the best achievable measurements.
These new capabilities will be decisive in enabling ongoing and future surveys to tackle the key open problems in cosmology: What is the nature of dark energy? Is it produced by an evolving scalar field? Or does it rather require a modification of the laws of gravity? How does it relate to dark matter? What is the role of neutrinos? The answer to these questions may well revolutionize our view of physics.
Max ERC Funding
1 723 600 €
Duration
Start date: 2012-05-01, End date: 2017-10-31
Project acronym DEBRIS
Project Debris in extrasolar planetary systems
Researcher (PI) Mark Charles Wyatt
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary This proposal concerns the debris discs of nearby stars; ie, discs of asteroids, comets and dust. Such dust can be imaged, providing clues to the underlying planetary system. Debris images have already predicted planets later confirmed in direct imaging. Most debris lies in cold outer (~100AU) regions of planetary systems, but a growing number of stars have hot dust in regions where terrestrial planets are expected (few AU). This proposal aims learn about the planetary systems of nearby stars through study of their debris discs. Specific focus is on the frontier area of characterisation and modelling of dust within planetary systems, which is important for the design of missions to detect habitable planets, a high priority goal for the next decade. The PI has played a significant role in debris disc studies, and proposes to consolidate an independent research team in Cambridge. The proposal covers 3 studies supported by 3 PDRAs. Specific objectives are: 1) Debris disc observations: Carry out survey for cold debris around unbiased sample of nearest 500 stars with Herschel and SCUBA2. Follow-up bright discs with high resolution imaging using ALMA and JWST to characterise sub-structure from planets and search for dust at multiple radii. Pioneer survey for hot dust using polarisation and interferometry. 2) Debris disc modelling: Develop new model to follow the interplay between collisions, radiation pressure, P-R drag, sublimation, disintegration, and dynamical interactions with planets. Use model to consider nature of small particle halos, resonant ring structures formed by terrestrial planets, and level of cometary dust scattered into inner regions. 3) Debris disc origin: Demonstrate constraints placed on planet formation models through studies of dust from Earth-moon forming impacts, effect of planetesimals on late-stage planetary dynamics, population synthesis explaining planets and debris, constraints on primordial size and stirring of debris.
Summary
This proposal concerns the debris discs of nearby stars; ie, discs of asteroids, comets and dust. Such dust can be imaged, providing clues to the underlying planetary system. Debris images have already predicted planets later confirmed in direct imaging. Most debris lies in cold outer (~100AU) regions of planetary systems, but a growing number of stars have hot dust in regions where terrestrial planets are expected (few AU). This proposal aims learn about the planetary systems of nearby stars through study of their debris discs. Specific focus is on the frontier area of characterisation and modelling of dust within planetary systems, which is important for the design of missions to detect habitable planets, a high priority goal for the next decade. The PI has played a significant role in debris disc studies, and proposes to consolidate an independent research team in Cambridge. The proposal covers 3 studies supported by 3 PDRAs. Specific objectives are: 1) Debris disc observations: Carry out survey for cold debris around unbiased sample of nearest 500 stars with Herschel and SCUBA2. Follow-up bright discs with high resolution imaging using ALMA and JWST to characterise sub-structure from planets and search for dust at multiple radii. Pioneer survey for hot dust using polarisation and interferometry. 2) Debris disc modelling: Develop new model to follow the interplay between collisions, radiation pressure, P-R drag, sublimation, disintegration, and dynamical interactions with planets. Use model to consider nature of small particle halos, resonant ring structures formed by terrestrial planets, and level of cometary dust scattered into inner regions. 3) Debris disc origin: Demonstrate constraints placed on planet formation models through studies of dust from Earth-moon forming impacts, effect of planetesimals on late-stage planetary dynamics, population synthesis explaining planets and debris, constraints on primordial size and stirring of debris.
Max ERC Funding
1 497 920 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym DeciGUT
Project A Grand Unified Theory of Decidability in Logic-Based Knowledge Representation
Researcher (PI) Sebastian Rudolph
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "Logic-based knowledge representation (KR) constitutes a vital area of IT. The field inspires and guides scientific and technological developments enabling intelligent management of large and complex knowledge resources. Elaborate languages for specifying knowledge (so-called ontology languages) and querying it have been defined and standardized. Algorithms for automated reasoning and intelligent querying over knowledge resources are being developed, implemented and practically deployed on a wide scale.
Thereby, decidability investigations play a pivotal role to characterize what reasoning or querying tasks are at all computationally solvable.
Past decades have seen a proliferation of new decidable formalisms for KR, dominated by two major paradigms: description logics and rule-based approaches, most notably existential rules. Recently, these research lines have started to converge and first progress has been made toward identifying commonalities among the various formalisms. Still, the underlying principles for establishing their decidability remain disparate, ranging from proof-theoretic notions to model-theoretic ones.
DeciGUT will accomplish a major breakthrough in the field by establishing a ""Grand Unified Theory"" of decidability. We will provide a novel, powerful model-theoretic criterion inspired by advanced graph-theoretic notions. We will prove that the criterion indeed ensures decidability and that it subsumes most of (if not all) currently known decidable formalisms in the KR field.
We will exploit our results toward the definition of novel decidable KR languages of unprecedented expressivity. We will ultimately extend our framework to encompass more advanced KR features beyond standard first order logic such as counting and non-monotonic aspects.
Our research will draw from and significantly impact the scientific fields of AI, Database Theory and Logic, but also give rise to drastically improved practical information management technology."
Summary
"Logic-based knowledge representation (KR) constitutes a vital area of IT. The field inspires and guides scientific and technological developments enabling intelligent management of large and complex knowledge resources. Elaborate languages for specifying knowledge (so-called ontology languages) and querying it have been defined and standardized. Algorithms for automated reasoning and intelligent querying over knowledge resources are being developed, implemented and practically deployed on a wide scale.
Thereby, decidability investigations play a pivotal role to characterize what reasoning or querying tasks are at all computationally solvable.
Past decades have seen a proliferation of new decidable formalisms for KR, dominated by two major paradigms: description logics and rule-based approaches, most notably existential rules. Recently, these research lines have started to converge and first progress has been made toward identifying commonalities among the various formalisms. Still, the underlying principles for establishing their decidability remain disparate, ranging from proof-theoretic notions to model-theoretic ones.
DeciGUT will accomplish a major breakthrough in the field by establishing a ""Grand Unified Theory"" of decidability. We will provide a novel, powerful model-theoretic criterion inspired by advanced graph-theoretic notions. We will prove that the criterion indeed ensures decidability and that it subsumes most of (if not all) currently known decidable formalisms in the KR field.
We will exploit our results toward the definition of novel decidable KR languages of unprecedented expressivity. We will ultimately extend our framework to encompass more advanced KR features beyond standard first order logic such as counting and non-monotonic aspects.
Our research will draw from and significantly impact the scientific fields of AI, Database Theory and Logic, but also give rise to drastically improved practical information management technology."
Max ERC Funding
1 814 937 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym DeepInternal
Project Going Deep and Blind with Internal Statistics
Researcher (PI) Michal IRANI
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Unsupervised visual inference can often be performed by exploiting the internal redundancy inside a single visual datum (an image or a video). The strong repetition of patches inside a single image/video provides a powerful data-specific prior for solving a variety of vision tasks in a “blind” manner: (i) Blind in the sense that sophisticated unsupervised inferences can be made with no prior examples or training; (ii) Blind in the sense that complex ill-posed Inverse-Problems can be solved, even when the forward degradation is unknown.
While the above fully unsupervised approach achieved impressive results, it relies on internal data alone, hence cannot enjoy the “wisdom of the crowd” which Deep-Learning (DL) so wisely extracts from external collections of images, yielding state-of-the-art (SOTA) results. Nevertheless, DL requires huge amounts of training data, which restricts its applicability. Moreover, some internal image-specific information, which is clearly visible, remains unexploited by today's DL methods. One such example is shown in Fig.1.
We propose to combine the power of these two complementary approaches – unsupervised Internal Data Recurrence, with Deep Learning, to obtain the best of both worlds. If successful, this will have several important outcomes including:
• A wide range of low-level & high-level inferences (image & video).
• A continuum between Internal & External training – a platform to explore theoretical and practical tradeoffs between amount of available training data and optimal Internal-vs-External training.
• Enable totally unsupervised DL when no training data are available.
• Enable supervised DL with modest amounts of training data.
• New applications, disciplines and domains, which are enabled by the unified approach.
• A platform for substantial progress in video analysis (which has been lagging behind so far due to the strong reliance on exhaustive supervised training data).
Summary
Unsupervised visual inference can often be performed by exploiting the internal redundancy inside a single visual datum (an image or a video). The strong repetition of patches inside a single image/video provides a powerful data-specific prior for solving a variety of vision tasks in a “blind” manner: (i) Blind in the sense that sophisticated unsupervised inferences can be made with no prior examples or training; (ii) Blind in the sense that complex ill-posed Inverse-Problems can be solved, even when the forward degradation is unknown.
While the above fully unsupervised approach achieved impressive results, it relies on internal data alone, hence cannot enjoy the “wisdom of the crowd” which Deep-Learning (DL) so wisely extracts from external collections of images, yielding state-of-the-art (SOTA) results. Nevertheless, DL requires huge amounts of training data, which restricts its applicability. Moreover, some internal image-specific information, which is clearly visible, remains unexploited by today's DL methods. One such example is shown in Fig.1.
We propose to combine the power of these two complementary approaches – unsupervised Internal Data Recurrence, with Deep Learning, to obtain the best of both worlds. If successful, this will have several important outcomes including:
• A wide range of low-level & high-level inferences (image & video).
• A continuum between Internal & External training – a platform to explore theoretical and practical tradeoffs between amount of available training data and optimal Internal-vs-External training.
• Enable totally unsupervised DL when no training data are available.
• Enable supervised DL with modest amounts of training data.
• New applications, disciplines and domains, which are enabled by the unified approach.
• A platform for substantial progress in video analysis (which has been lagging behind so far due to the strong reliance on exhaustive supervised training data).
Max ERC Funding
2 466 940 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym DeepSPIN
Project Deep Learning for Structured Prediction in Natural Language Processing
Researcher (PI) André Filipe TORRES MARTINS
Host Institution (HI) INSTITUTO DE TELECOMUNICACOES
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Deep learning is revolutionizing the field of Natural Language Processing (NLP), with breakthroughs in machine translation, speech recognition, and question answering. New language interfaces (digital assistants, messenger apps, customer service bots) are emerging as the next technologies for seamless, multilingual communication among humans and machines.
From a machine learning perspective, many problems in NLP can be characterized as structured prediction: they involve predicting structurally rich and interdependent output variables. In spite of this, current neural NLP systems ignore the structural complexity of human language, relying on simplistic and error-prone greedy search procedures. This leads to serious mistakes in machine translation, such as words being dropped or named entities mistranslated. More broadly, neural networks are missing the key structural mechanisms for solving complex real-world tasks requiring deep reasoning.
This project attacks these fundamental problems by bringing together deep learning and structured prediction, with a highly disruptive and cross-disciplinary approach. First, I will endow neural networks with a "planning mechanism" to guide structural search, letting decoders learn the optimal order by which they should operate. This makes a bridge with reinforcement learning and combinatorial optimization. Second, I will develop new ways of automatically inducing latent structure inside the network, making it more expressive, scalable and interpretable. Synergies with probabilistic inference and sparse modeling techniques will be exploited. To complement these two innovations, I will investigate new ways of incorporating weak supervision to reduce the need for labeled data.
Three highly challenging applications will serve as testbeds: machine translation, quality estimation, and dependency parsing. To maximize technological impact, a collaboration is planned with a start-up company in the crowd-sourcing translation industry.
Summary
Deep learning is revolutionizing the field of Natural Language Processing (NLP), with breakthroughs in machine translation, speech recognition, and question answering. New language interfaces (digital assistants, messenger apps, customer service bots) are emerging as the next technologies for seamless, multilingual communication among humans and machines.
From a machine learning perspective, many problems in NLP can be characterized as structured prediction: they involve predicting structurally rich and interdependent output variables. In spite of this, current neural NLP systems ignore the structural complexity of human language, relying on simplistic and error-prone greedy search procedures. This leads to serious mistakes in machine translation, such as words being dropped or named entities mistranslated. More broadly, neural networks are missing the key structural mechanisms for solving complex real-world tasks requiring deep reasoning.
This project attacks these fundamental problems by bringing together deep learning and structured prediction, with a highly disruptive and cross-disciplinary approach. First, I will endow neural networks with a "planning mechanism" to guide structural search, letting decoders learn the optimal order by which they should operate. This makes a bridge with reinforcement learning and combinatorial optimization. Second, I will develop new ways of automatically inducing latent structure inside the network, making it more expressive, scalable and interpretable. Synergies with probabilistic inference and sparse modeling techniques will be exploited. To complement these two innovations, I will investigate new ways of incorporating weak supervision to reduce the need for labeled data.
Three highly challenging applications will serve as testbeds: machine translation, quality estimation, and dependency parsing. To maximize technological impact, a collaboration is planned with a start-up company in the crowd-sourcing translation industry.
Max ERC Funding
1 436 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym DeeViSe
Project Deep Learning for Dynamic 3D Visual Scene Understanding
Researcher (PI) Bastian LEIBE
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary Over the past 5 years, deep learning has exercised a tremendous and transformational effect on the field of computer vision. However, deep neural networks (DNNs) can only realize their full potential when applied in an end-to-end manner, i.e., when every stage of the processing pipeline is differentiable with respect to the network’s parameters, such that all of those parameters can be optimized together. Such end-to-end learning solutions are still rare for computer vision problems, in particular for dynamic visual scene understanding tasks. Moreover, feed-forward processing, as done in most DNN-based vision approaches, is only a tiny fraction of what the human brain can do. Feedback processes, temporal information processing, and memory mechanisms form an important part of our human scene understanding capabilities. Those mechanisms are currently underexplored in computer vision.
The goal of this proposal is to remove this bottleneck and to design end-to-end deep learning approaches that can realize the full potential of DNNs for dynamic visual scene understanding. We will make use of the positive interactions and feedback processes between multiple vision modalities and combine them to work towards a common goal. In addition, we will impart deep learning approaches with a notion of what it means to move through a 3D world by incorporating temporal continuity constraints, as well as by developing novel deep associative and spatial memory mechanisms.
The results of this research will enable deep neural networks to reach significantly improved dynamic scene understanding capabilities compared to today’s methods. This will have an immediate positive effect for applications in need for such capabilities, most notably for mobile robotics and intelligent vehicles.
Summary
Over the past 5 years, deep learning has exercised a tremendous and transformational effect on the field of computer vision. However, deep neural networks (DNNs) can only realize their full potential when applied in an end-to-end manner, i.e., when every stage of the processing pipeline is differentiable with respect to the network’s parameters, such that all of those parameters can be optimized together. Such end-to-end learning solutions are still rare for computer vision problems, in particular for dynamic visual scene understanding tasks. Moreover, feed-forward processing, as done in most DNN-based vision approaches, is only a tiny fraction of what the human brain can do. Feedback processes, temporal information processing, and memory mechanisms form an important part of our human scene understanding capabilities. Those mechanisms are currently underexplored in computer vision.
The goal of this proposal is to remove this bottleneck and to design end-to-end deep learning approaches that can realize the full potential of DNNs for dynamic visual scene understanding. We will make use of the positive interactions and feedback processes between multiple vision modalities and combine them to work towards a common goal. In addition, we will impart deep learning approaches with a notion of what it means to move through a 3D world by incorporating temporal continuity constraints, as well as by developing novel deep associative and spatial memory mechanisms.
The results of this research will enable deep neural networks to reach significantly improved dynamic scene understanding capabilities compared to today’s methods. This will have an immediate positive effect for applications in need for such capabilities, most notably for mobile robotics and intelligent vehicles.
Max ERC Funding
2 000 000 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym DEMOBLACK
Project Demography of black hole binaries in the era of gravitational wave astronomy
Researcher (PI) Michela MAPELLI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PADOVA
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The first direct detection of gravitational waves demonstrated that double black hole (BH) binaries exist, and can host surprisingly massive objects (> 20 solar masses). Most theoretical models do not predict the existence of such massive BHs, and the formation channels of BH binaries are essentially unconstrained. Dynamically formed BH binaries are the most elusive ones: current models either neglect them or study them in idealized systems. With DEMOBLACK, I will draw the first satisfactory picture of BH binary demography, by modeling realistic BH dynamics in a well-motivated cosmological context. I propose a novel approach for the study of BH dynamics: I will simulate the formation of BH binaries in star clusters self-consistently, starting from the hydrodynamics of the parent molecular cloud and accounting for the impact of stellar evolution, feedback, and dynamics on BH binaries. The key tool of DEMOBLACK is SEVN, my new population-synthesis code. With SEVN, I predicted the formation of massive BHs from metal-poor stars, before the first direct detection of gravitational waves. I will interface SEVN with a hydrodynamical code and with an N-body code, to study the formation of BH binaries self-consistently. I will then model the history of BH binaries across cosmic time, accounting for the evolution of metallicity. This novel approach is decisive to break degeneracies between dynamically formed and primordial BH binaries, and to make predictions for future observations by ground-based and space-borne gravitational wave interferometers.
Summary
The first direct detection of gravitational waves demonstrated that double black hole (BH) binaries exist, and can host surprisingly massive objects (> 20 solar masses). Most theoretical models do not predict the existence of such massive BHs, and the formation channels of BH binaries are essentially unconstrained. Dynamically formed BH binaries are the most elusive ones: current models either neglect them or study them in idealized systems. With DEMOBLACK, I will draw the first satisfactory picture of BH binary demography, by modeling realistic BH dynamics in a well-motivated cosmological context. I propose a novel approach for the study of BH dynamics: I will simulate the formation of BH binaries in star clusters self-consistently, starting from the hydrodynamics of the parent molecular cloud and accounting for the impact of stellar evolution, feedback, and dynamics on BH binaries. The key tool of DEMOBLACK is SEVN, my new population-synthesis code. With SEVN, I predicted the formation of massive BHs from metal-poor stars, before the first direct detection of gravitational waves. I will interface SEVN with a hydrodynamical code and with an N-body code, to study the formation of BH binaries self-consistently. I will then model the history of BH binaries across cosmic time, accounting for the evolution of metallicity. This novel approach is decisive to break degeneracies between dynamically formed and primordial BH binaries, and to make predictions for future observations by ground-based and space-borne gravitational wave interferometers.
Max ERC Funding
1 994 764 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym DEVTAXNET
Project Tax Evasion in Developing Countries. The Role of Firm Networks
Researcher (PI) Dina Deborah POMERANZ
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary Tax evasion leads to billions of Euros of losses in government revenue around the world. This does not only affect public budgets, but can also create large distortions between activities that are fully taxed and others that escape taxation through evasion. These issues are particularly severe in developing countries, where evasion is especially high and governments struggle to raise funds for basic services and infrastructure, while at the same time trying to grow independent of international aid.
It is widely suspected that some of the most common and difficult to detect forms of evasion involve interactions across firm networks. However, due to severe data limitations, the existing literature has mostly considered taxpayers as isolated units. Empirical evidence on tax compliance in firm networks is extremely sparse.
This proposal describes 3 Sub-Projects to fill this gap. They are made possible thanks to access I have obtained -through five years of prior research and policy engagement– to unique datasets from Chile and Ecuador on both the networks of supply chains and of joint ownership structures.
The first Sub-Project focuses on international firm networks. It aims to analyze profit shifting of multinational firms to low tax jurisdictions, exploiting a natural experiment in Chile that strongly increased monitoring of international tax norms.
The second Sub-Project investigates the analogous issue at the intranational level: profit shifting and tax collusion in networks of firms within the same country. Despite much anecdotal evidence, this behavior has received little rigorous empirical scrutiny.
The final Sub-Project is situated at the nexus between international and national firms. It seeks to estimate a novel form of spillovers of FDI: the impact on tax compliance of local trading partners of foreign-owned firms.
DEVTAXNET will provide new insights about the role of firm networks for tax evasion that are valuable to academics and policy makers alike.
Summary
Tax evasion leads to billions of Euros of losses in government revenue around the world. This does not only affect public budgets, but can also create large distortions between activities that are fully taxed and others that escape taxation through evasion. These issues are particularly severe in developing countries, where evasion is especially high and governments struggle to raise funds for basic services and infrastructure, while at the same time trying to grow independent of international aid.
It is widely suspected that some of the most common and difficult to detect forms of evasion involve interactions across firm networks. However, due to severe data limitations, the existing literature has mostly considered taxpayers as isolated units. Empirical evidence on tax compliance in firm networks is extremely sparse.
This proposal describes 3 Sub-Projects to fill this gap. They are made possible thanks to access I have obtained -through five years of prior research and policy engagement– to unique datasets from Chile and Ecuador on both the networks of supply chains and of joint ownership structures.
The first Sub-Project focuses on international firm networks. It aims to analyze profit shifting of multinational firms to low tax jurisdictions, exploiting a natural experiment in Chile that strongly increased monitoring of international tax norms.
The second Sub-Project investigates the analogous issue at the intranational level: profit shifting and tax collusion in networks of firms within the same country. Despite much anecdotal evidence, this behavior has received little rigorous empirical scrutiny.
The final Sub-Project is situated at the nexus between international and national firms. It seeks to estimate a novel form of spillovers of FDI: the impact on tax compliance of local trading partners of foreign-owned firms.
DEVTAXNET will provide new insights about the role of firm networks for tax evasion that are valuable to academics and policy makers alike.
Max ERC Funding
1 288 125 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym DLT
Project Deep Learning Theory: Geometric Analysis of Capacity, Optimization, and Generalization for Improving Learning in Deep Neural Networks
Researcher (PI) Guido Francisco MONTUFAR CUARTAS
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Deep Learning is one of the most vibrant areas of contemporary machine learning and one of the most promising approaches to Artificial Intelligence. Deep Learning drives the latest systems for image, text, and audio processing, as well as an increasing number of new technologies. The goal of this project is to advance on key open problems in Deep Learning, specifically regarding the capacity, optimization, and regularization of these algorithms. The idea is to consolidate a theoretical basis that allows us to pin down the inner workings of the present success of Deep Learning and make it more widely applicable, in particular in situations with limited data and challenging problems in reinforcement learning. The approach is based on the geometry of neural networks and exploits innovative mathematics, drawing on information geometry and algebraic statistics. This is a quite timely and unique proposal which holds promise to vastly streamline the progress of Deep Learning into new frontiers.
Summary
Deep Learning is one of the most vibrant areas of contemporary machine learning and one of the most promising approaches to Artificial Intelligence. Deep Learning drives the latest systems for image, text, and audio processing, as well as an increasing number of new technologies. The goal of this project is to advance on key open problems in Deep Learning, specifically regarding the capacity, optimization, and regularization of these algorithms. The idea is to consolidate a theoretical basis that allows us to pin down the inner workings of the present success of Deep Learning and make it more widely applicable, in particular in situations with limited data and challenging problems in reinforcement learning. The approach is based on the geometry of neural networks and exploits innovative mathematics, drawing on information geometry and algebraic statistics. This is a quite timely and unique proposal which holds promise to vastly streamline the progress of Deep Learning into new frontiers.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym DMIDAS
Project Astrophysical constraints on the identity of the dark matter
Researcher (PI) Carlos Silvestre FRENK
Host Institution (HI) UNIVERSITY OF DURHAM
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary The identity of the dark matter is a fundamental problem in Physics whose solution will have major implications for cosmology, astronomy and particle physics. There is compelling evidence that the dark matter consists of elementary particles created shortly after the Big Bang, but searches for them in the laboratory and from astrophysical sources have proved inconclusive. The currently favoured candidate is cold dark matter or CDM. This forms the basis of the standard model of cosmology, LCDM, whose predictions, dating back to the 1980s, turned out to agree remarkably well with observations covering a staggering range of epochs and scales, from the temperature structure of the cosmic microwave background radiation to the large-scale pattern of galaxy clustering. Yet, this agreement is not exclusive to CDM: models based on other types of particles -- warm, self-interacting or asymmetric, for example -- agree equally well with these data but differ on scales smaller than individual bright galaxies. These are the scales targeted in this application in which we propose a comprehensive investigation of small-scale structure, with the aim of testing dark matter candidates, by focusing on three key astrophysical diagnostics: strong gravitational lensing, dwarf galaxies and stellar halos. We propose a joint theoretical and observational programme exploiting three major developments: SWIFT, a new code developed at Durham that will enable cosmological hydrodynamics simulations an order of magnitude larger than is possible today; SuperBIT, an innovative balloon-borne wide-field imaging telescope that will collect gravitational lensing data for hundreds of galaxy clusters; and DESI, a spectro-photometric survey that will acquire 10 times more spectra of stars in the Milky Way than previous surveys. The particle models that we will consider have predictive power and are disprovable. Our programme has the potential to rule out many dark matter particle candidates, including CDM.
Summary
The identity of the dark matter is a fundamental problem in Physics whose solution will have major implications for cosmology, astronomy and particle physics. There is compelling evidence that the dark matter consists of elementary particles created shortly after the Big Bang, but searches for them in the laboratory and from astrophysical sources have proved inconclusive. The currently favoured candidate is cold dark matter or CDM. This forms the basis of the standard model of cosmology, LCDM, whose predictions, dating back to the 1980s, turned out to agree remarkably well with observations covering a staggering range of epochs and scales, from the temperature structure of the cosmic microwave background radiation to the large-scale pattern of galaxy clustering. Yet, this agreement is not exclusive to CDM: models based on other types of particles -- warm, self-interacting or asymmetric, for example -- agree equally well with these data but differ on scales smaller than individual bright galaxies. These are the scales targeted in this application in which we propose a comprehensive investigation of small-scale structure, with the aim of testing dark matter candidates, by focusing on three key astrophysical diagnostics: strong gravitational lensing, dwarf galaxies and stellar halos. We propose a joint theoretical and observational programme exploiting three major developments: SWIFT, a new code developed at Durham that will enable cosmological hydrodynamics simulations an order of magnitude larger than is possible today; SuperBIT, an innovative balloon-borne wide-field imaging telescope that will collect gravitational lensing data for hundreds of galaxy clusters; and DESI, a spectro-photometric survey that will acquire 10 times more spectra of stars in the Milky Way than previous surveys. The particle models that we will consider have predictive power and are disprovable. Our programme has the potential to rule out many dark matter particle candidates, including CDM.
Max ERC Funding
2 493 439 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym DYNACORP
Project Dynamic Structural Corporate Finance: Linking Theory and Empirical Testing
Researcher (PI) Christopher Anthony Hennessy
Host Institution (HI) LONDON BUSINESS SCHOOL
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary There are three components to this project: Theory; Empirical Testing; and Dissemination. All components are linked to the current policy question of how taxes influence debt and systemic risk, and all use novel dynamic structural models. I am unique in explicitly linking such models to empirical testing.
Theory: “Learning, Capital Structure and Systemic Risk.” Standard dynamic structural models of financing assume firms know the stochastic process governing cash flow. I will first consider a partial equilibrium model. Here firms are exposed to rare event risk, with the true probability being unknown. Firms learn and update beliefs regarding risk. Relative to standard models, firms are debt conservative and there is leverage persistence. In many cases, firms increase leverage only if they have avoided a negative shock long enough. In order to analyze asset pricing implications, I plan to embed such firms in a general equilibrium setting with a common catastrophic risk having unknown probability. Firms rationally respond to “Great Moderations” by increasing leverage. Recessions are more severe after long tranquil periods due to high debt overhang. A third paper, Re-Examining the Link Between Leverage and Systematic Risk, considers cross sectional asset pricing implications of credit shocks. The standard levered beta formula is erroneous, and the pre-tax cost of capital increases with debt. Together, the models show privately optimal debt is lower than recognized, and that tax breaks for debt reduce welfare.
Empirical Testing: “Natural Experiment Policy Evaluation—A Structural Critique.” A common approach to testing whether taxes influence corporate financing and investment decisions is to compare leverage and investment before/after tax changes. I use a structural model as a laboratory to show that lack of a statistically significant change is not sufficient to reject the null that “taxes matter.” I will first consider an economy where the tax rate is a Markov process. Flotation costs on debt and real irreversibility limit the response of financing and investment to changes in shadow prices. More importantly, responses to tax changes are attenuated whenever they are partially anticipated and not permanent. Standard tests violate rational expectations by implicitly assuming tax changes come as surprises, with each new change being viewed as permanent, until the next surprise. My argument implies that standard tax experiments cannot falsify the null that taxes affect behaviour. Further, one cannot generalize elasticities if the policy transition matrix differs. I will propose an alternative Bayesian approach to hypothesis testing. My argument casts doubt on standard interpretations of historical evidence of tax change effects, suggesting true elasticities may be much higher. I will consider extending this argument to settings with endogenous policy choices.
“Dissemination. The objective of this phase is to lower entry barriers by making the methodology accessible via a non-technical primer, and by making the models readily available using a user-friendly online platform.
Summary
There are three components to this project: Theory; Empirical Testing; and Dissemination. All components are linked to the current policy question of how taxes influence debt and systemic risk, and all use novel dynamic structural models. I am unique in explicitly linking such models to empirical testing.
Theory: “Learning, Capital Structure and Systemic Risk.” Standard dynamic structural models of financing assume firms know the stochastic process governing cash flow. I will first consider a partial equilibrium model. Here firms are exposed to rare event risk, with the true probability being unknown. Firms learn and update beliefs regarding risk. Relative to standard models, firms are debt conservative and there is leverage persistence. In many cases, firms increase leverage only if they have avoided a negative shock long enough. In order to analyze asset pricing implications, I plan to embed such firms in a general equilibrium setting with a common catastrophic risk having unknown probability. Firms rationally respond to “Great Moderations” by increasing leverage. Recessions are more severe after long tranquil periods due to high debt overhang. A third paper, Re-Examining the Link Between Leverage and Systematic Risk, considers cross sectional asset pricing implications of credit shocks. The standard levered beta formula is erroneous, and the pre-tax cost of capital increases with debt. Together, the models show privately optimal debt is lower than recognized, and that tax breaks for debt reduce welfare.
Empirical Testing: “Natural Experiment Policy Evaluation—A Structural Critique.” A common approach to testing whether taxes influence corporate financing and investment decisions is to compare leverage and investment before/after tax changes. I use a structural model as a laboratory to show that lack of a statistically significant change is not sufficient to reject the null that “taxes matter.” I will first consider an economy where the tax rate is a Markov process. Flotation costs on debt and real irreversibility limit the response of financing and investment to changes in shadow prices. More importantly, responses to tax changes are attenuated whenever they are partially anticipated and not permanent. Standard tests violate rational expectations by implicitly assuming tax changes come as surprises, with each new change being viewed as permanent, until the next surprise. My argument implies that standard tax experiments cannot falsify the null that taxes affect behaviour. Further, one cannot generalize elasticities if the policy transition matrix differs. I will propose an alternative Bayesian approach to hypothesis testing. My argument casts doubt on standard interpretations of historical evidence of tax change effects, suggesting true elasticities may be much higher. I will consider extending this argument to settings with endogenous policy choices.
“Dissemination. The objective of this phase is to lower entry barriers by making the methodology accessible via a non-technical primer, and by making the models readily available using a user-friendly online platform.
Max ERC Funding
1 103 996 €
Duration
Start date: 2011-10-01, End date: 2015-09-30
Project acronym DYNAMIC MINVIP
Project Dynamic Minimal prior knowledge for model based Computer Vision and Scene Analysis
Researcher (PI) Bodo Rosenhahn
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Efficient solutions for open problems in computer vision are often achieved with the help of suitable prior knowledge, e.g. stemming from labeled databases, physical simulation or geometric invariances. Yet it has been largely neglected to analyse the minimal amount of prior knowledge, needed to satisfactory solve computer vision tasks. Even more important, there is need to steer the amount of priors in a dynamic fashion. Especially for scene analysis, database knowledge can become so large and complex, that it cannot be integrated efficiently for optimization. On the other hand, there exist geometric priors which are efficient and compact, but they have to be integrated and exploited explicitly in vision systems. As a consequence there is need to develop methods to conclude from (statistical) database knowledge to geometric prior knowledge and therefore to achieve compressed priors which contain the relevant information from a given database. Besides the efficient regularization during scene analysis, specific tasks require to treat the amount of priors dynamically, e.g. to maintain individualities of patterns or to avoid a bias from a given database. Our beyond state-of-the art research will focus on answering the following questions:
1) How to limit statistical prior knowledge to geometric priors for solving markerless Motion Capture dynamically with sufficient accuracy ?
2) How to stabilize tracking without introducing a database bias, or to enforce individuality ?
3) How to extract (geometric) motion characteristics for individual motion transfer and interpretation ?
Advancing minimal dynamic prior knowledge means to seek for the essence and granularity of priors. This will have a profound impact well beyond computer vision (e.g. for cognitive sciences or robotics). We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community
Summary
Efficient solutions for open problems in computer vision are often achieved with the help of suitable prior knowledge, e.g. stemming from labeled databases, physical simulation or geometric invariances. Yet it has been largely neglected to analyse the minimal amount of prior knowledge, needed to satisfactory solve computer vision tasks. Even more important, there is need to steer the amount of priors in a dynamic fashion. Especially for scene analysis, database knowledge can become so large and complex, that it cannot be integrated efficiently for optimization. On the other hand, there exist geometric priors which are efficient and compact, but they have to be integrated and exploited explicitly in vision systems. As a consequence there is need to develop methods to conclude from (statistical) database knowledge to geometric prior knowledge and therefore to achieve compressed priors which contain the relevant information from a given database. Besides the efficient regularization during scene analysis, specific tasks require to treat the amount of priors dynamically, e.g. to maintain individualities of patterns or to avoid a bias from a given database. Our beyond state-of-the art research will focus on answering the following questions:
1) How to limit statistical prior knowledge to geometric priors for solving markerless Motion Capture dynamically with sufficient accuracy ?
2) How to stabilize tracking without introducing a database bias, or to enforce individuality ?
3) How to extract (geometric) motion characteristics for individual motion transfer and interpretation ?
Advancing minimal dynamic prior knowledge means to seek for the essence and granularity of priors. This will have a profound impact well beyond computer vision (e.g. for cognitive sciences or robotics). We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community
Max ERC Funding
1 430 000 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym DYNAMIC MODELS
Project Solving dynamic models: Theory and Applications
Researcher (PI) Felix Egbert Kübler
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary The computation of equilibria in dynamic stochastic general
equilibrium models with heterogeneous agents has become
increasingly important in macroeconomics and public
finance. For a given example-economy, i.e. a given specification of
preferences, technologies and market-arrangements these methods
compute an (approximate) equilibrium and allow for quantitative
statements about one equilibrium of the example-economy.
Through these so-called 'computational experiments'
many economic insights can be obtained by analyzing
quantitative features of realistically calibrated models.
Unfortunately, economists often use ad hoc computational methods
with poorly understood properties that produce approximate solutions
of unknown quality.
The research-project outlined in this proposal
has three goals: Building theoretical foundations
for analyzing dynamic equilibrium models, developing efficient and stable
algorithms for the computation of equilibria in large scale models and
applying these algorithms to macroeconomic policy analysis.
Summary
The computation of equilibria in dynamic stochastic general
equilibrium models with heterogeneous agents has become
increasingly important in macroeconomics and public
finance. For a given example-economy, i.e. a given specification of
preferences, technologies and market-arrangements these methods
compute an (approximate) equilibrium and allow for quantitative
statements about one equilibrium of the example-economy.
Through these so-called 'computational experiments'
many economic insights can be obtained by analyzing
quantitative features of realistically calibrated models.
Unfortunately, economists often use ad hoc computational methods
with poorly understood properties that produce approximate solutions
of unknown quality.
The research-project outlined in this proposal
has three goals: Building theoretical foundations
for analyzing dynamic equilibrium models, developing efficient and stable
algorithms for the computation of equilibria in large scale models and
applying these algorithms to macroeconomic policy analysis.
Max ERC Funding
1 114 800 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym DYSMOIA
Project Dynamic Structural Economic Models: Identification and Estimation
Researcher (PI) Thierry Jean Magnac
Host Institution (HI) FONDATION JEAN-JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary The objective of this project is to enhance knowledge in the construction, identification and estimation of dynamic structural microeconomic models that are used for policy evaluation. This research proposal is built up having specific economic applications in mind as these applications involve inter-temporal trade-offs for a single or several decision makers. It first seeks to develop original identification results in these applications and attaches special attention to partial identification issues and constructive identification results so as to easily derive estimation techniques. In each specific application, empirical estimates using micro-data will then be used to construct and analyse counterfactuals. The whole sequence of original identification, estimation and prediction results aims at enhancing the quality and credibility of economic policy evaluations.
These research questions will be addressed in frameworks in which dynamic choices are continuous such as the ones regarding human capital investments or discrete such as the college choice decisions. This extends to dynamic games as in the analysis of firms' entry into a market.
This research proposal develops micro-econometric analyses devoted to earning dynamics, consumption smoothing and incomplete markets, firms' entry, school matching mechanisms as well as to the dynamics of undergraduate studies and the dynamics of retirement. It involves studies in labor economics, consumer behavior as well as financial econometrics, empirical industrial organization and the economics of education. One last theme of this project is devoted to research in theoretical econometrics analyzing questions derived from the empirical projects. Each empirical project will cross fertilize others and will feed up theoretical econometric analyses related to point or partial identification in various dimensions. In turn, theoretical analyses will inform identification and estimation in each of those specific applications.
Summary
The objective of this project is to enhance knowledge in the construction, identification and estimation of dynamic structural microeconomic models that are used for policy evaluation. This research proposal is built up having specific economic applications in mind as these applications involve inter-temporal trade-offs for a single or several decision makers. It first seeks to develop original identification results in these applications and attaches special attention to partial identification issues and constructive identification results so as to easily derive estimation techniques. In each specific application, empirical estimates using micro-data will then be used to construct and analyse counterfactuals. The whole sequence of original identification, estimation and prediction results aims at enhancing the quality and credibility of economic policy evaluations.
These research questions will be addressed in frameworks in which dynamic choices are continuous such as the ones regarding human capital investments or discrete such as the college choice decisions. This extends to dynamic games as in the analysis of firms' entry into a market.
This research proposal develops micro-econometric analyses devoted to earning dynamics, consumption smoothing and incomplete markets, firms' entry, school matching mechanisms as well as to the dynamics of undergraduate studies and the dynamics of retirement. It involves studies in labor economics, consumer behavior as well as financial econometrics, empirical industrial organization and the economics of education. One last theme of this project is devoted to research in theoretical econometrics analyzing questions derived from the empirical projects. Each empirical project will cross fertilize others and will feed up theoretical econometric analyses related to point or partial identification in various dimensions. In turn, theoretical analyses will inform identification and estimation in each of those specific applications.
Max ERC Funding
1 722 000 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym E-MARS
Project Evolution of Mars
Researcher (PI) Cathy Monique Quantin
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The primary questions that drive the Mars exploration program focus on life. Has the Martian climate ever been favorable for life development? Such scenario would imply a distinct planetary system from today with a magnetic flied able to retain the atmosphere. Where is the evidence of such past climate and intern conditions? The clues for answering these questions are locked up in the geologic record of the planet. The volume of data acquired in the past 15 years by the 4 Martian orbiters (ESA and NASA) reach the petaoctet, what is indecent as regard to the size of the Martian community. e-Mars propose to built a science team composed by the PI, Two post-doctorates, one PhD student and one engineer to exploit the data characterizing the surface of Mars. e-Mars proposes the unprecedented approach to combine topographic data, imagery data in diverse spectral domain and hyperspectral data from multiple orbiter captors to study the evolution of Mars and to propose pertinent landing sites for next missions. e-Mars will focus on three scientific themes: the composition of the Martian crust to constraint the early evolution of the planet, the research of possible habitable places based on evidence of past liquid water activity from both morphological record and hydrated mineral locations, and the study of current climatic and geological processes driven by the CO2 cycle. These scientific themes will be supported by three axis of methodological development: the geodatabase management via Geographic Information Systems (G.I.S.)., the automatic hyperspectral data analysis and the age estimates of planetary surface based on small size crater counts.
Summary
The primary questions that drive the Mars exploration program focus on life. Has the Martian climate ever been favorable for life development? Such scenario would imply a distinct planetary system from today with a magnetic flied able to retain the atmosphere. Where is the evidence of such past climate and intern conditions? The clues for answering these questions are locked up in the geologic record of the planet. The volume of data acquired in the past 15 years by the 4 Martian orbiters (ESA and NASA) reach the petaoctet, what is indecent as regard to the size of the Martian community. e-Mars propose to built a science team composed by the PI, Two post-doctorates, one PhD student and one engineer to exploit the data characterizing the surface of Mars. e-Mars proposes the unprecedented approach to combine topographic data, imagery data in diverse spectral domain and hyperspectral data from multiple orbiter captors to study the evolution of Mars and to propose pertinent landing sites for next missions. e-Mars will focus on three scientific themes: the composition of the Martian crust to constraint the early evolution of the planet, the research of possible habitable places based on evidence of past liquid water activity from both morphological record and hydrated mineral locations, and the study of current climatic and geological processes driven by the CO2 cycle. These scientific themes will be supported by three axis of methodological development: the geodatabase management via Geographic Information Systems (G.I.S.)., the automatic hyperspectral data analysis and the age estimates of planetary surface based on small size crater counts.
Max ERC Funding
1 392 000 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym ECOGAL
Project Star Formation and the Galactic Ecology
Researcher (PI) Ian Bonnell
Host Institution (HI) THE UNIVERSITY COURT OF THE UNIVERSITY OF ST ANDREWS
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary We will construct the first self-consistent models of star formation that follow the galactic scale flows
where molecular clouds form yet still resolve the star formation and feedback events down to sub-parsec scales.
By following the full galactic ecology, the life cycle of gas from the interstellar medium into stars and their radiative and kinematic output back into
the galaxy, we will develop a comprehensive theory of star formation. The link between the large-scale dynamics of the galaxy and the
small-scale star formation provides the ground-breaking nature of this proposal.
Star formation produces a wide range
of outcomes in nearby molecular clouds yet on large scales yields star formation rates that are strongly correlated to galactic-scale gas densities.
These observed properties of star forming galaxies have inspired a plethora of theoretical ideas, but until now there has been
no means of testing these analytical theories.
We will use galactic-disc simulations to determine how molecular clouds form through self-gravity, spiral shocks and/or
cloud-cloud collisions. We will use these self-consistent models of molecular clouds to follow the local gravitational collapse to
form individual stars and stellar clusters.
We will include ionisation, stellar winds and supernovae into the ISM to study how feedback can support
or destroy molecular clouds, as well as triggering successive generations of young stars.
We will also conduct Galactic bulge scale simulations to
model how gas flows into, and star formation occurs in, the Galactic centre.
The primary goals of this proposal are to understand what determines the
local and global rates, efficiencies and products of star formation in galaxies, and to develop
a complete theory of star formation that can be applied to galaxy formation and cosmology.
Summary
We will construct the first self-consistent models of star formation that follow the galactic scale flows
where molecular clouds form yet still resolve the star formation and feedback events down to sub-parsec scales.
By following the full galactic ecology, the life cycle of gas from the interstellar medium into stars and their radiative and kinematic output back into
the galaxy, we will develop a comprehensive theory of star formation. The link between the large-scale dynamics of the galaxy and the
small-scale star formation provides the ground-breaking nature of this proposal.
Star formation produces a wide range
of outcomes in nearby molecular clouds yet on large scales yields star formation rates that are strongly correlated to galactic-scale gas densities.
These observed properties of star forming galaxies have inspired a plethora of theoretical ideas, but until now there has been
no means of testing these analytical theories.
We will use galactic-disc simulations to determine how molecular clouds form through self-gravity, spiral shocks and/or
cloud-cloud collisions. We will use these self-consistent models of molecular clouds to follow the local gravitational collapse to
form individual stars and stellar clusters.
We will include ionisation, stellar winds and supernovae into the ISM to study how feedback can support
or destroy molecular clouds, as well as triggering successive generations of young stars.
We will also conduct Galactic bulge scale simulations to
model how gas flows into, and star formation occurs in, the Galactic centre.
The primary goals of this proposal are to understand what determines the
local and global rates, efficiencies and products of star formation in galaxies, and to develop
a complete theory of star formation that can be applied to galaxy formation and cosmology.
Max ERC Funding
2 210 523 €
Duration
Start date: 2012-05-01, End date: 2018-04-30
Project acronym ECONOMICHISTORY
Project Contracts, Institutions, and Markets in Historical Perspective
Researcher (PI) Maristella Botticini
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary A growing number of scholars are studying the interactions between cultural values, social and religious norms, institutions, and economic outcomes. The rise of markets, as well as the development of contracts that enable mutually beneficial transactions among agents, are one of the central themes in the literature on long-term economic growth.
This project contributes to both strands of literature by studying the invention and development of marine insurance contracts in medieval Italy and their subsequent spread all over Europe. It brings the economic approach to previously unexplored historical data housed in archives in Florence, Genoa, Pisa, Palermo, Prato, and Venice.
The interest in the historical origin and development of marine insurance contracts is twofold. First, marine contracts are the “parents” of all the other insurance contracts (e.g., fire, life, health, etc) that were developed in subsequent centuries to cope with risk. Second, their invention, as well as other innovations in business practices in the Middle Ages, contributed to the growth of international trade in subsequent centuries.
The key novelty of the project stems from combining contract theory with information from thousands of insurance contracts between 1300 and 1550 to explain why marine insurance developed in medieval Italy and then Europe, to study the empirical determinants of insurance contracts in medieval Italy, and to analyze how medieval merchants coped with adverse selection and moral hazard problems.
Most scholars agree that marine insurance was unknown to the ancient world. Italian merchants developed the first insurance contracts and other innovations in business practices during and in the aftermath of the Commercial Revolution that swept Europe from roughly 1275 to about 1325. Marine insurance contracts may have developed as a spin-off of earlier contracts which shifted the risk from one party to another (e.g., sea loan, insurance loan). Alternatively, in the early or mid-fourteenth century, sedentary merchants that provided the capital to travelling merchants invented a new type of contract, when they discovered that the existing contract forms had shortcomings in transferring and dividing sea risk.
A sample of the questions that this project will address includes:
- Why did insurance contracts and a marine insurance market first develop in medieval times and not earlier despite merchants had to deal with the risks associated with maritime trade since antiquity?
- What were the empirical determinants of contract form (e.g., insurance premium) in the medieval insurance market?
- How did medieval merchants compute insurance premia without having the formal notion of probability that was developed only in the mid-seventeenth century?
- How did medieval merchants cope with the typical problems that plague insurance markets, i.e., adverse selection and moral hazard?
Summary
A growing number of scholars are studying the interactions between cultural values, social and religious norms, institutions, and economic outcomes. The rise of markets, as well as the development of contracts that enable mutually beneficial transactions among agents, are one of the central themes in the literature on long-term economic growth.
This project contributes to both strands of literature by studying the invention and development of marine insurance contracts in medieval Italy and their subsequent spread all over Europe. It brings the economic approach to previously unexplored historical data housed in archives in Florence, Genoa, Pisa, Palermo, Prato, and Venice.
The interest in the historical origin and development of marine insurance contracts is twofold. First, marine contracts are the “parents” of all the other insurance contracts (e.g., fire, life, health, etc) that were developed in subsequent centuries to cope with risk. Second, their invention, as well as other innovations in business practices in the Middle Ages, contributed to the growth of international trade in subsequent centuries.
The key novelty of the project stems from combining contract theory with information from thousands of insurance contracts between 1300 and 1550 to explain why marine insurance developed in medieval Italy and then Europe, to study the empirical determinants of insurance contracts in medieval Italy, and to analyze how medieval merchants coped with adverse selection and moral hazard problems.
Most scholars agree that marine insurance was unknown to the ancient world. Italian merchants developed the first insurance contracts and other innovations in business practices during and in the aftermath of the Commercial Revolution that swept Europe from roughly 1275 to about 1325. Marine insurance contracts may have developed as a spin-off of earlier contracts which shifted the risk from one party to another (e.g., sea loan, insurance loan). Alternatively, in the early or mid-fourteenth century, sedentary merchants that provided the capital to travelling merchants invented a new type of contract, when they discovered that the existing contract forms had shortcomings in transferring and dividing sea risk.
A sample of the questions that this project will address includes:
- Why did insurance contracts and a marine insurance market first develop in medieval times and not earlier despite merchants had to deal with the risks associated with maritime trade since antiquity?
- What were the empirical determinants of contract form (e.g., insurance premium) in the medieval insurance market?
- How did medieval merchants compute insurance premia without having the formal notion of probability that was developed only in the mid-seventeenth century?
- How did medieval merchants cope with the typical problems that plague insurance markets, i.e., adverse selection and moral hazard?
Max ERC Funding
1 113 900 €
Duration
Start date: 2012-07-01, End date: 2018-06-30
Project acronym EDECS
Project Exploring Dark Energy through Cosmic Structures: Observational Consequences of Dark Energy Clustering
Researcher (PI) Pier Stefano Corasaniti
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary Understanding the nature of Dark Energy (DE) in the Universe is the central challenge of modern cosmology. Einstein’s Cosmological Constant (Λ) provides the simplest explanation fitting the available cosmological data thus far. However, its unnaturally tuned value indicates that other hypothesis must be explored. Furthermore, current observations do not by any means rule out alternative models in favor of the simplest “concordance” ΛCDM. In the absence of theoretical prejudice, observational tests have mainly focused on the DE equation of state. However, the detection of the inhomogeneous nature of DE will provide smoking-gun evidence that DE is dynamical, ruling out Λ. This key aspect has been mostly overlooked so far, particularly in the optimization design of the next generation of surveys dedicated to DE searches which will map the distribution of matter in the Universe with unprecedented accuracy. The success of these observations relies upon the ability to model the non-linear gravitational processes which affect the collapse of Dark Matter (DM) at small and intermediate scales. Therefore, it is of the highest importance to investigate the role of DE inhomogeneities throughout the non-linear evolution of cosmic structure formation. To achieve this, we will use specifically designed high-resolution numerical simulations and analytical methods to study the non-linear regime in different DE models. The hypothesis to be tested is whether the intrinsic clustering of DE can alter the predictions of the standard ΛCDM model. We will investigate the observational consequences on the DM density field and the properties of DM halos. The results will have a profound impact in the quest for DE and reveal new observable imprints on the distribution of cosmic structures, whose detection may disclose the ultimate origin of the DE phenomenon.
Summary
Understanding the nature of Dark Energy (DE) in the Universe is the central challenge of modern cosmology. Einstein’s Cosmological Constant (Λ) provides the simplest explanation fitting the available cosmological data thus far. However, its unnaturally tuned value indicates that other hypothesis must be explored. Furthermore, current observations do not by any means rule out alternative models in favor of the simplest “concordance” ΛCDM. In the absence of theoretical prejudice, observational tests have mainly focused on the DE equation of state. However, the detection of the inhomogeneous nature of DE will provide smoking-gun evidence that DE is dynamical, ruling out Λ. This key aspect has been mostly overlooked so far, particularly in the optimization design of the next generation of surveys dedicated to DE searches which will map the distribution of matter in the Universe with unprecedented accuracy. The success of these observations relies upon the ability to model the non-linear gravitational processes which affect the collapse of Dark Matter (DM) at small and intermediate scales. Therefore, it is of the highest importance to investigate the role of DE inhomogeneities throughout the non-linear evolution of cosmic structure formation. To achieve this, we will use specifically designed high-resolution numerical simulations and analytical methods to study the non-linear regime in different DE models. The hypothesis to be tested is whether the intrinsic clustering of DE can alter the predictions of the standard ΛCDM model. We will investigate the observational consequences on the DM density field and the properties of DM halos. The results will have a profound impact in the quest for DE and reveal new observable imprints on the distribution of cosmic structures, whose detection may disclose the ultimate origin of the DE phenomenon.
Max ERC Funding
1 468 800 €
Duration
Start date: 2012-04-01, End date: 2017-08-31
Project acronym EGGS
Project The first Galaxies
Researcher (PI) Johan Peter Uldall Fynbo
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary The goal of this project is to discover the first galaxies that formed after the Big Bang. The astrophysics of galaxy formation is deeply fascinating. From tiny density fluctuations of quantum mechanical nature believed to have formed during an inflationary period a tiny fraction of a second after the Big Bang during structure slowly formed through gravitational collapse. This process is strongly dependent on the nature of the dominant, but unknown form of matter - the dark matter. In the project proposed here I will study the epoch of first galaxy formation and the subsequent few billion years of cosmic evolution using gamma-ray bursts and Lyman-α (Lyα) emitting galaxies as probes. I am the principal investigator on two observational projects utilizing these probes. In the first project, I will over three years starting October 2009 be using the new X-shooter spectrograph on the European Southern Observatory Very Large Telescope to build a sample of ~100 gamma-ray bursts with UV/optical/near-IR spectroscopic follow-up. The objective of this project is to measure primarily metallicities, molecular content, and dust content of the gamma-ray burst host galaxies. I am primarily interested in the redshift range from 9 to 2 corresponding to about 500 million years to 3 billions years after the Big Bang. In the 2nd project we will use the new European Southern Observatory survey telescope VISTA. I am co-PI of the Ultra-VISTA project that over the next 5 years starting December 2009 will create an ultradeep image (about 2000 hr of total integration time) of a piece of sky known as the COSMOS field. I am responsible for the part of the project that will use a narrow-band filter to search for Lyα emitting galaxies at a redshift of 8.8 (corresponding to about 500 million years after the Big Bang) - believed to correspond to the epoch of formation of some of the very first galaxies.
Summary
The goal of this project is to discover the first galaxies that formed after the Big Bang. The astrophysics of galaxy formation is deeply fascinating. From tiny density fluctuations of quantum mechanical nature believed to have formed during an inflationary period a tiny fraction of a second after the Big Bang during structure slowly formed through gravitational collapse. This process is strongly dependent on the nature of the dominant, but unknown form of matter - the dark matter. In the project proposed here I will study the epoch of first galaxy formation and the subsequent few billion years of cosmic evolution using gamma-ray bursts and Lyman-α (Lyα) emitting galaxies as probes. I am the principal investigator on two observational projects utilizing these probes. In the first project, I will over three years starting October 2009 be using the new X-shooter spectrograph on the European Southern Observatory Very Large Telescope to build a sample of ~100 gamma-ray bursts with UV/optical/near-IR spectroscopic follow-up. The objective of this project is to measure primarily metallicities, molecular content, and dust content of the gamma-ray burst host galaxies. I am primarily interested in the redshift range from 9 to 2 corresponding to about 500 million years to 3 billions years after the Big Bang. In the 2nd project we will use the new European Southern Observatory survey telescope VISTA. I am co-PI of the Ultra-VISTA project that over the next 5 years starting December 2009 will create an ultradeep image (about 2000 hr of total integration time) of a piece of sky known as the COSMOS field. I am responsible for the part of the project that will use a narrow-band filter to search for Lyα emitting galaxies at a redshift of 8.8 (corresponding to about 500 million years after the Big Bang) - believed to correspond to the epoch of formation of some of the very first galaxies.
Max ERC Funding
1 002 000 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym ELECTRIC CHALLENGES
Project Current Tools and Policy Challenges in Electricity Markets
Researcher (PI) Natalia FABRA PORTELA
Host Institution (HI) UNIVERSIDAD CARLOS III DE MADRID
Call Details Consolidator Grant (CoG), SH1, ERC-2017-COG
Summary The fight against climate change is among Europe’s top policy priorities. In this research agenda, I propose to push out the frontier in the area of Energy and Environmental Economics by carrying out policy-relevant research on a pressing issue: how to design optimal regulatory and market-based solutions to achieve a least cost transition towards a low-carbon economy.
The European experience provides unique natural experiments with which to test some of the most contentious issues that arise in the context of electricity markets, including the potential to change households’ demand patterns through dynamic pricing, the scope of renewables to mitigate market power and depress wholesale market prices, and the design and performance of the auctions for renewable support. While there is a body of policy work on these issues, it generally does not meet the required research standards.
In this research, I will rely on cutting-edge theoretical, empirical, and simulation tools to disentangle these topics, while providing key economic insights that are relevant beyond electricity markets. On the theory front, I propose to develop new models that incorporate the intermittency of renewables to characterize optimal bidding as a key, broadly omitted ingredient in previous analysis. In turn, these models will provide a rigorous structure for the empirical and simulation analysis, which will rely both on traditional econometrics for casual inference as well as on state-of-the-art machine learning methods to construct counterfactual scenarios for policy analysis.
While my focus is on energy and environmental issues, my research will also provide methodological contributions for other areas - particularly those related to policy design and policy evaluation. The conclusions of this research should prove valuable for academics, as well as to policy makers to assess the impact of environmental and energy policies and redefine them where necessary.
Summary
The fight against climate change is among Europe’s top policy priorities. In this research agenda, I propose to push out the frontier in the area of Energy and Environmental Economics by carrying out policy-relevant research on a pressing issue: how to design optimal regulatory and market-based solutions to achieve a least cost transition towards a low-carbon economy.
The European experience provides unique natural experiments with which to test some of the most contentious issues that arise in the context of electricity markets, including the potential to change households’ demand patterns through dynamic pricing, the scope of renewables to mitigate market power and depress wholesale market prices, and the design and performance of the auctions for renewable support. While there is a body of policy work on these issues, it generally does not meet the required research standards.
In this research, I will rely on cutting-edge theoretical, empirical, and simulation tools to disentangle these topics, while providing key economic insights that are relevant beyond electricity markets. On the theory front, I propose to develop new models that incorporate the intermittency of renewables to characterize optimal bidding as a key, broadly omitted ingredient in previous analysis. In turn, these models will provide a rigorous structure for the empirical and simulation analysis, which will rely both on traditional econometrics for casual inference as well as on state-of-the-art machine learning methods to construct counterfactual scenarios for policy analysis.
While my focus is on energy and environmental issues, my research will also provide methodological contributions for other areas - particularly those related to policy design and policy evaluation. The conclusions of this research should prove valuable for academics, as well as to policy makers to assess the impact of environmental and energy policies and redefine them where necessary.
Max ERC Funding
1 422 375 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ELVER
Project Engineering with Logic and Verification: Mathematically Rigorous Engineering for Safe and Secure Computer Systems
Researcher (PI) Peter SEWELL
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Computer systems have become critical to modern society, but they are pervasively subject to security flaws and malicious attacks, with large-scale exposures of confidential data, denial-of-service and ransom attacks, and the threat of nation-state attackers: they are trusted, but are far from trustworthy. This is especially important for the major pan-industry components of our information infrastructure: processors, programming languages, operating systems, etc.
The basic problem is that conventional engineering techniques suffice only to make systems that *usually* work. The usual test-and-debug development methods, with poorly specified abstractions described in prose, lack the mathematical rigour of other engineering disciplines - yet the huge investment in legacy systems and skills makes it hard to improve.
ELVER will develop *mathematically rigorous* methods for specifying, testing, and reasoning about *real systems*, focussed on the core mechanisms used by hardware and software to enforce security boundaries. It will establish mathematical models for the industry ARM architecture, used pervasively in mobile phones and embedded devices, and the CHERI research architecture, which protects against many attacks. Using these, ELVER will build tools for analysis of system software, develop techniques for mathematical proof of safety and security properties, and explore improved systems programming languages. ELVER will build on successful collaborations with ARM, IBM, and the C/C++ ISO standards committees. It will directly impact mainstream processor architectures, languages, and development methods, smoothly complementing existing methods while simultaneously enabling longer-term research towards the gold standard of provably secure systems.
ELVER will thus demonstrate the feasibility and benefits of a more rigorous approach to system engineering, putting future systems on more solid foundations, and hence making them safer and more secure
Summary
Computer systems have become critical to modern society, but they are pervasively subject to security flaws and malicious attacks, with large-scale exposures of confidential data, denial-of-service and ransom attacks, and the threat of nation-state attackers: they are trusted, but are far from trustworthy. This is especially important for the major pan-industry components of our information infrastructure: processors, programming languages, operating systems, etc.
The basic problem is that conventional engineering techniques suffice only to make systems that *usually* work. The usual test-and-debug development methods, with poorly specified abstractions described in prose, lack the mathematical rigour of other engineering disciplines - yet the huge investment in legacy systems and skills makes it hard to improve.
ELVER will develop *mathematically rigorous* methods for specifying, testing, and reasoning about *real systems*, focussed on the core mechanisms used by hardware and software to enforce security boundaries. It will establish mathematical models for the industry ARM architecture, used pervasively in mobile phones and embedded devices, and the CHERI research architecture, which protects against many attacks. Using these, ELVER will build tools for analysis of system software, develop techniques for mathematical proof of safety and security properties, and explore improved systems programming languages. ELVER will build on successful collaborations with ARM, IBM, and the C/C++ ISO standards committees. It will directly impact mainstream processor architectures, languages, and development methods, smoothly complementing existing methods while simultaneously enabling longer-term research towards the gold standard of provably secure systems.
ELVER will thus demonstrate the feasibility and benefits of a more rigorous approach to system engineering, putting future systems on more solid foundations, and hence making them safer and more secure
Max ERC Funding
2 473 844 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym EMBED
Project Embedded Markets and the Economy
Researcher (PI) Matthew ELLIOTT
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary EMBED takes a microeconomic approach to investigating the macroeconomic implications of market transactions being embedded in social relationships. Sociologists and economists have documented the importance of relationships for mediating trade in a wide range of market settings. EMBED seeks to investigate the implications of this for the economy as a whole.
Ethnographic work suggests that relationships foster common understandings which limit opportunistic behaviour. Subproject 1 will develop a first relational contacting theory of networked markets to capture this, and test these predictions using data from the Bundesbank. Formally modelling dynamic business-relationships, these relationships can be viewed as social capital. We will investigate whether this social capital is destroyed by economic shocks, and if so how long it takes to rebuild.
Subproject 2 will run a field experiment. We will intervene in a networked market to create new relationships in a variety of ways. The varying success of these approaches will help us better understand the role of relationships in markets. Moreover, as a result we’ll get exogenous variation in the market structure that will help identity the affects relationships have on market outcomes.
Subproject 3 will explore frictions in the clearing of networked markets. As the data requirements to empirically test between different theories are extremely demanding, laboratory experiments will be run. Breaking convention, these experiments will be protocol-free, although interactions will be closely monitored. This will create a more level playing field for testing different theories while also creating scope for the market to develop efficiency enhancing norms.
Subproject 4 will examine firm level multi-sourcing and production technology decisions, and how these feed into the creation of supply chains. The fragility of these supply chains will be investigated and equilibrium supply chains compared across countries.
Summary
EMBED takes a microeconomic approach to investigating the macroeconomic implications of market transactions being embedded in social relationships. Sociologists and economists have documented the importance of relationships for mediating trade in a wide range of market settings. EMBED seeks to investigate the implications of this for the economy as a whole.
Ethnographic work suggests that relationships foster common understandings which limit opportunistic behaviour. Subproject 1 will develop a first relational contacting theory of networked markets to capture this, and test these predictions using data from the Bundesbank. Formally modelling dynamic business-relationships, these relationships can be viewed as social capital. We will investigate whether this social capital is destroyed by economic shocks, and if so how long it takes to rebuild.
Subproject 2 will run a field experiment. We will intervene in a networked market to create new relationships in a variety of ways. The varying success of these approaches will help us better understand the role of relationships in markets. Moreover, as a result we’ll get exogenous variation in the market structure that will help identity the affects relationships have on market outcomes.
Subproject 3 will explore frictions in the clearing of networked markets. As the data requirements to empirically test between different theories are extremely demanding, laboratory experiments will be run. Breaking convention, these experiments will be protocol-free, although interactions will be closely monitored. This will create a more level playing field for testing different theories while also creating scope for the market to develop efficiency enhancing norms.
Subproject 4 will examine firm level multi-sourcing and production technology decisions, and how these feed into the creation of supply chains. The fragility of these supply chains will be investigated and equilibrium supply chains compared across countries.
Max ERC Funding
1 449 106 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym EMPCONSFIN
Project Empirical Analyses of Markets for Consumer Financial Products and their Effects
Researcher (PI) Alessandro GAVAZZA
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Consolidator Grant (CoG), SH1, ERC-2017-COG
Summary This proposal presents three broad projects on information frictions in households' credit markets and on the
consequences of these frictions for durable good markets. Specifically, an influential theoretical literature in
information economics has shown that borrowing constraints can arise in equilibrium when borrowers and
lenders have asymmetric information about borrowers' risks. Hence, the first project aims to provide the first
empirical analyses of markets (i.e., demand and supply) with asymmetric information and nonexclusive
trades---i.e., markets in which households can purchase multiple insurance contracts, such as in life
insurance markets, or can open multiple credit lines, such as in credit card markets. The second project aims
to study recent regulations of fees and prices in markets for consumer financial products, such as mortgages,
that could have the unintended consequences of increasing households' cost of credit and, thus, of tightening
their borrowing constraints. Finally, the third project aims to study the role of borrowing constraints in
durable goods markets, with a special focus on car markets during the Great Recession.
All these projects aim to develop and estimate structural models using data from different markets. I further
plan to use the estimated structural parameters to perform counterfactual policy analyses in each of the
specific markets analyzed in these projects.
Summary
This proposal presents three broad projects on information frictions in households' credit markets and on the
consequences of these frictions for durable good markets. Specifically, an influential theoretical literature in
information economics has shown that borrowing constraints can arise in equilibrium when borrowers and
lenders have asymmetric information about borrowers' risks. Hence, the first project aims to provide the first
empirical analyses of markets (i.e., demand and supply) with asymmetric information and nonexclusive
trades---i.e., markets in which households can purchase multiple insurance contracts, such as in life
insurance markets, or can open multiple credit lines, such as in credit card markets. The second project aims
to study recent regulations of fees and prices in markets for consumer financial products, such as mortgages,
that could have the unintended consequences of increasing households' cost of credit and, thus, of tightening
their borrowing constraints. Finally, the third project aims to study the role of borrowing constraints in
durable goods markets, with a special focus on car markets during the Great Recession.
All these projects aim to develop and estimate structural models using data from different markets. I further
plan to use the estimated structural parameters to perform counterfactual policy analyses in each of the
specific markets analyzed in these projects.
Max ERC Funding
1 550 945 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym EPOCHS
Project The Formation of the First Galaxies and Reionization with the James Webb Space Telescope
Researcher (PI) CHRISTOPHER CONSELICE
Host Institution (HI) THE UNIVERSITY OF NOTTINGHAM
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Within the first few hundred million years after the Big-Bang the first galaxies and stars were born. Sometime soon after, these first objects produced enough energetic photons to reionization the neutral gas in the universe. This frontier of early galaxy assembly has not yet been observed, but will be uncovered by deep imaging and spectroscopy taken with the James Webb Space Telescope (JWST). Key problems include: how the very first galaxies were assembled, and evolved, in their first few Gyr, and the history of reionization. With this ERC funded EPOCHS project I will lead a major effort to investigate these questions using JWST GTO time discovering galaxies before, during, and after the epoch of reionization. This proposal has three interconnected and complementary themes: (i) Identifying the first galaxies and characterizing their UV luminosities, stellar masses, and star formation rates at 7<z<12. JWST imaging and spectroscopy will allow us to make significant progress beyond the current state of the art, and to use these measures to test models of the earliest galaxy assembly. (ii) Using these galaxies we will map the process of reionization: the sources of it, and the time-scale of its onset and duration. Using new diagnostics we will address uncertainties that currently plague this calculation, including escape fractions and the number of ionizing photons, using UV emission lines, spectral shapes, and measuring hardness ratios with radiative transfer models. (iii) We will measure the rest-frame optical structures of galaxies at 3<z<7 to reveal the formation modes of galaxies when they assembled their first masses and structures. We will determine how and when compact galaxies, mergers, dissipative formation in star forming disks, and the formation of bulges and disks are occurring. This includes measuring the formation history of internal components in 3<z<7 galaxies, allowing us to examine how quenching is occurring ‘inside-out’ or ‘outside-in’.
Summary
Within the first few hundred million years after the Big-Bang the first galaxies and stars were born. Sometime soon after, these first objects produced enough energetic photons to reionization the neutral gas in the universe. This frontier of early galaxy assembly has not yet been observed, but will be uncovered by deep imaging and spectroscopy taken with the James Webb Space Telescope (JWST). Key problems include: how the very first galaxies were assembled, and evolved, in their first few Gyr, and the history of reionization. With this ERC funded EPOCHS project I will lead a major effort to investigate these questions using JWST GTO time discovering galaxies before, during, and after the epoch of reionization. This proposal has three interconnected and complementary themes: (i) Identifying the first galaxies and characterizing their UV luminosities, stellar masses, and star formation rates at 7<z<12. JWST imaging and spectroscopy will allow us to make significant progress beyond the current state of the art, and to use these measures to test models of the earliest galaxy assembly. (ii) Using these galaxies we will map the process of reionization: the sources of it, and the time-scale of its onset and duration. Using new diagnostics we will address uncertainties that currently plague this calculation, including escape fractions and the number of ionizing photons, using UV emission lines, spectral shapes, and measuring hardness ratios with radiative transfer models. (iii) We will measure the rest-frame optical structures of galaxies at 3<z<7 to reveal the formation modes of galaxies when they assembled their first masses and structures. We will determine how and when compact galaxies, mergers, dissipative formation in star forming disks, and the formation of bulges and disks are occurring. This includes measuring the formation history of internal components in 3<z<7 galaxies, allowing us to examine how quenching is occurring ‘inside-out’ or ‘outside-in’.
Max ERC Funding
1 951 138 €
Duration
Start date: 2020-05-01, End date: 2025-04-30
Project acronym ESCADA
Project Energy-optimized Symmetric Cryptography by Algebraic Duality Analysis
Researcher (PI) Joan DAEMEN
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary The main scientific contribution of this project will be a breakthrough in the understanding of cryptanalytic and side channel attacks of symmetric cryptosystems. We will do this by a unification of attacks that will a stepping stone to the holy grail of symmetric cryptography: provable security of concrete cryptosystems. The main real-world impact is that we will build cryptosystems that are much more efficient than those used today while having the same strength. Depending on the platform, higher efficiency translates to lower energy/power (in-body sensors, contactless payment cards etc.), but also lower latency (authentication for e.g car brakes or airbags) and/or lower heat dissipation (on-the-fly encryption of high bandwidth data streams). In a software implementation it simply means less CPU cycles per byte.
We build our cryptosystems as modes, on top of block ciphers or permutations. For these primitives we adopt the classical technique of iterating a simple round function (more rounds means more security but less efficiency). We focus on round functions of algebraic degree 2. Their relative simplicity will allow a unification of all cryptanalytic attacks that exploit propagation of affine varieties and polynomial ideals (their dual) through the rounds and to precisely estimate their success rates. Moreover, we will design modes that strongly restrict the exposure of the primitive(s) to attackers and that permit security reductions to specific properties of the underlying primitive(s) in a formally verifiable way. In comparison to the classical pseudorandom and ideal permutation models, this will allow reducing the number of rounds while preserving security with high assurance. We will also study side channel attacks of our round functions and ways to defend against them. We will make ASIC prototypes and implement novel efficient countermeasures against side channel attacks and use this to evaluate their effectiveness in practice.
Summary
The main scientific contribution of this project will be a breakthrough in the understanding of cryptanalytic and side channel attacks of symmetric cryptosystems. We will do this by a unification of attacks that will a stepping stone to the holy grail of symmetric cryptography: provable security of concrete cryptosystems. The main real-world impact is that we will build cryptosystems that are much more efficient than those used today while having the same strength. Depending on the platform, higher efficiency translates to lower energy/power (in-body sensors, contactless payment cards etc.), but also lower latency (authentication for e.g car brakes or airbags) and/or lower heat dissipation (on-the-fly encryption of high bandwidth data streams). In a software implementation it simply means less CPU cycles per byte.
We build our cryptosystems as modes, on top of block ciphers or permutations. For these primitives we adopt the classical technique of iterating a simple round function (more rounds means more security but less efficiency). We focus on round functions of algebraic degree 2. Their relative simplicity will allow a unification of all cryptanalytic attacks that exploit propagation of affine varieties and polynomial ideals (their dual) through the rounds and to precisely estimate their success rates. Moreover, we will design modes that strongly restrict the exposure of the primitive(s) to attackers and that permit security reductions to specific properties of the underlying primitive(s) in a formally verifiable way. In comparison to the classical pseudorandom and ideal permutation models, this will allow reducing the number of rounds while preserving security with high assurance. We will also study side channel attacks of our round functions and ways to defend against them. We will make ASIC prototypes and implement novel efficient countermeasures against side channel attacks and use this to evaluate their effectiveness in practice.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym ESEARCH
Project Direct Empirical Evidence on Labor Market Search Theories
Researcher (PI) Thomas LE BARBANCHON
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary Our project proposes to provide new empirical evidence on the search strategies of both job seekers and of recruiters in the labor market. This evidence will enhance our understanding of the information asymmetries at the root of search frictions.
We will leverage the extraordinary opportunities offered by online job boards, which record search activities in details. We will match for the first time these data with administrative data from unemployment-employment registers. This will enable us to jointly observe search activity and core economic outcomes (wage, job duration) on very large samples.
We will design randomized controlled trials, where we recommend new matches to both job seekers and recruiters. This will test for the extent of geographical and skill mismatch in the labor market. We will further test the assumptions of directed search models by displaying to job seekers the real-time length of the queue in front of vacancies. Finally, we will use new item-to-item collaborative filtering algorithms (amazon-type recommendations) to quantify the social value of the private information that job seekers gather when they screen vacancies.
Using quasi-experimental research designs, we will provide the first precise estimates of the direct and cross effects of search subsidies - unemployment insurance and reduction in vacancy advertising costs - on the search strategies of both sides of the market. We will then test the empirical relevance of behavioral mechanisms, such as reference-dependence or over-optimism.
We expect our direct empirical evidence on search strategies to trigger new developments in search theories. Our results will guide policy-makers who design job boards and search subsidies to both recruiters and job seekers. We hope that the social impact of our research will be to reduce frictional unemployment and to increase the productivity of workers through a reduction of mismatch in the labor market.
Summary
Our project proposes to provide new empirical evidence on the search strategies of both job seekers and of recruiters in the labor market. This evidence will enhance our understanding of the information asymmetries at the root of search frictions.
We will leverage the extraordinary opportunities offered by online job boards, which record search activities in details. We will match for the first time these data with administrative data from unemployment-employment registers. This will enable us to jointly observe search activity and core economic outcomes (wage, job duration) on very large samples.
We will design randomized controlled trials, where we recommend new matches to both job seekers and recruiters. This will test for the extent of geographical and skill mismatch in the labor market. We will further test the assumptions of directed search models by displaying to job seekers the real-time length of the queue in front of vacancies. Finally, we will use new item-to-item collaborative filtering algorithms (amazon-type recommendations) to quantify the social value of the private information that job seekers gather when they screen vacancies.
Using quasi-experimental research designs, we will provide the first precise estimates of the direct and cross effects of search subsidies - unemployment insurance and reduction in vacancy advertising costs - on the search strategies of both sides of the market. We will then test the empirical relevance of behavioral mechanisms, such as reference-dependence or over-optimism.
We expect our direct empirical evidence on search strategies to trigger new developments in search theories. Our results will guide policy-makers who design job boards and search subsidies to both recruiters and job seekers. We hope that the social impact of our research will be to reduce frictional unemployment and to increase the productivity of workers through a reduction of mismatch in the labor market.
Max ERC Funding
1 250 250 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym ESEMO
Project Estimation of General Equilibrium Labor Market Search Models
Researcher (PI) Claudio Michelacci
Host Institution (HI) Istituto Einaudi per l'Economia e la Finanza
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary "My proposal deals with the estimation of Dynamic Stochastic General Equilibrium models with important heterogeneity at the level of firms and households and frictions in the labor market. In the estimation I will exploit mixed frequency data (monthly, quarterly and annual) available in different countries. I will also efficiently take care of possible missing values in the data. This might require developing new estimation techniques. The contribution of the project will be in dealing with important empirically relevant questions. I will address issues that lie at the boundaries between labour economics, business cycle analysis, monetary economics, finance, and growth. In particular I will answer the following questions:
1. How are business cycle costs distributed across different individuals? How costly is involuntary unemployment?
2. Which view best characterizes the process of technology adoption at business cycle frequencies? In particular does Schumpeterian creative destruction play a role in characterizing the adoption of new technologies over the business cycle?
3. What are the welfare costs of the search inefficiencies present in the process of worker reallocation over the business cycle?
4. What are the sources of business cycle fluctuations? And in particular are technology shocks an important driving force?
5. What are the contribution of the job separation rate and the importance of the intensive margin relative to the extensive margin in characterizing aggregate labor market fluctuations?
6. What are the main differences in the cyclical properties of the labor market across the OECD? And which institutions explain these differences?
7. What are the effects of financial sector shocks? And why has the Beveridge curve shifted during the last world wide recession?
8. How policy should respond to the large variation in unemployment risk that individual workers face over their life cycle?"
Summary
"My proposal deals with the estimation of Dynamic Stochastic General Equilibrium models with important heterogeneity at the level of firms and households and frictions in the labor market. In the estimation I will exploit mixed frequency data (monthly, quarterly and annual) available in different countries. I will also efficiently take care of possible missing values in the data. This might require developing new estimation techniques. The contribution of the project will be in dealing with important empirically relevant questions. I will address issues that lie at the boundaries between labour economics, business cycle analysis, monetary economics, finance, and growth. In particular I will answer the following questions:
1. How are business cycle costs distributed across different individuals? How costly is involuntary unemployment?
2. Which view best characterizes the process of technology adoption at business cycle frequencies? In particular does Schumpeterian creative destruction play a role in characterizing the adoption of new technologies over the business cycle?
3. What are the welfare costs of the search inefficiencies present in the process of worker reallocation over the business cycle?
4. What are the sources of business cycle fluctuations? And in particular are technology shocks an important driving force?
5. What are the contribution of the job separation rate and the importance of the intensive margin relative to the extensive margin in characterizing aggregate labor market fluctuations?
6. What are the main differences in the cyclical properties of the labor market across the OECD? And which institutions explain these differences?
7. What are the effects of financial sector shocks? And why has the Beveridge curve shifted during the last world wide recession?
8. How policy should respond to the large variation in unemployment risk that individual workers face over their life cycle?"
Max ERC Funding
1 659 169 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym EVALIDEA
Project Designing Institutions to Evaluate Ideas
Researcher (PI) Marco Maria Ottaviani
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary "Not all new ideas are equally valuable from a social perspective. As the readers of this document know all too well, “picking the winners” is challenging because innovations have highly uncertain outcomes.
The aim of this research project is to develop a general theoretical framework to investigate the design of institutions and mechanisms for evaluating new ideas and innovations. The proposed framework examine how these institutions function, draw parallels between them and suggest changes to ensure more accurate evaluation of ideas. Accurate evaluation is of paramount importance because in order for an idea to be successful, it is not enough that the idea be good. It is also necessary that the idea is recognized as good by those who evaluate it. As the evaluation process becomes more accurate, good ideas are more likely to be funded and incentives for the creation of good ideas are enhanced.
A key contribution of our framework is a characterization of the role played by evaluating institutions in overcoming the inefficiencies resulting from decentralized interaction. These institutions act as intermediaries between innovators and users, and thus are able to redress the market failure resulting when ideas are evaluated in a more decentralized way.
By viewing evaluating institutions through a common lens, we perform a comparative analysis of the workings of such diverse institutions as research funding bodies, government regulators, and screening panels of venture capitalists. With techniques from the economic theory of mechanism design, we intend to characterize the best institutional design for idea evaluation. We then compare this ideal benchmark with actual institutions and characterize how it depends on the primitive ingredients of the environment. Lastly, through empirical and experimental testing, we propose changes to institution design parameters and suggest modifications to the mechanisms created for the purpose of evaluating ideas."
Summary
"Not all new ideas are equally valuable from a social perspective. As the readers of this document know all too well, “picking the winners” is challenging because innovations have highly uncertain outcomes.
The aim of this research project is to develop a general theoretical framework to investigate the design of institutions and mechanisms for evaluating new ideas and innovations. The proposed framework examine how these institutions function, draw parallels between them and suggest changes to ensure more accurate evaluation of ideas. Accurate evaluation is of paramount importance because in order for an idea to be successful, it is not enough that the idea be good. It is also necessary that the idea is recognized as good by those who evaluate it. As the evaluation process becomes more accurate, good ideas are more likely to be funded and incentives for the creation of good ideas are enhanced.
A key contribution of our framework is a characterization of the role played by evaluating institutions in overcoming the inefficiencies resulting from decentralized interaction. These institutions act as intermediaries between innovators and users, and thus are able to redress the market failure resulting when ideas are evaluated in a more decentralized way.
By viewing evaluating institutions through a common lens, we perform a comparative analysis of the workings of such diverse institutions as research funding bodies, government regulators, and screening panels of venture capitalists. With techniques from the economic theory of mechanism design, we intend to characterize the best institutional design for idea evaluation. We then compare this ideal benchmark with actual institutions and characterize how it depends on the primitive ingredients of the environment. Lastly, through empirical and experimental testing, we propose changes to institution design parameters and suggest modifications to the mechanisms created for the purpose of evaluating ideas."
Max ERC Funding
1 142 200 €
Duration
Start date: 2012-06-01, End date: 2017-05-31
Project acronym EvolvingEconomics
Project Human motivation: evolutionary foundations and their implications for economics
Researcher (PI) Karin Ingela Maria ALGER
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), SH1, ERC-2017-ADG
Summary Economics provides decision-makers with powerful tools to analyse a wide range of issues. The methodological unity of the discipline and its quest for a general understanding of market as well as non-market interactions have given the discipline great influence on policy. A core component of economics is its assumption that individuals act as if they each had some goal function that they seek to maximise, under the constraints they face and the information they have.
Despite significant advances in behavioural economics, there still is no consensus as to whether and why certain preferences are more likely than others. Further progress could be made if the factors that shape human motivation in the first place were understood. The aim of this project is to produce novel insights about such factors, by establishing evolutionary foundations of human motivation.The project's scope is ambitious. First, it will study two large classes of interactions: strategic interactions, and interactions within the realm of the family. Second, to obtain both depth and breadth of insights, it will consist of four different, but inter-related, components (three theoretical and one empirical), the ultimate goal being to significantly enhance our overall understanding of the factors that shape human motivation.
The methodology is ground-breaking in that it is strongly interdisciplinary. Parts of the body of knowledge built by biologists and evolutionary anthropologists in the past decades will be combined with state-of-the-art economics to produce insights that cannot be obtained within any single discipline. Focus will nonetheless be on addressing issues of importance for economists.The proposed research builds on extensive work done by the PI in the past decade. It will benefit from the years that the PI has invested in understanding the biology and the evolutionary anthropology literatures, and in contributing towards building an interdisciplinary research ecosystem in Toulouse, France
Summary
Economics provides decision-makers with powerful tools to analyse a wide range of issues. The methodological unity of the discipline and its quest for a general understanding of market as well as non-market interactions have given the discipline great influence on policy. A core component of economics is its assumption that individuals act as if they each had some goal function that they seek to maximise, under the constraints they face and the information they have.
Despite significant advances in behavioural economics, there still is no consensus as to whether and why certain preferences are more likely than others. Further progress could be made if the factors that shape human motivation in the first place were understood. The aim of this project is to produce novel insights about such factors, by establishing evolutionary foundations of human motivation.The project's scope is ambitious. First, it will study two large classes of interactions: strategic interactions, and interactions within the realm of the family. Second, to obtain both depth and breadth of insights, it will consist of four different, but inter-related, components (three theoretical and one empirical), the ultimate goal being to significantly enhance our overall understanding of the factors that shape human motivation.
The methodology is ground-breaking in that it is strongly interdisciplinary. Parts of the body of knowledge built by biologists and evolutionary anthropologists in the past decades will be combined with state-of-the-art economics to produce insights that cannot be obtained within any single discipline. Focus will nonetheless be on addressing issues of importance for economists.The proposed research builds on extensive work done by the PI in the past decade. It will benefit from the years that the PI has invested in understanding the biology and the evolutionary anthropology literatures, and in contributing towards building an interdisciplinary research ecosystem in Toulouse, France
Max ERC Funding
1 550 891 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym ExoAI
Project Deciphering super-Earths using Artificial Intelligence
Researcher (PI) Ingo WALDMANN
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary The discovery of extrasolar planets - i.e. planets orbiting other stars - has fundamentally transformed our understanding of planets, solar systems and our place in the Milky Way. Recent discoveries have shown that planets between 1-2 R are the most abundant in our galaxy, so called super-Earths. Yet, they are entirely absent from our own solar system. Their nature, chemistry, formation histories or climate remain very much a mystery. Estimates of their densities suggest a variety of possible planet types and formation/evolution scenarios but current degeneracies cannot be broken with mass/radius measures alone. Spectroscopy of their atmospheres can provide vital insight. Recently, the first atmosphere around a super-Earth, 55 Cnc e, was discovered, showcasing that these worlds are far more complex than simple densities allow us to constrain.
To achieve a more fundamental understanding, we need to move away from the status quo of treating individual planets as case-studies and analysing data ‘by hand’. A globally encompassing, self-consistent and self-calibrating approach is required. Here, I propose to move the field a significant step towards this goal with the ExoAI (Exoplanet Artificial Intelligence) framework. ExoAI will use state-of-the-art neural networks and Bayesian atmospheric retrieval algorithms applied to big-data. Given all available data of an instrument, ExoAI will autonomously learn the best calibration strategy, intelligently recognise spectral features and provide a full quantitative atmospheric model for every planet observed. This uniformly derived catalogue of super-Earth atmospheric models, will move us on from the individual case-studies and allow us to study the larger picture. We will constrain the underlying processes of planet formation/migration and bulk chemistries of super-Earths. The algorithm and the catalogue of atmospheric and instrument models will be made freely available to the community.
Summary
The discovery of extrasolar planets - i.e. planets orbiting other stars - has fundamentally transformed our understanding of planets, solar systems and our place in the Milky Way. Recent discoveries have shown that planets between 1-2 R are the most abundant in our galaxy, so called super-Earths. Yet, they are entirely absent from our own solar system. Their nature, chemistry, formation histories or climate remain very much a mystery. Estimates of their densities suggest a variety of possible planet types and formation/evolution scenarios but current degeneracies cannot be broken with mass/radius measures alone. Spectroscopy of their atmospheres can provide vital insight. Recently, the first atmosphere around a super-Earth, 55 Cnc e, was discovered, showcasing that these worlds are far more complex than simple densities allow us to constrain.
To achieve a more fundamental understanding, we need to move away from the status quo of treating individual planets as case-studies and analysing data ‘by hand’. A globally encompassing, self-consistent and self-calibrating approach is required. Here, I propose to move the field a significant step towards this goal with the ExoAI (Exoplanet Artificial Intelligence) framework. ExoAI will use state-of-the-art neural networks and Bayesian atmospheric retrieval algorithms applied to big-data. Given all available data of an instrument, ExoAI will autonomously learn the best calibration strategy, intelligently recognise spectral features and provide a full quantitative atmospheric model for every planet observed. This uniformly derived catalogue of super-Earth atmospheric models, will move us on from the individual case-studies and allow us to study the larger picture. We will constrain the underlying processes of planet formation/migration and bulk chemistries of super-Earths. The algorithm and the catalogue of atmospheric and instrument models will be made freely available to the community.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym EXOKLEIN
Project The Climates and Habitability of Small Exoplanets Around Red Stars
Researcher (PI) Kevin HENG
Host Institution (HI) UNIVERSITAET BERN
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of life beyond our Solar System is possible only via the remote sensing of the atmospheres of exoplanets. The recent discovery that small exoplanets are common around cool, red stars offers an exciting opportunity to study the atmospheres of Earth-like worlds. Motivated by this revelation, the EXOKLEIN project proposes to construct a holistic climate framework to understand astronomical observations in the context of the atmosphere, geochemistry and biosignatures of the exoplanet. The proposed research is divided into three major themes. Research Theme 1 aims to construct a virtual laboratory of an atmosphere that considers atmospheric dynamics, chemistry and radiation, as well as how they interact. This virtual laboratory enables us to understand the physical and chemical mechanisms involved, as well as predict the observed properties of an exoplanet. Research Theme 2 aims to generalize the carbonate-silicate cycle (also known as the long-term carbon cycle) by considering variations in rock composition, water acidity and atmospheric conditions. The carbonate-silicate cycle is important because it regulates the long-term presence of carbon dioxide (a vital greenhouse gas) in atmospheres. We also aim to investigate the role of the cycle in determining the fates of ocean-dominated exoplanets called “water worlds”. Research Theme 3 aims to investigate the long-term stability of biosignature gases in the context of the climate. Whether a gas uniquely indicates the presence of biology on an exoplanet depends on the atmospheric properties and ultraviolet radiation environment. We investigate three prime candidates for biosignature gases: methyl chloride, dimethylsulfide and ammonia. Overall, the EXOKLEIN project will significantly advance our understanding of whether the environments of rocky exoplanets around red stars are stable and conducive for life, and whether the tell-tale signatures of life may be detected by astronomers.
Summary
The detection of life beyond our Solar System is possible only via the remote sensing of the atmospheres of exoplanets. The recent discovery that small exoplanets are common around cool, red stars offers an exciting opportunity to study the atmospheres of Earth-like worlds. Motivated by this revelation, the EXOKLEIN project proposes to construct a holistic climate framework to understand astronomical observations in the context of the atmosphere, geochemistry and biosignatures of the exoplanet. The proposed research is divided into three major themes. Research Theme 1 aims to construct a virtual laboratory of an atmosphere that considers atmospheric dynamics, chemistry and radiation, as well as how they interact. This virtual laboratory enables us to understand the physical and chemical mechanisms involved, as well as predict the observed properties of an exoplanet. Research Theme 2 aims to generalize the carbonate-silicate cycle (also known as the long-term carbon cycle) by considering variations in rock composition, water acidity and atmospheric conditions. The carbonate-silicate cycle is important because it regulates the long-term presence of carbon dioxide (a vital greenhouse gas) in atmospheres. We also aim to investigate the role of the cycle in determining the fates of ocean-dominated exoplanets called “water worlds”. Research Theme 3 aims to investigate the long-term stability of biosignature gases in the context of the climate. Whether a gas uniquely indicates the presence of biology on an exoplanet depends on the atmospheric properties and ultraviolet radiation environment. We investigate three prime candidates for biosignature gases: methyl chloride, dimethylsulfide and ammonia. Overall, the EXOKLEIN project will significantly advance our understanding of whether the environments of rocky exoplanets around red stars are stable and conducive for life, and whether the tell-tale signatures of life may be detected by astronomers.
Max ERC Funding
1 984 729 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym EXPRESSIVE
Project EXPloring REsponsive Shapes for Seamless desIgn of Virtual Environments
be retained
Researcher (PI) Marie-Paule Renée Cani
Host Institution (HI) INSTITUT POLYTECHNIQUE DE GRENOBLE
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary Despite our great expressive skills, we humans lack an easy way of communicating the 3D shapes we imagine, and even more so when it comes to dynamic shapes. Over centuries humans used drawing and sculpture to convey shapes. These tools require significant expertise and time investment, especially if one aims to describe complex or dynamic shapes. With the advent of virtual environments one would expect digital modeling to replace these traditional tools. Unfortunately, conventional techniques in the area have failed, since even trained computer artists still create with traditional media and only use the computer to reproduce already designed content.
Could digital media be turned into a tool, even more expressive and simpler to use than a pen, to convey and refine both static and dynamic 3D shapes? This is the goal of this project. Achieving it will make shape design directly possible in virtual form, from early drafting to progressive refinement and finalization of an idea. To this end, models for shape and motion need to be totally rethought from a user-centered perspective . Specifically, we propose the new paradigm of responsive 3D shapes – a novel representation separating morphology from isometric embedding – to define high-level, dynamic 3D content that takes form, is refined, moves and deforms based on user intent, expressed through intuitive interaction gestures.
Scientifically, while the problem we address belongs to Computer Graphics, it calls for a new convergence with Geometry, Simulation and Human Computer Interaction. In terms of impact, the resulting “expressive virtual pen” for 3D content will not only serve the needs of artists, but also of scientists and engineers willing to refine their thoughts by interacting with prototypes of their objects of study, educators and media aiming at quickly conveying their ideas, as well as anyone willing to communicate a 3D shape This project thus opens up new horizons for science, technology and society.
Summary
Despite our great expressive skills, we humans lack an easy way of communicating the 3D shapes we imagine, and even more so when it comes to dynamic shapes. Over centuries humans used drawing and sculpture to convey shapes. These tools require significant expertise and time investment, especially if one aims to describe complex or dynamic shapes. With the advent of virtual environments one would expect digital modeling to replace these traditional tools. Unfortunately, conventional techniques in the area have failed, since even trained computer artists still create with traditional media and only use the computer to reproduce already designed content.
Could digital media be turned into a tool, even more expressive and simpler to use than a pen, to convey and refine both static and dynamic 3D shapes? This is the goal of this project. Achieving it will make shape design directly possible in virtual form, from early drafting to progressive refinement and finalization of an idea. To this end, models for shape and motion need to be totally rethought from a user-centered perspective . Specifically, we propose the new paradigm of responsive 3D shapes – a novel representation separating morphology from isometric embedding – to define high-level, dynamic 3D content that takes form, is refined, moves and deforms based on user intent, expressed through intuitive interaction gestures.
Scientifically, while the problem we address belongs to Computer Graphics, it calls for a new convergence with Geometry, Simulation and Human Computer Interaction. In terms of impact, the resulting “expressive virtual pen” for 3D content will not only serve the needs of artists, but also of scientists and engineers willing to refine their thoughts by interacting with prototypes of their objects of study, educators and media aiming at quickly conveying their ideas, as well as anyone willing to communicate a 3D shape This project thus opens up new horizons for science, technology and society.
Max ERC Funding
2 498 116 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym EXPROTEA
Project Exploring Relations in Structured Data with Functional Maps
Researcher (PI) Maksims OVSJANIKOVS
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary We propose to lay the theoretical foundations and design efficient computational methods for analyzing, quantifying and exploring relations and variability in structured data sets, such as collections of geometric shapes, point clouds, and large networks or graphs, among others. Unlike existing methods that are tied and often limited to the underlying data representation, our goal is to design a unified framework in which variability can be processed in a way that is largely agnostic to the underlying data type.
In particular, we propose to depart from the standard representations of objects as collections of primitives, such as points or triangles, and instead to treat them as functional spaces that can be easily manipulated and analyzed. Since real-valued functions can be defined on a wide variety of data representations and as they enjoy a rich algebraic structure, such an approach can provide a completely novel unified framework for representing and processing different types of data. Key to our study will be the exploration of relations and variability between objects, which can be expressed as operators acting on functions and thus treated and analyzed as objects in their own right using the vast number of tools from functional analysis in theory and numerical linear algebra in practice.
Such a unified computational framework of variability will enable entirely novel applications including accurate shape matching, efficiently tracking and highlighting most relevant changes in evolving systems, such as dynamic graphs, and analysis of shape collections. Thus, it will permit not only to compare or cluster objects, but also to reveal where and how they are different and what makes instances unique, which can be especially useful in medical imaging applications. Ultimately, we expect our study to create to a new rigorous, unified paradigm for computational variability, providing a common language and sets of tools applicable across diverse underlying domains.
Summary
We propose to lay the theoretical foundations and design efficient computational methods for analyzing, quantifying and exploring relations and variability in structured data sets, such as collections of geometric shapes, point clouds, and large networks or graphs, among others. Unlike existing methods that are tied and often limited to the underlying data representation, our goal is to design a unified framework in which variability can be processed in a way that is largely agnostic to the underlying data type.
In particular, we propose to depart from the standard representations of objects as collections of primitives, such as points or triangles, and instead to treat them as functional spaces that can be easily manipulated and analyzed. Since real-valued functions can be defined on a wide variety of data representations and as they enjoy a rich algebraic structure, such an approach can provide a completely novel unified framework for representing and processing different types of data. Key to our study will be the exploration of relations and variability between objects, which can be expressed as operators acting on functions and thus treated and analyzed as objects in their own right using the vast number of tools from functional analysis in theory and numerical linear algebra in practice.
Such a unified computational framework of variability will enable entirely novel applications including accurate shape matching, efficiently tracking and highlighting most relevant changes in evolving systems, such as dynamic graphs, and analysis of shape collections. Thus, it will permit not only to compare or cluster objects, but also to reveal where and how they are different and what makes instances unique, which can be especially useful in medical imaging applications. Ultimately, we expect our study to create to a new rigorous, unified paradigm for computational variability, providing a common language and sets of tools applicable across diverse underlying domains.
Max ERC Funding
1 499 845 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym FAIR
Project Fairness and the Moral Mind
Researcher (PI) Bertil TUNGODDEN
Host Institution (HI) NORGES HANDELSHOYSKOLE
Call Details Advanced Grant (AdG), SH1, ERC-2017-ADG
Summary The project provides a comprehensive and groundbreaking approach to the analysis of the moral mind and inequality acceptance. The first part of the project will provide a novel study of how the moral ideals of personal responsibility and individual freedom, which are fundamental values in most liberal societies, shape inequality acceptance. It will also provide the first experimental study of how people draw the moral circle, which is at the heart of the most pressing policy challenges facing the world today and strongly related to the question of global fairness. The second part will study how social institutions shape inequality acceptance and how it develops in childhood and adolescence, by providing two unique international studies of inequality acceptance in 60 countries across the world. These studies will provide novel insights on the distributive behavior of nationally representative samples of adults and children and on the cultural transmission of moral preferences in society. The project is rooted in behavioral and experimental economics, but will also draw on insights from other social sciences and philosophy. It will develop novel experimental paradigms to study the moral mind and the nature of inequality acceptance, including incentivized experiments on nationally representative populations, and combine structural and non-parametric empirical analysis with theory development. Taken together, the project represents a unique study of inequality acceptance in the social sciences that will address an important knowledge gap in the literature on inequality.
Summary
The project provides a comprehensive and groundbreaking approach to the analysis of the moral mind and inequality acceptance. The first part of the project will provide a novel study of how the moral ideals of personal responsibility and individual freedom, which are fundamental values in most liberal societies, shape inequality acceptance. It will also provide the first experimental study of how people draw the moral circle, which is at the heart of the most pressing policy challenges facing the world today and strongly related to the question of global fairness. The second part will study how social institutions shape inequality acceptance and how it develops in childhood and adolescence, by providing two unique international studies of inequality acceptance in 60 countries across the world. These studies will provide novel insights on the distributive behavior of nationally representative samples of adults and children and on the cultural transmission of moral preferences in society. The project is rooted in behavioral and experimental economics, but will also draw on insights from other social sciences and philosophy. It will develop novel experimental paradigms to study the moral mind and the nature of inequality acceptance, including incentivized experiments on nationally representative populations, and combine structural and non-parametric empirical analysis with theory development. Taken together, the project represents a unique study of inequality acceptance in the social sciences that will address an important knowledge gap in the literature on inequality.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym FairSocialComputing
Project Foundations for Fair Social Computing
Researcher (PI) Krishna GUMMADI
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Social computing represents a societal-scale symbiosis of humans and computational systems, where humans interact via and with computers, actively providing inputs to influence and being influenced by, the outputs of the computations. Recently, several concerns have been raised about the unfairness of social computations pervading our lives ranging from the potential for discrimination in machine learning based predictive analytics and implicit biases in online search and recommendations to their general lack of transparency on what sensitive data about users they use or how they use them.
In this proposal, I propose ten fairness principles for social computations. They span across all three main categories of organizational justice, including distributive (fairness of the outcomes or ends of computations), procedural (fairness of the process or means of computations), and informational fairness (transparency of the outcomes and process of computations) and they cover a variety of unfairness perceptions about social computations.
I describe the fundamental and novel technical challenges that arise when applying these principles to social computations. These challenges are related to operationalization (measurement), synthesis and analysis of fairness in computations. Tackling these requires applying methodologies from a number of sub-areas within CS, including learning, datamining, IR, game-theory, privacy, and distributed systems.
I discuss our recent breakthroughs in tackling some of these challenges, particularly our idea of fairness constraints, a flexible mechanism that allows us to constrain learning models to synthesize fair computations that are non-discriminatory, the first of our ten principles. I outline our plans to build upon our results to tackle the challenges that arise from the other nine fairness principles. Successful execution of the proposal will provide the foundations for fair social computing in the future.
Summary
Social computing represents a societal-scale symbiosis of humans and computational systems, where humans interact via and with computers, actively providing inputs to influence and being influenced by, the outputs of the computations. Recently, several concerns have been raised about the unfairness of social computations pervading our lives ranging from the potential for discrimination in machine learning based predictive analytics and implicit biases in online search and recommendations to their general lack of transparency on what sensitive data about users they use or how they use them.
In this proposal, I propose ten fairness principles for social computations. They span across all three main categories of organizational justice, including distributive (fairness of the outcomes or ends of computations), procedural (fairness of the process or means of computations), and informational fairness (transparency of the outcomes and process of computations) and they cover a variety of unfairness perceptions about social computations.
I describe the fundamental and novel technical challenges that arise when applying these principles to social computations. These challenges are related to operationalization (measurement), synthesis and analysis of fairness in computations. Tackling these requires applying methodologies from a number of sub-areas within CS, including learning, datamining, IR, game-theory, privacy, and distributed systems.
I discuss our recent breakthroughs in tackling some of these challenges, particularly our idea of fairness constraints, a flexible mechanism that allows us to constrain learning models to synthesize fair computations that are non-discriminatory, the first of our ten principles. I outline our plans to build upon our results to tackle the challenges that arise from the other nine fairness principles. Successful execution of the proposal will provide the foundations for fair social computing in the future.
Max ERC Funding
2 487 500 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym FEEDGALAXIES
Project A new vantage point on how gas flows regulate the build-up of galaxies in the early universe
Researcher (PI) Michele FUMAGALLI
Host Institution (HI) UNIVERSITY OF DURHAM
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary Galaxies reside within a web of gas that feeds the formation of new stars. Following star formation, galaxies eject some of their gas reservoir back into this cosmic web. This proposal addresses the fundamental questions of how these inflows and outflows regulate the evolution of galaxies. My research team will tackle two key problems: 1) how gas accretion regulates the build-up of galaxies; 2) how efficiently outflows are in removing gas from star-forming regions. To characterise these flows across five billion years of cosmic history, we will pursue cutting-edge research on the halo gas, which is the material around the central galaxies, within dark matter halos. We will focus on scales ranging from a few kiloparsecs, where outflows originate, up to hundreds of kiloparsecs from galaxies, where inflows and outflows have visible impacts on halos. We will attack this problem using both simulations and observations with the largest telescopes on the ground and in space. With novel applications of absorption spectroscopy, we will gain a new vantage point on the astrophysics of these gas flows. Exploiting unprecedented datasets that I am currently assembling thanks to ground-breaking developments in instrumentation, we will directly connect the properties of halo gas to those of the central galaxies, investigating the impact that the baryonic processes probed in absorption have on galaxies seen in emission. In parallel, using new hydrodynamic simulations and radiative transfer calculations, we will go beyond present state-of-the-art methodologies to unveil the theory behind the origin of these gas flows, a crucial aspect to decode the physics probed by our observations. As a result of this powerful synergy between observations and simulations, this programme will provide the most advanced analysis of the impact that inflows and outflows have on galaxy evolution, shaping the direction of future work at 40m telescopes and the next generation of cosmological simulations.
Summary
Galaxies reside within a web of gas that feeds the formation of new stars. Following star formation, galaxies eject some of their gas reservoir back into this cosmic web. This proposal addresses the fundamental questions of how these inflows and outflows regulate the evolution of galaxies. My research team will tackle two key problems: 1) how gas accretion regulates the build-up of galaxies; 2) how efficiently outflows are in removing gas from star-forming regions. To characterise these flows across five billion years of cosmic history, we will pursue cutting-edge research on the halo gas, which is the material around the central galaxies, within dark matter halos. We will focus on scales ranging from a few kiloparsecs, where outflows originate, up to hundreds of kiloparsecs from galaxies, where inflows and outflows have visible impacts on halos. We will attack this problem using both simulations and observations with the largest telescopes on the ground and in space. With novel applications of absorption spectroscopy, we will gain a new vantage point on the astrophysics of these gas flows. Exploiting unprecedented datasets that I am currently assembling thanks to ground-breaking developments in instrumentation, we will directly connect the properties of halo gas to those of the central galaxies, investigating the impact that the baryonic processes probed in absorption have on galaxies seen in emission. In parallel, using new hydrodynamic simulations and radiative transfer calculations, we will go beyond present state-of-the-art methodologies to unveil the theory behind the origin of these gas flows, a crucial aspect to decode the physics probed by our observations. As a result of this powerful synergy between observations and simulations, this programme will provide the most advanced analysis of the impact that inflows and outflows have on galaxy evolution, shaping the direction of future work at 40m telescopes and the next generation of cosmological simulations.
Max ERC Funding
1 499 557 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym FEP
Project Foundations of Economic Preferences
Researcher (PI) Ernst Fehr
Host Institution (HI) UNIVERSITAT ZURICH
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary Preferences are a representation of individuals’ behavioral goals. Assumptions about individual preferences are a decisive component of almost all economic models. In fact, any social science that aims at explaining both individual behaviors and aggregate outcomes in terms of individuals’ goals and constraints has to make assumptions about preferences. Knowledge about preferences is thus key for the ability to predict the behavior of individuals and groups, and for assessing the welfare consequences of different policies. There are, however, still large empirical gaps in our knowledge about preferences. Extremely little is known about the social, economic and biological factors that causally shape them. There is also limited knowledge about how preferences are distributed in society, how they relate to demographic and socio-economic factors, how time, risk and social preferences are interrelated, and the extent to which preferences are stable across time and strategic situations. Therefore, we propose to study these foundational questions by applying economic and neuroeconomic tools that enable us to measure structural preference parameters and the social and biological forces that shape them. In particular, we will study the four following topics. (1) The distribution and stability of time, risk and social preference parameters based on nationally representative behavioral experiments. (2) The relationship between time, risk and social preferences. (3) The causal impact of the social environment on preferences. (4) The neural and genetic determinants of preferences. The proposed research program promises to yield important insights into the causal determinants, the structure and the relationships between time, risk and social preferences. This will inform and constrain theoretical models and policy conclusions based on such models.
Summary
Preferences are a representation of individuals’ behavioral goals. Assumptions about individual preferences are a decisive component of almost all economic models. In fact, any social science that aims at explaining both individual behaviors and aggregate outcomes in terms of individuals’ goals and constraints has to make assumptions about preferences. Knowledge about preferences is thus key for the ability to predict the behavior of individuals and groups, and for assessing the welfare consequences of different policies. There are, however, still large empirical gaps in our knowledge about preferences. Extremely little is known about the social, economic and biological factors that causally shape them. There is also limited knowledge about how preferences are distributed in society, how they relate to demographic and socio-economic factors, how time, risk and social preferences are interrelated, and the extent to which preferences are stable across time and strategic situations. Therefore, we propose to study these foundational questions by applying economic and neuroeconomic tools that enable us to measure structural preference parameters and the social and biological forces that shape them. In particular, we will study the four following topics. (1) The distribution and stability of time, risk and social preference parameters based on nationally representative behavioral experiments. (2) The relationship between time, risk and social preferences. (3) The causal impact of the social environment on preferences. (4) The neural and genetic determinants of preferences. The proposed research program promises to yield important insights into the causal determinants, the structure and the relationships between time, risk and social preferences. This will inform and constrain theoretical models and policy conclusions based on such models.
Max ERC Funding
2 494 759 €
Duration
Start date: 2012-08-01, End date: 2018-07-31
Project acronym FINIMPMACRO
Project Financial Imperfections and Macroeconomic Implications
Researcher (PI) Tommaso Monacelli
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary We plan to study the implications of financial market imperfections for four main questions.
First, how do financial imperfections affect the optimal conduct of monetary and exchange rate policy in open economies? A key insight is that we characterize financial frictions as endogenous and only occasionally binding. This can have important implications for the optimal conduct of stabilization policy.
Second, how do financial and labor market imperfections interact? We extend the standard search-and-matching model to allow firms to issue debt. This feature affects the wage bargaining process endogenously, since firms, by leveraging, can pay lower wages. We study the ability of such a model to replicate the volatility and persistence of unemployment in the data, and the role of financial imperfections in affecting the transmission of productivity and financial shocks.
Third, does the effectiveness of tax policy depend on its redistributive content, and how is this affected by financial imperfections? We characterize the distributional feature of several Tax Acts in the US, and investigate empirically whether tax changes that “favor the poor” are more expansionary than cuts that “favor the rich”. We then build a theoretical framework with heterogeneous agents and financial frictions to rationalize our evidence.
Fourth, how do financial intermediaries affect the transmission channel of monetary policy? We extend the current New Keynesian framework for monetary policy analysis to study the role of financial intermediaries. We emphasize the role of three features: (i) asymmetric information in interbank markets; (ii) maturity mismatch in the banks’ balance sheets; (iii) the “paradox of securitization”, thereby a deeper diversification of idiosyncratic risk leads to a simultaneous increase in the sensitivity of banks’ balance sheets to aggregate risk.
Summary
We plan to study the implications of financial market imperfections for four main questions.
First, how do financial imperfections affect the optimal conduct of monetary and exchange rate policy in open economies? A key insight is that we characterize financial frictions as endogenous and only occasionally binding. This can have important implications for the optimal conduct of stabilization policy.
Second, how do financial and labor market imperfections interact? We extend the standard search-and-matching model to allow firms to issue debt. This feature affects the wage bargaining process endogenously, since firms, by leveraging, can pay lower wages. We study the ability of such a model to replicate the volatility and persistence of unemployment in the data, and the role of financial imperfections in affecting the transmission of productivity and financial shocks.
Third, does the effectiveness of tax policy depend on its redistributive content, and how is this affected by financial imperfections? We characterize the distributional feature of several Tax Acts in the US, and investigate empirically whether tax changes that “favor the poor” are more expansionary than cuts that “favor the rich”. We then build a theoretical framework with heterogeneous agents and financial frictions to rationalize our evidence.
Fourth, how do financial intermediaries affect the transmission channel of monetary policy? We extend the current New Keynesian framework for monetary policy analysis to study the role of financial intermediaries. We emphasize the role of three features: (i) asymmetric information in interbank markets; (ii) maturity mismatch in the banks’ balance sheets; (iii) the “paradox of securitization”, thereby a deeper diversification of idiosyncratic risk leads to a simultaneous increase in the sensitivity of banks’ balance sheets to aggregate risk.
Max ERC Funding
778 800 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym FINLAB
Project Finance and Labor
Researcher (PI) Marco Pagano
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), SH1, ERC-2011-ADG_20110406
Summary How does financial market development affect employment, wages and unemployment risk? And how do labor market institutions and workers’ behavior in turn affect corporate policies?
These issues are much under-researched, in spite of their prominence in public debate, often ideologically polarized between those who consider finance as socially harmful and those who view it as an efficient allocation machine. Most economic research indicates that financial development raises output growth, but is silent about its effects on the labor market: does it also raise employment and wages? If so, is it at the expense of greater employment risk and inequality? And how does the potential for systemic financial instability affect the answers to these questions?
The study of these issues naturally opens also another – equally under-researched – line of inquiry: that concerning the effects of labor relations on financial arrangements. Do corporate investment policies and leverage decisions take into account their own effects on firms’ bargaining position in wage negotiations? And if so, how are these corporate decisions affected by job protection regulation, union density or workers’ protection in bankruptcy? To what extent do companies insure workers against employment risk, and do family and non-family firms differ in this respect? Finally, can labor market competition damage the performance of employees with decision-making responsibilities? For instance, can it induce managers or traders to take excessively risky decisions, by providing them with an escape route once they make mistakes, especially when outcomes are observed long after decisions?
This research project purports to break new ground on both sets of issues, using a combination of analytical modelling and empirical analysis, which in some cases will require the collection of entirely new data.
Summary
How does financial market development affect employment, wages and unemployment risk? And how do labor market institutions and workers’ behavior in turn affect corporate policies?
These issues are much under-researched, in spite of their prominence in public debate, often ideologically polarized between those who consider finance as socially harmful and those who view it as an efficient allocation machine. Most economic research indicates that financial development raises output growth, but is silent about its effects on the labor market: does it also raise employment and wages? If so, is it at the expense of greater employment risk and inequality? And how does the potential for systemic financial instability affect the answers to these questions?
The study of these issues naturally opens also another – equally under-researched – line of inquiry: that concerning the effects of labor relations on financial arrangements. Do corporate investment policies and leverage decisions take into account their own effects on firms’ bargaining position in wage negotiations? And if so, how are these corporate decisions affected by job protection regulation, union density or workers’ protection in bankruptcy? To what extent do companies insure workers against employment risk, and do family and non-family firms differ in this respect? Finally, can labor market competition damage the performance of employees with decision-making responsibilities? For instance, can it induce managers or traders to take excessively risky decisions, by providing them with an escape route once they make mistakes, especially when outcomes are observed long after decisions?
This research project purports to break new ground on both sets of issues, using a combination of analytical modelling and empirical analysis, which in some cases will require the collection of entirely new data.
Max ERC Funding
1 873 800 €
Duration
Start date: 2012-07-01, End date: 2018-06-30
Project acronym FirstGalaxies
Project Finding the most distant galaxies with NIRSpec guaranteed time on the James Webb Space Telescope
Researcher (PI) Andrew BUNKER
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary "Over the past 15 years, my team have pushed the frontier for the most distant known objects to higher redshifts, exploring galaxies when the Universe was young using the Hubble Space Telescope and large ground-based telescopes. As well as discovering galaxies within the first billion years (90percent of the way back in time to the Big Bang), our knowledge of the composition of the Universe has also grown - dark matter and dark energy dictate the expansion history and initial collapse of structures which ultimately form galaxies. We now know that the gas between the galaxies, initially plasma, became mostly neutral about 300,000 years after the Big Bang, but again became plasma about a billion years later. The first few generations of stars to form, with a contribution from high redshift quasars, might be responsible for this reionization, but we have yet to find the galaxies accounting for the bulk of the ionizing photons and key questions remain: what is the contribution from the faintest dwarf galaxies in the luminosity function at high redshift? what fraction of ionizing photons emitted by stars reach the intergalactic gas? is the first generation of stars forming from primordial hydrogen and helium more efficient in producing ionizing photons?
I am in a privileged position to address these questions, as a member of the ESA Instrument Science Team since 2005 for the near-infrared spectrograph (NIRSpec) on the James Webb Space Telescope (JWST), due to launch in May 2020. Much of our 900 hours of guaranteed time will be spectroscopy of high redshift galaxies, and I am leading the deep tier of our survey to get accurate redshifts (vital for luminosity functions), measure the stellar populations (ages and star formation rates), assess the escape fractions of ionizing photons and determine the metal enrichment (potentially finding the long-sought ""Population III"", the first stars to form). With this ERC grant I aim to assemble a team to achieve these science goals.
"
Summary
"Over the past 15 years, my team have pushed the frontier for the most distant known objects to higher redshifts, exploring galaxies when the Universe was young using the Hubble Space Telescope and large ground-based telescopes. As well as discovering galaxies within the first billion years (90percent of the way back in time to the Big Bang), our knowledge of the composition of the Universe has also grown - dark matter and dark energy dictate the expansion history and initial collapse of structures which ultimately form galaxies. We now know that the gas between the galaxies, initially plasma, became mostly neutral about 300,000 years after the Big Bang, but again became plasma about a billion years later. The first few generations of stars to form, with a contribution from high redshift quasars, might be responsible for this reionization, but we have yet to find the galaxies accounting for the bulk of the ionizing photons and key questions remain: what is the contribution from the faintest dwarf galaxies in the luminosity function at high redshift? what fraction of ionizing photons emitted by stars reach the intergalactic gas? is the first generation of stars forming from primordial hydrogen and helium more efficient in producing ionizing photons?
I am in a privileged position to address these questions, as a member of the ESA Instrument Science Team since 2005 for the near-infrared spectrograph (NIRSpec) on the James Webb Space Telescope (JWST), due to launch in May 2020. Much of our 900 hours of guaranteed time will be spectroscopy of high redshift galaxies, and I am leading the deep tier of our survey to get accurate redshifts (vital for luminosity functions), measure the stellar populations (ages and star formation rates), assess the escape fractions of ionizing photons and determine the metal enrichment (potentially finding the long-sought ""Population III"", the first stars to form). With this ERC grant I aim to assemble a team to achieve these science goals.
"
Max ERC Funding
2 049 961 €
Duration
Start date: 2020-04-01, End date: 2025-03-31
Project acronym FLEXBOT
Project Flexible object manipulation based on statistical learning and topological representations
Researcher (PI) Danica Kragic Jensfelt
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Summary
A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Max ERC Funding
1 398 720 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym FlowMachines
Project Flow Machines: Interacting with Style
Researcher (PI) Francois Pachet
Host Institution (HI) UNIVERSITE PIERRE ET MARIE CURIE - PARIS 6
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary Content creation is a fundamental activity for developing identities in modern individuals. Yet creativity is hardly addressed by computer science. This project addresses the issue of content creation from the perspective of Flow machines. Flow machines are interactive systems that learn how to generate content, text or music, in the user’s style. Thanks to controlled generation mechanisms, the user can then steer the machine to generate content that fits with their intentions. Flow interactions induce a multiplicative effect that boosts creativity and prompts the user to reflect on their own style. This vision stems from the success stories of several computer-assisted musical systems that showed how interactive dialogs with self-learning interactions provoke flow states.
To enables full control of stylistic generation, the scientific challenge is the reification of style as a flexible texture. This challenge will be addressed by pursuing three original directions in the fields of statistical learning and combinatorial optimization: 1) the formulation of Markov-based generation as a constraint problem, 2) the development of feature generation techniques for feeding machine learning algorithms and 3) the development of techniques to transform descriptors into controllers.
Two large-scale studies will be conducted with well-known creators using these Flow machines, during which the whole creation process will be recorded, stored, and analyzed, providing the first complete chronicles of professional-level artifacts. The artifacts, a music album and a novel, will be published in their respective ecosystems, and the reaction of the audience will be measured and analyzed to further assess the impact of Flow machines on creation. The technologies developed and the pilot studies will serve as pioneering experiments to turn Flow machines into a field of study and explore other domains of creation.
Summary
Content creation is a fundamental activity for developing identities in modern individuals. Yet creativity is hardly addressed by computer science. This project addresses the issue of content creation from the perspective of Flow machines. Flow machines are interactive systems that learn how to generate content, text or music, in the user’s style. Thanks to controlled generation mechanisms, the user can then steer the machine to generate content that fits with their intentions. Flow interactions induce a multiplicative effect that boosts creativity and prompts the user to reflect on their own style. This vision stems from the success stories of several computer-assisted musical systems that showed how interactive dialogs with self-learning interactions provoke flow states.
To enables full control of stylistic generation, the scientific challenge is the reification of style as a flexible texture. This challenge will be addressed by pursuing three original directions in the fields of statistical learning and combinatorial optimization: 1) the formulation of Markov-based generation as a constraint problem, 2) the development of feature generation techniques for feeding machine learning algorithms and 3) the development of techniques to transform descriptors into controllers.
Two large-scale studies will be conducted with well-known creators using these Flow machines, during which the whole creation process will be recorded, stored, and analyzed, providing the first complete chronicles of professional-level artifacts. The artifacts, a music album and a novel, will be published in their respective ecosystems, and the reaction of the audience will be measured and analyzed to further assess the impact of Flow machines on creation. The technologies developed and the pilot studies will serve as pioneering experiments to turn Flow machines into a field of study and explore other domains of creation.
Max ERC Funding
2 240 120 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym FoTran
Project Found in Translation – Natural Language Understanding with Cross-Lingual Grounding
Researcher (PI) Jörg TIEDEMANN
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary "Natural language understanding is the ""holy grail"" of computational linguistics and a long-term goal in research on artificial intelligence. Understanding human communication is difficult due to the various ambiguities in natural languages and the wide range of contextual dependencies required to resolve them. Discovering the semantics behind language input is necessary for proper interpretation in interactive tools, which requires an abstraction from language-specific forms to language-independent meaning representations. With this project, I propose a line of research that will focus on the development of novel data-driven models that can learn such meaning representations from indirect supervision provided by human translations covering a substantial proportion of the linguistic diversity in the world. A guiding principle is cross-lingual grounding, the effect of resolving ambiguities through translation. The beauty of that idea is the use of naturally occurring data instead of artificially created resources and costly manual annotations. The framework is based on deep learning and neural machine translation and my hypothesis is that training on increasing amounts of linguistically diverse data improves the abstractions found by the model. Eventually, this will lead to universal sentence-level meaning representations and we will test our ideas with multilingual machine translation and tasks that require semantic reasoning and inference."
Summary
"Natural language understanding is the ""holy grail"" of computational linguistics and a long-term goal in research on artificial intelligence. Understanding human communication is difficult due to the various ambiguities in natural languages and the wide range of contextual dependencies required to resolve them. Discovering the semantics behind language input is necessary for proper interpretation in interactive tools, which requires an abstraction from language-specific forms to language-independent meaning representations. With this project, I propose a line of research that will focus on the development of novel data-driven models that can learn such meaning representations from indirect supervision provided by human translations covering a substantial proportion of the linguistic diversity in the world. A guiding principle is cross-lingual grounding, the effect of resolving ambiguities through translation. The beauty of that idea is the use of naturally occurring data instead of artificially created resources and costly manual annotations. The framework is based on deep learning and neural machine translation and my hypothesis is that training on increasing amounts of linguistically diverse data improves the abstractions found by the model. Eventually, this will lead to universal sentence-level meaning representations and we will test our ideas with multilingual machine translation and tasks that require semantic reasoning and inference."
Max ERC Funding
1 817 622 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym FRAPPANT
Project Formal Reasoning About Probabilistic Programs: Breaking New Ground for Automation
Researcher (PI) Joost Pieter KATOEN
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Probabilistic programs describe recipes on how to infer statistical conclusions about data from a complex mixture of uncertain data and real-world observations. They can represent probabilistic graphical models far beyond the capabilities of Bayesian networks and are expected to have a major impact on machine intelligence.
Probabilistic programs are ubiquitous. They steer autonomous robots and self-driving cars, are key to describe security mechanisms, naturally code up randomised algorithms for solving NP-hard problems, and are rapidly encroaching AI. Probabilistic programming aims to make probabilistic modeling and machine learning accessible to the programmer.
Probabilistic programs, though typically relatively small in size, are hard to grasp, let alone automatically checkable. Are they doing the right thing? What’s their precision? These questions are notoriously hard — even the most elementary question “does a program halt with probability one?” is “more undecidable” than the halting problem — and can (if at all) be answered with statistical evidence only. Bugs thus easily occur. Hard guarantees are called for. The objective of this project is to enable predictable probabilistic programming. We do so by developing formal verification techniques.
Whereas program correctness is pivotal in computer science, the formal verification of probabilistic programs is in its infancy. The project aims to fill this barren landscape by developing program analysis techniques, leveraging model checking, deductive verification, and static analysis. Challenging problems such as checking program equivalence, loop-invariant and parameter synthesis, program repair, program robustness and exact inference using weakest precondition reasoning will be tackled. The techniques will be evaluated in the context of probabilistic graphical models, randomised algorithms, and autonomous robots.
FRAPPANT will spearhead formally verifiable probabilistic programming.
Summary
Probabilistic programs describe recipes on how to infer statistical conclusions about data from a complex mixture of uncertain data and real-world observations. They can represent probabilistic graphical models far beyond the capabilities of Bayesian networks and are expected to have a major impact on machine intelligence.
Probabilistic programs are ubiquitous. They steer autonomous robots and self-driving cars, are key to describe security mechanisms, naturally code up randomised algorithms for solving NP-hard problems, and are rapidly encroaching AI. Probabilistic programming aims to make probabilistic modeling and machine learning accessible to the programmer.
Probabilistic programs, though typically relatively small in size, are hard to grasp, let alone automatically checkable. Are they doing the right thing? What’s their precision? These questions are notoriously hard — even the most elementary question “does a program halt with probability one?” is “more undecidable” than the halting problem — and can (if at all) be answered with statistical evidence only. Bugs thus easily occur. Hard guarantees are called for. The objective of this project is to enable predictable probabilistic programming. We do so by developing formal verification techniques.
Whereas program correctness is pivotal in computer science, the formal verification of probabilistic programs is in its infancy. The project aims to fill this barren landscape by developing program analysis techniques, leveraging model checking, deductive verification, and static analysis. Challenging problems such as checking program equivalence, loop-invariant and parameter synthesis, program repair, program robustness and exact inference using weakest precondition reasoning will be tackled. The techniques will be evaluated in the context of probabilistic graphical models, randomised algorithms, and autonomous robots.
FRAPPANT will spearhead formally verifiable probabilistic programming.
Max ERC Funding
2 491 250 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym FUNGRAPH
Project A New Foundation for Computer Graphics with Inherent Uncertainty
Researcher (PI) George DRETTAKIS
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary The use of Computer Graphics (CG) is constantly expanding, e.g., in Virtual and Augmented Reality, requiring realistic interactive renderings of complex virtual environments at a much wider scale than available today. CG has many limitations we must overcome to satisfy these demands. High-quality accurate rendering needs expensive simulation, while fast approximate rendering algorithms have no guarantee on accuracy; both need manually-designed expensive-to-create content. Capture (e.g., reconstruction from photos) can provide content, but it is uncertain (i.e., inaccurate and incomplete). Image-based rendering (IBR) can display such content, but lacks flexibility to modify the scene. These different rendering algorithms have incompatible but complementary tradeoffs in quality, speed and flexibility; they cannot currently be used together, and only IBR can directly use captured content. To address these problems
FunGraph will revisit the foundations of Computer Graphics, so these disparate methods can be used together, introducing the treatment of uncertainty to achieve this goal.
FunGraph introduces estimation of rendering uncertainty, quantifying the expected error of rendering components, and propagation of input uncertainty of captured content to the renderer. The ultimate goal is to define a unified renderer exploiting the advantages of each approach in a single algorithm. Our methodology builds on the use of extensive synthetic (and captured) “ground truth” data, the domain of Uncertainty Quantification adapted to our problems and recent advances in machine learning – Bayesian Deep Learning in particular.
FunGraph will fundamentally transform computer graphics, and rendering in particular, by proposing a principled methodology based on uncertainty to develop a new generation of algorithms that fully exploit the spectacular (but previously incompatible) advances in rendering, and fully benefit from the wealth offered by constantly improving captured content.
Summary
The use of Computer Graphics (CG) is constantly expanding, e.g., in Virtual and Augmented Reality, requiring realistic interactive renderings of complex virtual environments at a much wider scale than available today. CG has many limitations we must overcome to satisfy these demands. High-quality accurate rendering needs expensive simulation, while fast approximate rendering algorithms have no guarantee on accuracy; both need manually-designed expensive-to-create content. Capture (e.g., reconstruction from photos) can provide content, but it is uncertain (i.e., inaccurate and incomplete). Image-based rendering (IBR) can display such content, but lacks flexibility to modify the scene. These different rendering algorithms have incompatible but complementary tradeoffs in quality, speed and flexibility; they cannot currently be used together, and only IBR can directly use captured content. To address these problems
FunGraph will revisit the foundations of Computer Graphics, so these disparate methods can be used together, introducing the treatment of uncertainty to achieve this goal.
FunGraph introduces estimation of rendering uncertainty, quantifying the expected error of rendering components, and propagation of input uncertainty of captured content to the renderer. The ultimate goal is to define a unified renderer exploiting the advantages of each approach in a single algorithm. Our methodology builds on the use of extensive synthetic (and captured) “ground truth” data, the domain of Uncertainty Quantification adapted to our problems and recent advances in machine learning – Bayesian Deep Learning in particular.
FunGraph will fundamentally transform computer graphics, and rendering in particular, by proposing a principled methodology based on uncertainty to develop a new generation of algorithms that fully exploit the spectacular (but previously incompatible) advances in rendering, and fully benefit from the wealth offered by constantly improving captured content.
Max ERC Funding
2 497 161 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym GasAroundGalaxies
Project Studying the gas around galaxies with the Multi Unit Spectroscopic Explorer and hydrodynamical simulations
Researcher (PI) Joop Schaye
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary "Gas accretion and galactic winds are two of the most important and poorly understood ingredients of models for the formation and evolution of galaxies. We propose to take advantage of two unique opportunities to embark on a multi-disciplinary program to advance our understanding of the circumgalactic medium (CGM).
We will use MUSE, a massive optical integral field spectrograph that we helped to develop and that will be commissioned on the VLT in 2012, to study the CGM in both absorption and emission. We will use 200 hours of guaranteed time to carry out deep redshift surveys of fields centred on bright z≈3.5 and z≈5 QSOs. This will yield hundreds of faint galaxies (mainly Lyα emitters) within 250 kpc of the lines of sight to the background QSOs, an order of magnitude increase compared to the best existing sample (bright, z≈2.3 galaxies). This will allow us to map the CGM in absorption in 3-D using HI and metal lines and to identify, for the first time, the counterparts to most metal absorbers. MUSE will also enable us to detect Lyα emission from the denser CGM (also using another 300 hours of guaranteed time targeting deep HST fields) and thus to directly explore its connection with galaxies and QSO absorbers.
We will use the new supercomputer of the Virgo consortium to carry out cosmological hydro simulations that contain 1-2 orders of magnitude more resolution elements than the largest existing (spatially adaptive) runs. We will use the results of our previous work to guide our choice of parameters in order to obtain a better match to the observed mass function of galaxies. In parallel, we will carry out a complementary program of zoomed simulations of individual galaxies. These will have the physics and resolution to include a cold gas phase and hence to bypass much of the ""subgrid"" physics used in the cosmological runs. Both types of simulations will be used to study the physics of gas flows around galaxies and to guide the interpretation of our observations."
Summary
"Gas accretion and galactic winds are two of the most important and poorly understood ingredients of models for the formation and evolution of galaxies. We propose to take advantage of two unique opportunities to embark on a multi-disciplinary program to advance our understanding of the circumgalactic medium (CGM).
We will use MUSE, a massive optical integral field spectrograph that we helped to develop and that will be commissioned on the VLT in 2012, to study the CGM in both absorption and emission. We will use 200 hours of guaranteed time to carry out deep redshift surveys of fields centred on bright z≈3.5 and z≈5 QSOs. This will yield hundreds of faint galaxies (mainly Lyα emitters) within 250 kpc of the lines of sight to the background QSOs, an order of magnitude increase compared to the best existing sample (bright, z≈2.3 galaxies). This will allow us to map the CGM in absorption in 3-D using HI and metal lines and to identify, for the first time, the counterparts to most metal absorbers. MUSE will also enable us to detect Lyα emission from the denser CGM (also using another 300 hours of guaranteed time targeting deep HST fields) and thus to directly explore its connection with galaxies and QSO absorbers.
We will use the new supercomputer of the Virgo consortium to carry out cosmological hydro simulations that contain 1-2 orders of magnitude more resolution elements than the largest existing (spatially adaptive) runs. We will use the results of our previous work to guide our choice of parameters in order to obtain a better match to the observed mass function of galaxies. In parallel, we will carry out a complementary program of zoomed simulations of individual galaxies. These will have the physics and resolution to include a cold gas phase and hence to bypass much of the ""subgrid"" physics used in the cosmological runs. Both types of simulations will be used to study the physics of gas flows around galaxies and to guide the interpretation of our observations."
Max ERC Funding
1 496 400 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym GEODESI
Project Theoretical and observational consequences of the Geometrical Destabilization of Inflation
Researcher (PI) Sébastien Maurice Marceau RENAUX-PETEL
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE9, ERC-2017-STG
Summary The GEODESI project aims at interpreting current and forthcoming cosmological observations in a renewed theoretical framework about cosmological inflation and its ending. The simplest toy models of inflation economically explain all current data, leaving no observational clue to guide theorists towards a finer physical understanding. In this context, I very recently unveiled an hitherto unnoticed instability at play in the primordial universe that potentially affects all inflationary models and drastically modifies the interpretation of cosmological observations in terms of fundamental physics. The so-called Geometrical Destabilization of inflation reshuffles our understanding of the origin of structures in the universe, offers a new mechanism to end inflation, and promises unrivaled constraints on high-energy physics. It is crucial to develop this fresh look before a host of high-quality data from large-scale structure surveys and cosmic microwave background observations become available within the 5 year timescale of the project.
With the ERC grant I plan to build a group at the Institute of Astrophysics of Paris (IAP-CNRS) with the objective of determining the full theoretical and observational consequences of the geometrical destabilization of inflation. We will combine insights from non-standard cosmological perturbation theory and lattice simulations to constrain realistic models of inflation in high-energy physics, producing accurate theoretical predictions for a wide variety of observables, including the spectra and the non-Gaussianities of primordial fluctuations and stochastic backgrounds of gravitational waves.
Summary
The GEODESI project aims at interpreting current and forthcoming cosmological observations in a renewed theoretical framework about cosmological inflation and its ending. The simplest toy models of inflation economically explain all current data, leaving no observational clue to guide theorists towards a finer physical understanding. In this context, I very recently unveiled an hitherto unnoticed instability at play in the primordial universe that potentially affects all inflationary models and drastically modifies the interpretation of cosmological observations in terms of fundamental physics. The so-called Geometrical Destabilization of inflation reshuffles our understanding of the origin of structures in the universe, offers a new mechanism to end inflation, and promises unrivaled constraints on high-energy physics. It is crucial to develop this fresh look before a host of high-quality data from large-scale structure surveys and cosmic microwave background observations become available within the 5 year timescale of the project.
With the ERC grant I plan to build a group at the Institute of Astrophysics of Paris (IAP-CNRS) with the objective of determining the full theoretical and observational consequences of the geometrical destabilization of inflation. We will combine insights from non-standard cosmological perturbation theory and lattice simulations to constrain realistic models of inflation in high-energy physics, producing accurate theoretical predictions for a wide variety of observables, including the spectra and the non-Gaussianities of primordial fluctuations and stochastic backgrounds of gravitational waves.
Max ERC Funding
1 476 672 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym GINE
Project General Institutional Equilibrium
- theory and policy implications
Researcher (PI) Bard Harstad
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary Existing institutional theory, including political economics and contract theory, convincingly show that institutional details have large impacts on economic and policy outcomes. Once this is recognized, it follows that contracts should depend on the organisational design of the institution to which the contract is offered. Stage 1 of Project Gine aims at characterising optimal contracts as a function of this design. Stage 2 develops a framework for endogenising and characterising the optimal institutional design. At Stage 3, sets of institutions are endogenised at the same time, where the design of one is an optimal response to the designs of the others. This outcome is referred to as a general institutional equilibrium.
Such a theory or methodological framework has several immensely important applications. Development aid contracts should carefully account for the political structure in the recipient country; otherwise the effect of aid may surprise and be counterproductive. The major application motivating this study, however, is environmental policy. Not only must the optimal environmental policy be conditioned on political economy forces; it must also be a function of institutional details, such as the political system. This can explain why the choice of instrument differs across political systems, and why politicians often prefer standards rather than economic instruments. Furthermore, we still do not have a good knowledge of how to design effective and implementable international environmental treaties. The optimal treaty design as well as the best choice of policy instrument must take into account that certain institutions (e.g., interest groups, firm structures, and perhaps even local governance) respond endogenously to these policies.
Summary
Existing institutional theory, including political economics and contract theory, convincingly show that institutional details have large impacts on economic and policy outcomes. Once this is recognized, it follows that contracts should depend on the organisational design of the institution to which the contract is offered. Stage 1 of Project Gine aims at characterising optimal contracts as a function of this design. Stage 2 develops a framework for endogenising and characterising the optimal institutional design. At Stage 3, sets of institutions are endogenised at the same time, where the design of one is an optimal response to the designs of the others. This outcome is referred to as a general institutional equilibrium.
Such a theory or methodological framework has several immensely important applications. Development aid contracts should carefully account for the political structure in the recipient country; otherwise the effect of aid may surprise and be counterproductive. The major application motivating this study, however, is environmental policy. Not only must the optimal environmental policy be conditioned on political economy forces; it must also be a function of institutional details, such as the political system. This can explain why the choice of instrument differs across political systems, and why politicians often prefer standards rather than economic instruments. Furthermore, we still do not have a good knowledge of how to design effective and implementable international environmental treaties. The optimal treaty design as well as the best choice of policy instrument must take into account that certain institutions (e.g., interest groups, firm structures, and perhaps even local governance) respond endogenously to these policies.
Max ERC Funding
760 170 €
Duration
Start date: 2012-07-01, End date: 2016-06-30
Project acronym GOVERN
Project Local Governance and Dynamic Conflict in Developing Countries
Researcher (PI) Gerard Padro Miquel
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), SH1, ERC-2011-StG_20101124
Summary "This proposal is divided into two main strands.
The first strand seeks to understand the effects of local marginal institutional change in autocracies. In particular, we will examine the introduction of local democracy in rural China. Our first contribution is to collect a representative panel of villages in rural China. With this unique data we will examine three main questions: First, we will establish the effect of the introduction of local elections on policies that are determined at the village level: land allocation, tax collection, public good provision and the enforcement of the one child policy. Second, the data will provide a unique opportunity to explore the interaction between formal and informal institutions of accountability by leveraging our information on social infrastructure in these villages. Third, we will determine whether leaders' characteristics change with the introduction of elections. These unique data will also set the stage for examining many other recent policy reforms in rural China.
The second strand of the proposal seeks to focus the formal conflict literature to the study of insurgencies, a currently prevalent form of organized violence in developing countries. To capture the basic characteristics of these conflicts, we need models that allow for (i) meaningful conflict dynamics, (ii) a central role for the non-combatant population, (iii) fundamental asymmetry between government and insurgents and (iv) economic transfers and service provision as a strategic ability of the contenders. To reach this goal we will build a series of models whose main contribution to the formal literature of conflict is the introduction of tools from the dynamic principal agent framework. Several building blocks will be analyzed before integrating them in a coherent theory of insurgency from which optimal policy and empirical implications can be derived."
Summary
"This proposal is divided into two main strands.
The first strand seeks to understand the effects of local marginal institutional change in autocracies. In particular, we will examine the introduction of local democracy in rural China. Our first contribution is to collect a representative panel of villages in rural China. With this unique data we will examine three main questions: First, we will establish the effect of the introduction of local elections on policies that are determined at the village level: land allocation, tax collection, public good provision and the enforcement of the one child policy. Second, the data will provide a unique opportunity to explore the interaction between formal and informal institutions of accountability by leveraging our information on social infrastructure in these villages. Third, we will determine whether leaders' characteristics change with the introduction of elections. These unique data will also set the stage for examining many other recent policy reforms in rural China.
The second strand of the proposal seeks to focus the formal conflict literature to the study of insurgencies, a currently prevalent form of organized violence in developing countries. To capture the basic characteristics of these conflicts, we need models that allow for (i) meaningful conflict dynamics, (ii) a central role for the non-combatant population, (iii) fundamental asymmetry between government and insurgents and (iv) economic transfers and service provision as a strategic ability of the contenders. To reach this goal we will build a series of models whose main contribution to the formal literature of conflict is the introduction of tools from the dynamic principal agent framework. Several building blocks will be analyzed before integrating them in a coherent theory of insurgency from which optimal policy and empirical implications can be derived."
Max ERC Funding
805 089 €
Duration
Start date: 2012-07-01, End date: 2017-06-30
Project acronym GRAPH GAMES
Project Quantitative Graph Games: Theory and Applications
Researcher (PI) Krishnendu Chatterjee
Host Institution (HI) INSTITUTE OF SCIENCE AND TECHNOLOGYAUSTRIA
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary The theory of games played on graphs provides the mathematical foundations to study numerous important problems in branches of mathematics, economics, computer science, biology, and other fields. One key application area in computer science is the formal verification of reactive systems. The system is modeled as a graph, in which vertices of the graph represent states of the system, edges represent transitions, and paths represent behavior of the system. The verification of the system in an arbitrary environment is then studied as a problem of game played on the graph, where the players represent the different interacting agents. Traditionally, these games have been studied either with Boolean objectives, or single quantitative objectives. However, for the problem of verification of systems that must behave correctly in resource-constrained environments (such as an embedded system) both Boolean and quantitative objectives are necessary: the Boolean objective for correctness specification and quantitative objective for resource-constraints. Thus we need to generalize the theory of graph games such that the objectives can express combinations of quantitative and Boolean objectives. In this project, we will focus on the following research objectives for the study of graph games with quantitative objectives:
(1) develop the mathematical theory and algorithms for the new class of games on graphs obtained by combining quantitative and Boolean objectives;
(2) develop practical techniques (such as compositional and abstraction techniques) that allow our algorithmic solutions be implemented efficiently to handle large game graphs;
(3) explore new application areas to demonstrate the application of quantitative graph games in diverse disciplines; and
(4) develop the theory of games on graphs with infinite state space and with quantitative objectives.
since the theory of graph games is foundational in several disciplines, new algorithmic solutions are expected.
Summary
The theory of games played on graphs provides the mathematical foundations to study numerous important problems in branches of mathematics, economics, computer science, biology, and other fields. One key application area in computer science is the formal verification of reactive systems. The system is modeled as a graph, in which vertices of the graph represent states of the system, edges represent transitions, and paths represent behavior of the system. The verification of the system in an arbitrary environment is then studied as a problem of game played on the graph, where the players represent the different interacting agents. Traditionally, these games have been studied either with Boolean objectives, or single quantitative objectives. However, for the problem of verification of systems that must behave correctly in resource-constrained environments (such as an embedded system) both Boolean and quantitative objectives are necessary: the Boolean objective for correctness specification and quantitative objective for resource-constraints. Thus we need to generalize the theory of graph games such that the objectives can express combinations of quantitative and Boolean objectives. In this project, we will focus on the following research objectives for the study of graph games with quantitative objectives:
(1) develop the mathematical theory and algorithms for the new class of games on graphs obtained by combining quantitative and Boolean objectives;
(2) develop practical techniques (such as compositional and abstraction techniques) that allow our algorithmic solutions be implemented efficiently to handle large game graphs;
(3) explore new application areas to demonstrate the application of quantitative graph games in diverse disciplines; and
(4) develop the theory of games on graphs with infinite state space and with quantitative objectives.
since the theory of graph games is foundational in several disciplines, new algorithmic solutions are expected.
Max ERC Funding
1 163 111 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym GRB-SN
Project The Gamma Ray Burst – Supernova Connection
and Shock Breakout Physics
Researcher (PI) Ehud Nakar
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary Long gamma ray bursts (long GRBs) and core-collapse supernovae (CCSNe) are two of the most spectacular explosions in the Universe. They are a focal point of research for many reasons. Nevertheless, despite considerable effort during the last several decades, there are still many fundamental open questions regarding their physics.
Long GRBs and CCSNe are related. We know that they are both an outcome of a massive star collapse, where in some cases, such collapse produces simultaneously a GRB and a SN. However, we do not know how a single stellar collapse can produce these two apparently very different explosions. The GRB-SN connection raises many questions, but it also offers new opportunities to learn on the two types of explosions.
The focus of the proposed research is on the connection between CCSNe and GRBs, and on the physics of shock breakout. As I explain in this proposal, shock breakouts play an important role in this connection and therefore, I will develop a comprehensive theory of relativistic and Newtonian shock breakout. In addition, I will study the propagation of relativistic jets inside stars, including the effects of jet propagation and GRB engine on the emerging SN. This will be done by a set of interrelated projects that carefully combine analytic calculations and numerical simulations. Together, these projects will be the first to model a GRB and a SN that are simultaneously produced in a single star. This in turn will be used to gain new insights into long GRBs and CCSNe in general.
This research will also make a direct contribution to cosmic explosions research in general. Any observable cosmic explosion must go through a shock breakout and a considerable effort is invested these days in large field of view surveys in search for these breakouts. This program will provide a new theoretical base for the interpretation of the upcoming observations.
Summary
Long gamma ray bursts (long GRBs) and core-collapse supernovae (CCSNe) are two of the most spectacular explosions in the Universe. They are a focal point of research for many reasons. Nevertheless, despite considerable effort during the last several decades, there are still many fundamental open questions regarding their physics.
Long GRBs and CCSNe are related. We know that they are both an outcome of a massive star collapse, where in some cases, such collapse produces simultaneously a GRB and a SN. However, we do not know how a single stellar collapse can produce these two apparently very different explosions. The GRB-SN connection raises many questions, but it also offers new opportunities to learn on the two types of explosions.
The focus of the proposed research is on the connection between CCSNe and GRBs, and on the physics of shock breakout. As I explain in this proposal, shock breakouts play an important role in this connection and therefore, I will develop a comprehensive theory of relativistic and Newtonian shock breakout. In addition, I will study the propagation of relativistic jets inside stars, including the effects of jet propagation and GRB engine on the emerging SN. This will be done by a set of interrelated projects that carefully combine analytic calculations and numerical simulations. Together, these projects will be the first to model a GRB and a SN that are simultaneously produced in a single star. This in turn will be used to gain new insights into long GRBs and CCSNe in general.
This research will also make a direct contribution to cosmic explosions research in general. Any observable cosmic explosion must go through a shock breakout and a considerable effort is invested these days in large field of view surveys in search for these breakouts. This program will provide a new theoretical base for the interpretation of the upcoming observations.
Max ERC Funding
1 468 180 €
Duration
Start date: 2012-01-01, End date: 2017-12-31