Project acronym ABEP
Project Asset Bubbles and Economic Policy
Researcher (PI) Jaume Ventura Fontanet
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Summary
Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Max ERC Funding
1 000 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym AdaptiveResponse
Project The evolution of adaptive response mechanisms
Researcher (PI) Franz WEISSING
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), LS8, ERC-2017-ADG
Summary In an era of rapid climate change there is a pressing need to understand whether and how organisms are able to adapt to novel environments. Such understanding is hampered by a major divide in the life sciences. Disciplines like systems biology or neurobiology make rapid progress in unravelling the mechanisms underlying the responses of organisms to their environment, but this knowledge is insufficiently integrated in eco-evolutionary theory. Current eco-evolutionary models focus on the response patterns themselves, largely neglecting the structures and mechanisms producing these patterns. Here I propose a new, mechanism-oriented framework that views the architecture of adaptation, rather than the resulting responses, as the primary target of natural selection. I am convinced that this change in perspective will yield fundamentally new insights, necessitating the re-evaluation of many seemingly well-established eco-evolutionary principles.
My aim is to develop a comprehensive theory of the eco-evolutionary causes and consequences of the architecture underlying adaptive responses. In three parallel lines of investigation, I will study how architecture is shaped by selection, how evolved response strategies reflect the underlying architecture, and how these responses affect the eco-evolutionary dynamics and the capacity to adapt to novel conditions. All three lines have the potential of making ground-breaking contributions to eco-evolutionary theory, including: the specification of evolutionary tipping points; resolving the puzzle that real organisms evolve much faster than predicted by current theory; a new and general explanation for the evolutionary emergence of individual variation; and a framework for studying the evolution of learning and other general-purpose mechanisms. By making use of concepts from information theory and artificial intelligence, the project will also introduce various methodological innovations.
Summary
In an era of rapid climate change there is a pressing need to understand whether and how organisms are able to adapt to novel environments. Such understanding is hampered by a major divide in the life sciences. Disciplines like systems biology or neurobiology make rapid progress in unravelling the mechanisms underlying the responses of organisms to their environment, but this knowledge is insufficiently integrated in eco-evolutionary theory. Current eco-evolutionary models focus on the response patterns themselves, largely neglecting the structures and mechanisms producing these patterns. Here I propose a new, mechanism-oriented framework that views the architecture of adaptation, rather than the resulting responses, as the primary target of natural selection. I am convinced that this change in perspective will yield fundamentally new insights, necessitating the re-evaluation of many seemingly well-established eco-evolutionary principles.
My aim is to develop a comprehensive theory of the eco-evolutionary causes and consequences of the architecture underlying adaptive responses. In three parallel lines of investigation, I will study how architecture is shaped by selection, how evolved response strategies reflect the underlying architecture, and how these responses affect the eco-evolutionary dynamics and the capacity to adapt to novel conditions. All three lines have the potential of making ground-breaking contributions to eco-evolutionary theory, including: the specification of evolutionary tipping points; resolving the puzzle that real organisms evolve much faster than predicted by current theory; a new and general explanation for the evolutionary emergence of individual variation; and a framework for studying the evolution of learning and other general-purpose mechanisms. By making use of concepts from information theory and artificial intelligence, the project will also introduce various methodological innovations.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALMP_ECON
Project Effective evaluation of active labour market policies in social insurance programs - improving the interaction between econometric evaluation estimators and economic theory
Researcher (PI) Bas Van Der Klaauw
Host Institution (HI) STICHTING VU
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary In most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.
Summary
In most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.
Max ERC Funding
550 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym ANAMMOX
Project Anaerobic ammonium oxidizing bacteria: unique prokayotes with exceptional properties
Researcher (PI) Michael Silvester Maria Jetten
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), LS8, ERC-2008-AdG
Summary For over a century it was believed that ammonium could only be oxidized by microbes in the presence of oxygen. The possibility of anaerobic ammonium oxidation (anammox) was considered impossible. However, about 10 years ago the microbes responsible for the anammox reaction were discovered in a wastewater plant. This was followed by the identification of the responsible bacteria. Recently, the widespread environmental occurrence of the anammox bacteria was demonstrated leading to the realization that anammox bacteria may play a major role in biological nitrogen cycling. The anammox bacteria are unique microbes with many unusual properties. These include the biological turn-over of hydrazine, a well known rocket fuel, the biological synthesis of ladderane lipids, and the presence of a prokaryotic organelle in the cytoplasma of anammox bacteria. The aim of this project is to obtain a fundamental understanding of the metabolism and ecological importance of the anammox bacteria. Such understanding contributes directly to our environment and economy because the anammox bacteria form a new opportunity for nitrogen removal from wastewater, cheaper, with lower carbon dioxide emissions than existing technology. Scientifically the results will contribute to the understanding how hydrazine and dinitrogen gas are made by the anammox bacteria. The research will show which gene products are responsible for the anammox reaction, and how their expression is regulated. Furthermore, the experiments proposed will show if the prokaryotic organelle in anammox bacteria is involved in energy generation. Together the environmental and metabolic data will help to understand why anammox bacteria are so successful in the biogeochemical nitrogen cycle and thus shape our planets atmosphere. The different research lines will employ state of the art microbial and molecular methods to unravel the exceptional properties of these highly unusual and important anammox bacteria.
Summary
For over a century it was believed that ammonium could only be oxidized by microbes in the presence of oxygen. The possibility of anaerobic ammonium oxidation (anammox) was considered impossible. However, about 10 years ago the microbes responsible for the anammox reaction were discovered in a wastewater plant. This was followed by the identification of the responsible bacteria. Recently, the widespread environmental occurrence of the anammox bacteria was demonstrated leading to the realization that anammox bacteria may play a major role in biological nitrogen cycling. The anammox bacteria are unique microbes with many unusual properties. These include the biological turn-over of hydrazine, a well known rocket fuel, the biological synthesis of ladderane lipids, and the presence of a prokaryotic organelle in the cytoplasma of anammox bacteria. The aim of this project is to obtain a fundamental understanding of the metabolism and ecological importance of the anammox bacteria. Such understanding contributes directly to our environment and economy because the anammox bacteria form a new opportunity for nitrogen removal from wastewater, cheaper, with lower carbon dioxide emissions than existing technology. Scientifically the results will contribute to the understanding how hydrazine and dinitrogen gas are made by the anammox bacteria. The research will show which gene products are responsible for the anammox reaction, and how their expression is regulated. Furthermore, the experiments proposed will show if the prokaryotic organelle in anammox bacteria is involved in energy generation. Together the environmental and metabolic data will help to understand why anammox bacteria are so successful in the biogeochemical nitrogen cycle and thus shape our planets atmosphere. The different research lines will employ state of the art microbial and molecular methods to unravel the exceptional properties of these highly unusual and important anammox bacteria.
Max ERC Funding
2 500 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANIMETRICS
Project Measurement-Based Modeling and Animation of Complex Mechanical Phenomena
Researcher (PI) Miguel Angel Otaduy Tristan
Host Institution (HI) UNIVERSIDAD REY JUAN CARLOS
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Summary
Computer animation has traditionally been associated with applications in virtual-reality-based training, video games or feature films. However, interactive animation is gaining relevance in a more general scope, as a tool for early-stage analysis, design and planning in many applications in science and engineering. The user can get quick and visual feedback of the results, and then proceed by refining the experiments or designs. Potential applications include nanodesign, e-commerce or tactile telecommunication, but they also reach as far as, e.g., the analysis of ecological, climate, biological or physiological processes.
The application of computer animation is extremely limited in comparison to its potential outreach due to a trade-off between accuracy and computational efficiency. Such trade-off is induced by inherent complexity sources such as nonlinear or anisotropic behaviors, heterogeneous properties, or high dynamic ranges of effects.
The Animetrics project proposes a modeling and animation methodology, which consists of a multi-scale decomposition of complex processes, the description of the process at each scale through combination of simple local models, and fitting the parameters of those local models using large amounts of data from example effects. The modeling and animation methodology will be explored on specific problems arising in complex mechanical phenomena, including viscoelasticity of solids and thin shells, multi-body contact, granular and liquid flow, and fracture of solids.
Max ERC Funding
1 277 969 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym APMPAL
Project Asset Prices and Macro Policy when Agents Learn
Researcher (PI) Albert Marcet Torrens
Host Institution (HI) FUNDACIÓ MARKETS, ORGANIZATIONS AND VOTES IN ECONOMICS
Call Details Advanced Grant (AdG), SH1, ERC-2012-ADG_20120411
Summary "A conventional assumption in dynamic models is that agents form their expectations in a very sophisticated manner. In particular, that they have Rational Expectations (RE). We develop some tools to relax this assumption while retaining fully optimal behaviour by agents. We study implications for asset pricing and macro policy.
We assume that agents have a consistent set of beliefs that is close, but not equal, to RE. Agents are ""Internally Rational"", that is, they behave rationally given their system of beliefs. Thus, it is conceptually a small deviation from RE. It provides microfoundations for models of adaptive learning, since the learning algorithm is determined by agents’ optimal behaviour. In previous work we have shown that this framework can match stock price and housing price fluctuations, and that policy implications are quite different.
In this project we intend to: i) develop further the foundations of internally rational (IR) learning, ii) apply this to explain observed asset price price behavior, such as stock prices, bond prices, inflation, commodity derivatives, and exchange rates, iii) extend the IR framework to the case when agents entertain various models, iv) optimal policy under IR learning and under private information when some hidden shocks are not revealed ex-post. Along the way we will address policy issues such as: effects of creating derivative markets, sovereign spread as a signal of sovereign default risk, tests of fiscal sustainability, fiscal policy when agents learn, monetary policy (more specifically, QE measures and interest rate policy), and the role of credibility in macro policy."
Summary
"A conventional assumption in dynamic models is that agents form their expectations in a very sophisticated manner. In particular, that they have Rational Expectations (RE). We develop some tools to relax this assumption while retaining fully optimal behaviour by agents. We study implications for asset pricing and macro policy.
We assume that agents have a consistent set of beliefs that is close, but not equal, to RE. Agents are ""Internally Rational"", that is, they behave rationally given their system of beliefs. Thus, it is conceptually a small deviation from RE. It provides microfoundations for models of adaptive learning, since the learning algorithm is determined by agents’ optimal behaviour. In previous work we have shown that this framework can match stock price and housing price fluctuations, and that policy implications are quite different.
In this project we intend to: i) develop further the foundations of internally rational (IR) learning, ii) apply this to explain observed asset price price behavior, such as stock prices, bond prices, inflation, commodity derivatives, and exchange rates, iii) extend the IR framework to the case when agents entertain various models, iv) optimal policy under IR learning and under private information when some hidden shocks are not revealed ex-post. Along the way we will address policy issues such as: effects of creating derivative markets, sovereign spread as a signal of sovereign default risk, tests of fiscal sustainability, fiscal policy when agents learn, monetary policy (more specifically, QE measures and interest rate policy), and the role of credibility in macro policy."
Max ERC Funding
1 970 260 €
Duration
Start date: 2013-06-01, End date: 2018-08-31
Project acronym APMPAL-HET
Project Asset Prices and Macro Policy when Agents Learn and are Heterogeneous
Researcher (PI) Albert MARCET TORRENS
Host Institution (HI) FUNDACIÓ MARKETS, ORGANIZATIONS AND VOTES IN ECONOMICS
Call Details Advanced Grant (AdG), SH1, ERC-2017-ADG
Summary Based on the APMPAL (ERC) project we continue to develop the frameworks of internal rationality (IR) and optimal signal extraction (OSE). Under IR investors/consumers behave rationally given their subjective beliefs about prices, these beliefs are compatible with data. Under OSE the government has partial information, it knows how policy influences observed variables and signal extraction.
We develop further the foundations of IR and OSE with an emphasis on heterogeneous agents. We study sovereign bond crisis and heterogeneity of beliefs in asset pricing models under IR, using survey data on expectations. Under IR the assets’ stochastic discount factor depends on the agents’ decision function and beliefs; this modifies some key asset pricing results. We extend OSE to models with state variables, forward-looking constraints and heterogeneity.
Under IR agents’ prior beliefs determine the effects of a policy reform. If the government does not observe prior beliefs it has partial information, thus OSE should be used to analyse policy reforms under IR.
If IR heterogeneous workers forecast their productivity either from their own wage or their neighbours’ in a network, low current wages discourage search and human capital accumulation, leading to low productivity. This can explain low development of a country or social exclusion of a group. Worker subsidies redistribute wealth and can increase productivity if they “teach” agents to exit a low-wage state.
We build DSGE models under IR for prediction and policy analysis. We develop time-series tools for predicting macro and asset market variables, using information available to the analyst, and we introduce non-linearities and survey expectations using insights from models under IR.
We study how IR and OSE change the view on macro policy issues such as tax smoothing, debt management, Taylor rule, level of inflation, fiscal/monetary policy coordination, factor taxation or redistribution.
Summary
Based on the APMPAL (ERC) project we continue to develop the frameworks of internal rationality (IR) and optimal signal extraction (OSE). Under IR investors/consumers behave rationally given their subjective beliefs about prices, these beliefs are compatible with data. Under OSE the government has partial information, it knows how policy influences observed variables and signal extraction.
We develop further the foundations of IR and OSE with an emphasis on heterogeneous agents. We study sovereign bond crisis and heterogeneity of beliefs in asset pricing models under IR, using survey data on expectations. Under IR the assets’ stochastic discount factor depends on the agents’ decision function and beliefs; this modifies some key asset pricing results. We extend OSE to models with state variables, forward-looking constraints and heterogeneity.
Under IR agents’ prior beliefs determine the effects of a policy reform. If the government does not observe prior beliefs it has partial information, thus OSE should be used to analyse policy reforms under IR.
If IR heterogeneous workers forecast their productivity either from their own wage or their neighbours’ in a network, low current wages discourage search and human capital accumulation, leading to low productivity. This can explain low development of a country or social exclusion of a group. Worker subsidies redistribute wealth and can increase productivity if they “teach” agents to exit a low-wage state.
We build DSGE models under IR for prediction and policy analysis. We develop time-series tools for predicting macro and asset market variables, using information available to the analyst, and we introduce non-linearities and survey expectations using insights from models under IR.
We study how IR and OSE change the view on macro policy issues such as tax smoothing, debt management, Taylor rule, level of inflation, fiscal/monetary policy coordination, factor taxation or redistribution.
Max ERC Funding
1 524 144 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym AUTAR
Project A Unified Theory of Algorithmic Relaxations
Researcher (PI) Albert Atserias Peri
Host Institution (HI) UNIVERSITAT POLITECNICA DE CATALUNYA
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Summary
For a large family of computational problems collectively known as constrained optimization and satisfaction problems (CSPs), four decades of research in algorithms and computational complexity have led to a theory that tries to classify them as algorithmically tractable vs. intractable, i.e. polynomial-time solvable vs. NP-hard. However, there remains an important gap in our knowledge in that many CSPs of interest resist classification by this theory. Some such problems of practical relevance include fundamental partition problems in graph theory, isomorphism problems in combinatorics, and strategy-design problems in mathematical game theory. To tackle this gap in our knowledge, the research of the last decade has been driven either by finding hard instances for algorithms that solve tighter and tighter relaxations of the original problem, or by formulating new hardness-hypotheses that are stronger but admittedly less robust than NP-hardness.
The ultimate goal of this project is closing the gap between the partial progress that these approaches represent and the original classification project into tractable vs. intractable problems. Our thesis is that the field has reached a point where, in many cases of interest, the analysis of the current candidate algorithms that appear to solve all instances could suffice to classify the problem one way or the other, without the need for alternative hardness-hypotheses. The novelty in our approach is a program to develop our recent discovery that, in some cases of interest, two methods from different areas match in strength: indistinguishability pebble games from mathematical logic, and hierarchies of convex relaxations from mathematical programming. Thus, we aim at making significant advances in the status of important algorithmic problems by looking for a general theory that unifies and goes beyond the current understanding of its components.
Max ERC Funding
1 725 656 €
Duration
Start date: 2015-06-01, End date: 2020-05-31
Project acronym BabyVir
Project The role of the virome in shaping the gut ecosystem during the first year of life
Researcher (PI) Alexandra Petrovna ZHERNAKOVA
Host Institution (HI) ACADEMISCH ZIEKENHUIS GRONINGEN
Call Details Starting Grant (StG), LS8, ERC-2016-STG
Summary The role of intestinal bacteria in human health and disease has been intensively studied; however the viral composition of the microbiome, the virome, remains largely unknown. As many of the viruses are bacteriophages, they are expected to be a major factor shaping the human microbiome. The dynamics of the virome during early life, its interaction with host and environmental factors, is likely to have profound effects on human physiology. Therefore it is extremely timely to study the virome in depth and on a wide scale.
This ERC project aims at understanding how the gut virome develops during the first year of life and how that relates to the composition of the bacterial microbiome. In particular, we will determine which intrinsic and environmental factors, including genetics and the mother’s microbiome and diet, interact with the virome in shaping the early gut microbiome ecosystem. In a longitudinal study of 1,000 newborns followed at 7 time points from birth till age 12 months, I will investigate: (1) the composition and evolution of the virome and bacterial microbiome in the first year of life; (2) the role of factors coming from the mother and from the host genome on virome and bacterial microbiome development and their co-evolution; and (3) the role of environmental factors, like infectious diseases, vaccinations and diet habits, on establishing the virome and overall microbiome composition during the first year of life.
This project will provide crucial knowledge about composition and maturation of the virome during the first year of life, and its symbiotic relation with the bacterial microbiome. This longitudinal dataset will be instrumental for identification of microbiome markers of diseases and for the follow up analysis of the long-term effect of microbiota maturation later in life. Knowledge of the role of viruses in shaping the microbiota may promote future directions for manipulating the human gut microbiota in health and disease.
Summary
The role of intestinal bacteria in human health and disease has been intensively studied; however the viral composition of the microbiome, the virome, remains largely unknown. As many of the viruses are bacteriophages, they are expected to be a major factor shaping the human microbiome. The dynamics of the virome during early life, its interaction with host and environmental factors, is likely to have profound effects on human physiology. Therefore it is extremely timely to study the virome in depth and on a wide scale.
This ERC project aims at understanding how the gut virome develops during the first year of life and how that relates to the composition of the bacterial microbiome. In particular, we will determine which intrinsic and environmental factors, including genetics and the mother’s microbiome and diet, interact with the virome in shaping the early gut microbiome ecosystem. In a longitudinal study of 1,000 newborns followed at 7 time points from birth till age 12 months, I will investigate: (1) the composition and evolution of the virome and bacterial microbiome in the first year of life; (2) the role of factors coming from the mother and from the host genome on virome and bacterial microbiome development and their co-evolution; and (3) the role of environmental factors, like infectious diseases, vaccinations and diet habits, on establishing the virome and overall microbiome composition during the first year of life.
This project will provide crucial knowledge about composition and maturation of the virome during the first year of life, and its symbiotic relation with the bacterial microbiome. This longitudinal dataset will be instrumental for identification of microbiome markers of diseases and for the follow up analysis of the long-term effect of microbiota maturation later in life. Knowledge of the role of viruses in shaping the microbiota may promote future directions for manipulating the human gut microbiota in health and disease.
Max ERC Funding
1 499 881 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym BALANCED LETHALS
Project Untangling the Evolution of a Balanced Lethal System
Researcher (PI) Biense WIELSTRA
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), LS8, ERC-2018-STG
Summary Natural selection is supposed to keep lethal alleles (dysfunctional or deleted copies of crucial genes) in check. Yet, in a balanced lethal system the frequency of lethal alleles is inflated. Because two forms of a chromosome carry distinct lethal alleles that are reciprocally compensated for by functional genes on the alternate chromosome form, both chromosome forms – and in effect their linked lethal alleles – are required for survival. The inability of natural selection to purge balanced lethal systems appears to defy evolutionary theory. How do balanced lethal systems originate and persist in nature? I suspect the answer to this pressing but neglected research question can be found in the context of supergenes in a balanced polymorphism – a current, hot topic in evolutionary biology. Chromosome rearrangements can lock distinct beneficial sets of alleles (i.e. supergenes) on two chromosome forms by suppressing recombination. Now, balancing selection would favour possession of both supergenes. However, as a consequence of suppressed recombination, unique lethal alleles could become fixed on each supergene, with natural selection powerless to prevent collapse of the arrangement into a balanced lethal system. I aim to explain the evolution of balanced lethal systems in nature. As empirical example I will use chromosome 1 syndrome, a balanced lethal system observed in newts of the genus Triturus. My research team will: Reconstruct the genomic architecture of this balanced lethal system at its point of origin [PI project]; Conduct comparative genomics with related, unaffected species [PhD project]; Determine gene order of the two supergenes involved [Postdoc project I]; and Model the conditions under which this balanced lethal system could theoretically have evolved [Postdoc project II]. Solving the paradox of chromosome 1 syndrome will allow us to understand balanced lethal systems in general and address the challenges they pose to evolutionary theory.
Summary
Natural selection is supposed to keep lethal alleles (dysfunctional or deleted copies of crucial genes) in check. Yet, in a balanced lethal system the frequency of lethal alleles is inflated. Because two forms of a chromosome carry distinct lethal alleles that are reciprocally compensated for by functional genes on the alternate chromosome form, both chromosome forms – and in effect their linked lethal alleles – are required for survival. The inability of natural selection to purge balanced lethal systems appears to defy evolutionary theory. How do balanced lethal systems originate and persist in nature? I suspect the answer to this pressing but neglected research question can be found in the context of supergenes in a balanced polymorphism – a current, hot topic in evolutionary biology. Chromosome rearrangements can lock distinct beneficial sets of alleles (i.e. supergenes) on two chromosome forms by suppressing recombination. Now, balancing selection would favour possession of both supergenes. However, as a consequence of suppressed recombination, unique lethal alleles could become fixed on each supergene, with natural selection powerless to prevent collapse of the arrangement into a balanced lethal system. I aim to explain the evolution of balanced lethal systems in nature. As empirical example I will use chromosome 1 syndrome, a balanced lethal system observed in newts of the genus Triturus. My research team will: Reconstruct the genomic architecture of this balanced lethal system at its point of origin [PI project]; Conduct comparative genomics with related, unaffected species [PhD project]; Determine gene order of the two supergenes involved [Postdoc project I]; and Model the conditions under which this balanced lethal system could theoretically have evolved [Postdoc project II]. Solving the paradox of chromosome 1 syndrome will allow us to understand balanced lethal systems in general and address the challenges they pose to evolutionary theory.
Max ERC Funding
1 499 869 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym BayesianMarkets
Project Bayesian markets for unverifiable truths
Researcher (PI) Aurelien Baillon
Host Institution (HI) ERASMUS UNIVERSITEIT ROTTERDAM
Call Details Starting Grant (StG), SH1, ERC-2014-STG
Summary Subjective data play an increasing role in modern economics. For instance, new welfare measurements are based on people’s subjective assessments of their happiness or their life satisfaction. A problem of such measurements is that people have no incentives to tell the truth. To solve this problem and make those measurements incentive compatible, I will introduce a new market institution, called Bayesian markets.
Imagine we ask people whether they are happy with their life. On Bayesian markets, they will trade an asset whose value is the proportion of people answering Yes. Only those answering Yes will have the right to buy the asset and those answering No the right to sell it. Bayesian updating implies that “Yes” agents predict a higher value of the asset than “No” agents do and, consequently, “Yes” agents want to buy it while “No” agents want to sell it. I will show that truth-telling is then the optimal strategy.
Bayesian markets reward truth-telling the same way as prediction markets (betting markets) reward people for reporting their true subjective probabilities about observable events. Yet, unlike prediction markets, they do not require events to be objectively observable. Bayesian markets apply to any type of unverifiable truths, from one’s own happiness to beliefs about events that will never be observed.
The present research program will first establish the theoretical foundations of Bayesian markets. It will then develop the proper methodology to implement them. Finally, it will disseminate the use of Bayesian markets via applications.
The first application will demonstrate how degrees of expertise can be measured and will apply it to risks related to climate change and nuclear power plants. It will contribute to the political debate by shedding new light on what true experts think about these risks. The second application will provide the first incentivized measures of life satisfaction and happiness.
Summary
Subjective data play an increasing role in modern economics. For instance, new welfare measurements are based on people’s subjective assessments of their happiness or their life satisfaction. A problem of such measurements is that people have no incentives to tell the truth. To solve this problem and make those measurements incentive compatible, I will introduce a new market institution, called Bayesian markets.
Imagine we ask people whether they are happy with their life. On Bayesian markets, they will trade an asset whose value is the proportion of people answering Yes. Only those answering Yes will have the right to buy the asset and those answering No the right to sell it. Bayesian updating implies that “Yes” agents predict a higher value of the asset than “No” agents do and, consequently, “Yes” agents want to buy it while “No” agents want to sell it. I will show that truth-telling is then the optimal strategy.
Bayesian markets reward truth-telling the same way as prediction markets (betting markets) reward people for reporting their true subjective probabilities about observable events. Yet, unlike prediction markets, they do not require events to be objectively observable. Bayesian markets apply to any type of unverifiable truths, from one’s own happiness to beliefs about events that will never be observed.
The present research program will first establish the theoretical foundations of Bayesian markets. It will then develop the proper methodology to implement them. Finally, it will disseminate the use of Bayesian markets via applications.
The first application will demonstrate how degrees of expertise can be measured and will apply it to risks related to climate change and nuclear power plants. It will contribute to the political debate by shedding new light on what true experts think about these risks. The second application will provide the first incentivized measures of life satisfaction and happiness.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym BIOCOM
Project Biotic community attributes and ecosystem functioning: implications for predicting and mitigating global change impacts
Researcher (PI) Fernando Tomás Maestre Gil
Host Institution (HI) UNIVERSIDAD REY JUAN CARLOS
Call Details Starting Grant (StG), LS8, ERC-2009-StG
Summary Increases in nutrient availability and temperature, and changes in precipitation patterns and biodiversity are important components of global environmental change. Thus, it is imperative to understand their impacts on the functioning of natural ecosystems. Substantial research efforts are being currently devoted to predict how biodiversity will respond to global change. However, little is known on the relative importance of biodiversity against other attributes of biotic communities, such as species cover and spatial pattern, as a driver of ecosystem processes. Furthermore, the effects of global change on the relationships between these attributes and ecosystem functioning are virtually unknown. This project aims to evaluate the relationships between community attributes (species richness, composition, evenness, cover, and spatial pattern) and key processes related to ecosystem functioning under different global change scenarios. Its specific objectives are to: i) evaluate the relative importance of community attributes as drivers of ecosystem functioning, ii) assess how multiple global change drivers will affect key ecosystem processes, iii) test whether global change drivers modify observed community attributes-ecosystem functioning relationships, iv) develop models to forecast global change effects on ecosystem functioning, and v) set up protocols for the establishment of mitigation actions based on the results obtained. They will be achieved by integrating experimental and modeling approaches conducted with multiple biotic communities at different spatial scales. Such integrated framework has not been tackled before, and constitutes a ground breaking advance over current research efforts on global change. This proposal will also open the door to new research lines exploring the functional role of community attributes and their importance as modulators of ecosystem responses to global change.
Summary
Increases in nutrient availability and temperature, and changes in precipitation patterns and biodiversity are important components of global environmental change. Thus, it is imperative to understand their impacts on the functioning of natural ecosystems. Substantial research efforts are being currently devoted to predict how biodiversity will respond to global change. However, little is known on the relative importance of biodiversity against other attributes of biotic communities, such as species cover and spatial pattern, as a driver of ecosystem processes. Furthermore, the effects of global change on the relationships between these attributes and ecosystem functioning are virtually unknown. This project aims to evaluate the relationships between community attributes (species richness, composition, evenness, cover, and spatial pattern) and key processes related to ecosystem functioning under different global change scenarios. Its specific objectives are to: i) evaluate the relative importance of community attributes as drivers of ecosystem functioning, ii) assess how multiple global change drivers will affect key ecosystem processes, iii) test whether global change drivers modify observed community attributes-ecosystem functioning relationships, iv) develop models to forecast global change effects on ecosystem functioning, and v) set up protocols for the establishment of mitigation actions based on the results obtained. They will be achieved by integrating experimental and modeling approaches conducted with multiple biotic communities at different spatial scales. Such integrated framework has not been tackled before, and constitutes a ground breaking advance over current research efforts on global change. This proposal will also open the door to new research lines exploring the functional role of community attributes and their importance as modulators of ecosystem responses to global change.
Max ERC Funding
1 463 374 €
Duration
Start date: 2010-01-01, End date: 2015-09-30
Project acronym BIODESERT
Project Biological feedbacks and ecosystem resilience under global change: a new perspective on dryland desertification
Researcher (PI) Fernando Tomás Maestre Gil
Host Institution (HI) UNIVERSIDAD DE ALICANTE
Call Details Consolidator Grant (CoG), LS8, ERC-2014-CoG
Summary Changes in climate and land use (e.g., increased grazing pressure), are two main global change components that also act as major desertification drivers. Understanding how drylands will respond to these drivers is crucial because they occupy 41% of the terrestrial surface and are home to over 38% of the world’s human population. Land degradation already affects ~250 million people in the developing world, which rely upon the provision of many ecosystem processes (multifunctionality). This proposal aims to develop a better understanding of the functioning and resilience of drylands (i.e. their ability to respond to and recover from disturbances) to major desertification drivers. Its objectives are to: 1) test how changes in climate and grazing pressure determine spatiotemporal patterns in multifunctionality in global drylands, 2) assess how biotic attributes (e.g., biodiversity, cover) modulate ecosystem resilience to climate change and grazing pressure at various spatial scales, 3) test and develop early warning indicators of desertification, and 4) forecast the onset of desertification and its ecological consequences under different climate and grazing scenarios. I will use various biotic communities/attributes, ecosystem services and spatial scales (from local to global), and will combine approaches from several disciplines. Such comprehensive and highly integrated research endeavor is novel and constitutes a ground breaking advance over current research efforts on desertification. This project will provide a mechanistic understanding on the processes driving multifunctionality under different global change scenarios, as well as key insights to forecast future scenarios for the provisioning of ecosystem services in drylands, and to test and develop early warning indicators of desertification. This is of major importance to attain global sustainability and key Millennium Development Goals, such as the eradication of poverty.
Summary
Changes in climate and land use (e.g., increased grazing pressure), are two main global change components that also act as major desertification drivers. Understanding how drylands will respond to these drivers is crucial because they occupy 41% of the terrestrial surface and are home to over 38% of the world’s human population. Land degradation already affects ~250 million people in the developing world, which rely upon the provision of many ecosystem processes (multifunctionality). This proposal aims to develop a better understanding of the functioning and resilience of drylands (i.e. their ability to respond to and recover from disturbances) to major desertification drivers. Its objectives are to: 1) test how changes in climate and grazing pressure determine spatiotemporal patterns in multifunctionality in global drylands, 2) assess how biotic attributes (e.g., biodiversity, cover) modulate ecosystem resilience to climate change and grazing pressure at various spatial scales, 3) test and develop early warning indicators of desertification, and 4) forecast the onset of desertification and its ecological consequences under different climate and grazing scenarios. I will use various biotic communities/attributes, ecosystem services and spatial scales (from local to global), and will combine approaches from several disciplines. Such comprehensive and highly integrated research endeavor is novel and constitutes a ground breaking advance over current research efforts on desertification. This project will provide a mechanistic understanding on the processes driving multifunctionality under different global change scenarios, as well as key insights to forecast future scenarios for the provisioning of ecosystem services in drylands, and to test and develop early warning indicators of desertification. This is of major importance to attain global sustainability and key Millennium Development Goals, such as the eradication of poverty.
Max ERC Funding
1 894 450 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym BRAINSIGNALS
Project Optical dissection of circuits underlying fast cholinergic signalling during cognitive behaviour
Researcher (PI) Huibert Mansvelder
Host Institution (HI) STICHTING VU
Call Details Starting Grant (StG), LS5, ERC-2011-StG_20101109
Summary Our ability to think, to memorize and focus our thoughts depends on acetylcholine signaling in the brain. The loss of cholinergic signalling in for instance Alzheimer’s disease strongly compromises these cognitive abilities. The traditional view on the role of cholinergic input to the neocortex is that slowly changing levels of extracellular acetylcholine (ACh) mediate different arousal states. This view has been challenged by recent studies demonstrating that rapid phasic changes in ACh levels at the scale of seconds are correlated with focus of attention, suggesting that these signals may mediate defined cognitive operations. Despite a wealth of anatomical data on the organization of the cholinergic system, very little understanding exists on its functional organization. How the relatively sparse input of cholinergic transmission in the prefrontal cortex elicits such a profound and specific control over attention is unknown. The main objective of this proposal is to develop a causal understanding of how cellular mechanisms of fast acetylcholine signalling are orchestrated during cognitive behaviour.
In a series of studies, I have identified several synaptic and cellular mechanisms by which the cholinergic system can alter neuronal circuitry function, both in cortical and subcortical areas. I have used a combination of behavioral, physiological and genetic methods in which I manipulated cholinergic receptor functionality in prefrontal cortex in a subunit specific manner and found that ACh receptors in the prefrontal cortex control attention performance. Recent advances in optogenetic and electrochemical methods now allow to rapidly manipulate and measure acetylcholine levels in freely moving, behaving animals. Using these techniques, I aim to uncover which cholinergic neurons are involved in fast cholinergic signaling during cognition and uncover the underlying neuronal mechanisms that alter prefrontal cortical network function.
Summary
Our ability to think, to memorize and focus our thoughts depends on acetylcholine signaling in the brain. The loss of cholinergic signalling in for instance Alzheimer’s disease strongly compromises these cognitive abilities. The traditional view on the role of cholinergic input to the neocortex is that slowly changing levels of extracellular acetylcholine (ACh) mediate different arousal states. This view has been challenged by recent studies demonstrating that rapid phasic changes in ACh levels at the scale of seconds are correlated with focus of attention, suggesting that these signals may mediate defined cognitive operations. Despite a wealth of anatomical data on the organization of the cholinergic system, very little understanding exists on its functional organization. How the relatively sparse input of cholinergic transmission in the prefrontal cortex elicits such a profound and specific control over attention is unknown. The main objective of this proposal is to develop a causal understanding of how cellular mechanisms of fast acetylcholine signalling are orchestrated during cognitive behaviour.
In a series of studies, I have identified several synaptic and cellular mechanisms by which the cholinergic system can alter neuronal circuitry function, both in cortical and subcortical areas. I have used a combination of behavioral, physiological and genetic methods in which I manipulated cholinergic receptor functionality in prefrontal cortex in a subunit specific manner and found that ACh receptors in the prefrontal cortex control attention performance. Recent advances in optogenetic and electrochemical methods now allow to rapidly manipulate and measure acetylcholine levels in freely moving, behaving animals. Using these techniques, I aim to uncover which cholinergic neurons are involved in fast cholinergic signaling during cognition and uncover the underlying neuronal mechanisms that alter prefrontal cortical network function.
Max ERC Funding
1 499 242 €
Duration
Start date: 2011-11-01, End date: 2016-10-31
Project acronym BUBPOL
Project Monetary Policy and Asset Price Bubbles
Researcher (PI) Jordi Galí Garreta
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2013-ADG
Summary "The proposed research project seeks to further our understanding on two important questions for the design of monetary policy:
(a) What are the effects of monetary policy interventions on asset price bubbles?
(b) How should monetary policy be conducted in the presence of asset price bubbles?
The first part of the project will focus on the development of a theoretical framework that can be used to analyze rigorously the implications of alternative monetary policy rules in the presence of asset price bubbles, and to characterize the optimal monetary policy. In particular, I plan to use such a framework to assess the merits of a “leaning against the wind” strategy, which calls for a systematic rise in interest rates in response to the development of a bubble.
The second part of the project will seek to produce evidence, both empirical and experimental, regarding the effects of monetary policy on asset price bubbles. The empirical evidence will seek to identify and estimate the sign and response of asset price bubbles to interest rate changes, exploiting the potential differences in the joint behavior of interest rates and asset prices during “bubbly” episodes, in comparison to “normal” times. In addition, I plan to conduct some lab experiments in order to shed some light on the link between monetary policy and bubbles. Participants will trade two assets, a one-period riskless asset and a long-lived stock, in an environment consistent with the existence of asset price bubbles in equilibrium. Monetary policy interventions will take the form of changes in the short-term interest rate, engineered by the experimenter. The experiments will allow us to evaluate some of the predictions of the theoretical models regarding the impact of monetary policy on the dynamics of bubbles, as well as the effectiveness of “leaning against the wind” policies."
Summary
"The proposed research project seeks to further our understanding on two important questions for the design of monetary policy:
(a) What are the effects of monetary policy interventions on asset price bubbles?
(b) How should monetary policy be conducted in the presence of asset price bubbles?
The first part of the project will focus on the development of a theoretical framework that can be used to analyze rigorously the implications of alternative monetary policy rules in the presence of asset price bubbles, and to characterize the optimal monetary policy. In particular, I plan to use such a framework to assess the merits of a “leaning against the wind” strategy, which calls for a systematic rise in interest rates in response to the development of a bubble.
The second part of the project will seek to produce evidence, both empirical and experimental, regarding the effects of monetary policy on asset price bubbles. The empirical evidence will seek to identify and estimate the sign and response of asset price bubbles to interest rate changes, exploiting the potential differences in the joint behavior of interest rates and asset prices during “bubbly” episodes, in comparison to “normal” times. In addition, I plan to conduct some lab experiments in order to shed some light on the link between monetary policy and bubbles. Participants will trade two assets, a one-period riskless asset and a long-lived stock, in an environment consistent with the existence of asset price bubbles in equilibrium. Monetary policy interventions will take the form of changes in the short-term interest rate, engineered by the experimenter. The experiments will allow us to evaluate some of the predictions of the theoretical models regarding the impact of monetary policy on the dynamics of bubbles, as well as the effectiveness of “leaning against the wind” policies."
Max ERC Funding
799 200 €
Duration
Start date: 2014-01-01, End date: 2017-12-31
Project acronym CAFES
Project Causal Analysis of Feedback Systems
Researcher (PI) Joris Marten Mooij
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Many questions in science, policy making and everyday life are of a causal nature: how would changing A influence B? Causal inference, a branch of statistics and machine learning, studies how cause-effect relationships can be discovered from data and how these can be used for making predictions in situations where a system has been perturbed by an external intervention. The ability to reliably make such causal predictions is of great value for practical applications in a variety of disciplines. Over the last two decades, remarkable progress has been made in the field. However, even though state-of-the-art causal inference algorithms work well on simulated data when all their assumptions are met, there is still a considerable gap between theory and practice. The goal of CAFES is to bridge that gap by developing theory and algorithms that will enable large-scale applications of causal inference in various challenging domains in science, industry and decision making.
The key challenge that will be addressed is how to deal with cyclic causal relationships ("feedback loops"). Feedback loops are very common in many domains (e.g., biology, economy and climatology), but have mostly been ignored so far in the field. Building on recently established connections between dynamical systems and causal models, CAFES will develop theory and algorithms for causal modeling, reasoning, discovery and prediction for cyclic causal systems. Extensions to stationary and non-stationary processes will be developed to advance the state-of-the-art in causal analysis of time-series data. In order to optimally use available resources, computationally efficient and statistically robust algorithms for causal inference from observational and interventional data in the context of confounders and feedback will be developed. The work will be done with a strong focus on applications in molecular biology, one of the most promising areas for automated causal inference from data.
Summary
Many questions in science, policy making and everyday life are of a causal nature: how would changing A influence B? Causal inference, a branch of statistics and machine learning, studies how cause-effect relationships can be discovered from data and how these can be used for making predictions in situations where a system has been perturbed by an external intervention. The ability to reliably make such causal predictions is of great value for practical applications in a variety of disciplines. Over the last two decades, remarkable progress has been made in the field. However, even though state-of-the-art causal inference algorithms work well on simulated data when all their assumptions are met, there is still a considerable gap between theory and practice. The goal of CAFES is to bridge that gap by developing theory and algorithms that will enable large-scale applications of causal inference in various challenging domains in science, industry and decision making.
The key challenge that will be addressed is how to deal with cyclic causal relationships ("feedback loops"). Feedback loops are very common in many domains (e.g., biology, economy and climatology), but have mostly been ignored so far in the field. Building on recently established connections between dynamical systems and causal models, CAFES will develop theory and algorithms for causal modeling, reasoning, discovery and prediction for cyclic causal systems. Extensions to stationary and non-stationary processes will be developed to advance the state-of-the-art in causal analysis of time-series data. In order to optimally use available resources, computationally efficient and statistically robust algorithms for causal inference from observational and interventional data in the context of confounders and feedback will be developed. The work will be done with a strong focus on applications in molecular biology, one of the most promising areas for automated causal inference from data.
Max ERC Funding
1 405 652 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym CCC
Project Cracking the Cerebellar Code
Researcher (PI) Christiaan Innocentius De Zeeuw
Host Institution (HI) ERASMUS UNIVERSITAIR MEDISCH CENTRUM ROTTERDAM
Call Details Advanced Grant (AdG), LS5, ERC-2011-ADG_20110310
Summary Spike trains transfer information to and from neurons. Most studies so far assume that the average firing rate or “rate coding” is the predominant way of information coding. However, spikes occur at millisecond precision, and their actual timing or “temporal coding” can in principle strongly increase the information content of spike trains. The two coding mechanisms are not mutually exclusive. Neurons may switch between rate and temporal coding, or use a combination of both coding mechanisms at the same time, which would increase the information content of spike trains even further. Here, we propose to investigate the hypothesis that temporal coding plays, next to rate coding, important and specific roles in cerebellar processing during learning. The cerebellum is ideal to study this timely topic, because it has a clear anatomy with well-organized modules and matrices, a well-described physiology of different types of neurons with distinguishable spiking activity, and a central role in various forms of tractable motor learning. Moreover, uniquely in the brain, the main types of neurons in the cerebellar system can be genetically manipulated in a cell-specific fashion, which will allow us to investigate the behavioural importance of both coding mechanisms following cell-specific interference and/or during cell-specific visual imaging. Thus, for this proposal we will create conditional mouse mutants that will be subjected to learning paradigms in which we can disentangle the contributions of rate coding and temporal coding using electrophysiological and optogenetic recordings and stimulation. Together, our experiments should elucidate how neurons in the brain communicate during natural learning behaviour and how one may be able to intervene in this process to affect or improve procedural learning skills.
Summary
Spike trains transfer information to and from neurons. Most studies so far assume that the average firing rate or “rate coding” is the predominant way of information coding. However, spikes occur at millisecond precision, and their actual timing or “temporal coding” can in principle strongly increase the information content of spike trains. The two coding mechanisms are not mutually exclusive. Neurons may switch between rate and temporal coding, or use a combination of both coding mechanisms at the same time, which would increase the information content of spike trains even further. Here, we propose to investigate the hypothesis that temporal coding plays, next to rate coding, important and specific roles in cerebellar processing during learning. The cerebellum is ideal to study this timely topic, because it has a clear anatomy with well-organized modules and matrices, a well-described physiology of different types of neurons with distinguishable spiking activity, and a central role in various forms of tractable motor learning. Moreover, uniquely in the brain, the main types of neurons in the cerebellar system can be genetically manipulated in a cell-specific fashion, which will allow us to investigate the behavioural importance of both coding mechanisms following cell-specific interference and/or during cell-specific visual imaging. Thus, for this proposal we will create conditional mouse mutants that will be subjected to learning paradigms in which we can disentangle the contributions of rate coding and temporal coding using electrophysiological and optogenetic recordings and stimulation. Together, our experiments should elucidate how neurons in the brain communicate during natural learning behaviour and how one may be able to intervene in this process to affect or improve procedural learning skills.
Max ERC Funding
2 499 600 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym CHAMELEON
Project Intuitive editing of visual appearance from real-world datasets
Researcher (PI) Diego Gutierrez Pérez
Host Institution (HI) UNIVERSIDAD DE ZARAGOZA
Call Details Consolidator Grant (CoG), PE6, ERC-2015-CoG
Summary Computer-generated imagery is now ubiquitous in our society, spanning fields such as games and movies, architecture, engineering, or virtual prototyping, while also helping create novel ones such as computational materials. With the increase in computational power and the improvement of acquisition techniques, there has been a paradigm shift in the field towards data-driven techniques, which has yielded an unprecedented level of realism in visual appearance. Unfortunately, this leads to a series of problems, identified in this proposal: First, there is a disconnect between the mathematical representation of the data and any meaningful parameters that humans understand; the captured data is machine-friendly, but not human friendly. Second, the many different acquisition systems lead to heterogeneous formats and very large datasets. And third, real-world appearance functions are usually nonlinear and high-dimensional. As a result, visual appearance datasets are increasingly unfit to editing operations, which limits the creative process for scientists, engineers, artists and practitioners in general. There is an immense gap between the complexity, realism and richness of the captured data, and the flexibility to edit such data.
We believe that the current research path leads to a fragmented space of isolated solutions, each tailored to a particular dataset and problem. We propose a research plan at the theoretical, algorithmic and application levels, putting the user at the core. We will learn key relevant appearance features in terms humans understand, from which intuitive, predictable editing spaces, algorithms, and workflows will be defined. In order to ensure usability and foster creativity, we will also extend our research to efficient simulation of visual appearance, exploiting the extra dimensionality of the captured datasets. Achieving our goals will finally enable us to reach the true potential of real-world captured datasets in many aspects of society.
Summary
Computer-generated imagery is now ubiquitous in our society, spanning fields such as games and movies, architecture, engineering, or virtual prototyping, while also helping create novel ones such as computational materials. With the increase in computational power and the improvement of acquisition techniques, there has been a paradigm shift in the field towards data-driven techniques, which has yielded an unprecedented level of realism in visual appearance. Unfortunately, this leads to a series of problems, identified in this proposal: First, there is a disconnect between the mathematical representation of the data and any meaningful parameters that humans understand; the captured data is machine-friendly, but not human friendly. Second, the many different acquisition systems lead to heterogeneous formats and very large datasets. And third, real-world appearance functions are usually nonlinear and high-dimensional. As a result, visual appearance datasets are increasingly unfit to editing operations, which limits the creative process for scientists, engineers, artists and practitioners in general. There is an immense gap between the complexity, realism and richness of the captured data, and the flexibility to edit such data.
We believe that the current research path leads to a fragmented space of isolated solutions, each tailored to a particular dataset and problem. We propose a research plan at the theoretical, algorithmic and application levels, putting the user at the core. We will learn key relevant appearance features in terms humans understand, from which intuitive, predictable editing spaces, algorithms, and workflows will be defined. In order to ensure usability and foster creativity, we will also extend our research to efficient simulation of visual appearance, exploiting the extra dimensionality of the captured datasets. Achieving our goals will finally enable us to reach the true potential of real-world captured datasets in many aspects of society.
Max ERC Funding
1 629 519 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym CHANGING FAMILIES
Project Changing Families: Causes, Consequences and Challenges for Public Policy
Researcher (PI) Nezih Guner
Host Institution (HI) FUNDACIÓ MARKETS, ORGANIZATIONS AND VOTES IN ECONOMICS
Call Details Starting Grant (StG), SH1, ERC-2010-StG_20091209
Summary The household and family structure in every major industrialized country changed in a fundamental way during the last couple of decades. First, marriage is less important today, as divorce, cohabitation, and single-motherhood are much more common. Second, female labor force participation has increased dramatically. As a result of these changes, today s households are very far from traditional breadwinner husband and housekeeper wife paradigm. These dramatic changes generated significant public interest and a large body of literature that tries to understand causes and consequences of these changes.
This project has two main goals. First, it studies changes in household and family structure. The particular questions that it tries to answer are: 1) What are economic factors behind the rise in premarital sex and its destigmatization? What determines parents incentives to socialize their children and affect their attitudes? 2) What are the causes and consequences of the recent rise in assortative mating and diverging marriage patterns by different educational groups? 3) Why are marriage patterns among blacks so different than whites in the U.S.?
The second aim of this project is to improve our understanding of income risk, the role of social insurance policies and labor market dynamics by building models that explicitly considers two-earner households. In particular, we ask the following set of questions: 1) What is the role of social insurance policies (income maintenance programs or progressive taxation) in an economy populated by two-earner households facing uninsurable idiosyncratic risk? 2) How does marriage and labor market dynamics interact and how important this interaction for our understanding of labor supply and marriage decisions?
Summary
The household and family structure in every major industrialized country changed in a fundamental way during the last couple of decades. First, marriage is less important today, as divorce, cohabitation, and single-motherhood are much more common. Second, female labor force participation has increased dramatically. As a result of these changes, today s households are very far from traditional breadwinner husband and housekeeper wife paradigm. These dramatic changes generated significant public interest and a large body of literature that tries to understand causes and consequences of these changes.
This project has two main goals. First, it studies changes in household and family structure. The particular questions that it tries to answer are: 1) What are economic factors behind the rise in premarital sex and its destigmatization? What determines parents incentives to socialize their children and affect their attitudes? 2) What are the causes and consequences of the recent rise in assortative mating and diverging marriage patterns by different educational groups? 3) Why are marriage patterns among blacks so different than whites in the U.S.?
The second aim of this project is to improve our understanding of income risk, the role of social insurance policies and labor market dynamics by building models that explicitly considers two-earner households. In particular, we ask the following set of questions: 1) What is the role of social insurance policies (income maintenance programs or progressive taxation) in an economy populated by two-earner households facing uninsurable idiosyncratic risk? 2) How does marriage and labor market dynamics interact and how important this interaction for our understanding of labor supply and marriage decisions?
Max ERC Funding
1 037 000 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym CITISENSE
Project Evolving communication systems in response to altered sensory environments
Researcher (PI) Wouter Halfwerk
Host Institution (HI) STICHTING VU
Call Details Starting Grant (StG), LS8, ERC-2018-STG
Summary How animal communication systems evolve is a fundamental question in ecology and evolution and crucial for our understanding of adaptation and speciation. I will make use of the process of urbanization to address how communication signals adapt to changes in the sensory environment. I will focus on the impact of noise and light pollution on acoustic communication of Neotropical frogs and address the following questions:
1) How do senders, such as a male frog, adjust their signals to altered sensory environments? I will assess plasticity and heritability of signal divergence found between urban and forest populations of the tungara frog. 2) How do signals evolve in response to direct (via sender) and indirect (via receivers) selection pressures? I will expose forest sites to noise and light pollution, parse out importance of multiple selection pressures and carry out experimental evolution using artificial phenotypes.
3) What are the evolutionary consequences of signal divergence? I will assess inter-and-intra sexual responses to signal divergence between urban and forest populations. 4) Can we predict how species adapt their signals to the sensory environment? I will use a trait-based comparative approach to study signal divergence among closely related species with known urban populations.
Our state-of-the-art automated sender-receiver system allows for experimental evolution using long-lived species and opens new ways to study selection pressures operating on animal behaviour under real field conditions. Our expected results will provide crucial insight into the early stages of signal divergence that may ultimately lead to reproductive isolation and speciation.
Summary
How animal communication systems evolve is a fundamental question in ecology and evolution and crucial for our understanding of adaptation and speciation. I will make use of the process of urbanization to address how communication signals adapt to changes in the sensory environment. I will focus on the impact of noise and light pollution on acoustic communication of Neotropical frogs and address the following questions:
1) How do senders, such as a male frog, adjust their signals to altered sensory environments? I will assess plasticity and heritability of signal divergence found between urban and forest populations of the tungara frog. 2) How do signals evolve in response to direct (via sender) and indirect (via receivers) selection pressures? I will expose forest sites to noise and light pollution, parse out importance of multiple selection pressures and carry out experimental evolution using artificial phenotypes.
3) What are the evolutionary consequences of signal divergence? I will assess inter-and-intra sexual responses to signal divergence between urban and forest populations. 4) Can we predict how species adapt their signals to the sensory environment? I will use a trait-based comparative approach to study signal divergence among closely related species with known urban populations.
Our state-of-the-art automated sender-receiver system allows for experimental evolution using long-lived species and opens new ways to study selection pressures operating on animal behaviour under real field conditions. Our expected results will provide crucial insight into the early stages of signal divergence that may ultimately lead to reproductive isolation and speciation.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym CITIZINGLOBAL
Project Citizens, Institutions and Globalization
Researcher (PI) Giacomo Antonio Maria PONZETTO
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Globalization has brought the world economy unprecedented prosperity, but it poses governance challenges. It needs governments to provide the infrastructure for global economic integration and to refrain from destructive protectionism; yet it can engender popular discontent and a crisis of democracy. My proposal will study when trade- and productivity-enhancing policies enjoy democratic support; why voters may support instead inefficient surplus-reducing policies; and how political structure reacts to globalization.
Part A studies the puzzling popularity of protectionism and how lobbies can raise it by manipulating information. It will study empirically if greater transparency causes lower trade barriers. It will introduce salience theory to political economics and argue that voters overweight concentrated losses and disregard diffuse benefits. It will show that lobbies can raise protection by channeling information to insiders and advertising the plight of displaced workers.
Part B studies inefficient infrastructure policy and the ensuing spatial misallocation of economic activity. It will show that voters’ unequal knowledge lets local residents capture national policy. They disregard nationwide positive externalities, so investment in major cities is insufficient, but also nationwide taxes, so spending in low-density areas is excessive. It will argue that the fundamental attribution error causes voter opposition to growth-enhancing policies and efficient incentive schemes like congestion pricing.
Part C studies how the size of countries and international unions adapts to expanding trade opportunities. It will focus on three forces: cultural diversity, economies of scale and scope in government, and trade-reducing border effects. It will show they explain increasing country size in the 19th century; the rise and fall of colonial empires; and the recent emergence of regional and global economic unions, accompanied by a peaceful increase in the number of countries.
Summary
Globalization has brought the world economy unprecedented prosperity, but it poses governance challenges. It needs governments to provide the infrastructure for global economic integration and to refrain from destructive protectionism; yet it can engender popular discontent and a crisis of democracy. My proposal will study when trade- and productivity-enhancing policies enjoy democratic support; why voters may support instead inefficient surplus-reducing policies; and how political structure reacts to globalization.
Part A studies the puzzling popularity of protectionism and how lobbies can raise it by manipulating information. It will study empirically if greater transparency causes lower trade barriers. It will introduce salience theory to political economics and argue that voters overweight concentrated losses and disregard diffuse benefits. It will show that lobbies can raise protection by channeling information to insiders and advertising the plight of displaced workers.
Part B studies inefficient infrastructure policy and the ensuing spatial misallocation of economic activity. It will show that voters’ unequal knowledge lets local residents capture national policy. They disregard nationwide positive externalities, so investment in major cities is insufficient, but also nationwide taxes, so spending in low-density areas is excessive. It will argue that the fundamental attribution error causes voter opposition to growth-enhancing policies and efficient incentive schemes like congestion pricing.
Part C studies how the size of countries and international unions adapts to expanding trade opportunities. It will focus on three forces: cultural diversity, economies of scale and scope in government, and trade-reducing border effects. It will show they explain increasing country size in the 19th century; the rise and fall of colonial empires; and the recent emergence of regional and global economic unions, accompanied by a peaceful increase in the number of countries.
Max ERC Funding
960 000 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CMTaaRS
Project Defective protein translation as a pathogenic mechanism of peripheral neuropathy
Researcher (PI) Erik Jan Marthe STORKEBAUM
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Consolidator Grant (CoG), LS5, ERC-2017-COG
Summary Familial forms of neurodegenerative diseases are caused by mutations in a single gene. It is unknown whether distinct mutations in the same gene or in functionally related genes cause disease through similar or disparate mechanisms. Furthermore, the precise molecular mechanisms underlying virtually all neurodegenerative disorders are poorly understood, and effective treatments are typically lacking.
This is also the case for Charcot-Marie-Tooth (CMT) peripheral neuropathy caused by mutations in five distinct tRNA synthetase (aaRS) genes. We previously generated Drosophila CMT-aaRS models and used a novel method for cell-type-specific labeling of newly synthesized proteins in vivo to show that impaired protein translation may represent a common pathogenic mechanism.
In this proposal, I aim to determine whether translation is also inhibited in CMT-aaRS mouse models, and whether all mutations cause disease through gain-of-toxic-function, or alternatively, whether some mutations act through a dominant-negative mechanism. In addition, I will evaluate whether all CMT-aaRS mutant proteins inhibit translation, and I will test the hypothesis, raised by our unpublished preliminary data shown here, that a defect in the transfer of the (aminoacylated) tRNA from the mutant synthetase to elongation factor eEF1A is the molecular mechanism underlying CMT-aaRS. Finally, I will validate the identified molecular mechanism in CMT-aaRS mouse models, as the most disease-relevant mammalian model.
I expect to elucidate whether all CMT-aaRS mutations cause disease through a common molecular mechanism that involves inhibition of translation. This is of key importance from a therapeutic perspective, as a common pathogenic mechanism allows for a unified therapeutic approach. Furthermore, this proposal has the potential to unravel the detailed molecular mechanism underlying CMT-aaRS, what would constitute a breakthrough and a requirement for rational drug design for this incurable disease.
Summary
Familial forms of neurodegenerative diseases are caused by mutations in a single gene. It is unknown whether distinct mutations in the same gene or in functionally related genes cause disease through similar or disparate mechanisms. Furthermore, the precise molecular mechanisms underlying virtually all neurodegenerative disorders are poorly understood, and effective treatments are typically lacking.
This is also the case for Charcot-Marie-Tooth (CMT) peripheral neuropathy caused by mutations in five distinct tRNA synthetase (aaRS) genes. We previously generated Drosophila CMT-aaRS models and used a novel method for cell-type-specific labeling of newly synthesized proteins in vivo to show that impaired protein translation may represent a common pathogenic mechanism.
In this proposal, I aim to determine whether translation is also inhibited in CMT-aaRS mouse models, and whether all mutations cause disease through gain-of-toxic-function, or alternatively, whether some mutations act through a dominant-negative mechanism. In addition, I will evaluate whether all CMT-aaRS mutant proteins inhibit translation, and I will test the hypothesis, raised by our unpublished preliminary data shown here, that a defect in the transfer of the (aminoacylated) tRNA from the mutant synthetase to elongation factor eEF1A is the molecular mechanism underlying CMT-aaRS. Finally, I will validate the identified molecular mechanism in CMT-aaRS mouse models, as the most disease-relevant mammalian model.
I expect to elucidate whether all CMT-aaRS mutations cause disease through a common molecular mechanism that involves inhibition of translation. This is of key importance from a therapeutic perspective, as a common pathogenic mechanism allows for a unified therapeutic approach. Furthermore, this proposal has the potential to unravel the detailed molecular mechanism underlying CMT-aaRS, what would constitute a breakthrough and a requirement for rational drug design for this incurable disease.
Max ERC Funding
2 000 000 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym CoCoUnit
Project CoCoUnit: An Energy-Efficient Processing Unit for Cognitive Computing
Researcher (PI) Antonio Maria Gonzalez Colas
Host Institution (HI) UNIVERSITAT POLITECNICA DE CATALUNYA
Call Details Advanced Grant (AdG), PE6, ERC-2018-ADG
Summary There is a fast-growing interest in extending the capabilities of computing systems to perform human-like tasks in an intelligent way. These technologies are usually referred to as cognitive computing. We envision a next revolution in computing in the forthcoming years that will be driven by deploying many “intelligent” devices around us in all kind of environments (work, entertainment, transportation, health care, etc.) backed up by “intelligent” servers in the cloud. These cognitive computing systems will provide new user experiences by delivering new services or improving the operational efficiency of existing ones, and altogether will enrich our lives and our economy.
A key characteristic of cognitive computing systems will be their capability to process in real time large amounts of data coming from audio and vision devices, and other type of sensors. This will demand a very high computing power but at the same time an extremely low energy consumption. This very challenging energy-efficiency requirement is a sine qua non to success not only for mobile and wearable systems, where power dissipation and cost budgets are very low, but also for large data centers where energy consumption is a main component of the total cost of ownership.
Current processor architectures (including general-purpose cores and GPUs) are not a good fit for this type of systems since they keep the same basic organization as early computers, which were mainly optimized for “number crunching”. CoCoUnit will take a disruptive direction by investigating unconventional architectures that can offer orders of magnitude better efficiency in terms of performance per energy and cost for cognitive computing tasks. The ultimate goal of this project is to devise a novel processing unit that will be integrated with the existing units of a processor (general-purpose cores and GPUs) and altogether will be able to deliver cognitive computing user experiences with extremely high energy-efficiency.
Summary
There is a fast-growing interest in extending the capabilities of computing systems to perform human-like tasks in an intelligent way. These technologies are usually referred to as cognitive computing. We envision a next revolution in computing in the forthcoming years that will be driven by deploying many “intelligent” devices around us in all kind of environments (work, entertainment, transportation, health care, etc.) backed up by “intelligent” servers in the cloud. These cognitive computing systems will provide new user experiences by delivering new services or improving the operational efficiency of existing ones, and altogether will enrich our lives and our economy.
A key characteristic of cognitive computing systems will be their capability to process in real time large amounts of data coming from audio and vision devices, and other type of sensors. This will demand a very high computing power but at the same time an extremely low energy consumption. This very challenging energy-efficiency requirement is a sine qua non to success not only for mobile and wearable systems, where power dissipation and cost budgets are very low, but also for large data centers where energy consumption is a main component of the total cost of ownership.
Current processor architectures (including general-purpose cores and GPUs) are not a good fit for this type of systems since they keep the same basic organization as early computers, which were mainly optimized for “number crunching”. CoCoUnit will take a disruptive direction by investigating unconventional architectures that can offer orders of magnitude better efficiency in terms of performance per energy and cost for cognitive computing tasks. The ultimate goal of this project is to devise a novel processing unit that will be integrated with the existing units of a processor (general-purpose cores and GPUs) and altogether will be able to deliver cognitive computing user experiences with extremely high energy-efficiency.
Max ERC Funding
2 498 661 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym COMPMUSIC
Project Computational models for the discovery of the world's music
Researcher (PI) Francesc Xavier Serra Casals
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Advanced Grant (AdG), PE6, ERC-2010-AdG_20100224
Summary Current IT research does not respond to the world's multi-cultural reality. It could be argued that we are imposing the paradigms of our market-driven western culture also on IT and that current IT research results will only facilitate the access of a small part of the world’s information to a small part of the world's population. Most IT research is being carried out with a western centred approach and as a result, our data models, cognition models, user models, interaction models, ontologies, … are all culturally biased. This fact is quite evident in music information research, since, despite the world's richness in musical cultures, most of the research is centred on CDs and metadata of our western commercial music. CompMusic wants to break this huge research bias. By approaching musical information modelling from a multicultural perspective it aims at advancing our state of the art while facilitating the discovery and reuse of the music produced outside the western commercial context. But the development of computational models to address the world’s music information richness cannot be done from the West looking out; we have to involve researchers and musical experts immersed in the different cultures. Their contribution is fundamental to develop the appropriate multicultural musicological and cognitive frameworks from which we should then carry our research on finding appropriate musical features, ontologies, data representations, user interfaces and user centred approaches. CompMusic will investigate some of the most consolidated non-western classical music traditions, Indian (hindustani, carnatic), Turkish-Arab (ottoman, andalusian), and Chinese (han), developing the needed computational models to bring their music into the current globalized information framework. Using these music cultures as case studies, cultures that are alive and have a strong influence in current society, we can develop rich information models that can take advantage of the existing information coming from musicological and cultural studies, from mature performance practice traditions and from active social contexts. With this approach we aim at challenging the current western centred information paradigms, advance our IT research, and contribute to our rich multicultural society.
Summary
Current IT research does not respond to the world's multi-cultural reality. It could be argued that we are imposing the paradigms of our market-driven western culture also on IT and that current IT research results will only facilitate the access of a small part of the world’s information to a small part of the world's population. Most IT research is being carried out with a western centred approach and as a result, our data models, cognition models, user models, interaction models, ontologies, … are all culturally biased. This fact is quite evident in music information research, since, despite the world's richness in musical cultures, most of the research is centred on CDs and metadata of our western commercial music. CompMusic wants to break this huge research bias. By approaching musical information modelling from a multicultural perspective it aims at advancing our state of the art while facilitating the discovery and reuse of the music produced outside the western commercial context. But the development of computational models to address the world’s music information richness cannot be done from the West looking out; we have to involve researchers and musical experts immersed in the different cultures. Their contribution is fundamental to develop the appropriate multicultural musicological and cognitive frameworks from which we should then carry our research on finding appropriate musical features, ontologies, data representations, user interfaces and user centred approaches. CompMusic will investigate some of the most consolidated non-western classical music traditions, Indian (hindustani, carnatic), Turkish-Arab (ottoman, andalusian), and Chinese (han), developing the needed computational models to bring their music into the current globalized information framework. Using these music cultures as case studies, cultures that are alive and have a strong influence in current society, we can develop rich information models that can take advantage of the existing information coming from musicological and cultural studies, from mature performance practice traditions and from active social contexts. With this approach we aim at challenging the current western centred information paradigms, advance our IT research, and contribute to our rich multicultural society.
Max ERC Funding
2 443 200 €
Duration
Start date: 2011-07-01, End date: 2017-06-30
Project acronym CompSCHoice
Project A Comprehensive Approach to School Choice and Education
Researcher (PI) Caterina Calsamiglia Costa
Host Institution (HI) INSTITUTE OF POLITICAL ECONOMY AND GOVERNANCE
Call Details Starting Grant (StG), SH1, ERC-2014-STG
Summary School choice is one of the most hotly debated policies in education. Advocates argue that school choice allows equal access to high quality schooling for all. High-income families have always had more choice, either through residential choice or through enrolment in private schools. Therefore increased choice should also improve equity by allowing minority and low-income students to choose too. On the other hand, school choice critics suggest that school choice can increase sorting between schools based on their socio-economics status, suggesting high-income families benefit more from these policies.
Three different and disconnected literatures in economics provide different and often contradicting answers to these questions. We propose a unified theoretical framework that merges these three literatures and allows for a comprehensive analysis on school choice design and its impact on actual choice, outcomes and segregation in schools and neighborhoods. Unique and newly constructed data sets are used to address novel empirical challenges. The data constructed for Barcelona shall become one of the largest and most comprehensive data sets not only on school choice but also on public education worldwide.
Using the data set from Barcelona we 1) estimate families’ preferences and, for the first time, evaluate the efficiency of different mechanism through structural estimation of our model and counterfactual analysis. We then 2) evaluate the impact that peer effects have on parents' choice and on outcomes. Exploiting the occurrence of hurricane Katrina in New Orleans and the aid programs implemented we aim at 3) estimating the distribution of willingness to pay for quality schools for families with different socio-economics. And last we exploit a policy change in Catalunya in 2009 to 4) provide evidence on how increased flexibility of the school system to adapt for differential maturity levels affects individual short and medium-term outcomes.
Summary
School choice is one of the most hotly debated policies in education. Advocates argue that school choice allows equal access to high quality schooling for all. High-income families have always had more choice, either through residential choice or through enrolment in private schools. Therefore increased choice should also improve equity by allowing minority and low-income students to choose too. On the other hand, school choice critics suggest that school choice can increase sorting between schools based on their socio-economics status, suggesting high-income families benefit more from these policies.
Three different and disconnected literatures in economics provide different and often contradicting answers to these questions. We propose a unified theoretical framework that merges these three literatures and allows for a comprehensive analysis on school choice design and its impact on actual choice, outcomes and segregation in schools and neighborhoods. Unique and newly constructed data sets are used to address novel empirical challenges. The data constructed for Barcelona shall become one of the largest and most comprehensive data sets not only on school choice but also on public education worldwide.
Using the data set from Barcelona we 1) estimate families’ preferences and, for the first time, evaluate the efficiency of different mechanism through structural estimation of our model and counterfactual analysis. We then 2) evaluate the impact that peer effects have on parents' choice and on outcomes. Exploiting the occurrence of hurricane Katrina in New Orleans and the aid programs implemented we aim at 3) estimating the distribution of willingness to pay for quality schools for families with different socio-economics. And last we exploit a policy change in Catalunya in 2009 to 4) provide evidence on how increased flexibility of the school system to adapt for differential maturity levels affects individual short and medium-term outcomes.
Max ERC Funding
1 207 500 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym CoordinatedDopamine
Project Coordination of regional dopamine release in the striatum during habit formation and compulsive behaviour
Researcher (PI) Ingo Willuhn
Host Institution (HI) ACADEMISCH MEDISCH CENTRUM BIJ DE UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), LS5, ERC-2014-STG
Summary The basal ganglia consist of a set of neuroanatomical structures that participate in the representation and execution of action sequences. Dopamine neurotransmission in the striatum, the main input nucleus of the basal ganglia, is a fundamental mechanism involved in learning and regulation of such actions. The striatum has multiple functional units, where the limbic striatum is thought to mediate motivational aspects of actions (e.g., goal-directedness) and the sensorimotor striatum their automation (e.g., habit formation). A long-standing question in the field is how limbic and sensorimotor domains communicate with each other, and specifically if they do so during the automation of action sequences. It has been suggested that such coordination is implemented by reciprocal loop connections between striatal projection neurons and the dopaminergic midbrain. Although very influential in theory the effectiveness of this limbic-sensorimotor “bridging” principle has yet to be verified. I hypothesize that during the automation of behaviour regional dopamine signalling is governed by a striatal hierarchy and that dysregulation of this coordination leads to compulsive execution of automatic actions characteristic of several psychiatric disorders. To test this hypothesis, we will conduct electrochemical measurements with real-time resolution simultaneously in limbic and sensorimotor striatum to assess the regional coordination of dopamine release in behaving animals. We developed novel chronically implantable electrodes to enable monitoring of dopamine dynamics throughout the development of habitual behaviour and its compulsive execution in transgenic rats - a species suitable for our complex behavioural assays. Novel rabies virus-mediated gene delivery for in vivo optogenetics in these rats will give us the unique opportunity to test whether specific loop pathways govern striatal dopamine transmission and are causally involved in habit formation and compulsive behaviour.
Summary
The basal ganglia consist of a set of neuroanatomical structures that participate in the representation and execution of action sequences. Dopamine neurotransmission in the striatum, the main input nucleus of the basal ganglia, is a fundamental mechanism involved in learning and regulation of such actions. The striatum has multiple functional units, where the limbic striatum is thought to mediate motivational aspects of actions (e.g., goal-directedness) and the sensorimotor striatum their automation (e.g., habit formation). A long-standing question in the field is how limbic and sensorimotor domains communicate with each other, and specifically if they do so during the automation of action sequences. It has been suggested that such coordination is implemented by reciprocal loop connections between striatal projection neurons and the dopaminergic midbrain. Although very influential in theory the effectiveness of this limbic-sensorimotor “bridging” principle has yet to be verified. I hypothesize that during the automation of behaviour regional dopamine signalling is governed by a striatal hierarchy and that dysregulation of this coordination leads to compulsive execution of automatic actions characteristic of several psychiatric disorders. To test this hypothesis, we will conduct electrochemical measurements with real-time resolution simultaneously in limbic and sensorimotor striatum to assess the regional coordination of dopamine release in behaving animals. We developed novel chronically implantable electrodes to enable monitoring of dopamine dynamics throughout the development of habitual behaviour and its compulsive execution in transgenic rats - a species suitable for our complex behavioural assays. Novel rabies virus-mediated gene delivery for in vivo optogenetics in these rats will give us the unique opportunity to test whether specific loop pathways govern striatal dopamine transmission and are causally involved in habit formation and compulsive behaviour.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym CORTEXFOLDING
Project Understanding the development and function of cerebral cortex folding
Researcher (PI) Victor Borrell Franco
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Starting Grant (StG), LS5, ERC-2012-StG_20111109
Summary The mammalian cerebral cortex was subject to a dramatic expansion in surface area during evolution. This process is recapitulated during development and is accompanied by folding of the cortical sheet, which allows fitting a large cortical surface within a limited cranial volume. A loss of cortical folds is linked to severe intellectual impairment in humans, so cortical folding is believed to be crucial for brain function. However, developmental mechanisms responsible for cortical folding, and the influence of this on cortical function, remain largely unknown. The goal of this proposal is to understand the genetic and cellular mechanisms that control the developmental expansion and folding of the cerebral cortex, and what is the impact of these processes on its functional organization. Human studies have identified genes essential for the proper folding of the human cerebral cortex. Genetic manipulations in mice have unraveled specific functions for some of those genes in the development of the cerebral cortex. But because the mouse cerebral cortex does not fold naturally, the mechanisms of cortical expansion and folding in larger brains remain unknown. We will study these mechanisms on ferret, an ideal model with a naturally folded cerebral cortex. We will combine the advantages of ferrets with cell biology, genetics and next-generation transcriptomics, together with state-of-the-art in vivo, in vitro and in silico approaches, including in vivo imaging of functional columnar maps. The successful execution of this project will provide insights into developmental and genetic risk factors for anomalies in human cortical topology, and into mechanisms responsible for the early formation of cortical functional maps.
Summary
The mammalian cerebral cortex was subject to a dramatic expansion in surface area during evolution. This process is recapitulated during development and is accompanied by folding of the cortical sheet, which allows fitting a large cortical surface within a limited cranial volume. A loss of cortical folds is linked to severe intellectual impairment in humans, so cortical folding is believed to be crucial for brain function. However, developmental mechanisms responsible for cortical folding, and the influence of this on cortical function, remain largely unknown. The goal of this proposal is to understand the genetic and cellular mechanisms that control the developmental expansion and folding of the cerebral cortex, and what is the impact of these processes on its functional organization. Human studies have identified genes essential for the proper folding of the human cerebral cortex. Genetic manipulations in mice have unraveled specific functions for some of those genes in the development of the cerebral cortex. But because the mouse cerebral cortex does not fold naturally, the mechanisms of cortical expansion and folding in larger brains remain unknown. We will study these mechanisms on ferret, an ideal model with a naturally folded cerebral cortex. We will combine the advantages of ferrets with cell biology, genetics and next-generation transcriptomics, together with state-of-the-art in vivo, in vitro and in silico approaches, including in vivo imaging of functional columnar maps. The successful execution of this project will provide insights into developmental and genetic risk factors for anomalies in human cortical topology, and into mechanisms responsible for the early formation of cortical functional maps.
Max ERC Funding
1 701 116 €
Duration
Start date: 2013-01-01, End date: 2018-06-30
Project acronym COSI
Project Cerebellar modules and the Ontogeny of Sensorimotor Integration
Researcher (PI) Martijn Schonewille
Host Institution (HI) ERASMUS UNIVERSITAIR MEDISCH CENTRUM ROTTERDAM
Call Details Starting Grant (StG), LS5, ERC-2015-STG
Summary The perfect execution of a voluntary movement requires the appropriate integration of current bodily state, sensory input and desired outcome. To assure that this motor output becomes and remains appropriate, the brain needs to learn from the result of previous outputs. The cerebellum plays a central role in sensorimotor integration, yet -despite decades of studies- there is no generally excepted theory for cerebellar functioning. I recently demonstrated that cerebellar modules, identified based on anatomical connectivity and gene expression, differ distinctly in spike activity properties. It is my long-term goal to identify the ontogeny of anatomical and physiological differences between modules, and their functional consequences. My hypothesis is that these differences can explain existing controversies, and unify contradicting results into one central theory.
To this end, I have designed three key objectives. First, I will identify the development of connectivity and activity patterns at the input stage of the cerebellar cortex in relation to the cerebellar modules (key objective A). Next, I will relate the differences in gene expression levels between modules to differences in basal activity and strength of plasticity mechanisms in juvenile mice (key objective B). Finally, I will determine how module specific output develops in relation to behavior and what the effect of module specific mutations is on cerebellum-dependent motor tasks and higher order functions (key objective C).
Ultimately, the combined results of all key objectives will reveal how distinct difference between cerebellar modules develop, and how this ensemble ensures proper cerebellar information processing for optimal coordination of timing and force of movements. Combined with the growing body of evidence for a cerebellar role in higher order brain functions and neurodevelopmental disorders, a unifying theory would be fundamental for understanding how the juvenile brain develops.
Summary
The perfect execution of a voluntary movement requires the appropriate integration of current bodily state, sensory input and desired outcome. To assure that this motor output becomes and remains appropriate, the brain needs to learn from the result of previous outputs. The cerebellum plays a central role in sensorimotor integration, yet -despite decades of studies- there is no generally excepted theory for cerebellar functioning. I recently demonstrated that cerebellar modules, identified based on anatomical connectivity and gene expression, differ distinctly in spike activity properties. It is my long-term goal to identify the ontogeny of anatomical and physiological differences between modules, and their functional consequences. My hypothesis is that these differences can explain existing controversies, and unify contradicting results into one central theory.
To this end, I have designed three key objectives. First, I will identify the development of connectivity and activity patterns at the input stage of the cerebellar cortex in relation to the cerebellar modules (key objective A). Next, I will relate the differences in gene expression levels between modules to differences in basal activity and strength of plasticity mechanisms in juvenile mice (key objective B). Finally, I will determine how module specific output develops in relation to behavior and what the effect of module specific mutations is on cerebellum-dependent motor tasks and higher order functions (key objective C).
Ultimately, the combined results of all key objectives will reveal how distinct difference between cerebellar modules develop, and how this ensemble ensures proper cerebellar information processing for optimal coordination of timing and force of movements. Combined with the growing body of evidence for a cerebellar role in higher order brain functions and neurodevelopmental disorders, a unifying theory would be fundamental for understanding how the juvenile brain develops.
Max ERC Funding
1 500 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym DCVFUSION
Project Telling the full story: how neurons send other signals than by classical synaptic transmission
Researcher (PI) Matthijs Verhage
Host Institution (HI) STICHTING VUMC
Call Details Advanced Grant (AdG), LS5, ERC-2012-ADG_20120314
Summary The regulated secretion of chemical signals in the brain occurs principally from two organelles, synaptic vesicles and dense core vesicles (DCVs). Synaptic vesicle secretion accounts for the well characterized local, fast signalling in synapses. DCVs contain a diverse collection of cargo, including many neuropeptides that trigger a multitude of modulatory effects with quite robust impact, for instance on memory, mood, pain, appetite or social behavior. Disregulation of neuropeptide secretion is firmly associated with many diseases such as cognitive and mood disorders, obesity and diabetes. In addition, many other signals depend on DCVs, for instance trophic factors and proteolytic enzymes, but also signals that typically do not diffuse like guidance cues and pre-assembled active zones. Hence, it is beyond doubt that DCV signalling is a central factor in brain communication. However, many fundamental questions remain open on DCV trafficking and secretion. Therefore, the aim of this proposal is to characterize the molecular principles that account for DCV delivery at release sites and their secretion. I will address 4 fundamental questions: What are the molecular factors that drive DCV fusion in mammalian CNS neurons? How does Ca2+ trigger DCV fusion? What are the requirements of DCV release sites and where do they occur? Can DCV fusion be targeted to synthetic release sites in vivo? I will exploit >30 mutant mouse lines and new cell biological and photonic approaches that allow for the first time a quantitative assessment of DCV-trafficking and fusion of many cargo types, in living neurons with a single vesicle resolution. Preliminary data suggest that DCV secretion is quite different from synaptic vesicle and chromaffin granule secretion. Together, these studies will produce the first systematic evaluation of the molecular identity of the core machinery that drives DCV fusion in neurons, the Ca2+-affinity of DCV fusion and the characteristics of DCV release sites.
Summary
The regulated secretion of chemical signals in the brain occurs principally from two organelles, synaptic vesicles and dense core vesicles (DCVs). Synaptic vesicle secretion accounts for the well characterized local, fast signalling in synapses. DCVs contain a diverse collection of cargo, including many neuropeptides that trigger a multitude of modulatory effects with quite robust impact, for instance on memory, mood, pain, appetite or social behavior. Disregulation of neuropeptide secretion is firmly associated with many diseases such as cognitive and mood disorders, obesity and diabetes. In addition, many other signals depend on DCVs, for instance trophic factors and proteolytic enzymes, but also signals that typically do not diffuse like guidance cues and pre-assembled active zones. Hence, it is beyond doubt that DCV signalling is a central factor in brain communication. However, many fundamental questions remain open on DCV trafficking and secretion. Therefore, the aim of this proposal is to characterize the molecular principles that account for DCV delivery at release sites and their secretion. I will address 4 fundamental questions: What are the molecular factors that drive DCV fusion in mammalian CNS neurons? How does Ca2+ trigger DCV fusion? What are the requirements of DCV release sites and where do they occur? Can DCV fusion be targeted to synthetic release sites in vivo? I will exploit >30 mutant mouse lines and new cell biological and photonic approaches that allow for the first time a quantitative assessment of DCV-trafficking and fusion of many cargo types, in living neurons with a single vesicle resolution. Preliminary data suggest that DCV secretion is quite different from synaptic vesicle and chromaffin granule secretion. Together, these studies will produce the first systematic evaluation of the molecular identity of the core machinery that drives DCV fusion in neurons, the Ca2+-affinity of DCV fusion and the characteristics of DCV release sites.
Max ERC Funding
2 439 315 €
Duration
Start date: 2013-05-01, End date: 2019-04-30
Project acronym DIVERSE-EXPECON
Project Discriminative preferences and fairness ideals in diverse societies: An ‘experimental economics’ approach
Researcher (PI) Sigrid SUETENS
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT BRABANT
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary In economics, a distinction is made between statistical and taste-based discrimination (henceforth, TBD). Statistical discrimination refers to discrimination in a context with strategic uncertainty. Someone who is uncertain about the future behaviour of a person with a different ethnicity may rely on information about the different ethnic group to which this person belongs to form beliefs about the behaviour of that person. This may lead to discrimination. TBD refers to discrimination in a context without strategic uncertainty. It implies suffering a disutility when interacting with ‘different’ others. This project systematically studies TBD in ethnically diverse societies.
Identifying TBD is important because overcoming it requires different policies than overcoming statistical discrimination: they should deal with changing preferences of people rather than providing information about specific interaction partners. But identifying TBD is tricky. First, it is impossible to identify using uncontrolled empirical data because these data are characterised by strategic uncertainty. Second, people are generally reluctant to identify themselves as a discriminator. In the project, I study TBS using novel economic experiments that circumvent these problems.
The project consists of three main objectives. First, I investigate whether and how preferences of European natives in social interactions depend on others’ ethnicity. Are natives as altruistic, reciprocal, envious to immigrants as compared to other natives? Second, I study whether natives have different fairness ideals—what constitutes a fair distribution of resources from the perspective of an impartial spectator—when it comes to natives than when it comes to non-natives. Third, I analyse whether preferences and fairness ideals depend on exposure to diversity: do preferences and fairness ideals of natives change as contact with non-natives increases, and, if so, how?
Summary
In economics, a distinction is made between statistical and taste-based discrimination (henceforth, TBD). Statistical discrimination refers to discrimination in a context with strategic uncertainty. Someone who is uncertain about the future behaviour of a person with a different ethnicity may rely on information about the different ethnic group to which this person belongs to form beliefs about the behaviour of that person. This may lead to discrimination. TBD refers to discrimination in a context without strategic uncertainty. It implies suffering a disutility when interacting with ‘different’ others. This project systematically studies TBD in ethnically diverse societies.
Identifying TBD is important because overcoming it requires different policies than overcoming statistical discrimination: they should deal with changing preferences of people rather than providing information about specific interaction partners. But identifying TBD is tricky. First, it is impossible to identify using uncontrolled empirical data because these data are characterised by strategic uncertainty. Second, people are generally reluctant to identify themselves as a discriminator. In the project, I study TBS using novel economic experiments that circumvent these problems.
The project consists of three main objectives. First, I investigate whether and how preferences of European natives in social interactions depend on others’ ethnicity. Are natives as altruistic, reciprocal, envious to immigrants as compared to other natives? Second, I study whether natives have different fairness ideals—what constitutes a fair distribution of resources from the perspective of an impartial spectator—when it comes to natives than when it comes to non-natives. Third, I analyse whether preferences and fairness ideals depend on exposure to diversity: do preferences and fairness ideals of natives change as contact with non-natives increases, and, if so, how?
Max ERC Funding
1 499 046 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym DROSADAPTATION
Project New approaches to long-standing questions: adaptation in Drosophila
Researcher (PI) Josefa Gonzalez Perez
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), LS8, ERC-2014-CoG
Summary Understanding how organisms adapt to their environments is a long-standing problem in Biology with far-reaching implications: adaptation affects the ability of species to survive in changing environments, host-pathogen interactions, and resistance to pesticides and drugs. Despite recent progress, adaptation is to date a poorly understood process largely due to limitations of current approaches that focus (i) on a priori candidate genes, (ii) on signals of selection at the DNA level without functional validation of the identified candidates, and (iii) on small sets of adaptive mutations that do not represent the variability present in natural populations. As a result, major questions such as what is the relative importance of different types of mutations in adaptation?, and what is the importance of epigenetic changes in adaptive evolution?, remain largely unanswered.
To gain a deep understanding of adaptation, we need to systematically identify adaptive mutations across space and time, pinpoint their molecular mechanisms and discover their fitness effects. To this end, Drosophila melanogaster has proven to be an ideal organism. Besides the battery of genetic tools and resources available, D. melanogaster has recently adapted to live in out of Africa environments. We and others have already shown that transposable elements (TEs) have substantially contributed to the adaptation of D. melanogaster to different environmental challenges. Here, we propose to use state-of-the-art techniques, such as Illumina TruSeq sequencing and CRISPR/Cas9 genome editing, to systematically identify and characterize in detail adaptive TE insertions in D. melanogaster natural populations. Only by moving from gathering anecdotic evidence to applying global approaches, we will be able to start constructing a quantitative and predictive theory of adaptation that will be relevant for other species as well.
Summary
Understanding how organisms adapt to their environments is a long-standing problem in Biology with far-reaching implications: adaptation affects the ability of species to survive in changing environments, host-pathogen interactions, and resistance to pesticides and drugs. Despite recent progress, adaptation is to date a poorly understood process largely due to limitations of current approaches that focus (i) on a priori candidate genes, (ii) on signals of selection at the DNA level without functional validation of the identified candidates, and (iii) on small sets of adaptive mutations that do not represent the variability present in natural populations. As a result, major questions such as what is the relative importance of different types of mutations in adaptation?, and what is the importance of epigenetic changes in adaptive evolution?, remain largely unanswered.
To gain a deep understanding of adaptation, we need to systematically identify adaptive mutations across space and time, pinpoint their molecular mechanisms and discover their fitness effects. To this end, Drosophila melanogaster has proven to be an ideal organism. Besides the battery of genetic tools and resources available, D. melanogaster has recently adapted to live in out of Africa environments. We and others have already shown that transposable elements (TEs) have substantially contributed to the adaptation of D. melanogaster to different environmental challenges. Here, we propose to use state-of-the-art techniques, such as Illumina TruSeq sequencing and CRISPR/Cas9 genome editing, to systematically identify and characterize in detail adaptive TE insertions in D. melanogaster natural populations. Only by moving from gathering anecdotic evidence to applying global approaches, we will be able to start constructing a quantitative and predictive theory of adaptation that will be relevant for other species as well.
Max ERC Funding
2 392 521 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym DYMOLAMO
Project Dynamic Modeling of Labor Market Mobility and Human Capital Accumulation
Researcher (PI) Joan LLULL CABRER
Host Institution (HI) FUNDACIÓ MARKETS, ORGANIZATIONS AND VOTES IN ECONOMICS
Call Details Starting Grant (StG), SH1, ERC-2018-STG
Summary In today’s globalized world, labor mobility is at the core of the political debate and a centerpiece for economic policy. The design of migration policies, such as selective, skill-biased, immigration policies, policies to encourage the integration of immigrants, or ones that facilitate geographical mobility to increase labor market opportunities of disadvantaged workers, requires a good understanding of a more fundamental issue: understanding the role of internal migration and immigration in shaping the career paths and human capital accumulation of workers. This project aims at providing a coherent analysis that allows us to understand the interactions between labor mobility and human capital accumulation, and their implications for economic policy design.
This project focuses on three main issues: labor mobility, labor market effects of immigration, and the interaction between the two. Our questions are: (a) What are the role of temporary and permanent contracts in shaping career paths and geographic mobility of workers? (b) Does the forgone human capital accumulation during a recession produce a lost generation? Is this alleviated by geographical mobility? (c) What is the role of geographical and occupational mobility in spreading or containing the effects of technological progress on wage inequality? (d) To what extent selective immigration policies maximize native workers’ prospects and wellbeing? (e) How can we increase degree of assimilation of immigrants?
To address these questions, we will develop dynamic equilibrium models that explicitly characterize human capital accumulation decisions of workers and how these decisions interact with migration. Our proposed models will introduce rich labor market structures and a variety of economic shocks. They will require the implementation of novel estimation methods, which we will also develop. The estimated models will be used to evaluate and design key economic policies for the labor market.
Summary
In today’s globalized world, labor mobility is at the core of the political debate and a centerpiece for economic policy. The design of migration policies, such as selective, skill-biased, immigration policies, policies to encourage the integration of immigrants, or ones that facilitate geographical mobility to increase labor market opportunities of disadvantaged workers, requires a good understanding of a more fundamental issue: understanding the role of internal migration and immigration in shaping the career paths and human capital accumulation of workers. This project aims at providing a coherent analysis that allows us to understand the interactions between labor mobility and human capital accumulation, and their implications for economic policy design.
This project focuses on three main issues: labor mobility, labor market effects of immigration, and the interaction between the two. Our questions are: (a) What are the role of temporary and permanent contracts in shaping career paths and geographic mobility of workers? (b) Does the forgone human capital accumulation during a recession produce a lost generation? Is this alleviated by geographical mobility? (c) What is the role of geographical and occupational mobility in spreading or containing the effects of technological progress on wage inequality? (d) To what extent selective immigration policies maximize native workers’ prospects and wellbeing? (e) How can we increase degree of assimilation of immigrants?
To address these questions, we will develop dynamic equilibrium models that explicitly characterize human capital accumulation decisions of workers and how these decisions interact with migration. Our proposed models will introduce rich labor market structures and a variety of economic shocks. They will require the implementation of novel estimation methods, which we will also develop. The estimated models will be used to evaluate and design key economic policies for the labor market.
Max ERC Funding
1 400 250 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym DYNURBAN
Project Urban dynamics: learning from integrated models and big data
Researcher (PI) Diego PUGA
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS MONETARIOS Y FINANCIEROS
Call Details Advanced Grant (AdG), SH1, ERC-2015-AdG
Summary City growth is driven by a combination of systematic determinants and shocks. Random growth models predict realistic city size distributions but ignore, for instance, the strong empirical association between human capital and city growth. Models with systematic determinants predict degenerate size distributions. We will develop an integrated model that combines systematic and random determinants to explain the link between human capital, entrepreneurship and growth, while generating relevant city size distributions. We will calibrate the model to quantify the contribution of cities to aggregate growth.
Urban growth also has a poorly understood spatial component. Combining gridded data of land use, population, businesses and roads for 3 decennial periods we will track the evolution of land use in the US with an unprecedented level of spatial detail. We will pay particular attention to the magnitude and causes of “slash-and-burn” development: instances when built-up land stops meeting needs in terms of use and intensity and, instead of being redeveloped, it is abandoned while previously open space is built up.
Job-to-job flows across cities matter for efficiency and during the recent crisis they have plummeted. We will study them with individual social security data. Even if there have only been small changes in mismatch between unemployed workers and vacancies during the crisis, if workers shy away from moving to take a job in another city, misallocation can increase substantially.
We will also study commuting flows for Spain and the UK based on anonymized cell phone location records. We will identify urban areas by iteratively aggregating municipalities if more than a given share of transit flows end in the rest of the urban area. We will also measure the extent to which people cross paths with others opening the possibility of personal interactions, and assess the extent to which this generates productivity-enhancing agglomeration economies.
Summary
City growth is driven by a combination of systematic determinants and shocks. Random growth models predict realistic city size distributions but ignore, for instance, the strong empirical association between human capital and city growth. Models with systematic determinants predict degenerate size distributions. We will develop an integrated model that combines systematic and random determinants to explain the link between human capital, entrepreneurship and growth, while generating relevant city size distributions. We will calibrate the model to quantify the contribution of cities to aggregate growth.
Urban growth also has a poorly understood spatial component. Combining gridded data of land use, population, businesses and roads for 3 decennial periods we will track the evolution of land use in the US with an unprecedented level of spatial detail. We will pay particular attention to the magnitude and causes of “slash-and-burn” development: instances when built-up land stops meeting needs in terms of use and intensity and, instead of being redeveloped, it is abandoned while previously open space is built up.
Job-to-job flows across cities matter for efficiency and during the recent crisis they have plummeted. We will study them with individual social security data. Even if there have only been small changes in mismatch between unemployed workers and vacancies during the crisis, if workers shy away from moving to take a job in another city, misallocation can increase substantially.
We will also study commuting flows for Spain and the UK based on anonymized cell phone location records. We will identify urban areas by iteratively aggregating municipalities if more than a given share of transit flows end in the rest of the urban area. We will also measure the extent to which people cross paths with others opening the possibility of personal interactions, and assess the extent to which this generates productivity-enhancing agglomeration economies.
Max ERC Funding
1 292 586 €
Duration
Start date: 2016-08-01, End date: 2021-07-31
Project acronym E-Response
Project Evolutionary responses to a warming world: physiological genomics of seasonal timing
Researcher (PI) Marcel Erik Visser
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Advanced Grant (AdG), LS8, ERC-2013-ADG
Summary The world is seasonal and organisms’ adjustment of their seasonal timing to this environmental variation is crucial for their fitness. Climate change is strongly impacting seasonal timing which makes a better understanding of the potential for micro-evolution of timing in natural populations essential. As any phenotypic change ultimately involves changes in the physiological mechanism underlying timing, we need to unravel the genetics of these mechanisms. I will carry out a highly integrated eco-evo-devo project on the causes and consequences of genetic variation in timing of reproduction in great tits (Parus major), an ecological model species for that we recently developed state-of-the-art genomic tools. I will develop a powerful instrument to study this timing mechanism by creating selection lines of early and late reproducing birds using genome-wide, rather than phenotypic, selection. The phenotypic response of selection lines birds will be assessed both in controlled environment aviaries and in birds introduced to the wild. To unravel how selection has altered the birds’ physiology I will measure key components of the physiological mechanism at the central, peripheral and egg production levels. As a unique next step I will then introduce selection line birds into a wild population to assess the fitness of these extreme phenotypes. This will enable me, for the first time, to estimate the selection on timing without confounds, which I will compare with traditional estimates using observational data. Finally, I will integrate genetics, physiology and ecology to hindcast the rate of genetic change in our wild population and validate this rate using DNA sampled over a 20-year period. This innovative project integrates state of the art developments in ecology, genetics and physiology (eco-evo-devo) will set new standards for future studies in other wild species and will be of key importance for our predictions of evolutionary responses to a warming world.
Summary
The world is seasonal and organisms’ adjustment of their seasonal timing to this environmental variation is crucial for their fitness. Climate change is strongly impacting seasonal timing which makes a better understanding of the potential for micro-evolution of timing in natural populations essential. As any phenotypic change ultimately involves changes in the physiological mechanism underlying timing, we need to unravel the genetics of these mechanisms. I will carry out a highly integrated eco-evo-devo project on the causes and consequences of genetic variation in timing of reproduction in great tits (Parus major), an ecological model species for that we recently developed state-of-the-art genomic tools. I will develop a powerful instrument to study this timing mechanism by creating selection lines of early and late reproducing birds using genome-wide, rather than phenotypic, selection. The phenotypic response of selection lines birds will be assessed both in controlled environment aviaries and in birds introduced to the wild. To unravel how selection has altered the birds’ physiology I will measure key components of the physiological mechanism at the central, peripheral and egg production levels. As a unique next step I will then introduce selection line birds into a wild population to assess the fitness of these extreme phenotypes. This will enable me, for the first time, to estimate the selection on timing without confounds, which I will compare with traditional estimates using observational data. Finally, I will integrate genetics, physiology and ecology to hindcast the rate of genetic change in our wild population and validate this rate using DNA sampled over a 20-year period. This innovative project integrates state of the art developments in ecology, genetics and physiology (eco-evo-devo) will set new standards for future studies in other wild species and will be of key importance for our predictions of evolutionary responses to a warming world.
Max ERC Funding
2 495 808 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym EARLYWARNING
Project Generic Early Warning Signals for Critical Transitions
Researcher (PI) Marten Scheffer
Host Institution (HI) WAGENINGEN UNIVERSITY
Call Details Advanced Grant (AdG), LS8, ERC-2010-AdG_20100317
Summary Abrupt shifts occasionally reshape complex systems in nature ranging in scale from lakes and reefs to regional climate systems. Such shifts sometimes represent critical transitions in the sense that they happen at tipping points where runaway change propels the system towards an alterative contrasting state. Although the mechanism of critical transitions can often be reconstructed in the hindsight, we are virtually unable to predict when they will happen in advance. Simulation models for complex environmental systems are simply not good enough to predict tipping points, and there is little hope that this will change over the coming decades. The proposed project is aimed at developing an alternative way to predict critical transitions. We aim at finding early warning signals for such transitions that are generic in the sense that they work irrespective of the (often poorly known) mechanisms responsible for the tipping points. Mathematical theory indicates that this might be possible. However, although excitement about these ideas is emerging, we are far from having a cohesive theory, let alone practical approaches for predicting critical transitions in large complex systems like lakes, coral reefs or the climate. I will work towards this goal with my team along three lines: 1) Develop a comprehensive theory of early warning signals using analytical mathematical techniques as well as models ranging in character from simple and transparent to elaborate and realistic; 2) Test the theory on experimental plankton systems kept in controlled microcosms; and 3) Analyze data from real systems that go through catastrophic transitions. The anticipated results would imply a major breakthrough in a field of research that is exiting as well as highly relevant to society. If we are successful, it would allow us to anticipate critical transitions even in large complex systems where we have little hope of predicting tipping points on the basis of mechanistic models.
Summary
Abrupt shifts occasionally reshape complex systems in nature ranging in scale from lakes and reefs to regional climate systems. Such shifts sometimes represent critical transitions in the sense that they happen at tipping points where runaway change propels the system towards an alterative contrasting state. Although the mechanism of critical transitions can often be reconstructed in the hindsight, we are virtually unable to predict when they will happen in advance. Simulation models for complex environmental systems are simply not good enough to predict tipping points, and there is little hope that this will change over the coming decades. The proposed project is aimed at developing an alternative way to predict critical transitions. We aim at finding early warning signals for such transitions that are generic in the sense that they work irrespective of the (often poorly known) mechanisms responsible for the tipping points. Mathematical theory indicates that this might be possible. However, although excitement about these ideas is emerging, we are far from having a cohesive theory, let alone practical approaches for predicting critical transitions in large complex systems like lakes, coral reefs or the climate. I will work towards this goal with my team along three lines: 1) Develop a comprehensive theory of early warning signals using analytical mathematical techniques as well as models ranging in character from simple and transparent to elaborate and realistic; 2) Test the theory on experimental plankton systems kept in controlled microcosms; and 3) Analyze data from real systems that go through catastrophic transitions. The anticipated results would imply a major breakthrough in a field of research that is exiting as well as highly relevant to society. If we are successful, it would allow us to anticipate critical transitions even in large complex systems where we have little hope of predicting tipping points on the basis of mechanistic models.
Max ERC Funding
2 299 171 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym ECHO
Project Extending Coherence for Hardware-Driven Optimizations in Multicore Architectures
Researcher (PI) Alberto ROS BARDISA
Host Institution (HI) UNIVERSIDAD DE MURCIA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Multicore processors are present nowadays in most digital devices, from smartphones to high-performance
servers. The increasing computational power of these processors is essential for enabling many important
emerging application domains such as big-data, media, medical, or scientific modeling. A fundamental
technique to improve performance is speculation, a technique that consists in executing work before it is
known if it is actually needed. In hardware, speculation significantly increases energy consumption by
performing unnecessary operations, while speculation in software (e.g., compilers) is not the default thus
preventing performance optimizations. Since performance in current multicores is limited by their power
budget, it is imperative to make multicores as energy-efficient as possible to increase performance even
further.
In a multicore architecture, the cache coherence protocol is an essential component since its unique but
challenging role is to offer a simple and unified view of the memory hierarchy. This project envisions that
extending the role of the coherence protocol to simplify other system components will be the key to
overcome the performance and energy limitations of current multicores. In particular, ECHO proposes to
add simple but effective extensions to the cache coherence protocol in order to (i) reduce and even
eliminate misspeculations at the processing cores and synchronization mechanisms and to (ii) enable
speculative optimizations at compile time. The goal of this innovative approach is to improve the
performance and energy efficiency of future multicore architectures. To accomplish the objectives
proposed in this project, I will build on my 14 years expertise in cache coherence, documented in over 40
publications of high impact.
Summary
Multicore processors are present nowadays in most digital devices, from smartphones to high-performance
servers. The increasing computational power of these processors is essential for enabling many important
emerging application domains such as big-data, media, medical, or scientific modeling. A fundamental
technique to improve performance is speculation, a technique that consists in executing work before it is
known if it is actually needed. In hardware, speculation significantly increases energy consumption by
performing unnecessary operations, while speculation in software (e.g., compilers) is not the default thus
preventing performance optimizations. Since performance in current multicores is limited by their power
budget, it is imperative to make multicores as energy-efficient as possible to increase performance even
further.
In a multicore architecture, the cache coherence protocol is an essential component since its unique but
challenging role is to offer a simple and unified view of the memory hierarchy. This project envisions that
extending the role of the coherence protocol to simplify other system components will be the key to
overcome the performance and energy limitations of current multicores. In particular, ECHO proposes to
add simple but effective extensions to the cache coherence protocol in order to (i) reduce and even
eliminate misspeculations at the processing cores and synchronization mechanisms and to (ii) enable
speculative optimizations at compile time. The goal of this innovative approach is to improve the
performance and energy efficiency of future multicore architectures. To accomplish the objectives
proposed in this project, I will build on my 14 years expertise in cache coherence, documented in over 40
publications of high impact.
Max ERC Funding
1 999 955 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ECO-MOM
Project Ecology of anaerobic methane oxidizing microbes
Researcher (PI) Michael Jetten
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), LS8, ERC-2013-ADG
Summary For over a century it was believed that methane (CH4) could only be oxidized by micro-organisms in the presence of oxygen. The possibility of nitrate-dependent or metal-dependent anaerobic oxidation of CH4 (AOM) was generally dismissed. However, about 6 years ago the microbes responsible for the nitrate-AOM reaction were discovered. This was followed by molecular approaches that resulted in the identification of the responsible Methylomirabilis oxyfera bacteria. Recently, the widespread environmental occurrence of these bacteria was demonstrated leading to the realization that AOM may play a significant role in the CH4 and nitrogen cycles. M. oxyfera is a unique microbe with unusual properties that we only begin to understand: the production of oxygen from NO by a putative NO dismutase and a very unusual polygonal cell shape. Even less is known about metals (Fe3+ or Mn4+) as electron acceptors for AOM. The aim of this project is to obtain a fundamental understanding of the metabolism and ecological importance of the M. oxyfera bacteria, and to enrich new metal-dependent AOM microbes. Such understanding contributes directly to our environment and economy because AOM is a new sustainable opportunity for nitrogen removal from wastewater. The results will show how the CH4, nitrogen and iron cycles are connected and may lead to new ways of mitigating methane emission. The biodiversity and contribution of AOM-microbes to the biogeochemical cycles in oxygen-limited ecosystems will be investigated, and new metal-AOM enrichments will be performed. Together the environmental and metabolic data will help to understand how and to what extent AOM-microbes contribute to the biogeochemical cycles and thus shape atmosphere of our planet. The research lines will employ state-of-the- art methods to unravel the exceptional properties of these highly unusual and important microbes. The experiments will be performed in one of the world best equipped laboratories for microbial ecology.
Summary
For over a century it was believed that methane (CH4) could only be oxidized by micro-organisms in the presence of oxygen. The possibility of nitrate-dependent or metal-dependent anaerobic oxidation of CH4 (AOM) was generally dismissed. However, about 6 years ago the microbes responsible for the nitrate-AOM reaction were discovered. This was followed by molecular approaches that resulted in the identification of the responsible Methylomirabilis oxyfera bacteria. Recently, the widespread environmental occurrence of these bacteria was demonstrated leading to the realization that AOM may play a significant role in the CH4 and nitrogen cycles. M. oxyfera is a unique microbe with unusual properties that we only begin to understand: the production of oxygen from NO by a putative NO dismutase and a very unusual polygonal cell shape. Even less is known about metals (Fe3+ or Mn4+) as electron acceptors for AOM. The aim of this project is to obtain a fundamental understanding of the metabolism and ecological importance of the M. oxyfera bacteria, and to enrich new metal-dependent AOM microbes. Such understanding contributes directly to our environment and economy because AOM is a new sustainable opportunity for nitrogen removal from wastewater. The results will show how the CH4, nitrogen and iron cycles are connected and may lead to new ways of mitigating methane emission. The biodiversity and contribution of AOM-microbes to the biogeochemical cycles in oxygen-limited ecosystems will be investigated, and new metal-AOM enrichments will be performed. Together the environmental and metabolic data will help to understand how and to what extent AOM-microbes contribute to the biogeochemical cycles and thus shape atmosphere of our planet. The research lines will employ state-of-the- art methods to unravel the exceptional properties of these highly unusual and important microbes. The experiments will be performed in one of the world best equipped laboratories for microbial ecology.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ECOEVODEVO
Project Eco-evolutionary dynamics of community self-organization through ontogenetic asymmetry
Researcher (PI) Andre Marc De Roos
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), LS8, ERC-2012-ADG_20120314
Summary Classical theory on community ecology models dynamics as interplay between top-down and bottom-up effects of population abundances only and considers population composition irrelevant. It ignores food-dependent ontogenetic development, in particular somatic growth, which characterizes most species and uniquely distinguishes organisms from fundamental units in physical or chemical multi-particle systems. Similarly, evolutionary theory has ignored the potential population feedback on food-dependent ontogenetic development. Classic theory has been shown to apply in case of ontogenetic symmetry in energetics, when dynamics of population abundance and composition are independent. Ontogenetic symmetry stipulates that mass-specific rates of net biomass turnover are independent of individual body size. Ontogenetic symmetry only represents a limiting, structurally unstable case, separating two stable domains with ontogenetic asymmetry in energetics, when either juveniles or adults have higher mass-specific net-biomass production. In case of ontogenetic asymmetry the dynamics of population abundance and composition become intimately linked, ultimately resulting in the emergence of positive feedbacks between densities of predators and their main prey. This transforms consumer-resource interactions into indivisible units, whose behavior can no longer be predicted from its constituting parts (the species). Ontogenetic asymmetry in energetics is thus a potent driver of self-organization in ecological communities. This research project aims at unraveling the eco-evolutionary dynamics of ontogenetic asymmetry in energetics, focusing on (1) the likelihood that ontogenetic asymmetry in energetics evolves as mechanism of self-organization in ecological communities, (2) the conditions that may have promoted or inhibited this evolution and (3) the extent to which ontogenetic asymmetry in energetics has contributed to the diversity of life and the development of complex life cycles.
Summary
Classical theory on community ecology models dynamics as interplay between top-down and bottom-up effects of population abundances only and considers population composition irrelevant. It ignores food-dependent ontogenetic development, in particular somatic growth, which characterizes most species and uniquely distinguishes organisms from fundamental units in physical or chemical multi-particle systems. Similarly, evolutionary theory has ignored the potential population feedback on food-dependent ontogenetic development. Classic theory has been shown to apply in case of ontogenetic symmetry in energetics, when dynamics of population abundance and composition are independent. Ontogenetic symmetry stipulates that mass-specific rates of net biomass turnover are independent of individual body size. Ontogenetic symmetry only represents a limiting, structurally unstable case, separating two stable domains with ontogenetic asymmetry in energetics, when either juveniles or adults have higher mass-specific net-biomass production. In case of ontogenetic asymmetry the dynamics of population abundance and composition become intimately linked, ultimately resulting in the emergence of positive feedbacks between densities of predators and their main prey. This transforms consumer-resource interactions into indivisible units, whose behavior can no longer be predicted from its constituting parts (the species). Ontogenetic asymmetry in energetics is thus a potent driver of self-organization in ecological communities. This research project aims at unraveling the eco-evolutionary dynamics of ontogenetic asymmetry in energetics, focusing on (1) the likelihood that ontogenetic asymmetry in energetics evolves as mechanism of self-organization in ecological communities, (2) the conditions that may have promoted or inhibited this evolution and (3) the extent to which ontogenetic asymmetry in energetics has contributed to the diversity of life and the development of complex life cycles.
Max ERC Funding
1 779 634 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym EdGe
Project The molecular genetic architecture of educational attainment and its significance for cognitive health
Researcher (PI) Philipp Daniel Koellinger
Host Institution (HI) STICHTING VU
Call Details Consolidator Grant (CoG), SH1, ERC-2014-CoG
Summary Since many social and economic outcomes are moderately heritable, it is in principle possible to discover genetic variants associated with them. Such discoveries could yield new insights into the causal pathways underlying human behaviour, the complex interplay of environmental and genetic factors, and the relationship between socio-economic traits and health.
This proposal builds on a recent genome-wide association study on educational attainment (EA) led by the applicant (Rietveld et al. 2013, Science), which identified for the first time specific genetic variants robustly associated with a socio-economic outcome. The project will leverage the unique resources of the Social Science Genetic Association Consortium (SSGAC), which is co-led by the applicant.
The proposed research will extend existing knowledge by: 1) discovering additional genetic variants and causal pathways associated with EA; 2) developing methods to use the available genetic association results in novel, more efficient ways; 3) shedding new light on characteristics related to EA such as economic preferences, cognitive function, and cognitive health; 4) showing how policies promoting EA interact with genetic predisposition; 5) using genetic information to better understand the causal effects of educational policy interventions, 6) developing better tools to identify individuals at risk for cognition-related diseases before the onset of symptoms; and 7) identifying causal pathways of genetic influence on cognitive health via neurobiological measures. The project aims to elucidate the complex causal pathways connecting genes, environment, individual characteristics, and health-related outcomes; make methodological contributions applicable in genetic epidemiology and the social sciences; and contribute towards designing more effective public policy, which could improve public health and lower health costs.
Summary
Since many social and economic outcomes are moderately heritable, it is in principle possible to discover genetic variants associated with them. Such discoveries could yield new insights into the causal pathways underlying human behaviour, the complex interplay of environmental and genetic factors, and the relationship between socio-economic traits and health.
This proposal builds on a recent genome-wide association study on educational attainment (EA) led by the applicant (Rietveld et al. 2013, Science), which identified for the first time specific genetic variants robustly associated with a socio-economic outcome. The project will leverage the unique resources of the Social Science Genetic Association Consortium (SSGAC), which is co-led by the applicant.
The proposed research will extend existing knowledge by: 1) discovering additional genetic variants and causal pathways associated with EA; 2) developing methods to use the available genetic association results in novel, more efficient ways; 3) shedding new light on characteristics related to EA such as economic preferences, cognitive function, and cognitive health; 4) showing how policies promoting EA interact with genetic predisposition; 5) using genetic information to better understand the causal effects of educational policy interventions, 6) developing better tools to identify individuals at risk for cognition-related diseases before the onset of symptoms; and 7) identifying causal pathways of genetic influence on cognitive health via neurobiological measures. The project aims to elucidate the complex causal pathways connecting genes, environment, individual characteristics, and health-related outcomes; make methodological contributions applicable in genetic epidemiology and the social sciences; and contribute towards designing more effective public policy, which could improve public health and lower health costs.
Max ERC Funding
1 870 135 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym EDSGEL
Project Likelihood-based estimation of non-linear and non-normal DSGE models
Researcher (PI) Juan Francisco Rubio-Ramirez
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS MONETARIOS Y FINANCIEROS
Call Details Starting Grant (StG), SH1, ERC-2009-StG
Summary DSGE models are the standard tool of quantitative macroeconomics. We use them to measure economics phenomena and to provide policy advice. However, since Kydland and Prescott s 1982, the profession has fought about how to take these models to the data. Kydland and Prescott proposed to calibrate their model. Why? Macroeconomists could not compute their models efficiently. Moreover, the techniques required for estimating DSGE models using the likelihood did not exist. Finally, models were ranked very badly by likelihood ratio tests. Calibration offered a temporary solution. By focusing only on a very limited set of moments of the model, researchers could claim partial success and keep developing their theory. The landscape changed in the 1990s. There were developments along three fronts. First, macroeconomists learned how to efficiently compute equilibrium models with rich dynamics. Second, statisticians developed simulation techniques like Markov chain Monte Carlo (MCMC), which we require to estimate DSGE models. Third, and perhaps most important, computer power has become so cheap that we can now do things that were unthinkable 20 years ago. This proposal tries to estimate non-linear and/or non-normal DSGE models using a likelihood approach. Why non-linear models? Previous research has proved that second order approximation errors in the policy function have first order effects on the likelihood function. Why non-normal models? Time-varying volatility is key to understanding the Great Moderation. Kim and Nelson (1999), McConnell and Pérez-Quirós (2000), and Stock and Watson (2002) have documented a decline in the variance of output growth since the mid 1980s. Only DSGE models with richer structure than normal innovations can account for this.
Summary
DSGE models are the standard tool of quantitative macroeconomics. We use them to measure economics phenomena and to provide policy advice. However, since Kydland and Prescott s 1982, the profession has fought about how to take these models to the data. Kydland and Prescott proposed to calibrate their model. Why? Macroeconomists could not compute their models efficiently. Moreover, the techniques required for estimating DSGE models using the likelihood did not exist. Finally, models were ranked very badly by likelihood ratio tests. Calibration offered a temporary solution. By focusing only on a very limited set of moments of the model, researchers could claim partial success and keep developing their theory. The landscape changed in the 1990s. There were developments along three fronts. First, macroeconomists learned how to efficiently compute equilibrium models with rich dynamics. Second, statisticians developed simulation techniques like Markov chain Monte Carlo (MCMC), which we require to estimate DSGE models. Third, and perhaps most important, computer power has become so cheap that we can now do things that were unthinkable 20 years ago. This proposal tries to estimate non-linear and/or non-normal DSGE models using a likelihood approach. Why non-linear models? Previous research has proved that second order approximation errors in the policy function have first order effects on the likelihood function. Why non-normal models? Time-varying volatility is key to understanding the Great Moderation. Kim and Nelson (1999), McConnell and Pérez-Quirós (2000), and Stock and Watson (2002) have documented a decline in the variance of output growth since the mid 1980s. Only DSGE models with richer structure than normal innovations can account for this.
Max ERC Funding
909 942 €
Duration
Start date: 2010-07-01, End date: 2015-06-30
Project acronym EDST
Project Economic Development and Structural Transformation
Researcher (PI) Maria Paula BUSTOS
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS MONETARIOS Y FINANCIEROS
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary The early development literature documented that the growth path of most advanced economies was accompanied by a process of structural transformation. As economies develop, the share of agriculture in employment falls and workers migrate to cities to find employment in the industrial and service sectors [Clark (1940), Kuznets (1957)]. In the first industrialized countries, technical improvements in agriculture favoured the development of industry and services by releasing labour, increasing demand and raising profits to finance other activities. However, several scholars noted that the positive effects of agricultural productivity on economic development are no longer operative in open economies. In addition, there is a large theoretical literature highlighting how market failures can retard structural transformation in developing countries. In particular, financial frictions might constrain the reallocation of capital and thus retard the process of labour reallocation. In this project, we propose to contribute to our understanding of structural transformation by providing direct empirical evidence on the effects of exogenous shocks to local agricultural and manufacturing productivity on the reallocation of capital and labour across sectors, firms and space in Brazil. For this purpose, we construct the first data set that permits to jointly observe labour and credit flows across sectors and space. To exploit the spatial dimension of the capital allocation problem, we design a new empirical which exploits the geographical structure of bank branch networks. Similarly, we propose to study the spatial dimension of the labour allocation problem by exploiting differences in migration costs across regions due to transportation and social networks.
Summary
The early development literature documented that the growth path of most advanced economies was accompanied by a process of structural transformation. As economies develop, the share of agriculture in employment falls and workers migrate to cities to find employment in the industrial and service sectors [Clark (1940), Kuznets (1957)]. In the first industrialized countries, technical improvements in agriculture favoured the development of industry and services by releasing labour, increasing demand and raising profits to finance other activities. However, several scholars noted that the positive effects of agricultural productivity on economic development are no longer operative in open economies. In addition, there is a large theoretical literature highlighting how market failures can retard structural transformation in developing countries. In particular, financial frictions might constrain the reallocation of capital and thus retard the process of labour reallocation. In this project, we propose to contribute to our understanding of structural transformation by providing direct empirical evidence on the effects of exogenous shocks to local agricultural and manufacturing productivity on the reallocation of capital and labour across sectors, firms and space in Brazil. For this purpose, we construct the first data set that permits to jointly observe labour and credit flows across sectors and space. To exploit the spatial dimension of the capital allocation problem, we design a new empirical which exploits the geographical structure of bank branch networks. Similarly, we propose to study the spatial dimension of the labour allocation problem by exploiting differences in migration costs across regions due to transportation and social networks.
Max ERC Funding
1 486 500 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym ELECTRIC CHALLENGES
Project Current Tools and Policy Challenges in Electricity Markets
Researcher (PI) Natalia FABRA PORTELA
Host Institution (HI) UNIVERSIDAD CARLOS III DE MADRID
Call Details Consolidator Grant (CoG), SH1, ERC-2017-COG
Summary The fight against climate change is among Europe’s top policy priorities. In this research agenda, I propose to push out the frontier in the area of Energy and Environmental Economics by carrying out policy-relevant research on a pressing issue: how to design optimal regulatory and market-based solutions to achieve a least cost transition towards a low-carbon economy.
The European experience provides unique natural experiments with which to test some of the most contentious issues that arise in the context of electricity markets, including the potential to change households’ demand patterns through dynamic pricing, the scope of renewables to mitigate market power and depress wholesale market prices, and the design and performance of the auctions for renewable support. While there is a body of policy work on these issues, it generally does not meet the required research standards.
In this research, I will rely on cutting-edge theoretical, empirical, and simulation tools to disentangle these topics, while providing key economic insights that are relevant beyond electricity markets. On the theory front, I propose to develop new models that incorporate the intermittency of renewables to characterize optimal bidding as a key, broadly omitted ingredient in previous analysis. In turn, these models will provide a rigorous structure for the empirical and simulation analysis, which will rely both on traditional econometrics for casual inference as well as on state-of-the-art machine learning methods to construct counterfactual scenarios for policy analysis.
While my focus is on energy and environmental issues, my research will also provide methodological contributions for other areas - particularly those related to policy design and policy evaluation. The conclusions of this research should prove valuable for academics, as well as to policy makers to assess the impact of environmental and energy policies and redefine them where necessary.
Summary
The fight against climate change is among Europe’s top policy priorities. In this research agenda, I propose to push out the frontier in the area of Energy and Environmental Economics by carrying out policy-relevant research on a pressing issue: how to design optimal regulatory and market-based solutions to achieve a least cost transition towards a low-carbon economy.
The European experience provides unique natural experiments with which to test some of the most contentious issues that arise in the context of electricity markets, including the potential to change households’ demand patterns through dynamic pricing, the scope of renewables to mitigate market power and depress wholesale market prices, and the design and performance of the auctions for renewable support. While there is a body of policy work on these issues, it generally does not meet the required research standards.
In this research, I will rely on cutting-edge theoretical, empirical, and simulation tools to disentangle these topics, while providing key economic insights that are relevant beyond electricity markets. On the theory front, I propose to develop new models that incorporate the intermittency of renewables to characterize optimal bidding as a key, broadly omitted ingredient in previous analysis. In turn, these models will provide a rigorous structure for the empirical and simulation analysis, which will rely both on traditional econometrics for casual inference as well as on state-of-the-art machine learning methods to construct counterfactual scenarios for policy analysis.
While my focus is on energy and environmental issues, my research will also provide methodological contributions for other areas - particularly those related to policy design and policy evaluation. The conclusions of this research should prove valuable for academics, as well as to policy makers to assess the impact of environmental and energy policies and redefine them where necessary.
Max ERC Funding
1 422 375 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ENCODING IN AXONS
Project Identifying mechanisms of information encoding in myelinated single axons
Researcher (PI) Maarten Kole
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Starting Grant (StG), LS5, ERC-2010-StG_20091118
Summary A major challenge in neuroscience is to understand how information is stored and coded within single nerve cells (neurons) and across neuron populations in the brain. Nerve cell fibres (axons) are thought to provide the wiring to connect neurons and conduct the electrical nerve impulse (action potential; AP). Recent discoveries, however, show that the initial part of axons actively participates in modulating APs and providing a means to enhance the computational repertoire of neurons in the central nervous system. To decrease the temporal delay in information transmission over long distances most axons are myelinated. Here, we will test the hypothesis that the degree of myelination of single axons directly and indirectly influences the mechanisms of AP generation and neural coding. We will use a novel approach of patch-clamp recording combined with immunohistochemical and ultrastructural identification to develop a detailed model of single myelinated neocortical axons. We also will investigate the neuron-glia interactions responsible for the myelination process and measure whether their development follows an activity-dependent process. Finally, we will elucidate the physiological and molecular similarities and discrepancies between myelinated and experimentally demyelinated single neocortical axons. These studies will provide a novel methodological framework to study central nervous system axons and yield basic insights into myelin physiology and pathophysiology.
Summary
A major challenge in neuroscience is to understand how information is stored and coded within single nerve cells (neurons) and across neuron populations in the brain. Nerve cell fibres (axons) are thought to provide the wiring to connect neurons and conduct the electrical nerve impulse (action potential; AP). Recent discoveries, however, show that the initial part of axons actively participates in modulating APs and providing a means to enhance the computational repertoire of neurons in the central nervous system. To decrease the temporal delay in information transmission over long distances most axons are myelinated. Here, we will test the hypothesis that the degree of myelination of single axons directly and indirectly influences the mechanisms of AP generation and neural coding. We will use a novel approach of patch-clamp recording combined with immunohistochemical and ultrastructural identification to develop a detailed model of single myelinated neocortical axons. We also will investigate the neuron-glia interactions responsible for the myelination process and measure whether their development follows an activity-dependent process. Finally, we will elucidate the physiological and molecular similarities and discrepancies between myelinated and experimentally demyelinated single neocortical axons. These studies will provide a novel methodological framework to study central nervous system axons and yield basic insights into myelin physiology and pathophysiology.
Max ERC Funding
1 994 640 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym ENMUH
Project Estimation of Nonlinear Models with Unobserved Heterogeneity
Researcher (PI) Stephane Olivier Bonhomme
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS MONETARIOS Y FINANCIEROS
Call Details Starting Grant (StG), SH1, ERC-2010-StG_20091209
Summary Modern economic research emphasizes heterogeneity in various dimensions, such as individual preferences or firms’ technology. From an empirical perspective, the presence of unobserved heterogeneity (to the econometrician) creates challenging identification and estimation problems. In this proposal we explore these issues in a context where repeated observations are available for the same individual, and the researcher disposes of panel data. Most research to date adopts either of three approaches. One approach consists in modeling the distribution of unobserved heterogeneity, following a random-effects perspective (Chamberlain, 1984). Another approach looks for clever model-specific ways of differencing out the unobserved heterogeneity (Andersen, 1970, Honore and Kyriazidou, 2000). A more recent line of research relies on approximations that become more accurate when the number of observations per individual T gets large (Arellano and Hahn, 2006). Here we consider situations where T may be small, and the researcher does not restrict the distribution of the unobserved fixed effects. We will propose a new functional differencing approach which differences out the probability distribution of unobserved heterogeneity. This approach will generally be applicable in models with continuous dependent variables, emphasizing a possibility of point-identification of the structural parameters in those models. When outcomes are discrete, we will propose a nonlinear differencing strategy that delivers useful bounds on parameters in the presence of partial identification (Honore and Tamer, 2006).
Summary
Modern economic research emphasizes heterogeneity in various dimensions, such as individual preferences or firms’ technology. From an empirical perspective, the presence of unobserved heterogeneity (to the econometrician) creates challenging identification and estimation problems. In this proposal we explore these issues in a context where repeated observations are available for the same individual, and the researcher disposes of panel data. Most research to date adopts either of three approaches. One approach consists in modeling the distribution of unobserved heterogeneity, following a random-effects perspective (Chamberlain, 1984). Another approach looks for clever model-specific ways of differencing out the unobserved heterogeneity (Andersen, 1970, Honore and Kyriazidou, 2000). A more recent line of research relies on approximations that become more accurate when the number of observations per individual T gets large (Arellano and Hahn, 2006). Here we consider situations where T may be small, and the researcher does not restrict the distribution of the unobserved fixed effects. We will propose a new functional differencing approach which differences out the probability distribution of unobserved heterogeneity. This approach will generally be applicable in models with continuous dependent variables, emphasizing a possibility of point-identification of the structural parameters in those models. When outcomes are discrete, we will propose a nonlinear differencing strategy that delivers useful bounds on parameters in the presence of partial identification (Honore and Tamer, 2006).
Max ERC Funding
1 410 000 €
Duration
Start date: 2010-12-01, End date: 2015-11-30
Project acronym EPOQUE
Project Engineering post-quantum cryptography
Researcher (PI) Peter SCHWABE
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary "Our digital society critically relies on protection of data and communication against espionage and cyber crime. Underlying all protection mechanisms is cryptography, which we are using
daily to protect, for example, internet communication or e-banking. This protection is threatened by the dawn of universal quantum computers, which will break large parts of the
cryptography in use today. Transitioning current cryptographic algorithms to crypto that resist attacks by large quantum computers, so called ""post-quantum cryptography"", is possibly the
largest challenge applied cryptography is facing since becoming a domain of public research in the second half of the last century. Large standardization bodies, most prominently ETSI and
NIST, have started efforts to evaluate concrete proposals of post-quantum crypto for standardization and deployment. NIST's effort follows in the tradition of successful public ""crypto
competitions"" with strong involvement by the academic cryptographic community. It is expected to run through the next 5 years.
This project will tackle the engineering challenges of post-quantum cryptography following two main research directions. The first direction investigates implementation characteristics of
submissions to NIST for standardization. These include speed on various platforms, code size, and RAM usage. Furthermore we will study so-called side-channel attacks and propose suitable
countermeasures. Side-channel attacks use information such as timing or power consumption of cryptographic devices to obtain secret information. The second direction is about protocol
integration. We will examine how different real-world cryptographic protocols can accommodate the drastically different performance characteristics of post-quantum cryptography, explore
what algorithms suit best the requirements of common usage scenarios of these protocols, and investigate if changes to the high-level protocol layer are advisable to improve overall system
performance."
Summary
"Our digital society critically relies on protection of data and communication against espionage and cyber crime. Underlying all protection mechanisms is cryptography, which we are using
daily to protect, for example, internet communication or e-banking. This protection is threatened by the dawn of universal quantum computers, which will break large parts of the
cryptography in use today. Transitioning current cryptographic algorithms to crypto that resist attacks by large quantum computers, so called ""post-quantum cryptography"", is possibly the
largest challenge applied cryptography is facing since becoming a domain of public research in the second half of the last century. Large standardization bodies, most prominently ETSI and
NIST, have started efforts to evaluate concrete proposals of post-quantum crypto for standardization and deployment. NIST's effort follows in the tradition of successful public ""crypto
competitions"" with strong involvement by the academic cryptographic community. It is expected to run through the next 5 years.
This project will tackle the engineering challenges of post-quantum cryptography following two main research directions. The first direction investigates implementation characteristics of
submissions to NIST for standardization. These include speed on various platforms, code size, and RAM usage. Furthermore we will study so-called side-channel attacks and propose suitable
countermeasures. Side-channel attacks use information such as timing or power consumption of cryptographic devices to obtain secret information. The second direction is about protocol
integration. We will examine how different real-world cryptographic protocols can accommodate the drastically different performance characteristics of post-quantum cryptography, explore
what algorithms suit best the requirements of common usage scenarios of these protocols, and investigate if changes to the high-level protocol layer are advisable to improve overall system
performance."
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym ESCADA
Project Energy-optimized Symmetric Cryptography by Algebraic Duality Analysis
Researcher (PI) Joan DAEMEN
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary The main scientific contribution of this project will be a breakthrough in the understanding of cryptanalytic and side channel attacks of symmetric cryptosystems. We will do this by a unification of attacks that will a stepping stone to the holy grail of symmetric cryptography: provable security of concrete cryptosystems. The main real-world impact is that we will build cryptosystems that are much more efficient than those used today while having the same strength. Depending on the platform, higher efficiency translates to lower energy/power (in-body sensors, contactless payment cards etc.), but also lower latency (authentication for e.g car brakes or airbags) and/or lower heat dissipation (on-the-fly encryption of high bandwidth data streams). In a software implementation it simply means less CPU cycles per byte.
We build our cryptosystems as modes, on top of block ciphers or permutations. For these primitives we adopt the classical technique of iterating a simple round function (more rounds means more security but less efficiency). We focus on round functions of algebraic degree 2. Their relative simplicity will allow a unification of all cryptanalytic attacks that exploit propagation of affine varieties and polynomial ideals (their dual) through the rounds and to precisely estimate their success rates. Moreover, we will design modes that strongly restrict the exposure of the primitive(s) to attackers and that permit security reductions to specific properties of the underlying primitive(s) in a formally verifiable way. In comparison to the classical pseudorandom and ideal permutation models, this will allow reducing the number of rounds while preserving security with high assurance. We will also study side channel attacks of our round functions and ways to defend against them. We will make ASIC prototypes and implement novel efficient countermeasures against side channel attacks and use this to evaluate their effectiveness in practice.
Summary
The main scientific contribution of this project will be a breakthrough in the understanding of cryptanalytic and side channel attacks of symmetric cryptosystems. We will do this by a unification of attacks that will a stepping stone to the holy grail of symmetric cryptography: provable security of concrete cryptosystems. The main real-world impact is that we will build cryptosystems that are much more efficient than those used today while having the same strength. Depending on the platform, higher efficiency translates to lower energy/power (in-body sensors, contactless payment cards etc.), but also lower latency (authentication for e.g car brakes or airbags) and/or lower heat dissipation (on-the-fly encryption of high bandwidth data streams). In a software implementation it simply means less CPU cycles per byte.
We build our cryptosystems as modes, on top of block ciphers or permutations. For these primitives we adopt the classical technique of iterating a simple round function (more rounds means more security but less efficiency). We focus on round functions of algebraic degree 2. Their relative simplicity will allow a unification of all cryptanalytic attacks that exploit propagation of affine varieties and polynomial ideals (their dual) through the rounds and to precisely estimate their success rates. Moreover, we will design modes that strongly restrict the exposure of the primitive(s) to attackers and that permit security reductions to specific properties of the underlying primitive(s) in a formally verifiable way. In comparison to the classical pseudorandom and ideal permutation models, this will allow reducing the number of rounds while preserving security with high assurance. We will also study side channel attacks of our round functions and ways to defend against them. We will make ASIC prototypes and implement novel efficient countermeasures against side channel attacks and use this to evaluate their effectiveness in practice.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym Evoland
Project Evolution of regulatory landscapes at multiple timescales
Researcher (PI) Jose Luis GOMEZ-SKARMETA
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), LS8, ERC-2016-ADG
Summary Evolution of animal morphology relies on changes in developmental programs that control body plans and organ shape. Such changes are thought to arise form alteration of the expression of functionally conserved developmental genes and their vast downstream networks. Although this hypothesis has a profound impact on the way we view animal evolution, final proof is still lacking. The hypothesis calls for evolution to take place mainly through modifications of cis-regulatory elements (CREs) controlling gene expression. However, these genomic regions are precisely those that we understand the least and, until recently, basic knowledge on how regulatory information is organized in the 3D genome or how to spatio-temporally assign CREs to their target genes was unknown.
The advent of next generation sequencing-based tools has made possible to identify genome-wide CREs and reveal how they are organized in the 3D genome. But this new knowledge has been largely ignored by most hypotheses on the evolution of gene expression, development and animal morphology. These new high-throughput methods have been mainly restricted to selected model organisms, and due to the lack of sequence conservation of CREs across lineages, we still have very limited information about the impact of CREs on animal morphology evolution.
By integrating in a systematic and phylogenetically driven manner the contribution of CREs and their 3D organization to animal morphology at different evolutionary scales, we will for the first time link evolution, regulatory information, genome 3D architecture and morphology. We will apply this strategy to study animal morphology along the evolution of deuterostome body plans, the generation of fin morphological diversity in vertebrates, and the recent phenotypic changes in fish adapted to cave environments.
Our proposal will make ground-breaking advances in our understanding of the global principles underlying the evolution of cis-regulatory DNA and animal form.
Summary
Evolution of animal morphology relies on changes in developmental programs that control body plans and organ shape. Such changes are thought to arise form alteration of the expression of functionally conserved developmental genes and their vast downstream networks. Although this hypothesis has a profound impact on the way we view animal evolution, final proof is still lacking. The hypothesis calls for evolution to take place mainly through modifications of cis-regulatory elements (CREs) controlling gene expression. However, these genomic regions are precisely those that we understand the least and, until recently, basic knowledge on how regulatory information is organized in the 3D genome or how to spatio-temporally assign CREs to their target genes was unknown.
The advent of next generation sequencing-based tools has made possible to identify genome-wide CREs and reveal how they are organized in the 3D genome. But this new knowledge has been largely ignored by most hypotheses on the evolution of gene expression, development and animal morphology. These new high-throughput methods have been mainly restricted to selected model organisms, and due to the lack of sequence conservation of CREs across lineages, we still have very limited information about the impact of CREs on animal morphology evolution.
By integrating in a systematic and phylogenetically driven manner the contribution of CREs and their 3D organization to animal morphology at different evolutionary scales, we will for the first time link evolution, regulatory information, genome 3D architecture and morphology. We will apply this strategy to study animal morphology along the evolution of deuterostome body plans, the generation of fin morphological diversity in vertebrates, and the recent phenotypic changes in fish adapted to cave environments.
Our proposal will make ground-breaking advances in our understanding of the global principles underlying the evolution of cis-regulatory DNA and animal form.
Max ERC Funding
2 499 514 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym EXTREME
Project The Rise and Fall of Populism and Extremism
Researcher (PI) Maria PETROVA
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Starting Grant (StG), SH1, ERC-2018-STG
Summary In the recent years in advanced democracies there has been a wave of electoral successes of populist politicians supporting extreme messages. Is populism caused by negative economic shocks? If so, what are the mechanisms? What explains heterogeneity in responses to such shocks? In this project, I will test empirically if personal experiences, information environment, and their interaction with aggregate economic shocks shape people’s political decisions. The project consists of three parts.
First, I will study how personal employment histories, potentially affected by globalization and technological shocks, individual predispositions, and information environment influenced voting for Trump. I will use a unique database of more than 40 million resumes for the period 2010-2016, the largest available repository of resumes of job-seekers in the US, which was not previously used in academic research, and match it with zipcode-level economic and voting variables.
Second, I will study how negative social experiences during the formative years affect subsequent labor market outcomes, antisocial behavior, and the support of populist agenda. I will examine how corporal punishment in schools in UK affected subsequent educational attainment, employment, antisocial behavior, and voting for UKIP and Brexit. I will digitize archival records on regulations and practice of corporal punishment in different educational authorities in the UK during 1970-80s, combining it with contemporary outcomes.
Third, I will examine what makes people actively resist extremist regimes even when it is associated with high personal costs. I will study a historical example of resistance to Nazi regime in Germany during the WWII, which provides unique methodological opportunity to study determinants of resistance to extremism in a high stake environment. I will use a self-collected dataset on treason cases to measure resistance, combining it with data on bombing and exposure to foreign propaganda.
Summary
In the recent years in advanced democracies there has been a wave of electoral successes of populist politicians supporting extreme messages. Is populism caused by negative economic shocks? If so, what are the mechanisms? What explains heterogeneity in responses to such shocks? In this project, I will test empirically if personal experiences, information environment, and their interaction with aggregate economic shocks shape people’s political decisions. The project consists of three parts.
First, I will study how personal employment histories, potentially affected by globalization and technological shocks, individual predispositions, and information environment influenced voting for Trump. I will use a unique database of more than 40 million resumes for the period 2010-2016, the largest available repository of resumes of job-seekers in the US, which was not previously used in academic research, and match it with zipcode-level economic and voting variables.
Second, I will study how negative social experiences during the formative years affect subsequent labor market outcomes, antisocial behavior, and the support of populist agenda. I will examine how corporal punishment in schools in UK affected subsequent educational attainment, employment, antisocial behavior, and voting for UKIP and Brexit. I will digitize archival records on regulations and practice of corporal punishment in different educational authorities in the UK during 1970-80s, combining it with contemporary outcomes.
Third, I will examine what makes people actively resist extremist regimes even when it is associated with high personal costs. I will study a historical example of resistance to Nazi regime in Germany during the WWII, which provides unique methodological opportunity to study determinants of resistance to extremism in a high stake environment. I will use a self-collected dataset on treason cases to measure resistance, combining it with data on bombing and exposure to foreign propaganda.
Max ERC Funding
1 467 736 €
Duration
Start date: 2019-01-01, End date: 2023-12-31