Project acronym AAS
Project Approximate algebraic structure and applications
Researcher (PI) Ben Green
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary This project studies several mathematical topics with a related theme, all of them part of the relatively new discipline known as additive combinatorics.
We look at approximate, or rough, variants of familiar mathematical notions such as group, polynomial or homomorphism. In each case we seek to describe the structure of these approximate objects, and then to give applications of the resulting theorems. This endeavour has already lead to groundbreaking results in the theory of prime numbers, group theory and combinatorial number theory.
Summary
This project studies several mathematical topics with a related theme, all of them part of the relatively new discipline known as additive combinatorics.
We look at approximate, or rough, variants of familiar mathematical notions such as group, polynomial or homomorphism. In each case we seek to describe the structure of these approximate objects, and then to give applications of the resulting theorems. This endeavour has already lead to groundbreaking results in the theory of prime numbers, group theory and combinatorial number theory.
Max ERC Funding
1 000 000 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym ABACUS
Project Advancing Behavioral and Cognitive Understanding of Speech
Researcher (PI) Bart De Boer
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Summary
I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Max ERC Funding
1 276 620 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ABEP
Project Asset Bubbles and Economic Policy
Researcher (PI) Jaume Ventura Fontanet
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Summary
Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Max ERC Funding
1 000 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym ABRSEIST
Project Antibiotic Resistance: Socio-Economic Determinants and the Role of Information and Salience in Treatment Choice
Researcher (PI) Hannes ULLRICH
Host Institution (HI) DEUTSCHES INSTITUT FUR WIRTSCHAFTSFORSCHUNG DIW (INSTITUT FUR KONJUNKTURFORSCHUNG) EV
Call Details Starting Grant (StG), SH1, ERC-2018-STG
Summary Antibiotics have contributed to a tremendous increase in human well-being, saving many millions of lives. However, antibiotics become obsolete the more they are used as selection pressure promotes the development of resistant bacteria. The World Health Organization has proclaimed antibiotic resistance as a major global threat to public health. Today, 700,000 deaths per year are due to untreatable infections. To win the battle against antibiotic resistance, new policies affecting the supply and demand of existing and new drugs must be designed. I propose new research to identify and evaluate feasible and effective demand-side policy interventions targeting the relevant decision makers: physicians and patients. ABRSEIST will make use of a broad econometric toolset to identify mechanisms linking antibiotic resistance and consumption exploiting a unique combination of physician-patient-level antibiotic resistance, treatment, and socio-economic data. Using machine learning methods adapted for causal inference, theory-driven structural econometric analysis, and randomization in the field it will provide rigorous evidence on effective intervention designs. This research will improve our understanding of how prescribing, resistance, and the effect of antibiotic use on resistance, are distributed in the general population which has important implications for the design of targeted interventions. It will then estimate a structural model of general practitioners’ acquisition and use of information under uncertainty about resistance in prescription choice, allowing counterfactual analysis of information-improving policies such as mandatory diagnostic testing. The large-scale and structural econometric analyses allow flexible identification of physician heterogeneity, which ABRSEIST will exploit to design and evaluate targeted, randomized information nudges in the field. The result will be improved rational use and a toolset applicable in contexts of antibiotic prescribing.
Summary
Antibiotics have contributed to a tremendous increase in human well-being, saving many millions of lives. However, antibiotics become obsolete the more they are used as selection pressure promotes the development of resistant bacteria. The World Health Organization has proclaimed antibiotic resistance as a major global threat to public health. Today, 700,000 deaths per year are due to untreatable infections. To win the battle against antibiotic resistance, new policies affecting the supply and demand of existing and new drugs must be designed. I propose new research to identify and evaluate feasible and effective demand-side policy interventions targeting the relevant decision makers: physicians and patients. ABRSEIST will make use of a broad econometric toolset to identify mechanisms linking antibiotic resistance and consumption exploiting a unique combination of physician-patient-level antibiotic resistance, treatment, and socio-economic data. Using machine learning methods adapted for causal inference, theory-driven structural econometric analysis, and randomization in the field it will provide rigorous evidence on effective intervention designs. This research will improve our understanding of how prescribing, resistance, and the effect of antibiotic use on resistance, are distributed in the general population which has important implications for the design of targeted interventions. It will then estimate a structural model of general practitioners’ acquisition and use of information under uncertainty about resistance in prescription choice, allowing counterfactual analysis of information-improving policies such as mandatory diagnostic testing. The large-scale and structural econometric analyses allow flexible identification of physician heterogeneity, which ABRSEIST will exploit to design and evaluate targeted, randomized information nudges in the field. The result will be improved rational use and a toolset applicable in contexts of antibiotic prescribing.
Max ERC Funding
1 498 920 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym ACAP
Project Acency Costs and Asset Pricing
Researcher (PI) Thomas Mariotti
Host Institution (HI) FONDATION JEAN-JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Summary
The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-11-01, End date: 2014-10-31
Project acronym ACB
Project The Analytic Conformal Bootstrap
Researcher (PI) Luis Fernando ALDAY
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Summary
The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Max ERC Funding
2 171 483 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ACCELERATES
Project Acceleration in Extreme Shocks: from the microphysics to laboratory and astrophysics scenarios
Researcher (PI) Luis Miguel De Oliveira E Silva
Host Institution (HI) INSTITUTO SUPERIOR TECNICO
Call Details Advanced Grant (AdG), PE2, ERC-2010-AdG_20100224
Summary What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Summary
What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Max ERC Funding
1 588 800 €
Duration
Start date: 2011-06-01, End date: 2016-07-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACCUPOL
Project Unlimited Growth? A Comparative Analysis of Causes and Consequences of Policy Accumulation
Researcher (PI) Christoph KNILL
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), SH2, ERC-2017-ADG
Summary ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Summary
ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Max ERC Funding
2 359 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30