Project acronym BrainEnergy
Project Control of cerebral blood flow by capillary pericytes in health and disease
Researcher (PI) David ATTWELL
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), LS5, ERC-2016-ADG
Summary Pericytes, located at intervals along capillaries, have recently been revealed as major controllers of brain blood flow. Normally, they dilate capillaries in response to neuronal activity, increasing local blood flow and energy supply. But in pathology they have a more sinister role. After artery block causes a stroke, the brain suffers from the so-called “no-reflow” phenomenon - a failure to fully reperfuse capillaries, even after the upstream occluded artery has been reperfused successfully. The resulting long-lasting decrease of energy supply damages neurons. I have shown that a major cause of no-reflow lies in pericytes: during ischaemia they constrict and then die in rigor. This reduces capillary diameter and blood flow, and probably degrades blood-brain barrier function. However, despite their crucial role in regulating blood flow physiologically and in pathology, little is known about the mechanisms by which pericytes function.
By using blood vessel imaging, patch-clamping, two-photon imaging, optogenetics, immunohistochemistry, mathematical modelling, and live human tissue obtained from neurosurgery, this programme of research will:
(i) define the signalling mechanisms controlling capillary constriction and dilation in health and disease;
(ii) identify the relative contributions of neurons, astrocytes and microglia to regulating pericyte tone;
(iii) develop approaches to preventing brain pericyte constriction and death during ischaemia;
(iv) define how pericyte constriction of capillaries and pericyte death contribute to Alzheimer’s disease;
(v) extend these results from rodent brain to human brain pericytes as a prelude to developing therapies.
The diseases to which pericytes contribute include stroke, spinal cord injury, diabetes and Alzheimer’s disease. These all have an enormous economic impact, as well as causing great suffering for patients and their carers. This work will provide novel therapeutic approaches for treating these diseases.
Summary
Pericytes, located at intervals along capillaries, have recently been revealed as major controllers of brain blood flow. Normally, they dilate capillaries in response to neuronal activity, increasing local blood flow and energy supply. But in pathology they have a more sinister role. After artery block causes a stroke, the brain suffers from the so-called “no-reflow” phenomenon - a failure to fully reperfuse capillaries, even after the upstream occluded artery has been reperfused successfully. The resulting long-lasting decrease of energy supply damages neurons. I have shown that a major cause of no-reflow lies in pericytes: during ischaemia they constrict and then die in rigor. This reduces capillary diameter and blood flow, and probably degrades blood-brain barrier function. However, despite their crucial role in regulating blood flow physiologically and in pathology, little is known about the mechanisms by which pericytes function.
By using blood vessel imaging, patch-clamping, two-photon imaging, optogenetics, immunohistochemistry, mathematical modelling, and live human tissue obtained from neurosurgery, this programme of research will:
(i) define the signalling mechanisms controlling capillary constriction and dilation in health and disease;
(ii) identify the relative contributions of neurons, astrocytes and microglia to regulating pericyte tone;
(iii) develop approaches to preventing brain pericyte constriction and death during ischaemia;
(iv) define how pericyte constriction of capillaries and pericyte death contribute to Alzheimer’s disease;
(v) extend these results from rodent brain to human brain pericytes as a prelude to developing therapies.
The diseases to which pericytes contribute include stroke, spinal cord injury, diabetes and Alzheimer’s disease. These all have an enormous economic impact, as well as causing great suffering for patients and their carers. This work will provide novel therapeutic approaches for treating these diseases.
Max ERC Funding
2 499 954 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BYONIC
Project Beyond the Iron Curtain
Researcher (PI) Alessandro TAGLIABUE
Host Institution (HI) THE UNIVERSITY OF LIVERPOOL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Summary
As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Max ERC Funding
1 668 418 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym COMPASS
Project COMPASS: Climate-relevant Ocean Measurements and Processes on the Antarctic continental Shelf and Slope
Researcher (PI) Karen HEYWOOD
Host Institution (HI) UNIVERSITY OF EAST ANGLIA
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Processes on the Antarctic continental shelf and slope are crucially important for determining the rate of future sea level rise, setting the properties and volume of dense bottom water exported globally, and regulating the carbon cycle. Yet our ability to model and predict these processes over future decades remains rudimentary. This deficiency in understanding originates in a lack of observations in this inaccessible region. The COMPASS project seeks to rectify that by exploiting new technology - autonomous marine vehicles called gliders - to observe, quantify and elucidate processes on the continental shelf and slope of Antarctica that are important for climate.
The COMPASS objective is to make a step-change in our quantitative understanding of:
(i) the ocean front that marks the boundary between the Antarctic continental shelf and the open ocean, and its associated current system;
(ii) the interaction between ocean, atmosphere and sea-ice on the Antarctic continental shelf; and
(iii) the exchange of heat, salt and freshwater with the cavities beneath ice shelves.
These goals will be met by a series of targeted ocean glider campaigns around Antarctica, spanning different flow regimes, including areas where warm water is able to access the continental shelf and influence ice shelves, areas where the continental shelf is cold and fresh, and areas where the continental shelf hosts cold, salty, dense water that eventually spills into the abyss. A unique circumpolar assessment of ocean properties and dynamics, including instabilities and mixing, will be undertaken. COMPASS will develop new technology to deploy a profiling glider into inaccessible environments such as Antarctic polynyas (regions of open water surrounded by sea-ice). As well as scientific breakthroughs that will feed into future climate assessments, improving projections of future sea level rise and global temperatures, COMPASS will deliver enhanced design for future ocean observing systems.
Summary
Processes on the Antarctic continental shelf and slope are crucially important for determining the rate of future sea level rise, setting the properties and volume of dense bottom water exported globally, and regulating the carbon cycle. Yet our ability to model and predict these processes over future decades remains rudimentary. This deficiency in understanding originates in a lack of observations in this inaccessible region. The COMPASS project seeks to rectify that by exploiting new technology - autonomous marine vehicles called gliders - to observe, quantify and elucidate processes on the continental shelf and slope of Antarctica that are important for climate.
The COMPASS objective is to make a step-change in our quantitative understanding of:
(i) the ocean front that marks the boundary between the Antarctic continental shelf and the open ocean, and its associated current system;
(ii) the interaction between ocean, atmosphere and sea-ice on the Antarctic continental shelf; and
(iii) the exchange of heat, salt and freshwater with the cavities beneath ice shelves.
These goals will be met by a series of targeted ocean glider campaigns around Antarctica, spanning different flow regimes, including areas where warm water is able to access the continental shelf and influence ice shelves, areas where the continental shelf is cold and fresh, and areas where the continental shelf hosts cold, salty, dense water that eventually spills into the abyss. A unique circumpolar assessment of ocean properties and dynamics, including instabilities and mixing, will be undertaken. COMPASS will develop new technology to deploy a profiling glider into inaccessible environments such as Antarctic polynyas (regions of open water surrounded by sea-ice). As well as scientific breakthroughs that will feed into future climate assessments, improving projections of future sea level rise and global temperatures, COMPASS will deliver enhanced design for future ocean observing systems.
Max ERC Funding
3 499 270 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym Connections
Project Oligopoly Markets and Networks
Researcher (PI) Andrea Galeotti
Host Institution (HI) LONDON BUSINESS SCHOOL
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary Via our connections we learn about new ideas, quality of products, new investment opportunities and job opportunities. We influence and are influenced by our circle of friends. Firms are interconnected in complex processes of production and distribution. A firm’s decisions in a supply chain depends on other firms’ choices in the same supply chain, as well as on firms' behaviour in competing chains. Research on networks in the last 20 years has provided a series of tolls to study a system of interconnected economic agents. This project will advance the state of the art by further developing new applications of networks to better understand modern oligopoly markets.
The project is organised into two sub-projects. In sub-project 1 networks will be used to model diffusion and adoption of network goods. Different consumers' network locations will summarise different consumers' level of influence. The objectives are to understand how firms incorporate information about consumers' influence in their marketing strategies—pricing strategy and product design. It will provide a rigorous framework to evaluate how the increasing ability of firms to gather information on consumers’ influence affects outcomes of markets with network effects. In sub-project 2 networks will be used to model how inputs—e.g., intermediary goods and patents—are combined to deliver final goods. Possible applications are supply chains, communication networks and networks of patents. The objectives are to study firms' strategic behaviour, like pricing and R&D investments, in a complex process of production and distribution, and to understand the basic network metrics that are useful to describe market power. This is particularly important to provide a guide to competition authorities and alike when they evaluate mergers in complex interconnected markets.
Summary
Via our connections we learn about new ideas, quality of products, new investment opportunities and job opportunities. We influence and are influenced by our circle of friends. Firms are interconnected in complex processes of production and distribution. A firm’s decisions in a supply chain depends on other firms’ choices in the same supply chain, as well as on firms' behaviour in competing chains. Research on networks in the last 20 years has provided a series of tolls to study a system of interconnected economic agents. This project will advance the state of the art by further developing new applications of networks to better understand modern oligopoly markets.
The project is organised into two sub-projects. In sub-project 1 networks will be used to model diffusion and adoption of network goods. Different consumers' network locations will summarise different consumers' level of influence. The objectives are to understand how firms incorporate information about consumers' influence in their marketing strategies—pricing strategy and product design. It will provide a rigorous framework to evaluate how the increasing ability of firms to gather information on consumers’ influence affects outcomes of markets with network effects. In sub-project 2 networks will be used to model how inputs—e.g., intermediary goods and patents—are combined to deliver final goods. Possible applications are supply chains, communication networks and networks of patents. The objectives are to study firms' strategic behaviour, like pricing and R&D investments, in a complex process of production and distribution, and to understand the basic network metrics that are useful to describe market power. This is particularly important to provide a guide to competition authorities and alike when they evaluate mergers in complex interconnected markets.
Max ERC Funding
829 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym ECCLES
Project Emergent Constraints on Climate-Land feedbacks in the Earth System
Researcher (PI) Peter COX
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary The Land Biosphere is a critical component of the Earth System, linking to climate through multiple feedback processes. Understanding these feedback processes is a huge intellectual challenge. In part because of the pioneering work of the PI (Cox et al., 2000), many of the climate projections reported in the IPCC 5th Assessment Report (AR5) now include climate-carbon cycle feedbacks. However the latest Earth System Models (ESMs) continue to show a huge range in the projected responses of the land carbon cycle over the 21st century. This uncertainty threatens to undermine the value of these projections to inform climate policy. This project (ECCLES) is designed to produce significant reductions in the uncertainties associated with land-climate interactions, using the novel concept of Emergent Constraints - relationships between future projections and observable variations in the current Earth System that are common across the ensemble of ESMs. Emergent Constraints have many attractive features but chief amongst these is that they can make ensembles of ESMs more than the sum of the parts - allowing the full range of ESM projections to be used collectively, alongside key observations, to reduce uncertainties in the future climate. The project will deliver: (i) a theoretical foundation for Emergent Constraints; (ii) new datasets on the changing function of the land biosphere; (iii) Emergent Constraints on land-climate interactions based on observed temporal and spatial variations; (iv) a new generation of scientists expert in land-climate interactions and Emergent Constraints. ECCLES will benefit from the expertise and experience of the PI, which includes training as a theoretical physicist, an early career developing models of the land biosphere for ESMs, and a current career in a department of mathematics where he is at the forefront of efforts to develop and apply the concept of Emergent Constraints (Cox et al., 2013, Wenzel et al., 2016).
Summary
The Land Biosphere is a critical component of the Earth System, linking to climate through multiple feedback processes. Understanding these feedback processes is a huge intellectual challenge. In part because of the pioneering work of the PI (Cox et al., 2000), many of the climate projections reported in the IPCC 5th Assessment Report (AR5) now include climate-carbon cycle feedbacks. However the latest Earth System Models (ESMs) continue to show a huge range in the projected responses of the land carbon cycle over the 21st century. This uncertainty threatens to undermine the value of these projections to inform climate policy. This project (ECCLES) is designed to produce significant reductions in the uncertainties associated with land-climate interactions, using the novel concept of Emergent Constraints - relationships between future projections and observable variations in the current Earth System that are common across the ensemble of ESMs. Emergent Constraints have many attractive features but chief amongst these is that they can make ensembles of ESMs more than the sum of the parts - allowing the full range of ESM projections to be used collectively, alongside key observations, to reduce uncertainties in the future climate. The project will deliver: (i) a theoretical foundation for Emergent Constraints; (ii) new datasets on the changing function of the land biosphere; (iii) Emergent Constraints on land-climate interactions based on observed temporal and spatial variations; (iv) a new generation of scientists expert in land-climate interactions and Emergent Constraints. ECCLES will benefit from the expertise and experience of the PI, which includes training as a theoretical physicist, an early career developing models of the land biosphere for ESMs, and a current career in a department of mathematics where he is at the forefront of efforts to develop and apply the concept of Emergent Constraints (Cox et al., 2013, Wenzel et al., 2016).
Max ERC Funding
2 249 834 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym EPP
Project Econometrics for Public Policy: Sampling, Estimation, Decision, and Applications
Researcher (PI) Toru KITAGAWA
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary One of the ultimate goals of economics is to inform a policy that improves welfare. Despite that the vast amount of empirical works in economics aims to achieve this goal, the current state of the art in econometrics is silent about concrete recommendation for how to estimate the welfare maximizing policy. This project addresses statistically optimal and practically useful ways to learn the welfare-maximizing policy from data by developing novel econometric frameworks, sampling design, and estimation approaches that can be applied to a wide range of policy design problems in reality.
Development of econometric methods for optimal empirical policy design proceeds by answering the following open questions. First, given a sampling process, how do we define optimal estimation for the welfare-maximizing policy? Second, what estimation method achieves this statistical optimality? Third, how do we solve policy decision problem when the sampling process only set-identifies the social welfare criterion? Fourth, how can we integrate the sampling step and estimation step to develop a package of optimal sampling and optimal estimation procedures?
I divide the project into the following four parts. Each part is motivated by important empirical applications and has methodological challenges related to these four questions.
1) Estimation of treatment assignment policy
2) Estimation of optimal policy in other public policy applications
3) Policy design with set-identified social welfare
4) Sampling design for empirical policy design
Summary
One of the ultimate goals of economics is to inform a policy that improves welfare. Despite that the vast amount of empirical works in economics aims to achieve this goal, the current state of the art in econometrics is silent about concrete recommendation for how to estimate the welfare maximizing policy. This project addresses statistically optimal and practically useful ways to learn the welfare-maximizing policy from data by developing novel econometric frameworks, sampling design, and estimation approaches that can be applied to a wide range of policy design problems in reality.
Development of econometric methods for optimal empirical policy design proceeds by answering the following open questions. First, given a sampling process, how do we define optimal estimation for the welfare-maximizing policy? Second, what estimation method achieves this statistical optimality? Third, how do we solve policy decision problem when the sampling process only set-identifies the social welfare criterion? Fourth, how can we integrate the sampling step and estimation step to develop a package of optimal sampling and optimal estimation procedures?
I divide the project into the following four parts. Each part is motivated by important empirical applications and has methodological challenges related to these four questions.
1) Estimation of treatment assignment policy
2) Estimation of optimal policy in other public policy applications
3) Policy design with set-identified social welfare
4) Sampling design for empirical policy design
Max ERC Funding
1 291 064 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym FLEXNEURO
Project Flexible and robust nervous system function from reconfiguring networks
Researcher (PI) Timothy O'LEARY
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), LS5, ERC-2016-STG
Summary It is now possible to monitor and manipulate neurons in live, awake animals, revealing how patterns of neural activity represent information and give rise to behaviour. Very recent experiments show that many circuits have physiology and connectivity that is highly variable and that changes continually, even when an animal’s behaviour and environment are stable. Existing theories of brain function assume that neural circuit parameters only change as required during learning and development. This paradigm cannot explain how consistent behaviour can emerge from circuits that continually reconfigure, nor what mechanisms might drive variability and continual change. Understanding this deep puzzle requires new theory and new ways to interpret experimental data. I will develop a theory of reconfiguring circuits by significantly generalizing my previous work that uses control theory to show how network activity can be maintained in spite of variability and continual turnover of crucial circuit components. We will analyse how biological plasticity mechanisms steer collective properties of neurons and circuits toward functional states without requiring individual parameters to be fixed, resulting in circuit models with consistent output but variable and mutable internal structures. In close collaboration with leading experimentalists we will challenge these modelling principles to account for new findings which reveal that navigation, sensory percepts and learned associations are underpinned by surprisingly dynamic, variable circuit connectivity and physiology. This will generate new, exciting questions that will drive experiments and theory together: how can known plasticity mechanisms generate reconfigurable neural representations? Do continually reconfiguring networks possess unique functional flexibility and robustness, and are they vulnerable to specific pathologies? And how can we design new experiments to test theories of robust, reconfigurable networks?
Summary
It is now possible to monitor and manipulate neurons in live, awake animals, revealing how patterns of neural activity represent information and give rise to behaviour. Very recent experiments show that many circuits have physiology and connectivity that is highly variable and that changes continually, even when an animal’s behaviour and environment are stable. Existing theories of brain function assume that neural circuit parameters only change as required during learning and development. This paradigm cannot explain how consistent behaviour can emerge from circuits that continually reconfigure, nor what mechanisms might drive variability and continual change. Understanding this deep puzzle requires new theory and new ways to interpret experimental data. I will develop a theory of reconfiguring circuits by significantly generalizing my previous work that uses control theory to show how network activity can be maintained in spite of variability and continual turnover of crucial circuit components. We will analyse how biological plasticity mechanisms steer collective properties of neurons and circuits toward functional states without requiring individual parameters to be fixed, resulting in circuit models with consistent output but variable and mutable internal structures. In close collaboration with leading experimentalists we will challenge these modelling principles to account for new findings which reveal that navigation, sensory percepts and learned associations are underpinned by surprisingly dynamic, variable circuit connectivity and physiology. This will generate new, exciting questions that will drive experiments and theory together: how can known plasticity mechanisms generate reconfigurable neural representations? Do continually reconfiguring networks possess unique functional flexibility and robustness, and are they vulnerable to specific pathologies? And how can we design new experiments to test theories of robust, reconfigurable networks?
Max ERC Funding
1 299 191 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym FUNCOPLAN
Project Functions of plasticity in adult-born neurons
Researcher (PI) Matthew Stuart GRUBB
Host Institution (HI) KING'S COLLEGE LONDON
Call Details Consolidator Grant (CoG), LS5, ERC-2016-COG
Summary The major objective of FUNCOPLAN is to examine groundbreaking questions on the functional role of newly-generated neurons in the adult brain. Using a combination of innovative approaches, our aim is to discover how plasticity in adult-born cells shapes information processing in neuronal circuits.
Adult neurogenesis produces new neurons in particular areas of the mammalian brain throughout life. Because they undergo a transient period of heightened plasticity, these freshly-generated cells are believed to bring unique properties to the circuits they join – a continual influx of new, immature cells is believed to provide a level of plasticity not achievable by the mature, resident network alone. But what exactly is the function of the additional plasticity provided by adult-born neurons? How does it influence information processing in neuronal networks?
These questions are vital for our fundamental understanding of how the brain works. We will address them by studying a unique population of cells that is continually generated throughout life: dopaminergic neurons in the olfactory bulb. These cells play a key role in the modulation of early sensory responses and are renowned for their plastic capacity. However, the role of this plasticity in shaping sensory processing remains completely unknown. FUNCOPLAN’s first objectives, therefore, are to discover novel experience-dependent plastic changes in the cellular features and sensory response properties of adult-born neurons. We will then go much further than this, however, by integrating our discoveries with state-of-the-art techniques for precisely manipulating activity in these cells in vivo. This wholly innovative approach will allow us to mimic the effects of plasticity in naïve circuits, or cancel the effects of plasticity in experience-altered networks. In this way, we will break new ground, demonstrating a unique contribution of plasticity in adult-born cells to the fundamental function of neuronal circuitry.
Summary
The major objective of FUNCOPLAN is to examine groundbreaking questions on the functional role of newly-generated neurons in the adult brain. Using a combination of innovative approaches, our aim is to discover how plasticity in adult-born cells shapes information processing in neuronal circuits.
Adult neurogenesis produces new neurons in particular areas of the mammalian brain throughout life. Because they undergo a transient period of heightened plasticity, these freshly-generated cells are believed to bring unique properties to the circuits they join – a continual influx of new, immature cells is believed to provide a level of plasticity not achievable by the mature, resident network alone. But what exactly is the function of the additional plasticity provided by adult-born neurons? How does it influence information processing in neuronal networks?
These questions are vital for our fundamental understanding of how the brain works. We will address them by studying a unique population of cells that is continually generated throughout life: dopaminergic neurons in the olfactory bulb. These cells play a key role in the modulation of early sensory responses and are renowned for their plastic capacity. However, the role of this plasticity in shaping sensory processing remains completely unknown. FUNCOPLAN’s first objectives, therefore, are to discover novel experience-dependent plastic changes in the cellular features and sensory response properties of adult-born neurons. We will then go much further than this, however, by integrating our discoveries with state-of-the-art techniques for precisely manipulating activity in these cells in vivo. This wholly innovative approach will allow us to mimic the effects of plasticity in naïve circuits, or cancel the effects of plasticity in experience-altered networks. In this way, we will break new ground, demonstrating a unique contribution of plasticity in adult-born cells to the fundamental function of neuronal circuitry.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym GEOSTICK
Project Morphodynamic Stickiness: the influence of physical and biological cohesion in sedimentary systems
Researcher (PI) Daniel Roy PARSONS
Host Institution (HI) UNIVERSITY OF HULL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Our coasts, estuaries, & low-land river environments are some of the most sensitive systems to sea-level rise & environmental change. In order to manage these systems, & adapt to future changes, we desperately need to be able to predict how they will alter under various scenarios. However, our models for these environments are not yet robust enough to predict, with confidence, very far into the future. Moreover, we also need to improve how we use our understanding of modern environments in reconstructing paleo-environments, where significant assumptions have been made in the way in which relationships derived from the modern have been applied to ancient rocks.
One of the main reasons our models, & geological interpretations, of these environments, are not yet good enough is because these models have formulations that are based on assumptions that these systems are composed of only non-cohesive sands. However, mud is the most common sediment on Earth & many of these systems are actually dominated by biologically-active muds & complex sediment mixtures. We need to therefore find ways to incorporate the effect of sticky mud & sticky biological components into our predictions. Recent work my colleagues & I have published show just how important such abiotic-biotic interactions can be: inclusion of only relatively small (<0.1% by mass) quantities of biological material into sediment mixtures can reduce alluvial bedform size by an order of magnitude.
However, this is just a start & there is much to do in order to advance our fundamental understanding & develop robust models that predict the combined effects of abiotic & biotic processes on morphological evolution of these environments under changing drivers & conditions. GEOSTICK will deliver this advance allowing us to test how sensitive these environments are, assess if there are tipping points in their resilience & examine evidence for the evolution of life in the ancient sediments of early Earth and Mars.
Summary
Our coasts, estuaries, & low-land river environments are some of the most sensitive systems to sea-level rise & environmental change. In order to manage these systems, & adapt to future changes, we desperately need to be able to predict how they will alter under various scenarios. However, our models for these environments are not yet robust enough to predict, with confidence, very far into the future. Moreover, we also need to improve how we use our understanding of modern environments in reconstructing paleo-environments, where significant assumptions have been made in the way in which relationships derived from the modern have been applied to ancient rocks.
One of the main reasons our models, & geological interpretations, of these environments, are not yet good enough is because these models have formulations that are based on assumptions that these systems are composed of only non-cohesive sands. However, mud is the most common sediment on Earth & many of these systems are actually dominated by biologically-active muds & complex sediment mixtures. We need to therefore find ways to incorporate the effect of sticky mud & sticky biological components into our predictions. Recent work my colleagues & I have published show just how important such abiotic-biotic interactions can be: inclusion of only relatively small (<0.1% by mass) quantities of biological material into sediment mixtures can reduce alluvial bedform size by an order of magnitude.
However, this is just a start & there is much to do in order to advance our fundamental understanding & develop robust models that predict the combined effects of abiotic & biotic processes on morphological evolution of these environments under changing drivers & conditions. GEOSTICK will deliver this advance allowing us to test how sensitive these environments are, assess if there are tipping points in their resilience & examine evidence for the evolution of life in the ancient sediments of early Earth and Mars.
Max ERC Funding
2 581 155 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym GLOBALFIRMS
Project Global Firms and Global Value Chains: Measurement and Mechanisms
Researcher (PI) Kalina MANOVA
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary The growing fragmentation of production across firms and countries has revolutionized international trade in recent decades. Firms today choose which production stages to conduct themselves and which to outsource to other parties, which to complete at home and which to offshore abroad. Known as global value chains (GVCs), this phenomenon creates new challenges and opportunities for individual firms and aggregate economies. Of primary policy interest are the implications of GVCs for growth, the transmission of shocks across firms and borders, and the design of economic policies. Yet academic research has faced two major challenges: poor measurement and poorly understood mechanisms.
I propose an ambitious research program that will use exceptional new data and novel GVC measures for path-breaking GVC analysis. First, I will exploit unique panel data on firm production, management practices, export and import transactions for the world’s two largest export economies, China and the US; and unique panel data on firm production, export and import transactions, and the network of domestic firm-to-firm transactions for one of the most open economies, Belgium. Second, I will develop measures that comprehensively characterize three dimensions of firms’ GVC activity: value added (total/domestic/foreign), production line position (upstreamness), and network position (centrality). Third, I will empirically and theoretically examine the impact of GVCs on firm growth, shock transmission, and export-finance policy through six synergistic projects. Each project will make a distinct contribution by investigating new economic mechanisms, establishing new empirical facts, and combining theory and data for informative welfare calculations.
The novelty of the data and the complex mechanisms driving GVCs make this research program highly ambitious. At the same time, the importance of understanding GVCs for economic policy and academic research make this agenda extraordinarily high-return.
Summary
The growing fragmentation of production across firms and countries has revolutionized international trade in recent decades. Firms today choose which production stages to conduct themselves and which to outsource to other parties, which to complete at home and which to offshore abroad. Known as global value chains (GVCs), this phenomenon creates new challenges and opportunities for individual firms and aggregate economies. Of primary policy interest are the implications of GVCs for growth, the transmission of shocks across firms and borders, and the design of economic policies. Yet academic research has faced two major challenges: poor measurement and poorly understood mechanisms.
I propose an ambitious research program that will use exceptional new data and novel GVC measures for path-breaking GVC analysis. First, I will exploit unique panel data on firm production, management practices, export and import transactions for the world’s two largest export economies, China and the US; and unique panel data on firm production, export and import transactions, and the network of domestic firm-to-firm transactions for one of the most open economies, Belgium. Second, I will develop measures that comprehensively characterize three dimensions of firms’ GVC activity: value added (total/domestic/foreign), production line position (upstreamness), and network position (centrality). Third, I will empirically and theoretically examine the impact of GVCs on firm growth, shock transmission, and export-finance policy through six synergistic projects. Each project will make a distinct contribution by investigating new economic mechanisms, establishing new empirical facts, and combining theory and data for informative welfare calculations.
The novelty of the data and the complex mechanisms driving GVCs make this research program highly ambitious. At the same time, the importance of understanding GVCs for economic policy and academic research make this agenda extraordinarily high-return.
Max ERC Funding
1 462 304 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym GlymphEye
Project The Ocular Glymphatic System
Researcher (PI) Maiken Nedergaard
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Advanced Grant (AdG), LS5, ERC-2016-ADG
Summary The glymphatic system is a highly organized brain-wide mechanism by which fluid wastes are removed from the brain that was recently described by my team. The glymphatic system clears 65% of amyloid-beta from the normal adult brain. A rapidly evolving literature has shown that the major neurodegenerative diseases of the eye, macular degeneration and glaucoma, may also result from the toxicity of uncleared protein wastes, including amyloid-beta. Yet the eye, like the brain, has no traditional lymphatic vessels. In this application, I propose that two of the most significant causes of human visual loss, macular degeneration and glaucoma – previously thought of as both intractable and unrelated – are instead mechanistically allied disorders that not only share a common causal pathway, but may both be therapeutically modified by targeting dysregulation of the glymphatic pathway. As such, this proposal seeks to link the biology of a fundamentally new pathway for both metabolic substrate and waste transport in the adult brain, to diseases of the eye that have long been resistant to either understanding or treatment.
The objectives: WP1: Define the cellular mechanisms that drive ocular glymphatic transport of Amyloid-beta using an ex vivo preparation of the optic nerve. WP2: Use magnetic resonance imaging (MRI) to establish the existence of ocular glymphatic transport in live animals. WP3: Determine whether the ocular glymphatic system, like the brain lymphatic system, is critically regulated by the sleep-wake cycle. WP4: Test the hypothesis that age-dependent macular degeneration is caused by a suppression of ocular glymphatic transport, with secondary accumulation of toxic protein products in and subjacent to the retinal pigment epithelium? WP5: Define the impact of increased intraocular pressure on glymphatic export of amyloid-beta, and test the hypothesis that the decrease in ocular glymphatic transport contributes to degeneration of retinal ganglion cells in glaucoma.
Summary
The glymphatic system is a highly organized brain-wide mechanism by which fluid wastes are removed from the brain that was recently described by my team. The glymphatic system clears 65% of amyloid-beta from the normal adult brain. A rapidly evolving literature has shown that the major neurodegenerative diseases of the eye, macular degeneration and glaucoma, may also result from the toxicity of uncleared protein wastes, including amyloid-beta. Yet the eye, like the brain, has no traditional lymphatic vessels. In this application, I propose that two of the most significant causes of human visual loss, macular degeneration and glaucoma – previously thought of as both intractable and unrelated – are instead mechanistically allied disorders that not only share a common causal pathway, but may both be therapeutically modified by targeting dysregulation of the glymphatic pathway. As such, this proposal seeks to link the biology of a fundamentally new pathway for both metabolic substrate and waste transport in the adult brain, to diseases of the eye that have long been resistant to either understanding or treatment.
The objectives: WP1: Define the cellular mechanisms that drive ocular glymphatic transport of Amyloid-beta using an ex vivo preparation of the optic nerve. WP2: Use magnetic resonance imaging (MRI) to establish the existence of ocular glymphatic transport in live animals. WP3: Determine whether the ocular glymphatic system, like the brain lymphatic system, is critically regulated by the sleep-wake cycle. WP4: Test the hypothesis that age-dependent macular degeneration is caused by a suppression of ocular glymphatic transport, with secondary accumulation of toxic protein products in and subjacent to the retinal pigment epithelium? WP5: Define the impact of increased intraocular pressure on glymphatic export of amyloid-beta, and test the hypothesis that the decrease in ocular glymphatic transport contributes to degeneration of retinal ganglion cells in glaucoma.
Max ERC Funding
2 176 250 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym GOCART
Project Gauging Ocean organic Carbon fluxes using Autonomous Robotic Technologies
Researcher (PI) Stephanie Anne HENSON
Host Institution (HI) NATURAL ENVIRONMENT RESEARCH COUNCIL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Climate change driven by CO2 emissions from human activities is a significant challenge facing mankind. An important component of Earth’s carbon (C) cycle is the ocean’s biological C pump; without it atmospheric CO2 would be ~50% higher than it is now. The pump consists of sinking organic matter which is remineralised back into CO2 in the deep ocean. The depth at which remineralisation occurs is the main factor affecting the amount of organic C stored in the ocean. Currently we do not understand how or why remineralisation depth varies in time, which limits our ability to make robust predictions of how the future C cycle, and hence our climate, will change into the future. This is mainly due to the challenges of measuring remineralisation depth using conventional methods– a barrier which autonomous underwater vehicles are poised to overcome by providing high frequency data over long periods. This technological innovation will revolutionise our understanding of this important planetary C flux.
I propose an ambitious project to address current uncertainties in remineralisation depth. GOCART encompasses new observations, obtained using cutting-edge technology and novel methodology, through to global climate modelling. Underwater glider deployments will be used to establish the characteristics and significance of temporal variability in organic C flux and remineralisation depth during the most dynamic period of the year. This will enable new insights into the factors driving variability in remineralisation depth, ultimately leading to development of a new model parameterisation incorporating temporal variability. Using an innovative modelling framework, this parameterisation will be tested for its potential to improve predictions of ocean C storage. GOCART represents a significant advance in quantifying temporal variability in remineralisation depth, which is key to reducing uncertainty in model predictions of ocean C storage, and yet currently almost entirely unknown.
Summary
Climate change driven by CO2 emissions from human activities is a significant challenge facing mankind. An important component of Earth’s carbon (C) cycle is the ocean’s biological C pump; without it atmospheric CO2 would be ~50% higher than it is now. The pump consists of sinking organic matter which is remineralised back into CO2 in the deep ocean. The depth at which remineralisation occurs is the main factor affecting the amount of organic C stored in the ocean. Currently we do not understand how or why remineralisation depth varies in time, which limits our ability to make robust predictions of how the future C cycle, and hence our climate, will change into the future. This is mainly due to the challenges of measuring remineralisation depth using conventional methods– a barrier which autonomous underwater vehicles are poised to overcome by providing high frequency data over long periods. This technological innovation will revolutionise our understanding of this important planetary C flux.
I propose an ambitious project to address current uncertainties in remineralisation depth. GOCART encompasses new observations, obtained using cutting-edge technology and novel methodology, through to global climate modelling. Underwater glider deployments will be used to establish the characteristics and significance of temporal variability in organic C flux and remineralisation depth during the most dynamic period of the year. This will enable new insights into the factors driving variability in remineralisation depth, ultimately leading to development of a new model parameterisation incorporating temporal variability. Using an innovative modelling framework, this parameterisation will be tested for its potential to improve predictions of ocean C storage. GOCART represents a significant advance in quantifying temporal variability in remineralisation depth, which is key to reducing uncertainty in model predictions of ocean C storage, and yet currently almost entirely unknown.
Max ERC Funding
1 999 110 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym HETEROPOLIS
Project The Design of Social Policy in a Heterogeneous World
Researcher (PI) Johannes SPINNEWIJN
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Modern societies are characterized by tremendous heterogeneity in economic outcomes: from heterogeneity in wages and employment, to heterogeneity in capital income, wealth and health outcomes. It is unclear, however, how to map heterogeneity in these outcomes to heterogeneity in welfare. This mapping is crucial for the design of tax and benefit systems, providing insurance against individual risk and redistributing income between individuals, while maintaining proper incentives.
The main objectives of HETEROPOLIS are: 1) to provide new insights on the relation between inequality in earnings, wealth and consumption, 2) to develop a new consumption-based method to measure welfare inequality and heterogeneity in the marginal value of social transfers, 3) to provide and implement a simple, but general evidence-based framework to evaluate the differential design of social insurance based on observable heterogeneity, 4) to analyse selection effects due to unobservable heterogeneity and how they affect social insurance design, 5) to analyse heterogeneity in behavioural “biases” and their consequences for policy design.
The first part of HETEROPOLIS analyses the use of registry-based consumption measures to evaluate heterogeneity in welfare and exploits a newly developed data set based on administrative registers for the universe of Swedish households providing comprehensive and detailed information on income, wealth, labour market outcomes and other variables. The second part develops and implements a general evidence-based framework to evaluate the design of multi-faceted social insurance programs in a heterogeneous world. The final part of HETEROPOLIS analyses and estimates different sources of heterogeneity that affect market efficiency and justify further government interventions.
Summary
Modern societies are characterized by tremendous heterogeneity in economic outcomes: from heterogeneity in wages and employment, to heterogeneity in capital income, wealth and health outcomes. It is unclear, however, how to map heterogeneity in these outcomes to heterogeneity in welfare. This mapping is crucial for the design of tax and benefit systems, providing insurance against individual risk and redistributing income between individuals, while maintaining proper incentives.
The main objectives of HETEROPOLIS are: 1) to provide new insights on the relation between inequality in earnings, wealth and consumption, 2) to develop a new consumption-based method to measure welfare inequality and heterogeneity in the marginal value of social transfers, 3) to provide and implement a simple, but general evidence-based framework to evaluate the differential design of social insurance based on observable heterogeneity, 4) to analyse selection effects due to unobservable heterogeneity and how they affect social insurance design, 5) to analyse heterogeneity in behavioural “biases” and their consequences for policy design.
The first part of HETEROPOLIS analyses the use of registry-based consumption measures to evaluate heterogeneity in welfare and exploits a newly developed data set based on administrative registers for the universe of Swedish households providing comprehensive and detailed information on income, wealth, labour market outcomes and other variables. The second part develops and implements a general evidence-based framework to evaluate the design of multi-faceted social insurance programs in a heterogeneous world. The final part of HETEROPOLIS analyses and estimates different sources of heterogeneity that affect market efficiency and justify further government interventions.
Max ERC Funding
1 497 505 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym ITHACA
Project An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations
Researcher (PI) Timothy PALMER
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary The aim of this project is to develop a new synergy between climate and computer science to increase the accuracy and hence reliability of comprehensive weather and climate models. The scientific basis for this project lies in the PI’s pioneering research on stochastic sub-grid parametrisations for climate models. These parametrisations provide estimates of irreducible uncertainty in weather and climate models, and will be used to determine where numerical precision for model variables can be reduced without degradation. By identifying those bits that carry negligible information – typically in high-wavenumber components of the dynamical core and within parametrisation and Earth-System modules – computational resources can be reinvested into areas (resolution, process representation, ensemble size) where they are sorely needed. This project will determine scale-dependent estimates of information content as rigorously as possible based on a variety of new tools, which include information-theoretic diagnostics and emulators of imprecision, and in a variety of models, from idealised to comprehensive. The project will contribute significantly to the development of next-generation weather and climate models and is well timed for the advent of exascale supercomputing where energy efficiency is paramount and where movement of bits, being the single biggest determinant of power consumption, must be minimised. The ideas will be tested on emerging hardware capable of exploiting the benefits of mixed-precision arithmetic. A testable scientific hypothesis is presented: a proposed increase in forecast reliability arising from an increase in the forecast model’s vertical resolution, the cost being paid for by a reduction in precision of small-scale variables. This project can be expected to provide new scientific understanding of how different scales interact in the nonlinear climate system, for example in maintaining persistent atmospheric flow regimes.
Summary
The aim of this project is to develop a new synergy between climate and computer science to increase the accuracy and hence reliability of comprehensive weather and climate models. The scientific basis for this project lies in the PI’s pioneering research on stochastic sub-grid parametrisations for climate models. These parametrisations provide estimates of irreducible uncertainty in weather and climate models, and will be used to determine where numerical precision for model variables can be reduced without degradation. By identifying those bits that carry negligible information – typically in high-wavenumber components of the dynamical core and within parametrisation and Earth-System modules – computational resources can be reinvested into areas (resolution, process representation, ensemble size) where they are sorely needed. This project will determine scale-dependent estimates of information content as rigorously as possible based on a variety of new tools, which include information-theoretic diagnostics and emulators of imprecision, and in a variety of models, from idealised to comprehensive. The project will contribute significantly to the development of next-generation weather and climate models and is well timed for the advent of exascale supercomputing where energy efficiency is paramount and where movement of bits, being the single biggest determinant of power consumption, must be minimised. The ideas will be tested on emerging hardware capable of exploiting the benefits of mixed-precision arithmetic. A testable scientific hypothesis is presented: a proposed increase in forecast reliability arising from an increase in the forecast model’s vertical resolution, the cost being paid for by a reduction in precision of small-scale variables. This project can be expected to provide new scientific understanding of how different scales interact in the nonlinear climate system, for example in maintaining persistent atmospheric flow regimes.
Max ERC Funding
2 494 117 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym MANANDNATURE
Project Man and Nature in Developing Countries
Researcher (PI) Robin BURGESS
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Advanced Grant (AdG), SH1, ERC-2016-ADG
Summary The growth required to lift a billion people out of extreme poverty will require large increases in natural resource extraction and energy consumption. The negative externalities this growth creates – through degradation of forests and oceans, pollution and climate change – will affect us all. This is a proposal to create a new body of research on natural resource management and energy use in developing countries. It is distinctive for four reasons. First, it brings novel, applied micro techniques from development economics to the study of environmental and energy economics. Second, it harnesses new data collection technologies using satellites and randomized control trials. Third, we pioneer the use of political economy approaches to understand the gap between de jure and de facto policies. Finally, we innovate on policy design by embedding researchers with policy partners to co-generate research and ensure that findings scale directly into policies.
On natural resources, we propose three projects which use newly-available satellite data. The first examines regression discontinuities along the Brazilian border to understand why deforestation has slowed in the Brazilian Amazon but not in neighbouring countries. The second employs structural modeling to look at how economic and political factors influence the ignition and spread of forest fires in Indonesia. The third looks at whether regulating access to parts of the ocean can enhance its productivity and ability to absorb carbon.
On energy, we propose three collaborative projects which employ randomized trials to look at how to improve access to energy. The first examines how to get consumers to pay for the electricity they use in contexts where theft, non-payment and mispricing of electricity are rife. The second estimates a demand curve for solar electricity to understand how solar may contribute to meeting rising energy demand. The third looks at impacts of grid expansion in a largely un-electrified country.
Summary
The growth required to lift a billion people out of extreme poverty will require large increases in natural resource extraction and energy consumption. The negative externalities this growth creates – through degradation of forests and oceans, pollution and climate change – will affect us all. This is a proposal to create a new body of research on natural resource management and energy use in developing countries. It is distinctive for four reasons. First, it brings novel, applied micro techniques from development economics to the study of environmental and energy economics. Second, it harnesses new data collection technologies using satellites and randomized control trials. Third, we pioneer the use of political economy approaches to understand the gap between de jure and de facto policies. Finally, we innovate on policy design by embedding researchers with policy partners to co-generate research and ensure that findings scale directly into policies.
On natural resources, we propose three projects which use newly-available satellite data. The first examines regression discontinuities along the Brazilian border to understand why deforestation has slowed in the Brazilian Amazon but not in neighbouring countries. The second employs structural modeling to look at how economic and political factors influence the ignition and spread of forest fires in Indonesia. The third looks at whether regulating access to parts of the ocean can enhance its productivity and ability to absorb carbon.
On energy, we propose three collaborative projects which employ randomized trials to look at how to improve access to energy. The first examines how to get consumers to pay for the electricity they use in contexts where theft, non-payment and mispricing of electricity are rife. The second estimates a demand curve for solar electricity to understand how solar may contribute to meeting rising energy demand. The third looks at impacts of grid expansion in a largely un-electrified country.
Max ERC Funding
1 932 655 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym MEME
Project Memory Engram Maintenance and Expression
Researcher (PI) Tomas RYAN
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), LS5, ERC-2016-STG
Summary The goal of this project is to understand how specific memory engrams are physically stored in the brain. Connectionist theories of memory storage have guided research into the neuroscience of memory for over a half century, but have received little direct proof due to experimental limitations. The major confound that has limited direct testing of such theories has been an inability to identify the cells and circuits that store specific memories. Memory engram technology, which allows the tagging and in vivo manipulation of specific engram cells, has recently allowed us to overcome this empirical limitation and has revolutionised the way memory can be studied in rodent models. Based on our research it is now known that sparse populations of hippocampal neurons that were active during a defined learning experience are both sufficient and necessary for retrieval of specific contextual memories. More recently we have established that hippocampal engram cells preferentially synapse directly onto postsynaptic engram cells. This “engram cell connectivity” could provide the neurobiological substrate for the storage of multimodal memories through a distributed engram circuit. However it is currently unknown whether engram cell connectivity itself is important for memory function. The proposed integrative neuroscience project will employ inter-disciplinary methods to directly probe the importance of engram cell connectivity for memory retrieval, storage, and encoding. The outcomes will directly inform a novel and comprehensive neurobiological model of memory engram storage.
Summary
The goal of this project is to understand how specific memory engrams are physically stored in the brain. Connectionist theories of memory storage have guided research into the neuroscience of memory for over a half century, but have received little direct proof due to experimental limitations. The major confound that has limited direct testing of such theories has been an inability to identify the cells and circuits that store specific memories. Memory engram technology, which allows the tagging and in vivo manipulation of specific engram cells, has recently allowed us to overcome this empirical limitation and has revolutionised the way memory can be studied in rodent models. Based on our research it is now known that sparse populations of hippocampal neurons that were active during a defined learning experience are both sufficient and necessary for retrieval of specific contextual memories. More recently we have established that hippocampal engram cells preferentially synapse directly onto postsynaptic engram cells. This “engram cell connectivity” could provide the neurobiological substrate for the storage of multimodal memories through a distributed engram circuit. However it is currently unknown whether engram cell connectivity itself is important for memory function. The proposed integrative neuroscience project will employ inter-disciplinary methods to directly probe the importance of engram cell connectivity for memory retrieval, storage, and encoding. The outcomes will directly inform a novel and comprehensive neurobiological model of memory engram storage.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym MICA
Project Mechanics of slow earthquake phenomena: an Integrated perspective from the Composition, geometry, And rheology of plate boundary faults
Researcher (PI) Ake Fagereng
Host Institution (HI) CARDIFF UNIVERSITY
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Major tectonic faults have, until recently, been thought to accommodate displacement by either continuous creep or episodic, damaging earthquakes. High-resolution geophysical networks have now detected ‘slow earthquakes’, transient modes of displacement that are faster than creep but slower than earthquakes. This project aims to illuminate the unknown mechanism behind slow earthquakes, through an integrated, multi-scale approach. MICA uses the unique natural laboratory of exhumed and active faults, to build numerical models constrained by observed fault geometry and microstructurally defined deformation mechanisms, to determine, for the first time, the rheology of slow slip.
The first objective is to create a model of the slow earthquake source, to constrain the micro- to kilometre-scale internal geometry of plate boundary faults, and the spatial distribution of deformation mechanisms. Fault rocks also retain a deformation sequence, allowing insight to how deformation style evolves with time. Thus, a combination of drill samples from active faults and outcrops of exhumed analogues, from a range of depths, allows for a 4-D model from micro- to plate boundary scale.
By knowing the geometrical distribution of fault rocks, and deciphering their evolution in time, this project will apply geologically constrained numerical models and laboratory constrained stress-strain relationships to determine bulk fault rheology as a function of space. Unique from past models, this project integrates scales from microstructures to plate boundary scale faults, and bases rheological models on deformation mechanisms and fault structures constrained through detailed fieldwork, and also considers the state-of-the-art of geophysical observation. The model focuses on understanding slow earthquakes, but also applies to understanding whether the slow earthquake source can also host fast seismic slip, and what differentiates slowly slipping faults from faults hosting major earthquakes.
Summary
Major tectonic faults have, until recently, been thought to accommodate displacement by either continuous creep or episodic, damaging earthquakes. High-resolution geophysical networks have now detected ‘slow earthquakes’, transient modes of displacement that are faster than creep but slower than earthquakes. This project aims to illuminate the unknown mechanism behind slow earthquakes, through an integrated, multi-scale approach. MICA uses the unique natural laboratory of exhumed and active faults, to build numerical models constrained by observed fault geometry and microstructurally defined deformation mechanisms, to determine, for the first time, the rheology of slow slip.
The first objective is to create a model of the slow earthquake source, to constrain the micro- to kilometre-scale internal geometry of plate boundary faults, and the spatial distribution of deformation mechanisms. Fault rocks also retain a deformation sequence, allowing insight to how deformation style evolves with time. Thus, a combination of drill samples from active faults and outcrops of exhumed analogues, from a range of depths, allows for a 4-D model from micro- to plate boundary scale.
By knowing the geometrical distribution of fault rocks, and deciphering their evolution in time, this project will apply geologically constrained numerical models and laboratory constrained stress-strain relationships to determine bulk fault rheology as a function of space. Unique from past models, this project integrates scales from microstructures to plate boundary scale faults, and bases rheological models on deformation mechanisms and fault structures constrained through detailed fieldwork, and also considers the state-of-the-art of geophysical observation. The model focuses on understanding slow earthquakes, but also applies to understanding whether the slow earthquake source can also host fast seismic slip, and what differentiates slowly slipping faults from faults hosting major earthquakes.
Max ERC Funding
1 499 244 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym MiMo
Project Inference in Microeconometric Models
Researcher (PI) Koen JOCHMANS
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Unobserved differences between economic agents are an important driver behind the differences in their economic outcomes such as schooling decisions, wages, and employment durations. Allowing for such unobserved heterogeneity in economic modeling equips the specification with an additional dimension of realism but presents major challenges for econometric practice. Hence, reconciling heterogeneity in the data with econometric models is an issue of utmost importance.
The aim of this project is to develop inference methods for models with unobserved heterogeneity by exploiting the identifying power of longitudinal (panel) data. The project consists of three blocks. Together, they span the largest part of modern applications of panel data.
The first block deals with inference on nonlinear models and enhances the performance of statistical hypothesis tests. So far, the literature has focused on point estimation. However, it is statistical inference that accounts for uncertainty in the data and forms the basis for testing economic restrictions. The second block makes progress on the estimation of models for network data. The importance of social and economic connections is well established but few formal results are available. We exploit the fact that network data can be seen as a type of panel data to derive such results. The third block uses panel data to non-parametrically estimate dynamic discrete-choice models with unobserved type heterogeneity and/or latent state variables. Such results are inexistent even though dynamic discrete-choice models are a workhorse tool in labor economics and industrial organization.
The performance of the tools will be assessed theoretically and via simulation, and they will be applied to various empirical problems. Two examples of applications that we will study are the extensive margin of labor force participation and the determinants of the import and export behavior of firms and countries.
Summary
Unobserved differences between economic agents are an important driver behind the differences in their economic outcomes such as schooling decisions, wages, and employment durations. Allowing for such unobserved heterogeneity in economic modeling equips the specification with an additional dimension of realism but presents major challenges for econometric practice. Hence, reconciling heterogeneity in the data with econometric models is an issue of utmost importance.
The aim of this project is to develop inference methods for models with unobserved heterogeneity by exploiting the identifying power of longitudinal (panel) data. The project consists of three blocks. Together, they span the largest part of modern applications of panel data.
The first block deals with inference on nonlinear models and enhances the performance of statistical hypothesis tests. So far, the literature has focused on point estimation. However, it is statistical inference that accounts for uncertainty in the data and forms the basis for testing economic restrictions. The second block makes progress on the estimation of models for network data. The importance of social and economic connections is well established but few formal results are available. We exploit the fact that network data can be seen as a type of panel data to derive such results. The third block uses panel data to non-parametrically estimate dynamic discrete-choice models with unobserved type heterogeneity and/or latent state variables. Such results are inexistent even though dynamic discrete-choice models are a workhorse tool in labor economics and industrial organization.
The performance of the tools will be assessed theoretically and via simulation, and they will be applied to various empirical problems. Two examples of applications that we will study are the extensive margin of labor force participation and the determinants of the import and export behavior of firms and countries.
Max ERC Funding
1 294 739 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym MINORG
Project The role of minerals in the oceanic carbon cycle
Researcher (PI) Caroline Louise Peacock
Host Institution (HI) UNIVERSITY OF LEEDS
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary The oceanic carbon cycle is key for regulating the Earth system because, in sediments and seawater, the balance between the degradation and preservation of organic carbon (OC) exerts a first order control on atmospheric CO2 and O2. In sediments, OC is preserved over millions of years, while in seawater, a dissolved form of recalcitrant OC has been recently recognised as critical to OC storage over anthropogenic timescales. Both sedimentary and seawater OC are derived from living organisms, and should therefore be easily degraded. Their persistence is therefore profoundly puzzling. Quite simply we do not know how or why OC is preserved. A long-standing hypothesis suggests that protection of OC inside minerals might account for the vast OC stores preserved in sediments. In a NEW hypothesis, based on recent work by the PI and proposed here for the first time, the interaction of OC with minerals might ALSO account for the even larger stores of dissolved OC preserved in seawater. Together these concepts could revolutionise our understanding of OC degradation and preservation, but the extent to which minerals preserve OC in sediments and seawater is (still) unknown, largely because the mechanisms that control how OC interacts with minerals are almost entirely unconstrained. MINORG will quantify the role of minerals in the preservation of OC for the first time, by combining cutting-edge molecular-level techniques with the first ever comprehensive and fully integrated experimental and modelling campaign, to determine in unprecedented detail the exact mechanisms responsible for the interaction of OC with minerals, and its subsequent degradation and preservation behaviour. MINORG hypothesises that minerals play a MAJOR role in the preservation of OC, in both its sedimentary and seawater forms, and is uniquely poised to test this. This project will majorly contribute to our quantitative understanding of the oceanic carbon cycle, and so to predicting current climate change.
Summary
The oceanic carbon cycle is key for regulating the Earth system because, in sediments and seawater, the balance between the degradation and preservation of organic carbon (OC) exerts a first order control on atmospheric CO2 and O2. In sediments, OC is preserved over millions of years, while in seawater, a dissolved form of recalcitrant OC has been recently recognised as critical to OC storage over anthropogenic timescales. Both sedimentary and seawater OC are derived from living organisms, and should therefore be easily degraded. Their persistence is therefore profoundly puzzling. Quite simply we do not know how or why OC is preserved. A long-standing hypothesis suggests that protection of OC inside minerals might account for the vast OC stores preserved in sediments. In a NEW hypothesis, based on recent work by the PI and proposed here for the first time, the interaction of OC with minerals might ALSO account for the even larger stores of dissolved OC preserved in seawater. Together these concepts could revolutionise our understanding of OC degradation and preservation, but the extent to which minerals preserve OC in sediments and seawater is (still) unknown, largely because the mechanisms that control how OC interacts with minerals are almost entirely unconstrained. MINORG will quantify the role of minerals in the preservation of OC for the first time, by combining cutting-edge molecular-level techniques with the first ever comprehensive and fully integrated experimental and modelling campaign, to determine in unprecedented detail the exact mechanisms responsible for the interaction of OC with minerals, and its subsequent degradation and preservation behaviour. MINORG hypothesises that minerals play a MAJOR role in the preservation of OC, in both its sedimentary and seawater forms, and is uniquely poised to test this. This project will majorly contribute to our quantitative understanding of the oceanic carbon cycle, and so to predicting current climate change.
Max ERC Funding
1 985 996 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym RECAP
Project constRaining the EffeCts of Aerosols on Precipitation
Researcher (PI) Philip STIER
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Precipitation is of fundamental importance so it is vital to understand its response to anthropogenic perturbations. Aerosols have been proposed to significantly affect precipitation [e.g. Ramanathan et al., 2001]. However, despite major research efforts evidence for a systematic aerosol effect on precipitation remains “ambiguous” [IPCC AR5, Stocker et al., 2013].
The vast majority of prior research [even an entire World Meteorological Organisation assessment report: Levin and Cotton, 2009] has taken a process-driven approach: trying to infer aerosol effects on precipitation through modelling/observing the chain of microphysical processes: from aerosols acting as cloud condensation / ice nuclei via cloud microphysics to precipitation formation of individual clouds. However, this relies on a complete understanding of a very complex and uncertain process chain and there exist no clear strategies to scale the response of individual clouds or cloud systems to larger scales.
RECAP will break this deadlock, introducing a radically different approach to aerosol effects on precipitation. RECAP will systematically constrain the energetic control of aerosol effects on precipitation across scales, delivering the first comprehensive and physically consistent assessment of the effect of aerosols on precipitation across scales, uniting energetic and process-driven approaches.
Summary
Precipitation is of fundamental importance so it is vital to understand its response to anthropogenic perturbations. Aerosols have been proposed to significantly affect precipitation [e.g. Ramanathan et al., 2001]. However, despite major research efforts evidence for a systematic aerosol effect on precipitation remains “ambiguous” [IPCC AR5, Stocker et al., 2013].
The vast majority of prior research [even an entire World Meteorological Organisation assessment report: Levin and Cotton, 2009] has taken a process-driven approach: trying to infer aerosol effects on precipitation through modelling/observing the chain of microphysical processes: from aerosols acting as cloud condensation / ice nuclei via cloud microphysics to precipitation formation of individual clouds. However, this relies on a complete understanding of a very complex and uncertain process chain and there exist no clear strategies to scale the response of individual clouds or cloud systems to larger scales.
RECAP will break this deadlock, introducing a radically different approach to aerosol effects on precipitation. RECAP will systematically constrain the energetic control of aerosol effects on precipitation across scales, delivering the first comprehensive and physically consistent assessment of the effect of aerosols on precipitation across scales, uniting energetic and process-driven approaches.
Max ERC Funding
2 225 713 €
Duration
Start date: 2017-05-01, End date: 2022-04-30