Project acronym ATM-GTP
Project Atmospheric Gas-to-Particle conversion
Researcher (PI) Markku KULMALA
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Summary
Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym BYONIC
Project Beyond the Iron Curtain
Researcher (PI) Alessandro TAGLIABUE
Host Institution (HI) THE UNIVERSITY OF LIVERPOOL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Summary
As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Max ERC Funding
1 668 418 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym C2Phase
Project Closure of the Cloud Phase
Researcher (PI) Corinna HOOSE
Host Institution (HI) KARLSRUHER INSTITUT FUER TECHNOLOGIE
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Whether and where clouds consist of liquid water, ice or both (i.e. their thermodynamic phase distribution), has major impacts on the clouds’ dynamical development, their radiative properties, their efficiency to form precipitation, and their impacts on the atmospheric environment. Cloud ice formation in the temperature range between 0 and -37°C is initiated by aerosol particles acting as heterogeneous ice nuclei and propagates through the cloud via a multitude of microphysical processes. Enormous progress has been made in recent years concerning the understanding and model parameterization of primary ice formation. In addition, high-resolution atmospheric models with complex cloud microphysics schemes can now be employed for realistic case studies of clouds. Finally, new retrieval schemes for the cloud (top) phase have recently been developed for various satellites, including passive polar orbiting and geostationary sensors, which provide a good spatial and temporal coverage and a long data record.
We propose here to merge the bottom-up, forward modeling approach for the cloud phase distribution with the top-down view of satellites. C2Phase will conduct systematic closure studies for variables related to the cloud phase distribution such as the cloud ice area fraction, its distribution as function of temperature and its temporal evolution, with a focus on Europe. For this, we will (1) use clustering techniques to separate different cloud regimes in model and satellite data, (2) explore the parameters and processes which the simulated phase distribution is most sensitive to, (3) investigate whether closure is reached between state-of-the art cloud resolving models and satellite observations, and how this closure can be improved by consistent and physically justified changes in microphysical parameterizations, and (4) use our results to improve the representation of mixed-phase clouds in weather and climate models and to quantify the impacts of these improvements.
Summary
Whether and where clouds consist of liquid water, ice or both (i.e. their thermodynamic phase distribution), has major impacts on the clouds’ dynamical development, their radiative properties, their efficiency to form precipitation, and their impacts on the atmospheric environment. Cloud ice formation in the temperature range between 0 and -37°C is initiated by aerosol particles acting as heterogeneous ice nuclei and propagates through the cloud via a multitude of microphysical processes. Enormous progress has been made in recent years concerning the understanding and model parameterization of primary ice formation. In addition, high-resolution atmospheric models with complex cloud microphysics schemes can now be employed for realistic case studies of clouds. Finally, new retrieval schemes for the cloud (top) phase have recently been developed for various satellites, including passive polar orbiting and geostationary sensors, which provide a good spatial and temporal coverage and a long data record.
We propose here to merge the bottom-up, forward modeling approach for the cloud phase distribution with the top-down view of satellites. C2Phase will conduct systematic closure studies for variables related to the cloud phase distribution such as the cloud ice area fraction, its distribution as function of temperature and its temporal evolution, with a focus on Europe. For this, we will (1) use clustering techniques to separate different cloud regimes in model and satellite data, (2) explore the parameters and processes which the simulated phase distribution is most sensitive to, (3) investigate whether closure is reached between state-of-the art cloud resolving models and satellite observations, and how this closure can be improved by consistent and physically justified changes in microphysical parameterizations, and (4) use our results to improve the representation of mixed-phase clouds in weather and climate models and to quantify the impacts of these improvements.
Max ERC Funding
1 499 549 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym CancerFluxome
Project Cancer Cellular Metabolism across Space and Time
Researcher (PI) Tomer Shlomi
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary The metabolism of cancer cells is altered to meet cellular requirements for growth, providing novel means to selectively target tumorigenesis. While extensively studied, our current view of cancer cellular metabolism is fundamentally limited by lack of information on variability in metabolic activity between distinct subcellular compartments and cells.
We propose to develop a spatio-temporal fluxomics approach for quantifying metabolic fluxes in the cytoplasm vs. mitochondria as well as their cell-cycle dynamics, combining mass-spectrometry based isotope tracing with cell synchronization, rapid cellular fractionation, and computational metabolic network modelling.
Spatio-temporal fluxomics will be used to revisit and challenge our current understanding of central metabolism and its induced adaptation to oncogenic events – an important endeavour considering that mitochondrial bioenergetics and biosynthesis are required for tumorigenesis and accumulating evidences for metabolic alterations throughout the cell-cycle.
Our preliminary results show intriguing oscillations between oxidative and reductive TCA cycle flux throughout the cell-cycle. We will explore the extent to which cells adapt their metabolism to fulfil the changing energetic and anabolic demands throughout the cell-cycle, how metabolic oscillations are regulated, and their benefit to cells in terms of thermodynamic efficiency. Spatial flux analysis will be instrumental for investigating glutaminolysis - a ‘hallmark’ metabolic adaptation in cancer involving shuttling of metabolic intermediates and cofactors between mitochondria and cytoplasm.
On a clinical front, our spatio-temporal fluxomics analysis will enable to disentangle oncogene-induced flux alterations, having an important tumorigenic role, from artefacts originating from population averaging. A comprehensive view of how cells adapt their metabolism due to oncogenic mutations will reveal novel targets for anti-cancer drugs.
Summary
The metabolism of cancer cells is altered to meet cellular requirements for growth, providing novel means to selectively target tumorigenesis. While extensively studied, our current view of cancer cellular metabolism is fundamentally limited by lack of information on variability in metabolic activity between distinct subcellular compartments and cells.
We propose to develop a spatio-temporal fluxomics approach for quantifying metabolic fluxes in the cytoplasm vs. mitochondria as well as their cell-cycle dynamics, combining mass-spectrometry based isotope tracing with cell synchronization, rapid cellular fractionation, and computational metabolic network modelling.
Spatio-temporal fluxomics will be used to revisit and challenge our current understanding of central metabolism and its induced adaptation to oncogenic events – an important endeavour considering that mitochondrial bioenergetics and biosynthesis are required for tumorigenesis and accumulating evidences for metabolic alterations throughout the cell-cycle.
Our preliminary results show intriguing oscillations between oxidative and reductive TCA cycle flux throughout the cell-cycle. We will explore the extent to which cells adapt their metabolism to fulfil the changing energetic and anabolic demands throughout the cell-cycle, how metabolic oscillations are regulated, and their benefit to cells in terms of thermodynamic efficiency. Spatial flux analysis will be instrumental for investigating glutaminolysis - a ‘hallmark’ metabolic adaptation in cancer involving shuttling of metabolic intermediates and cofactors between mitochondria and cytoplasm.
On a clinical front, our spatio-temporal fluxomics analysis will enable to disentangle oncogene-induced flux alterations, having an important tumorigenic role, from artefacts originating from population averaging. A comprehensive view of how cells adapt their metabolism due to oncogenic mutations will reveal novel targets for anti-cancer drugs.
Max ERC Funding
1 481 250 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym cis-CONTROL
Project Decoding and controlling cell-state switching: A bottom-up approach based on enhancer logic
Researcher (PI) Stein Luc AERTS
Host Institution (HI) VIB
Call Details Consolidator Grant (CoG), LS2, ERC-2016-COG
Summary Cell-state switching in cancer allows cells to transition from a proliferative to an invasive and drug-resistant phenotype. This plasticity plays an important role in cancer progression and tumour heterogeneity. We have made a striking observation that cancer cells of different origin can switch to a common survival state. During this epigenomic reprogramming, cancer cells re-activate genomic enhancers from specific regulatory programs, such as wound repair and epithelial-to-mesenchymal transition.
The goal of my project is to decipher the enhancer logic underlying this canalization effect towards a common survival state. We will then employ this new understanding of enhancer logic to engineer synthetic enhancers that are able to monitor and manipulate cell-state switching in real time. Furthermore, we will use enhancer models to identify cis-regulatory mutations that have an impact on cell-state switching and drug resistance. Such applications are currently hampered because there is a significant gap in our understanding of how enhancers work.
To tackle this problem we will use a combination of in vivo massively parallel enhancer-reporter assays, single-cell genomics on microfluidic devices, computational modelling, and synthetic enhancer design. Using these approaches we will pursue the following aims: (1) to identify functional enhancers regulating cell-state switching by performing in vivo genetic screens in mice; (2) to elucidate the dynamic trajectories whereby cells of different cancer types switch to a common survival cell-state, at single-cell resolution; (3) to create synthetic enhancer circuits that specifically kill cancer cells undergoing cell-state switching.
Our findings will have an impact on genome research, characterizing how cellular decision making is implemented by the cis-regulatory code; and on cancer research, employing enhancer logic in the context of cancer therapy.
Summary
Cell-state switching in cancer allows cells to transition from a proliferative to an invasive and drug-resistant phenotype. This plasticity plays an important role in cancer progression and tumour heterogeneity. We have made a striking observation that cancer cells of different origin can switch to a common survival state. During this epigenomic reprogramming, cancer cells re-activate genomic enhancers from specific regulatory programs, such as wound repair and epithelial-to-mesenchymal transition.
The goal of my project is to decipher the enhancer logic underlying this canalization effect towards a common survival state. We will then employ this new understanding of enhancer logic to engineer synthetic enhancers that are able to monitor and manipulate cell-state switching in real time. Furthermore, we will use enhancer models to identify cis-regulatory mutations that have an impact on cell-state switching and drug resistance. Such applications are currently hampered because there is a significant gap in our understanding of how enhancers work.
To tackle this problem we will use a combination of in vivo massively parallel enhancer-reporter assays, single-cell genomics on microfluidic devices, computational modelling, and synthetic enhancer design. Using these approaches we will pursue the following aims: (1) to identify functional enhancers regulating cell-state switching by performing in vivo genetic screens in mice; (2) to elucidate the dynamic trajectories whereby cells of different cancer types switch to a common survival cell-state, at single-cell resolution; (3) to create synthetic enhancer circuits that specifically kill cancer cells undergoing cell-state switching.
Our findings will have an impact on genome research, characterizing how cellular decision making is implemented by the cis-regulatory code; and on cancer research, employing enhancer logic in the context of cancer therapy.
Max ERC Funding
1 999 660 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym CITIZINGLOBAL
Project Citizens, Institutions and Globalization
Researcher (PI) Giacomo Antonio Maria PONZETTO
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Globalization has brought the world economy unprecedented prosperity, but it poses governance challenges. It needs governments to provide the infrastructure for global economic integration and to refrain from destructive protectionism; yet it can engender popular discontent and a crisis of democracy. My proposal will study when trade- and productivity-enhancing policies enjoy democratic support; why voters may support instead inefficient surplus-reducing policies; and how political structure reacts to globalization.
Part A studies the puzzling popularity of protectionism and how lobbies can raise it by manipulating information. It will study empirically if greater transparency causes lower trade barriers. It will introduce salience theory to political economics and argue that voters overweight concentrated losses and disregard diffuse benefits. It will show that lobbies can raise protection by channeling information to insiders and advertising the plight of displaced workers.
Part B studies inefficient infrastructure policy and the ensuing spatial misallocation of economic activity. It will show that voters’ unequal knowledge lets local residents capture national policy. They disregard nationwide positive externalities, so investment in major cities is insufficient, but also nationwide taxes, so spending in low-density areas is excessive. It will argue that the fundamental attribution error causes voter opposition to growth-enhancing policies and efficient incentive schemes like congestion pricing.
Part C studies how the size of countries and international unions adapts to expanding trade opportunities. It will focus on three forces: cultural diversity, economies of scale and scope in government, and trade-reducing border effects. It will show they explain increasing country size in the 19th century; the rise and fall of colonial empires; and the recent emergence of regional and global economic unions, accompanied by a peaceful increase in the number of countries.
Summary
Globalization has brought the world economy unprecedented prosperity, but it poses governance challenges. It needs governments to provide the infrastructure for global economic integration and to refrain from destructive protectionism; yet it can engender popular discontent and a crisis of democracy. My proposal will study when trade- and productivity-enhancing policies enjoy democratic support; why voters may support instead inefficient surplus-reducing policies; and how political structure reacts to globalization.
Part A studies the puzzling popularity of protectionism and how lobbies can raise it by manipulating information. It will study empirically if greater transparency causes lower trade barriers. It will introduce salience theory to political economics and argue that voters overweight concentrated losses and disregard diffuse benefits. It will show that lobbies can raise protection by channeling information to insiders and advertising the plight of displaced workers.
Part B studies inefficient infrastructure policy and the ensuing spatial misallocation of economic activity. It will show that voters’ unequal knowledge lets local residents capture national policy. They disregard nationwide positive externalities, so investment in major cities is insufficient, but also nationwide taxes, so spending in low-density areas is excessive. It will argue that the fundamental attribution error causes voter opposition to growth-enhancing policies and efficient incentive schemes like congestion pricing.
Part C studies how the size of countries and international unions adapts to expanding trade opportunities. It will focus on three forces: cultural diversity, economies of scale and scope in government, and trade-reducing border effects. It will show they explain increasing country size in the 19th century; the rise and fall of colonial empires; and the recent emergence of regional and global economic unions, accompanied by a peaceful increase in the number of countries.
Max ERC Funding
960 000 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym CLIMAHAL
Project Climate dimension of natural halogens in the Earth system: Past, present, future
Researcher (PI) Alfonso SAIZ LOPEZ
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Naturally-emitted very short-lived halogens (VSLH) have a profound impact on the chemistry and composition of the atmosphere, destroying greenhouse gases and altering aerosol production, which together can change the Earth´s radiative balance. Therefore, natural halogens possess leverage to influence climate, although their contribution to climate change is not well established and most climate models have yet to consider their effects. Also, there is increasing evidence that natural halogens i) impact on the air quality of coastal cities, ii) accelerates the atmospheric deposition of mercury (a toxic heavy metal) and iii) that their natural ocean and ice emissions are controlled by biological and photochemical mechanisms that may respond to climate changes. Motivated by the above, this project aims to quantify the so far unrecognized natural halogen-climate feedbacks and the impact of these feedbacks on global atmospheric oxidizing capacity (AOC) and radiative forcing (RF) across pre-industrial, present and future climates. Answering these questions is essential to predict if these climate-mediated feedbacks can reduce or amplify future climate change. To this end we will develop a multidisciplinary research approach using laboratory and field observations and models interactively that will allow us to peel apart the detailed physical processes behind the contribution of natural halogens to global climate change. Furthermore, the work plan also involves examining past-future climate impacts of natural halogens within a holistic Earth System model, where we will develop the multidirectional halogen interactions in the land-ocean-ice-biosphere-atmosphere coupled system. This will provide a breakthrough in our understanding of the importance of these natural processes for the composition and oxidation capacity of the Earth´s atmosphere and climate, both in the presence and absence of human influence.
Summary
Naturally-emitted very short-lived halogens (VSLH) have a profound impact on the chemistry and composition of the atmosphere, destroying greenhouse gases and altering aerosol production, which together can change the Earth´s radiative balance. Therefore, natural halogens possess leverage to influence climate, although their contribution to climate change is not well established and most climate models have yet to consider their effects. Also, there is increasing evidence that natural halogens i) impact on the air quality of coastal cities, ii) accelerates the atmospheric deposition of mercury (a toxic heavy metal) and iii) that their natural ocean and ice emissions are controlled by biological and photochemical mechanisms that may respond to climate changes. Motivated by the above, this project aims to quantify the so far unrecognized natural halogen-climate feedbacks and the impact of these feedbacks on global atmospheric oxidizing capacity (AOC) and radiative forcing (RF) across pre-industrial, present and future climates. Answering these questions is essential to predict if these climate-mediated feedbacks can reduce or amplify future climate change. To this end we will develop a multidisciplinary research approach using laboratory and field observations and models interactively that will allow us to peel apart the detailed physical processes behind the contribution of natural halogens to global climate change. Furthermore, the work plan also involves examining past-future climate impacts of natural halogens within a holistic Earth System model, where we will develop the multidirectional halogen interactions in the land-ocean-ice-biosphere-atmosphere coupled system. This will provide a breakthrough in our understanding of the importance of these natural processes for the composition and oxidation capacity of the Earth´s atmosphere and climate, both in the presence and absence of human influence.
Max ERC Funding
1 979 112 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym CloudBrake
Project How nature's smallest clouds slow down large-scale circulations critical for climate
Researcher (PI) Aloisia NUIJENS
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Do even the smallest clouds simply drift with the wind?
Vast areas of our oceans and land are covered with shallow cumulus clouds. These low-level clouds are receiving increased attention as uncertainties in their representation in global climate models lead to a spread in predictions of future climate. This attention emphasizes radiative and thermodynamic impacts of clouds, which are thought to energize the large-scale Hadley circulation. But broadly overlooked is the impact of shallow cumuli on the trade-winds that drive this circulation. Reasons for this negligence are a lack of observations of vertical wind structure and the wide range of scales involved.
My project will test the hypothesis that shallow cumuli can also slow down the Hadley circulation by vertical transport of momentum. First, observations of clouds and winds will be explicitly connected and the causality of their relationship will be exposed using ground-based and airborne measurements and high-resolution modeling. Second, new lidar techniques aboard aircraft are exploited to validate low-level winds measured by the space-borne Aeolus wind lidar and collect high-resolution wind and turbulence data. Third, different models of momentum transport by shallow convection will be developed to represent its impact on winds. Last, evidence of global relationships between winds and shallow cumulus are traced in Aeolus and additional satellite data and the impact of momentum transport on circulations in a control and warmer climate is tested in a general circulation model.
This project exploits my expertise in observing and modeling clouds and convection focused on a hypothesis which, if true, will strongly influence our understanding of the sensitivity of circulations and the sensitivity of climate. It will increase the predictability of low-level winds and convergence patterns, which are important to many disciplines, including climate studies, numerical weather prediction and wind-energy research.
Summary
Do even the smallest clouds simply drift with the wind?
Vast areas of our oceans and land are covered with shallow cumulus clouds. These low-level clouds are receiving increased attention as uncertainties in their representation in global climate models lead to a spread in predictions of future climate. This attention emphasizes radiative and thermodynamic impacts of clouds, which are thought to energize the large-scale Hadley circulation. But broadly overlooked is the impact of shallow cumuli on the trade-winds that drive this circulation. Reasons for this negligence are a lack of observations of vertical wind structure and the wide range of scales involved.
My project will test the hypothesis that shallow cumuli can also slow down the Hadley circulation by vertical transport of momentum. First, observations of clouds and winds will be explicitly connected and the causality of their relationship will be exposed using ground-based and airborne measurements and high-resolution modeling. Second, new lidar techniques aboard aircraft are exploited to validate low-level winds measured by the space-borne Aeolus wind lidar and collect high-resolution wind and turbulence data. Third, different models of momentum transport by shallow convection will be developed to represent its impact on winds. Last, evidence of global relationships between winds and shallow cumulus are traced in Aeolus and additional satellite data and the impact of momentum transport on circulations in a control and warmer climate is tested in a general circulation model.
This project exploits my expertise in observing and modeling clouds and convection focused on a hypothesis which, if true, will strongly influence our understanding of the sensitivity of circulations and the sensitivity of climate. It will increase the predictability of low-level winds and convergence patterns, which are important to many disciplines, including climate studies, numerical weather prediction and wind-energy research.
Max ERC Funding
1 867 120 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym COMPASS
Project COMPASS: Climate-relevant Ocean Measurements and Processes on the Antarctic continental Shelf and Slope
Researcher (PI) Karen HEYWOOD
Host Institution (HI) UNIVERSITY OF EAST ANGLIA
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Processes on the Antarctic continental shelf and slope are crucially important for determining the rate of future sea level rise, setting the properties and volume of dense bottom water exported globally, and regulating the carbon cycle. Yet our ability to model and predict these processes over future decades remains rudimentary. This deficiency in understanding originates in a lack of observations in this inaccessible region. The COMPASS project seeks to rectify that by exploiting new technology - autonomous marine vehicles called gliders - to observe, quantify and elucidate processes on the continental shelf and slope of Antarctica that are important for climate.
The COMPASS objective is to make a step-change in our quantitative understanding of:
(i) the ocean front that marks the boundary between the Antarctic continental shelf and the open ocean, and its associated current system;
(ii) the interaction between ocean, atmosphere and sea-ice on the Antarctic continental shelf; and
(iii) the exchange of heat, salt and freshwater with the cavities beneath ice shelves.
These goals will be met by a series of targeted ocean glider campaigns around Antarctica, spanning different flow regimes, including areas where warm water is able to access the continental shelf and influence ice shelves, areas where the continental shelf is cold and fresh, and areas where the continental shelf hosts cold, salty, dense water that eventually spills into the abyss. A unique circumpolar assessment of ocean properties and dynamics, including instabilities and mixing, will be undertaken. COMPASS will develop new technology to deploy a profiling glider into inaccessible environments such as Antarctic polynyas (regions of open water surrounded by sea-ice). As well as scientific breakthroughs that will feed into future climate assessments, improving projections of future sea level rise and global temperatures, COMPASS will deliver enhanced design for future ocean observing systems.
Summary
Processes on the Antarctic continental shelf and slope are crucially important for determining the rate of future sea level rise, setting the properties and volume of dense bottom water exported globally, and regulating the carbon cycle. Yet our ability to model and predict these processes over future decades remains rudimentary. This deficiency in understanding originates in a lack of observations in this inaccessible region. The COMPASS project seeks to rectify that by exploiting new technology - autonomous marine vehicles called gliders - to observe, quantify and elucidate processes on the continental shelf and slope of Antarctica that are important for climate.
The COMPASS objective is to make a step-change in our quantitative understanding of:
(i) the ocean front that marks the boundary between the Antarctic continental shelf and the open ocean, and its associated current system;
(ii) the interaction between ocean, atmosphere and sea-ice on the Antarctic continental shelf; and
(iii) the exchange of heat, salt and freshwater with the cavities beneath ice shelves.
These goals will be met by a series of targeted ocean glider campaigns around Antarctica, spanning different flow regimes, including areas where warm water is able to access the continental shelf and influence ice shelves, areas where the continental shelf is cold and fresh, and areas where the continental shelf hosts cold, salty, dense water that eventually spills into the abyss. A unique circumpolar assessment of ocean properties and dynamics, including instabilities and mixing, will be undertaken. COMPASS will develop new technology to deploy a profiling glider into inaccessible environments such as Antarctic polynyas (regions of open water surrounded by sea-ice). As well as scientific breakthroughs that will feed into future climate assessments, improving projections of future sea level rise and global temperatures, COMPASS will deliver enhanced design for future ocean observing systems.
Max ERC Funding
3 499 270 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ComplexAssembly
Project The birth of protein complexes
Researcher (PI) Martin BECK
Host Institution (HI) EUROPEAN MOLECULAR BIOLOGY LABORATORY
Call Details Consolidator Grant (CoG), LS2, ERC-2016-COG
Summary Protein complexes are central to many cellular functions but our knowledge of how cells assemble protein complexes remains very sparse. Biophysical and structural data of assembly intermediates are extremely rare. Particularly in higher eukaryotes, it has become clear that complex assembly by random collision of subunits cannot cope with the spatial and temporal complexity of the intricate architecture of many cellular machines. Here I propose to combine systems biology approaches with in situ structural biology methods to visualize protein complex assembly. I want to investigate experimentally in which order the interfaces of protein complexes are formed and to which extent structures of assembly intermediates resemble those observed in fully assembled complexes. I want develop methods to systematically screen for additional factors involved in assembly pathways. I furthermore want to test the hypothesis that mechanisms must exist in eukaryotes that coordinate local mRNA translation with the ordered formation of protein complex interfaces. I believe that in order to understand assembly pathways, these processes, that so far are often studied autonomously, need to be considered jointly and in a protein complex centric manner. The research proposed here will bridge across these different scientific disciplines. In the long term, a better mechanistic understanding of protein complex assembly and the structural characterization of critical intermediates will be of high relevance for scenarios under which a cell’s protein quality control system has to cope with stress, such as aging and neurodegenerative diseases. It might also facilitate the more efficient industrial production of therapeutically relevant proteins.
Summary
Protein complexes are central to many cellular functions but our knowledge of how cells assemble protein complexes remains very sparse. Biophysical and structural data of assembly intermediates are extremely rare. Particularly in higher eukaryotes, it has become clear that complex assembly by random collision of subunits cannot cope with the spatial and temporal complexity of the intricate architecture of many cellular machines. Here I propose to combine systems biology approaches with in situ structural biology methods to visualize protein complex assembly. I want to investigate experimentally in which order the interfaces of protein complexes are formed and to which extent structures of assembly intermediates resemble those observed in fully assembled complexes. I want develop methods to systematically screen for additional factors involved in assembly pathways. I furthermore want to test the hypothesis that mechanisms must exist in eukaryotes that coordinate local mRNA translation with the ordered formation of protein complex interfaces. I believe that in order to understand assembly pathways, these processes, that so far are often studied autonomously, need to be considered jointly and in a protein complex centric manner. The research proposed here will bridge across these different scientific disciplines. In the long term, a better mechanistic understanding of protein complex assembly and the structural characterization of critical intermediates will be of high relevance for scenarios under which a cell’s protein quality control system has to cope with stress, such as aging and neurodegenerative diseases. It might also facilitate the more efficient industrial production of therapeutically relevant proteins.
Max ERC Funding
1 957 717 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym Connections
Project Oligopoly Markets and Networks
Researcher (PI) Andrea Galeotti
Host Institution (HI) LONDON BUSINESS SCHOOL
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary Via our connections we learn about new ideas, quality of products, new investment opportunities and job opportunities. We influence and are influenced by our circle of friends. Firms are interconnected in complex processes of production and distribution. A firm’s decisions in a supply chain depends on other firms’ choices in the same supply chain, as well as on firms' behaviour in competing chains. Research on networks in the last 20 years has provided a series of tolls to study a system of interconnected economic agents. This project will advance the state of the art by further developing new applications of networks to better understand modern oligopoly markets.
The project is organised into two sub-projects. In sub-project 1 networks will be used to model diffusion and adoption of network goods. Different consumers' network locations will summarise different consumers' level of influence. The objectives are to understand how firms incorporate information about consumers' influence in their marketing strategies—pricing strategy and product design. It will provide a rigorous framework to evaluate how the increasing ability of firms to gather information on consumers’ influence affects outcomes of markets with network effects. In sub-project 2 networks will be used to model how inputs—e.g., intermediary goods and patents—are combined to deliver final goods. Possible applications are supply chains, communication networks and networks of patents. The objectives are to study firms' strategic behaviour, like pricing and R&D investments, in a complex process of production and distribution, and to understand the basic network metrics that are useful to describe market power. This is particularly important to provide a guide to competition authorities and alike when they evaluate mergers in complex interconnected markets.
Summary
Via our connections we learn about new ideas, quality of products, new investment opportunities and job opportunities. We influence and are influenced by our circle of friends. Firms are interconnected in complex processes of production and distribution. A firm’s decisions in a supply chain depends on other firms’ choices in the same supply chain, as well as on firms' behaviour in competing chains. Research on networks in the last 20 years has provided a series of tolls to study a system of interconnected economic agents. This project will advance the state of the art by further developing new applications of networks to better understand modern oligopoly markets.
The project is organised into two sub-projects. In sub-project 1 networks will be used to model diffusion and adoption of network goods. Different consumers' network locations will summarise different consumers' level of influence. The objectives are to understand how firms incorporate information about consumers' influence in their marketing strategies—pricing strategy and product design. It will provide a rigorous framework to evaluate how the increasing ability of firms to gather information on consumers’ influence affects outcomes of markets with network effects. In sub-project 2 networks will be used to model how inputs—e.g., intermediary goods and patents—are combined to deliver final goods. Possible applications are supply chains, communication networks and networks of patents. The objectives are to study firms' strategic behaviour, like pricing and R&D investments, in a complex process of production and distribution, and to understand the basic network metrics that are useful to describe market power. This is particularly important to provide a guide to competition authorities and alike when they evaluate mergers in complex interconnected markets.
Max ERC Funding
829 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym COS-OCS
Project Carbonyl Sulphide: new ways of Observing the Climate System
Researcher (PI) Maarten KROL
Host Institution (HI) WAGENINGEN UNIVERSITY
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary The future climate of our planet strongly depends on the capacity of the biosphere to sequester atmospheric CO2, and on the abundance of stratospheric sulphate aerosols (SSA). These aerosols form a layer that resides at about 16 km altitude that, contrary to CO2, has a cooling effect on climate. These two climate-regulating mechanisms are intricately linked to the atmospheric trace gas carbonyl sulphide (COS).
COS is the most abundant sulphur compound in our atmosphere. The dominant COS source is biogenic activity in the ocean, while uptake by the terrestrial biosphere, and a small amount of destruction in the stratosphere, contribute to its removal. The COS loss to the biosphere could potentially be used to quantify photosynthetic CO2 uptake, while its stratospheric destruction is an important precursor for the formation of SSA. A deeper understanding of atmospheric COS variations would therefore signal a major step forward in our ability to diagnose CO2 uptake and SSA formation.
With this research program, I aim to fundamentally improve our limited understanding of the COS budget. The program combines innovative modelling and measurements. I aim to collect samples from aircraft, ship cruises, and stations across all latitudes, on which highly challenging analyses of COS and its isotopologues will be performed. To characterise the important transition to the stratosphere, vertical COS profiles up to 30 km will be sampled with so-called “AirCores”. A larger spatial coverage will come from currently untapped satellite data of COS isotopologues. My program will integrate these measurements into the first multispecies and isotope-enabled inverse modelling framework for COS, building on techniques I developed during the past decade. The measurements and model together will allow breakthroughs in the coupled COS and CO2 budgets, and unlock the potential of COS as new climate diagnostic.
Summary
The future climate of our planet strongly depends on the capacity of the biosphere to sequester atmospheric CO2, and on the abundance of stratospheric sulphate aerosols (SSA). These aerosols form a layer that resides at about 16 km altitude that, contrary to CO2, has a cooling effect on climate. These two climate-regulating mechanisms are intricately linked to the atmospheric trace gas carbonyl sulphide (COS).
COS is the most abundant sulphur compound in our atmosphere. The dominant COS source is biogenic activity in the ocean, while uptake by the terrestrial biosphere, and a small amount of destruction in the stratosphere, contribute to its removal. The COS loss to the biosphere could potentially be used to quantify photosynthetic CO2 uptake, while its stratospheric destruction is an important precursor for the formation of SSA. A deeper understanding of atmospheric COS variations would therefore signal a major step forward in our ability to diagnose CO2 uptake and SSA formation.
With this research program, I aim to fundamentally improve our limited understanding of the COS budget. The program combines innovative modelling and measurements. I aim to collect samples from aircraft, ship cruises, and stations across all latitudes, on which highly challenging analyses of COS and its isotopologues will be performed. To characterise the important transition to the stratosphere, vertical COS profiles up to 30 km will be sampled with so-called “AirCores”. A larger spatial coverage will come from currently untapped satellite data of COS isotopologues. My program will integrate these measurements into the first multispecies and isotope-enabled inverse modelling framework for COS, building on techniques I developed during the past decade. The measurements and model together will allow breakthroughs in the coupled COS and CO2 budgets, and unlock the potential of COS as new climate diagnostic.
Max ERC Funding
2 462 135 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym CrackEpitranscriptom
Project Cracking the epitranscriptome
Researcher (PI) Schraga SCHWARTZ
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary Over 100 types of distinct modifications are catalyzed on RNA molecules post-transcriptionally. In an analogous manner to well-studied chemical modifications on proteins or DNA, modifications on RNA - and particularly on mRNA - harbor the exciting potential of regulating the complex and interlinked life cycle of these molecules. The most abundant modification in mammalian and yeast mRNA is N6-methyladenosine (m6A). We have pioneered approaches for mapping m6A in a transcriptome wide manner, and we and others have identified factors involved in encoding and decoding m6A. While experimental disruption of these factors is associated with severe phenotypes, the role of m6A remains enigmatic. No single methylated site has been shown to causally underlie any physiological or molecular function. This proposal aims to establish a framework for systematically deciphering the molecular function of a modification and its underlying mechanisms and to uncover the physiological role of the modification in regulation of a cellular response. We will apply this framework to m6A in the context of meiosis in budding yeast, as m6A dynamically accumulates on meiotic mRNAs and as the methyltransferase catalyzing m6A is essential for meiosis. We will (1) aim to elucidate the physiological targets of methylation governing entry into meiosis (2) seek to elucidate the function of m6A at the molecular level, and understand its impact on the various steps of the mRNA life cycle, (3) seek to understand the mechanisms underlying its effects. These aims will provide a comprehensive framework for understanding how the epitranscriptome, an emerging post-transcriptional layer of regulation, fine-tunes gene regulation and impacts cellular decision making in a dynamic response, and will set the stage towards dissecting the roles of m6A and of an expanding set of mRNA modifications in more complex and disease related systems.
Summary
Over 100 types of distinct modifications are catalyzed on RNA molecules post-transcriptionally. In an analogous manner to well-studied chemical modifications on proteins or DNA, modifications on RNA - and particularly on mRNA - harbor the exciting potential of regulating the complex and interlinked life cycle of these molecules. The most abundant modification in mammalian and yeast mRNA is N6-methyladenosine (m6A). We have pioneered approaches for mapping m6A in a transcriptome wide manner, and we and others have identified factors involved in encoding and decoding m6A. While experimental disruption of these factors is associated with severe phenotypes, the role of m6A remains enigmatic. No single methylated site has been shown to causally underlie any physiological or molecular function. This proposal aims to establish a framework for systematically deciphering the molecular function of a modification and its underlying mechanisms and to uncover the physiological role of the modification in regulation of a cellular response. We will apply this framework to m6A in the context of meiosis in budding yeast, as m6A dynamically accumulates on meiotic mRNAs and as the methyltransferase catalyzing m6A is essential for meiosis. We will (1) aim to elucidate the physiological targets of methylation governing entry into meiosis (2) seek to elucidate the function of m6A at the molecular level, and understand its impact on the various steps of the mRNA life cycle, (3) seek to understand the mechanisms underlying its effects. These aims will provide a comprehensive framework for understanding how the epitranscriptome, an emerging post-transcriptional layer of regulation, fine-tunes gene regulation and impacts cellular decision making in a dynamic response, and will set the stage towards dissecting the roles of m6A and of an expanding set of mRNA modifications in more complex and disease related systems.
Max ERC Funding
1 402 666 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym CSEM
Project The Collaborative Seismic Earth Model Project
Researcher (PI) Andreas FICHTNER
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Seismic tomography images of the Earth's interior are key to the characterisation of earthquakes, natural resource exploration, seismic risk assessment, tsunami warning, and studies of geodynamic processes. While tomography has drawn a fascinating picture of our planet, today's individual researchers can exploit only a fraction of the rapidly expanding seismic data volume. Applications relying on tomographic images lag behind their potential; fundamental questions remain unanswered: Do mantle plumes exist in the deep Earth? What are the properties of active faults, and how do they affect earthquake ground motion?
To address these questions and to ensure continued progress of seismic tomography in the 'Big Data' era, I propose new technological developments that enable a paradigm shift in Earth model construction towards a Collaborative Seismic Earth Model (CSEM). Fully accounting for the physics of wave propagation in the complex 3D Earth, the CSEM is envisioned to evolve successively through a systematic group effort of my team, thus going beyond the tomographic models that individual researchers may construct today.
I will develop the technological foundation of the CSEM and integrate these developments in studies of large-earthquake rupture processes and the convective pattern of the Earth's mantle in relation to surface geology. The CSEM project will bridge the gap between regional and global tomography, and deliver the first multiscale model of the Earth where crust and mantle are jointly resolved. The CSEM will lead to a dramatic increase in the exploitable seismic data volume, and set new standards for the construction and reproducibility of tomographic Earth models.
Beyond this project, the CSEM will be openly accessible through the European Plate Observing System (EPOS). It will then offer Earth scientists the unique opportunity to join forces in the discovery of multiscale Earth structure by systematically building on each other's results.
Summary
Seismic tomography images of the Earth's interior are key to the characterisation of earthquakes, natural resource exploration, seismic risk assessment, tsunami warning, and studies of geodynamic processes. While tomography has drawn a fascinating picture of our planet, today's individual researchers can exploit only a fraction of the rapidly expanding seismic data volume. Applications relying on tomographic images lag behind their potential; fundamental questions remain unanswered: Do mantle plumes exist in the deep Earth? What are the properties of active faults, and how do they affect earthquake ground motion?
To address these questions and to ensure continued progress of seismic tomography in the 'Big Data' era, I propose new technological developments that enable a paradigm shift in Earth model construction towards a Collaborative Seismic Earth Model (CSEM). Fully accounting for the physics of wave propagation in the complex 3D Earth, the CSEM is envisioned to evolve successively through a systematic group effort of my team, thus going beyond the tomographic models that individual researchers may construct today.
I will develop the technological foundation of the CSEM and integrate these developments in studies of large-earthquake rupture processes and the convective pattern of the Earth's mantle in relation to surface geology. The CSEM project will bridge the gap between regional and global tomography, and deliver the first multiscale model of the Earth where crust and mantle are jointly resolved. The CSEM will lead to a dramatic increase in the exploitable seismic data volume, and set new standards for the construction and reproducibility of tomographic Earth models.
Beyond this project, the CSEM will be openly accessible through the European Plate Observing System (EPOS). It will then offer Earth scientists the unique opportunity to join forces in the discovery of multiscale Earth structure by systematically building on each other's results.
Max ERC Funding
1 367 500 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym D-TECT
Project Does dust triboelectrification affect our climate?
Researcher (PI) Vasileios AMOIRIDIS
Host Institution (HI) NATIONAL OBSERVATORY OF ATHENS
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary The recent IPCC report identifies mineral dust and the associated uncertainties in climate projections as key topics for future research. Dust size distribution in climate models controls the dust-radiation-cloud interactions and is a major contributor to these uncertainties. Observations show that the coarse mode of dust can be sustained during long-range transport, while current understanding fails in explaining why the lifetime of large airborne dust particles is longer than expected from gravitational settling theories. This discrepancy between observations and theory suggests that other processes counterbalance the effect of gravity along transport. D-TECT envisages filling this knowledge gap by studying the contribution of the triboelectrification (contact electrification) on particle removal processes. Our hypothesis is that triboelectric charging generates adequate electric fields to hold large dust particles up in the atmosphere. D-TECT aims to (i) parameterize the physical mechanisms responsible for dust triboelectrification; (ii) assess the impact of electrification on dust settling; (iii) quantify the climatic impacts of the process, particularly the effect on the dust size evolution during transport, on dry deposition and on CCN/IN reservoirs, and the effect of the electric field on particle orientation and on radiative transfer. The approach involves the development of a novel specialized high-power lidar system to detect and characterize aerosol particle orientation and a large-scale field experiment in the Mediterranean Basin using unprecedented ground-based remote sensing and airborne in-situ observation synergies. Considering aerosol-electricity interactions, the observations will be used to improve theoretical understanding and simulations of dust lifecycle. The project will provide new fundamental understanding, able to open new horizons for weather and climate science, including biogeochemistry, volcanic ash and extraterrestrial dust research.
Summary
The recent IPCC report identifies mineral dust and the associated uncertainties in climate projections as key topics for future research. Dust size distribution in climate models controls the dust-radiation-cloud interactions and is a major contributor to these uncertainties. Observations show that the coarse mode of dust can be sustained during long-range transport, while current understanding fails in explaining why the lifetime of large airborne dust particles is longer than expected from gravitational settling theories. This discrepancy between observations and theory suggests that other processes counterbalance the effect of gravity along transport. D-TECT envisages filling this knowledge gap by studying the contribution of the triboelectrification (contact electrification) on particle removal processes. Our hypothesis is that triboelectric charging generates adequate electric fields to hold large dust particles up in the atmosphere. D-TECT aims to (i) parameterize the physical mechanisms responsible for dust triboelectrification; (ii) assess the impact of electrification on dust settling; (iii) quantify the climatic impacts of the process, particularly the effect on the dust size evolution during transport, on dry deposition and on CCN/IN reservoirs, and the effect of the electric field on particle orientation and on radiative transfer. The approach involves the development of a novel specialized high-power lidar system to detect and characterize aerosol particle orientation and a large-scale field experiment in the Mediterranean Basin using unprecedented ground-based remote sensing and airborne in-situ observation synergies. Considering aerosol-electricity interactions, the observations will be used to improve theoretical understanding and simulations of dust lifecycle. The project will provide new fundamental understanding, able to open new horizons for weather and climate science, including biogeochemistry, volcanic ash and extraterrestrial dust research.
Max ERC Funding
1 968 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym DarkMix
Project Illuminating the dark side of surface meteorology: creating a novel framework to explain atmospheric transport and turbulent mixing in the weak-wind boundary layer
Researcher (PI) Christoph Karl THOMAS
Host Institution (HI) UNIVERSITAET BAYREUTH
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Surface meteorology impacts the abundance and quality of life on Earth through the transfer and mixing of light, heat, water, CO2, and other substances controlling the resources for humans, plants, and animals. However, current theories and models fail when airflows and turbulence are weak during calm nights leaving weather and climate forecasts uncertain. This ‘dark side’ occupies substantial fractions of time and our landscape, its physics are largely unknown, and has eluded proper experimental investigation.
DarkMix creates technological and theoretical innovations to observe and explain transport and mixing in the weak-wind boundary layer. Its ambitious goal is a radically new framework incorporating unexplored mechanisms such as submeso-scale motions, flow instationarities, and directional shear to effect a quantum leap in understanding the air-plant-soil exchange. DarkMix will build the ground-breaking first-ever fiber-optic distributed temperature sensing harp to fully resolve the 3-dimensional flow and air temperature fields and enable unprecedented computation of eddy covariance fluxes at scales of seconds over 4 orders of magnitude (deci- to hundreds of meters). Both key innovations bear significant risks of technical and fundamental nature, which are mitigated by pursuing alternatives. Measurements will inform cutting-edge large eddy simulations to test hypotheses. The interdisciplinary dimension takes DarkMix to a unique set of weak-wind sites including a valley-bottom grassland, a forest in complex terrain, and a city to investigate topographic effects, the forest carbon cycle, and the urban heat island.
DarkMix will open a new window for surface meteorology and its links to air quality, biogeochemistry, and climate change by giving physically meaningful and societally relevant answers to profound questions such as the exchange of greenhouse gases, hazards from ground fog, urban pollution, and agricultural losses through extreme cold air.
Summary
Surface meteorology impacts the abundance and quality of life on Earth through the transfer and mixing of light, heat, water, CO2, and other substances controlling the resources for humans, plants, and animals. However, current theories and models fail when airflows and turbulence are weak during calm nights leaving weather and climate forecasts uncertain. This ‘dark side’ occupies substantial fractions of time and our landscape, its physics are largely unknown, and has eluded proper experimental investigation.
DarkMix creates technological and theoretical innovations to observe and explain transport and mixing in the weak-wind boundary layer. Its ambitious goal is a radically new framework incorporating unexplored mechanisms such as submeso-scale motions, flow instationarities, and directional shear to effect a quantum leap in understanding the air-plant-soil exchange. DarkMix will build the ground-breaking first-ever fiber-optic distributed temperature sensing harp to fully resolve the 3-dimensional flow and air temperature fields and enable unprecedented computation of eddy covariance fluxes at scales of seconds over 4 orders of magnitude (deci- to hundreds of meters). Both key innovations bear significant risks of technical and fundamental nature, which are mitigated by pursuing alternatives. Measurements will inform cutting-edge large eddy simulations to test hypotheses. The interdisciplinary dimension takes DarkMix to a unique set of weak-wind sites including a valley-bottom grassland, a forest in complex terrain, and a city to investigate topographic effects, the forest carbon cycle, and the urban heat island.
DarkMix will open a new window for surface meteorology and its links to air quality, biogeochemistry, and climate change by giving physically meaningful and societally relevant answers to profound questions such as the exchange of greenhouse gases, hazards from ground fog, urban pollution, and agricultural losses through extreme cold air.
Max ERC Funding
1 898 104 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym DemandDemoc
Project Demand for Democracy
Researcher (PI) Davide Werner CANTONI
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Historically, people around the world have demanded democratic institutions. Such democratic movements propel political change and also determine economic outcomes. In this project, we ask, how do political preferences, beliefs, and second-order beliefs shape the strategic decision to participate in a movement demanding democracy? Existing scholarship is unsatisfactory because it is conducted ex post: preferences, beliefs, and behavior have converged to a new equilibrium. In contrast, we examine a democratic movement in real time, studying the ongoing democracy movement in Hong Kong.
Our study is composed of four parts. In Part 1, we collect panel survey data from Hong Kong university students, a particularly politically active subpopulation. We collect data on preferences, behavior, beliefs, and second-order beliefs using incentivized and indirect elicitation to encourage truthful reporting. We analyze the associations among these variables to shed light on the drivers of participation in the democracy movement.
In Part 2, we exploit experimental variation in the provision of information to study political coordination. Among participants in the panel survey, we provide information regarding the preferences and beliefs of other students. We examine whether exposure to information regarding peers causes students to update their beliefs and change their behavior.
In Part 3, we extend the analysis in Part 1 to a nationally representative sample of Hong Kong citizens. To do so, we have added a module regarding political preferences, beliefs, and behavior (including incentivized questions and questions providing cover for responses to politically sensitive topics) to the HKPSSD panel survey.
In Part 4, we study preferences for redistribution – plausibly a central driver for demands for political rights – among Hong Kong citizens and mainland Chinese. We examine how these preferences differ across populations, as well as their link to support for democracy.
Summary
Historically, people around the world have demanded democratic institutions. Such democratic movements propel political change and also determine economic outcomes. In this project, we ask, how do political preferences, beliefs, and second-order beliefs shape the strategic decision to participate in a movement demanding democracy? Existing scholarship is unsatisfactory because it is conducted ex post: preferences, beliefs, and behavior have converged to a new equilibrium. In contrast, we examine a democratic movement in real time, studying the ongoing democracy movement in Hong Kong.
Our study is composed of four parts. In Part 1, we collect panel survey data from Hong Kong university students, a particularly politically active subpopulation. We collect data on preferences, behavior, beliefs, and second-order beliefs using incentivized and indirect elicitation to encourage truthful reporting. We analyze the associations among these variables to shed light on the drivers of participation in the democracy movement.
In Part 2, we exploit experimental variation in the provision of information to study political coordination. Among participants in the panel survey, we provide information regarding the preferences and beliefs of other students. We examine whether exposure to information regarding peers causes students to update their beliefs and change their behavior.
In Part 3, we extend the analysis in Part 1 to a nationally representative sample of Hong Kong citizens. To do so, we have added a module regarding political preferences, beliefs, and behavior (including incentivized questions and questions providing cover for responses to politically sensitive topics) to the HKPSSD panel survey.
In Part 4, we study preferences for redistribution – plausibly a central driver for demands for political rights – among Hong Kong citizens and mainland Chinese. We examine how these preferences differ across populations, as well as their link to support for democracy.
Max ERC Funding
1 494 647 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym DIVERSE-EXPECON
Project Discriminative preferences and fairness ideals in diverse societies: An ‘experimental economics’ approach
Researcher (PI) Sigrid SUETENS
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT BRABANT
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary In economics, a distinction is made between statistical and taste-based discrimination (henceforth, TBD). Statistical discrimination refers to discrimination in a context with strategic uncertainty. Someone who is uncertain about the future behaviour of a person with a different ethnicity may rely on information about the different ethnic group to which this person belongs to form beliefs about the behaviour of that person. This may lead to discrimination. TBD refers to discrimination in a context without strategic uncertainty. It implies suffering a disutility when interacting with ‘different’ others. This project systematically studies TBD in ethnically diverse societies.
Identifying TBD is important because overcoming it requires different policies than overcoming statistical discrimination: they should deal with changing preferences of people rather than providing information about specific interaction partners. But identifying TBD is tricky. First, it is impossible to identify using uncontrolled empirical data because these data are characterised by strategic uncertainty. Second, people are generally reluctant to identify themselves as a discriminator. In the project, I study TBS using novel economic experiments that circumvent these problems.
The project consists of three main objectives. First, I investigate whether and how preferences of European natives in social interactions depend on others’ ethnicity. Are natives as altruistic, reciprocal, envious to immigrants as compared to other natives? Second, I study whether natives have different fairness ideals—what constitutes a fair distribution of resources from the perspective of an impartial spectator—when it comes to natives than when it comes to non-natives. Third, I analyse whether preferences and fairness ideals depend on exposure to diversity: do preferences and fairness ideals of natives change as contact with non-natives increases, and, if so, how?
Summary
In economics, a distinction is made between statistical and taste-based discrimination (henceforth, TBD). Statistical discrimination refers to discrimination in a context with strategic uncertainty. Someone who is uncertain about the future behaviour of a person with a different ethnicity may rely on information about the different ethnic group to which this person belongs to form beliefs about the behaviour of that person. This may lead to discrimination. TBD refers to discrimination in a context without strategic uncertainty. It implies suffering a disutility when interacting with ‘different’ others. This project systematically studies TBD in ethnically diverse societies.
Identifying TBD is important because overcoming it requires different policies than overcoming statistical discrimination: they should deal with changing preferences of people rather than providing information about specific interaction partners. But identifying TBD is tricky. First, it is impossible to identify using uncontrolled empirical data because these data are characterised by strategic uncertainty. Second, people are generally reluctant to identify themselves as a discriminator. In the project, I study TBS using novel economic experiments that circumvent these problems.
The project consists of three main objectives. First, I investigate whether and how preferences of European natives in social interactions depend on others’ ethnicity. Are natives as altruistic, reciprocal, envious to immigrants as compared to other natives? Second, I study whether natives have different fairness ideals—what constitutes a fair distribution of resources from the perspective of an impartial spectator—when it comes to natives than when it comes to non-natives. Third, I analyse whether preferences and fairness ideals depend on exposure to diversity: do preferences and fairness ideals of natives change as contact with non-natives increases, and, if so, how?
Max ERC Funding
1 499 046 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym DRY-2-DRY
Project Do droughts self-propagate and self-intensify?
Researcher (PI) Diego González Miralles
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Droughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.
Summary
Droughts cause agricultural loss, forest mortality and drinking water scarcity. Their predicted increase in recurrence and intensity poses serious threats to future global food security. Several historically unprecedented droughts have already occurred over the last decade in Europe, Australia and the USA. The cost of the ongoing Californian drought is estimated to be about US$3 billion. Still today, the knowledge of how droughts start and evolve remains limited, and so does the understanding of how climate change may affect them.
Positive feedbacks from land have been suggested as critical for the occurrence of recent droughts: as rainfall deficits dry out soil and vegetation, the evaporation of land water is reduced, then the local air becomes too dry to yield rainfall, which further enhances drought conditions. Importantly, this is not just a 'local' feedback, as remote regions may rely on evaporated water transported by winds from the drought-affected region. Following this rationale, droughts self-propagate and self-intensify.
However, a global capacity to observe these processes is lacking. Furthermore, climate and forecast models are immature when it comes to representing the influences of land on rainfall. Do climate models underestimate this land feedback? If so, future drought aggravation will be greater than currently expected. At the moment, this remains largely speculative, given the limited number of studies of these processes.
I propose to use novel in situ and satellite records of soil moisture, evaporation and precipitation, in combination with new mechanistic models that can map water vapour trajectories and explore multi-dimensional feedbacks. DRY-2-DRY will not only advance our fundamental knowledge of the mechanisms triggering droughts, it will also provide independent evidence of the extent to which managing land cover can help 'dampen' drought events, and enable progress towards more accurate short-term and long-term drought forecasts.
Max ERC Funding
1 465 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym DYNMECH
Project Dynamic Mechanisms
Researcher (PI) Daniel Ferguson Garrett
Host Institution (HI) FONDATION JEAN-JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary This project studies dynamic mechanisms. By “dynamic mechanisms”, we mean policies to which a principal (e.g., a seller, an employer, or a regulator) can commit to induce the agents (e.g., buyers, employees, or regulated firms) to take the desired actions over time. Several components of the project are envisaged:
- Competition in dynamic mechanisms.
o I propose a competitive setting in which agents (e.g., buyers or workers) learn about the offers of different principals over time. Agents may receive more than one offer at a time, leading to direct competition between mechanisms. Received offers are agents’ private information, permitting strategic delay of acceptance (for instance, an agent may want to wait to evaluate new offers that received in the future).
- Robust predictions for a rich class of stochastic processes.
o We study optimal dynamic mechanisms for agents whose preferences evolve stochastically with time. We develop an approach to partially characterizing these mechanisms which (unlike virtually all of the existing literature) does not depend on ad-hoc restrictions on the stochastic process for preferences.
- Efficient bilateral trade with budget balance: dynamic arrival of traders
o I study bilateral trade with budget balance, when traders (i) arrive over time, and (ii) have preferences which evolve stochastically with time. The project aims at an impossibility result in this setting: contrary to the existing literature which does not account for dynamic arrivals, budget-balanced efficient trade is typically impossible, even for very patient traders.
- Pre-event ticket sales and complementary investments
o We provide a rationale for the early allocation of capacity to customers for events such as flights and concerts based on customers’ demand for pre-event complementary investments (such as booking a hotel or a babysitter). We examine efficient and profit-maximizing mechanisms.
Summary
This project studies dynamic mechanisms. By “dynamic mechanisms”, we mean policies to which a principal (e.g., a seller, an employer, or a regulator) can commit to induce the agents (e.g., buyers, employees, or regulated firms) to take the desired actions over time. Several components of the project are envisaged:
- Competition in dynamic mechanisms.
o I propose a competitive setting in which agents (e.g., buyers or workers) learn about the offers of different principals over time. Agents may receive more than one offer at a time, leading to direct competition between mechanisms. Received offers are agents’ private information, permitting strategic delay of acceptance (for instance, an agent may want to wait to evaluate new offers that received in the future).
- Robust predictions for a rich class of stochastic processes.
o We study optimal dynamic mechanisms for agents whose preferences evolve stochastically with time. We develop an approach to partially characterizing these mechanisms which (unlike virtually all of the existing literature) does not depend on ad-hoc restrictions on the stochastic process for preferences.
- Efficient bilateral trade with budget balance: dynamic arrival of traders
o I study bilateral trade with budget balance, when traders (i) arrive over time, and (ii) have preferences which evolve stochastically with time. The project aims at an impossibility result in this setting: contrary to the existing literature which does not account for dynamic arrivals, budget-balanced efficient trade is typically impossible, even for very patient traders.
- Pre-event ticket sales and complementary investments
o We provide a rationale for the early allocation of capacity to customers for events such as flights and concerts based on customers’ demand for pre-event complementary investments (such as booking a hotel or a babysitter). We examine efficient and profit-maximizing mechanisms.
Max ERC Funding
1 321 625 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym EARTHBLOOM
Project Earth’s first biological bloom: An integrated field, geochemical, and geobiological examination of the origins of photosynthesis and carbonate production 3 billion years ago
Researcher (PI) Stefan Victor LALONDE
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary The origin of oxygenic photosynthesis is one of the most dramatic evolutionary events that the Earth has ever experienced. At some point in Earth’s first two billion years, primitive bacteria acquired the ability to harness sunlight, oxidize water, release O2, and transform CO2 to organic carbon, and all with unprecedented efficiency. Today, oxygenic photosynthesis accounts for nearly all of the biomass on the planet, and exerts significant control over the carbon cycle. Since 2 billion years ago (Ga), it has regulated the climate of our planet, ensuring liquid water at the surface and enough oxygen to support complex life. The biological and geological consequences of oxygenic photosynthesis are so great that they effectively underpin what we think of as a habitable planet. Understanding the origins of photosynthesis is a paramount scientific challenge at the heart of some of humanity’s greatest questions: how did life evolve? how did Earth become a habitable planet? EARTHBLOOM addresses these questions head-on through the first comprehensive scientific study of Earth’s first blooming photosynthetic ecosystem, preserved as Earth’s oldest carbonate platform. This relatively unknown, >450m thick deposit, comprised largely of 2.9 Ga fossil photosynthetic structures (stromatolites), is one of the most important early Earth fossil localities ever identified, and EARTHBLOOM is carefully positioned for major discovery. EARTHBLOOM will push the frontier of field data collection and sample screening using new XRF methods for carbonate analysis. EARTHBLOOM will also push the analytical frontier in the lab by applying the most sensitive metal stable isotope tracers for O2 at ultra-low levels (Mo, U, and Ce) coupled with novel isotopic “age of oxidation” constraints. By providing new constraints on atmospheric CO2, ocean pH, oxygen production, and nutrient availability, EARTHBLOOM is poised to redefine Earth’s surface environment at the dawn of photosynthetic life.
Summary
The origin of oxygenic photosynthesis is one of the most dramatic evolutionary events that the Earth has ever experienced. At some point in Earth’s first two billion years, primitive bacteria acquired the ability to harness sunlight, oxidize water, release O2, and transform CO2 to organic carbon, and all with unprecedented efficiency. Today, oxygenic photosynthesis accounts for nearly all of the biomass on the planet, and exerts significant control over the carbon cycle. Since 2 billion years ago (Ga), it has regulated the climate of our planet, ensuring liquid water at the surface and enough oxygen to support complex life. The biological and geological consequences of oxygenic photosynthesis are so great that they effectively underpin what we think of as a habitable planet. Understanding the origins of photosynthesis is a paramount scientific challenge at the heart of some of humanity’s greatest questions: how did life evolve? how did Earth become a habitable planet? EARTHBLOOM addresses these questions head-on through the first comprehensive scientific study of Earth’s first blooming photosynthetic ecosystem, preserved as Earth’s oldest carbonate platform. This relatively unknown, >450m thick deposit, comprised largely of 2.9 Ga fossil photosynthetic structures (stromatolites), is one of the most important early Earth fossil localities ever identified, and EARTHBLOOM is carefully positioned for major discovery. EARTHBLOOM will push the frontier of field data collection and sample screening using new XRF methods for carbonate analysis. EARTHBLOOM will also push the analytical frontier in the lab by applying the most sensitive metal stable isotope tracers for O2 at ultra-low levels (Mo, U, and Ce) coupled with novel isotopic “age of oxidation” constraints. By providing new constraints on atmospheric CO2, ocean pH, oxygen production, and nutrient availability, EARTHBLOOM is poised to redefine Earth’s surface environment at the dawn of photosynthetic life.
Max ERC Funding
1 848 685 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym ECCLES
Project Emergent Constraints on Climate-Land feedbacks in the Earth System
Researcher (PI) Peter COX
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary The Land Biosphere is a critical component of the Earth System, linking to climate through multiple feedback processes. Understanding these feedback processes is a huge intellectual challenge. In part because of the pioneering work of the PI (Cox et al., 2000), many of the climate projections reported in the IPCC 5th Assessment Report (AR5) now include climate-carbon cycle feedbacks. However the latest Earth System Models (ESMs) continue to show a huge range in the projected responses of the land carbon cycle over the 21st century. This uncertainty threatens to undermine the value of these projections to inform climate policy. This project (ECCLES) is designed to produce significant reductions in the uncertainties associated with land-climate interactions, using the novel concept of Emergent Constraints - relationships between future projections and observable variations in the current Earth System that are common across the ensemble of ESMs. Emergent Constraints have many attractive features but chief amongst these is that they can make ensembles of ESMs more than the sum of the parts - allowing the full range of ESM projections to be used collectively, alongside key observations, to reduce uncertainties in the future climate. The project will deliver: (i) a theoretical foundation for Emergent Constraints; (ii) new datasets on the changing function of the land biosphere; (iii) Emergent Constraints on land-climate interactions based on observed temporal and spatial variations; (iv) a new generation of scientists expert in land-climate interactions and Emergent Constraints. ECCLES will benefit from the expertise and experience of the PI, which includes training as a theoretical physicist, an early career developing models of the land biosphere for ESMs, and a current career in a department of mathematics where he is at the forefront of efforts to develop and apply the concept of Emergent Constraints (Cox et al., 2013, Wenzel et al., 2016).
Summary
The Land Biosphere is a critical component of the Earth System, linking to climate through multiple feedback processes. Understanding these feedback processes is a huge intellectual challenge. In part because of the pioneering work of the PI (Cox et al., 2000), many of the climate projections reported in the IPCC 5th Assessment Report (AR5) now include climate-carbon cycle feedbacks. However the latest Earth System Models (ESMs) continue to show a huge range in the projected responses of the land carbon cycle over the 21st century. This uncertainty threatens to undermine the value of these projections to inform climate policy. This project (ECCLES) is designed to produce significant reductions in the uncertainties associated with land-climate interactions, using the novel concept of Emergent Constraints - relationships between future projections and observable variations in the current Earth System that are common across the ensemble of ESMs. Emergent Constraints have many attractive features but chief amongst these is that they can make ensembles of ESMs more than the sum of the parts - allowing the full range of ESM projections to be used collectively, alongside key observations, to reduce uncertainties in the future climate. The project will deliver: (i) a theoretical foundation for Emergent Constraints; (ii) new datasets on the changing function of the land biosphere; (iii) Emergent Constraints on land-climate interactions based on observed temporal and spatial variations; (iv) a new generation of scientists expert in land-climate interactions and Emergent Constraints. ECCLES will benefit from the expertise and experience of the PI, which includes training as a theoretical physicist, an early career developing models of the land biosphere for ESMs, and a current career in a department of mathematics where he is at the forefront of efforts to develop and apply the concept of Emergent Constraints (Cox et al., 2013, Wenzel et al., 2016).
Max ERC Funding
2 249 834 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym EDST
Project Economic Development and Structural Transformation
Researcher (PI) Maria Paula BUSTOS
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS MONETARIOS Y FINANCIEROS
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary The early development literature documented that the growth path of most advanced economies was accompanied by a process of structural transformation. As economies develop, the share of agriculture in employment falls and workers migrate to cities to find employment in the industrial and service sectors [Clark (1940), Kuznets (1957)]. In the first industrialized countries, technical improvements in agriculture favoured the development of industry and services by releasing labour, increasing demand and raising profits to finance other activities. However, several scholars noted that the positive effects of agricultural productivity on economic development are no longer operative in open economies. In addition, there is a large theoretical literature highlighting how market failures can retard structural transformation in developing countries. In particular, financial frictions might constrain the reallocation of capital and thus retard the process of labour reallocation. In this project, we propose to contribute to our understanding of structural transformation by providing direct empirical evidence on the effects of exogenous shocks to local agricultural and manufacturing productivity on the reallocation of capital and labour across sectors, firms and space in Brazil. For this purpose, we construct the first data set that permits to jointly observe labour and credit flows across sectors and space. To exploit the spatial dimension of the capital allocation problem, we design a new empirical which exploits the geographical structure of bank branch networks. Similarly, we propose to study the spatial dimension of the labour allocation problem by exploiting differences in migration costs across regions due to transportation and social networks.
Summary
The early development literature documented that the growth path of most advanced economies was accompanied by a process of structural transformation. As economies develop, the share of agriculture in employment falls and workers migrate to cities to find employment in the industrial and service sectors [Clark (1940), Kuznets (1957)]. In the first industrialized countries, technical improvements in agriculture favoured the development of industry and services by releasing labour, increasing demand and raising profits to finance other activities. However, several scholars noted that the positive effects of agricultural productivity on economic development are no longer operative in open economies. In addition, there is a large theoretical literature highlighting how market failures can retard structural transformation in developing countries. In particular, financial frictions might constrain the reallocation of capital and thus retard the process of labour reallocation. In this project, we propose to contribute to our understanding of structural transformation by providing direct empirical evidence on the effects of exogenous shocks to local agricultural and manufacturing productivity on the reallocation of capital and labour across sectors, firms and space in Brazil. For this purpose, we construct the first data set that permits to jointly observe labour and credit flows across sectors and space. To exploit the spatial dimension of the capital allocation problem, we design a new empirical which exploits the geographical structure of bank branch networks. Similarly, we propose to study the spatial dimension of the labour allocation problem by exploiting differences in migration costs across regions due to transportation and social networks.
Max ERC Funding
1 486 500 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym EEC
Project Economic Engineering of Cooperation in Modern Markets
Researcher (PI) Axel OCKENFELS
Host Institution (HI) UNIVERSITAET ZU KOELN
Call Details Advanced Grant (AdG), SH1, ERC-2016-ADG
Summary Cooperation is essential for the functioning of the economy and society. Thus, with inappropriate mechanisms to harness self-interest by aligning it with the common good, the outcome of social and economic interaction can be bleak and even catastrophic.
Recent advances in computer technology lead to radical innovation in market design and trading strategies. This creates both, new challenges and exciting opportunities for “engineering cooperation”. This project uses the economic engineering approach (as advocated by Alvin Roth) to address some of the most pressing cooperation problems of modern markets and societies.
I propose three work packages, each using innovative experimental methods and (behavioral) game theory in order to address a specific challenge:
The first one studies the design of electronic reputation mechanisms that promote cooperation in the digital world. Previous research has shown that mechanisms to promote trust on the Internet are flawed. Yet, there is little empirical and normative guidance on how to repair these systems, and engineer better ones.
The second studies the design of mechanisms that avoid arms races for speed in real-time financial and electricity market trading. Traders use algorithmic sniping strategies, even when they are collectively wasteful and seriously threatening market liquidity and stability. Yet, little is known about the robust properties of alternative market designs to eliminate sniping.
The third one studies how to design modern markets that align with ethical considerations. People sometimes have a distaste for certain kinds of modern transactions, such as reciprocal kidney exchange and buying pollution rights. Yet, little is known about the underlying nature and robustness of this distaste.
My project will generate important knowledge to improve the functioning of modern markets, and at the same time open new horizons in the sciences of cooperation and of “behavioral economic engineering”.
Summary
Cooperation is essential for the functioning of the economy and society. Thus, with inappropriate mechanisms to harness self-interest by aligning it with the common good, the outcome of social and economic interaction can be bleak and even catastrophic.
Recent advances in computer technology lead to radical innovation in market design and trading strategies. This creates both, new challenges and exciting opportunities for “engineering cooperation”. This project uses the economic engineering approach (as advocated by Alvin Roth) to address some of the most pressing cooperation problems of modern markets and societies.
I propose three work packages, each using innovative experimental methods and (behavioral) game theory in order to address a specific challenge:
The first one studies the design of electronic reputation mechanisms that promote cooperation in the digital world. Previous research has shown that mechanisms to promote trust on the Internet are flawed. Yet, there is little empirical and normative guidance on how to repair these systems, and engineer better ones.
The second studies the design of mechanisms that avoid arms races for speed in real-time financial and electricity market trading. Traders use algorithmic sniping strategies, even when they are collectively wasteful and seriously threatening market liquidity and stability. Yet, little is known about the robust properties of alternative market designs to eliminate sniping.
The third one studies how to design modern markets that align with ethical considerations. People sometimes have a distaste for certain kinds of modern transactions, such as reciprocal kidney exchange and buying pollution rights. Yet, little is known about the underlying nature and robustness of this distaste.
My project will generate important knowledge to improve the functioning of modern markets, and at the same time open new horizons in the sciences of cooperation and of “behavioral economic engineering”.
Max ERC Funding
1 155 104 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym Epiherigans
Project Writing, reading and managing stress with H3K9me
Researcher (PI) Susan GASSER
Host Institution (HI) FRIEDRICH MIESCHER INSTITUTE FOR BIOMEDICAL RESEARCH FONDATION
Call Details Advanced Grant (AdG), LS2, ERC-2016-ADG
Summary Epigenetic inheritance is the transmission of information, generally in the form of DNA methylation or post-translational modifications on histones that regulate the availability of underlying genetic information for transcription. RNA itself feeds back to contribute to histone modification. Sequence accessibility is both a matter of folding the chromatin fibre to alter access to recognition motifs, and the local concentration of factors needed for efficient transcriptional initiation, elongation, termination or mRNA stability. In heterochromatin we find a subset of regulatory factors in carefully balanced concentrations that are maintained in part by the segregation of active and inactive domains. Histone H3 K9 methylation is key to this compartmentation.
C. elegans provides an ideal system in which to study chromatin-based gene repression. We have demonstrated that histone H3 K9 methylation is the essential signal for the sequestration of heterochromatin at the nuclear envelope in C. elegans. The recognition of H3K9me1/2/3 by an inner nuclear envelope-bound chromodomain protein, CEC-4, actively sequesters heterochromatin in embryos, and contributes redundantly in adult tissues.
Epiherigans has the ambitious goal to determine definitively what targets H3K9 methylation, and identify its physiological roles. We will examine how this mark contributes to the epigenetic recognition of repeat vs non-repeat sequence, and mediates a stress-induced response to oxidative damage. We will examine the link between these and the spatial clustering of heterochromatic domains. Epiherigans will develop an integrated approach to identify in vivo the factors that distinguish repeats from non-repeats, self from non-self within genomes and will examine how H3K9me contributes to a persistent ROS or DNA damage stress response. It represents a crucial step towards understanding of how our genomes use heterochromatin to modulate, stabilize and transmit chromatin organization.
Summary
Epigenetic inheritance is the transmission of information, generally in the form of DNA methylation or post-translational modifications on histones that regulate the availability of underlying genetic information for transcription. RNA itself feeds back to contribute to histone modification. Sequence accessibility is both a matter of folding the chromatin fibre to alter access to recognition motifs, and the local concentration of factors needed for efficient transcriptional initiation, elongation, termination or mRNA stability. In heterochromatin we find a subset of regulatory factors in carefully balanced concentrations that are maintained in part by the segregation of active and inactive domains. Histone H3 K9 methylation is key to this compartmentation.
C. elegans provides an ideal system in which to study chromatin-based gene repression. We have demonstrated that histone H3 K9 methylation is the essential signal for the sequestration of heterochromatin at the nuclear envelope in C. elegans. The recognition of H3K9me1/2/3 by an inner nuclear envelope-bound chromodomain protein, CEC-4, actively sequesters heterochromatin in embryos, and contributes redundantly in adult tissues.
Epiherigans has the ambitious goal to determine definitively what targets H3K9 methylation, and identify its physiological roles. We will examine how this mark contributes to the epigenetic recognition of repeat vs non-repeat sequence, and mediates a stress-induced response to oxidative damage. We will examine the link between these and the spatial clustering of heterochromatic domains. Epiherigans will develop an integrated approach to identify in vivo the factors that distinguish repeats from non-repeats, self from non-self within genomes and will examine how H3K9me contributes to a persistent ROS or DNA damage stress response. It represents a crucial step towards understanding of how our genomes use heterochromatin to modulate, stabilize and transmit chromatin organization.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym EpiScope
Project Epigenomics and chromosome architecture one cell at a time
Researcher (PI) Marcelo NOLLMANN MARTINEZ
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), LS2, ERC-2016-COG
Summary In Eukaryotes, cellular identity and tissue-specific functions are linked to the epigenetic landscape and the multi-scale architecture of the genome. The packing of DNA into nucleosomes at the ~100 bp scale and the organization of whole chromosomes into functional territories within the nucleus are well documented. At an intermediate scale, chromosomes are organised in megabase to sub-megabase structures called Topologically Associating Domains (TADs). Critically, TADs are highly correlated to patterns of epigenetic marks determining the transcriptional state of the genes they encompass. Until now, the lack of efficient technologies to map chromosome architecture and epigenetic marks at the single-cell level have limited our understanding of the molecular actors and mechanisms implicated in the establishment and maintenance of the multi-scale architecture of chromosomes and epigenetic states, and the interplay between this architecture and other nuclear functions such as transcription.
The overall aim of EpiScope is to unveil the functional, multi-scale, 3D architecture of chromatin at the single-cell level while preserving cellular context, with a toolbox of groundbreaking high-performance microscopies (Hi-M). Hi-M will use unique combinations of multi-focus and single-molecule localization microscopies with novel DNA labeling methods and microfluidics. Hi-M will enable the study of structure-function relationships within TADs of different chromatin types and correlate single-cell variations in epigenomic patterns to 3D conformations with genomic specificity and at the nanoscale. Finally, Hi-M will be used to develop a novel high-throughput, high-content method to unveil the full pairwise distance distribution between thousands of genomic loci at the single cell level and at multiple length-scales. Our findings and technologies will shed new light into the mechanisms responsible for cellular memory, identity and differentiation.
Summary
In Eukaryotes, cellular identity and tissue-specific functions are linked to the epigenetic landscape and the multi-scale architecture of the genome. The packing of DNA into nucleosomes at the ~100 bp scale and the organization of whole chromosomes into functional territories within the nucleus are well documented. At an intermediate scale, chromosomes are organised in megabase to sub-megabase structures called Topologically Associating Domains (TADs). Critically, TADs are highly correlated to patterns of epigenetic marks determining the transcriptional state of the genes they encompass. Until now, the lack of efficient technologies to map chromosome architecture and epigenetic marks at the single-cell level have limited our understanding of the molecular actors and mechanisms implicated in the establishment and maintenance of the multi-scale architecture of chromosomes and epigenetic states, and the interplay between this architecture and other nuclear functions such as transcription.
The overall aim of EpiScope is to unveil the functional, multi-scale, 3D architecture of chromatin at the single-cell level while preserving cellular context, with a toolbox of groundbreaking high-performance microscopies (Hi-M). Hi-M will use unique combinations of multi-focus and single-molecule localization microscopies with novel DNA labeling methods and microfluidics. Hi-M will enable the study of structure-function relationships within TADs of different chromatin types and correlate single-cell variations in epigenomic patterns to 3D conformations with genomic specificity and at the nanoscale. Finally, Hi-M will be used to develop a novel high-throughput, high-content method to unveil the full pairwise distance distribution between thousands of genomic loci at the single cell level and at multiple length-scales. Our findings and technologies will shed new light into the mechanisms responsible for cellular memory, identity and differentiation.
Max ERC Funding
1 999 780 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym EPP
Project Econometrics for Public Policy: Sampling, Estimation, Decision, and Applications
Researcher (PI) Toru KITAGAWA
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary One of the ultimate goals of economics is to inform a policy that improves welfare. Despite that the vast amount of empirical works in economics aims to achieve this goal, the current state of the art in econometrics is silent about concrete recommendation for how to estimate the welfare maximizing policy. This project addresses statistically optimal and practically useful ways to learn the welfare-maximizing policy from data by developing novel econometric frameworks, sampling design, and estimation approaches that can be applied to a wide range of policy design problems in reality.
Development of econometric methods for optimal empirical policy design proceeds by answering the following open questions. First, given a sampling process, how do we define optimal estimation for the welfare-maximizing policy? Second, what estimation method achieves this statistical optimality? Third, how do we solve policy decision problem when the sampling process only set-identifies the social welfare criterion? Fourth, how can we integrate the sampling step and estimation step to develop a package of optimal sampling and optimal estimation procedures?
I divide the project into the following four parts. Each part is motivated by important empirical applications and has methodological challenges related to these four questions.
1) Estimation of treatment assignment policy
2) Estimation of optimal policy in other public policy applications
3) Policy design with set-identified social welfare
4) Sampling design for empirical policy design
Summary
One of the ultimate goals of economics is to inform a policy that improves welfare. Despite that the vast amount of empirical works in economics aims to achieve this goal, the current state of the art in econometrics is silent about concrete recommendation for how to estimate the welfare maximizing policy. This project addresses statistically optimal and practically useful ways to learn the welfare-maximizing policy from data by developing novel econometric frameworks, sampling design, and estimation approaches that can be applied to a wide range of policy design problems in reality.
Development of econometric methods for optimal empirical policy design proceeds by answering the following open questions. First, given a sampling process, how do we define optimal estimation for the welfare-maximizing policy? Second, what estimation method achieves this statistical optimality? Third, how do we solve policy decision problem when the sampling process only set-identifies the social welfare criterion? Fourth, how can we integrate the sampling step and estimation step to develop a package of optimal sampling and optimal estimation procedures?
I divide the project into the following four parts. Each part is motivated by important empirical applications and has methodological challenges related to these four questions.
1) Estimation of treatment assignment policy
2) Estimation of optimal policy in other public policy applications
3) Policy design with set-identified social welfare
4) Sampling design for empirical policy design
Max ERC Funding
1 291 064 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym EUROPOPULISM
Project European Integration, Populism and European Cities
Researcher (PI) Guido Enrico TABELLINI
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Advanced Grant (AdG), SH1, ERC-2016-ADG
Summary Why is it so difficult to achieve further European political integration? This question motivates the first part of the project. The standard approach in economics presumes that integration of countries reflects a tradeoff between economic benefits and the cost of cultural heterogeneity. To assess this tradeoff, we exploit survey data to quantify cultural heterogeneity within and between EU countries, comparing it to the US. We also investigate time variation, to assess whether economic integration led to cultural convergence. Finally, exploiting regional variation, we seek to identify a cultural core and compare it to the economic core of the EU. We conjecture the following conclusion: although European economic integration has not led to cultural convergence, the primary obstacle to integration is not cultural heterogeneity per se, but the presence of other barriers, such as national identities or national institutions, which amplify its effects.
The second part of the project studies the causes and implications of two related phenomena: the diffusion of nationalism and of political populism, with behavioral voters. We study nationalism as endogenous identification with one’s nation, and analyze how it interacts with political institutions and political processes in a setting of international policy coordination. We study populism as due to the reaction of disappointed voters who behave according to Prospect theory. Our main goal is to explain these behavioral phenomena, and to derive predictions about the effect of institutional reforms.
The third part of the project examines Europe in the very long run. It studies the formation of clusters of creative élites within Europe, in a historical perspective. The main goal is to explain how local self-government institutions and the migration of upper tail human capital between different European cities contributed to the formation of clusters of innovation and creativity in the XI-XIX centuries.
Summary
Why is it so difficult to achieve further European political integration? This question motivates the first part of the project. The standard approach in economics presumes that integration of countries reflects a tradeoff between economic benefits and the cost of cultural heterogeneity. To assess this tradeoff, we exploit survey data to quantify cultural heterogeneity within and between EU countries, comparing it to the US. We also investigate time variation, to assess whether economic integration led to cultural convergence. Finally, exploiting regional variation, we seek to identify a cultural core and compare it to the economic core of the EU. We conjecture the following conclusion: although European economic integration has not led to cultural convergence, the primary obstacle to integration is not cultural heterogeneity per se, but the presence of other barriers, such as national identities or national institutions, which amplify its effects.
The second part of the project studies the causes and implications of two related phenomena: the diffusion of nationalism and of political populism, with behavioral voters. We study nationalism as endogenous identification with one’s nation, and analyze how it interacts with political institutions and political processes in a setting of international policy coordination. We study populism as due to the reaction of disappointed voters who behave according to Prospect theory. Our main goal is to explain these behavioral phenomena, and to derive predictions about the effect of institutional reforms.
The third part of the project examines Europe in the very long run. It studies the formation of clusters of creative élites within Europe, in a historical perspective. The main goal is to explain how local self-government institutions and the migration of upper tail human capital between different European cities contributed to the formation of clusters of innovation and creativity in the XI-XIX centuries.
Max ERC Funding
1 276 250 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym F-IMAGE
Project Seismic Functional Imaging of the Brittle Crust
Researcher (PI) Michel CAMPILLO
Host Institution (HI) UNIVERSITE GRENOBLE ALPES
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Despite the dramatic impact of earthquakes, the physics of their onset and the short-term behavior of fault are still poorly understood. Using existing high quality seismic observations, we propose to develop a novel functional imaging of the brittle crust to clarify not only structural properties but also the dynamics of faults. We will analyze spatio-temporal changes of elastic properties around fault zones to highlight the interplay between changes in the host rocks and fault slip. Imaging the damage structure around faults and its evolution requires new seismological methods. With novel methods to image the highly heterogeneous fault regions, we will provide multi-scale descriptions of fault zones, including their laterally variable thicknesses and depth dependence. In parallel we will image temporal changes of seismic velocities and scattering strength. External natural forcing terms (e.g. tides, seasonal hydrologic loadings) will be modeled to isolate the signals of tectonic origin. This will also allow us to monitor the evolving seismic susceptibility, i.e. a measure of the proximity to a critical state of failure. Improved earthquake detection techniques using ‘deep machine learning’ methods will facilitate tracking the evolution of rock damage. The imaging and monitoring will provide time-lapse images of elastic moduli, susceptibility and seismicity. The observed short-time changes of the materials will be included in slip initiation models coupling the weakening of both the friction and the damaged host rocks. Laboratory experiments will shed light on the transition of behavior from granular (shallow fault core) to cohesive (distant host rock) materials. Our initial data cover two well-studied fault regions of high earthquake probability (Southern California and the Marmara region, Turkey) and an area of induced seismicity (Groningen). The derived results and new versatile imaging and monitoring techniques can have fundamental social and economic impacts.
Summary
Despite the dramatic impact of earthquakes, the physics of their onset and the short-term behavior of fault are still poorly understood. Using existing high quality seismic observations, we propose to develop a novel functional imaging of the brittle crust to clarify not only structural properties but also the dynamics of faults. We will analyze spatio-temporal changes of elastic properties around fault zones to highlight the interplay between changes in the host rocks and fault slip. Imaging the damage structure around faults and its evolution requires new seismological methods. With novel methods to image the highly heterogeneous fault regions, we will provide multi-scale descriptions of fault zones, including their laterally variable thicknesses and depth dependence. In parallel we will image temporal changes of seismic velocities and scattering strength. External natural forcing terms (e.g. tides, seasonal hydrologic loadings) will be modeled to isolate the signals of tectonic origin. This will also allow us to monitor the evolving seismic susceptibility, i.e. a measure of the proximity to a critical state of failure. Improved earthquake detection techniques using ‘deep machine learning’ methods will facilitate tracking the evolution of rock damage. The imaging and monitoring will provide time-lapse images of elastic moduli, susceptibility and seismicity. The observed short-time changes of the materials will be included in slip initiation models coupling the weakening of both the friction and the damaged host rocks. Laboratory experiments will shed light on the transition of behavior from granular (shallow fault core) to cohesive (distant host rock) materials. Our initial data cover two well-studied fault regions of high earthquake probability (Southern California and the Marmara region, Turkey) and an area of induced seismicity (Groningen). The derived results and new versatile imaging and monitoring techniques can have fundamental social and economic impacts.
Max ERC Funding
2 434 743 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym FastBio
Project A genomics and systems biology approach to explore the molecular signature and functional consequences of long-term, structured fasting in humans
Researcher (PI) Antigoni DIMA
Host Institution (HI) BIOMEDICAL SCIENCES RESEARCH CENTER ALEXANDER FLEMING
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary Dietary intake has an enormous impact on aspects of human health, yet scientific consensus about how what we eat affects our biology remains elusive. To address the complex biological impact of diet, I propose to apply an unconventional, ‘humans-as-model-organisms’ approach to compare the molecular and functional effects of a highly structured dietary regime, specified by the Eastern Orthodox Christian Church (EOCC), to the unstructured diet followed by the general population. Individuals who follow the EOCC regime abstain from meat, dairy products and eggs for 180-200 days annually, in a temporally-structured manner initiated in childhood. I aim to explore the biological signatures of structured vs. unstructured diet by addressing three objectives. First I will investigate the effects of the two regimes, and of genetic variation, on higher-level phenotypes including anthropometric, physiological and biomarker traits. Second, I will carry out a comprehensive set of omics assays (metabolomics, transcriptomics, epigenomics and investigation of the gut microbiome), will associate omics phenotypes with genetic variation, and will integrate data across biological levels to uncover complex molecular signatures. Third, I will interrogate the functional consequences of dietary regimes at the cellular level through primary cell culture. Acute and long-term effects of dietary intake will be explored for all objectives through a two timepoint sampling strategy. This proposal therefore comprises a unique opportunity to study a specific perturbation (EOCC structured diet) introduced to a steady-state system (unstructured diet followed by the general population) in a ground-breaking human systems biology type of study. This approach brings together expertise from genomics, computational biology, statistics, medicine and epidemiology. It will lead to novel insights regarding the potent signalling nature of nutrients and is likely to yield results of high translational value.
Summary
Dietary intake has an enormous impact on aspects of human health, yet scientific consensus about how what we eat affects our biology remains elusive. To address the complex biological impact of diet, I propose to apply an unconventional, ‘humans-as-model-organisms’ approach to compare the molecular and functional effects of a highly structured dietary regime, specified by the Eastern Orthodox Christian Church (EOCC), to the unstructured diet followed by the general population. Individuals who follow the EOCC regime abstain from meat, dairy products and eggs for 180-200 days annually, in a temporally-structured manner initiated in childhood. I aim to explore the biological signatures of structured vs. unstructured diet by addressing three objectives. First I will investigate the effects of the two regimes, and of genetic variation, on higher-level phenotypes including anthropometric, physiological and biomarker traits. Second, I will carry out a comprehensive set of omics assays (metabolomics, transcriptomics, epigenomics and investigation of the gut microbiome), will associate omics phenotypes with genetic variation, and will integrate data across biological levels to uncover complex molecular signatures. Third, I will interrogate the functional consequences of dietary regimes at the cellular level through primary cell culture. Acute and long-term effects of dietary intake will be explored for all objectives through a two timepoint sampling strategy. This proposal therefore comprises a unique opportunity to study a specific perturbation (EOCC structured diet) introduced to a steady-state system (unstructured diet followed by the general population) in a ground-breaking human systems biology type of study. This approach brings together expertise from genomics, computational biology, statistics, medicine and epidemiology. It will lead to novel insights regarding the potent signalling nature of nutrients and is likely to yield results of high translational value.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym FIRMNET
Project Firms and Their Networks
Researcher (PI) Francis KRAMARZ
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), SH1, ERC-2016-ADG
Summary There is mounting evidence that firms are becoming more fragmented; production is less often made “in-house”. Firms buy inputs from abroad. Tasks are often split in parts. Some are offshored, others are subcontracted. Hence, firms buy services from other, local or international, firms. But they also supply inputs to other firms. Technical change, the internet, and globalization, all facilitate this transformation.
In order to better understand how firms thrive in the new global environment, the proposed research aims to construct a networks view of the firm. Fragmentation offers new opportunities: firms may specialize in what they make best, hence creating a business network of customers and suppliers. Networks are also useful to secure provision of fragmented tasks. The firms’ suppliers of goods and services – accountants, logisticians, consultants… -- may well be related to the firm through its workers’ social networks: family ties, boardroom relations… These social networks should be useful when times are tough -- board members could help find financing in banks where their schoolmates have a job – or when times are unusually good -- employees could help in spotting the right hires among their former co-workers.
The proposed research will focus on how firms social and business networks help firms to be resilient in the face of shocks. Resilience will be measured using the firms’ and workers’ outcomes – value-added, wages, employment, or occupations. The research will have a theoretical component using general equilibrium models with heterogeneous firms, an empirical component with unique data sources from at least two countries (France, Sweden), and an “econometric theory” component which will seek to develop techniques for the study of many-to-one matches in the presence of networks. The research will speak to the labor economics community but also to the international trade community, the management community, as well as the econometrics community.
Summary
There is mounting evidence that firms are becoming more fragmented; production is less often made “in-house”. Firms buy inputs from abroad. Tasks are often split in parts. Some are offshored, others are subcontracted. Hence, firms buy services from other, local or international, firms. But they also supply inputs to other firms. Technical change, the internet, and globalization, all facilitate this transformation.
In order to better understand how firms thrive in the new global environment, the proposed research aims to construct a networks view of the firm. Fragmentation offers new opportunities: firms may specialize in what they make best, hence creating a business network of customers and suppliers. Networks are also useful to secure provision of fragmented tasks. The firms’ suppliers of goods and services – accountants, logisticians, consultants… -- may well be related to the firm through its workers’ social networks: family ties, boardroom relations… These social networks should be useful when times are tough -- board members could help find financing in banks where their schoolmates have a job – or when times are unusually good -- employees could help in spotting the right hires among their former co-workers.
The proposed research will focus on how firms social and business networks help firms to be resilient in the face of shocks. Resilience will be measured using the firms’ and workers’ outcomes – value-added, wages, employment, or occupations. The research will have a theoretical component using general equilibrium models with heterogeneous firms, an empirical component with unique data sources from at least two countries (France, Sweden), and an “econometric theory” component which will seek to develop techniques for the study of many-to-one matches in the presence of networks. The research will speak to the labor economics community but also to the international trade community, the management community, as well as the econometrics community.
Max ERC Funding
1 753 288 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym FORENSICS
Project Illicit Markets, Unobserved Competitors, and Illegal Behavior
Researcher (PI) Michelle SOVINSKY
Host Institution (HI) UNIVERSITAET MANNHEIM
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary Many markets are characterized by illicit or illegal behavior by agents. To the extent that the empirical economic framework does not incorporate these unobserved actions or control for them in estimation, the resulting models are likely to be mis-specified. Naturally, if the models do not contain all elements relevant for decision making, then predictions based on the estimates will be misleading, which could result in incorrect policy recommendations. This project directly addresses three situations in which unobserved behavior plays a crucial role. The first concerns markets where consumers engage in illicit behavior. These markets are prevalent in society as they constitute the market for illicit drugs, which is estimated at more than $300 billion per year (UN, 2012). The second concerns markets where firms make strategic decisions in the presence of an unidentified competitor - a counterfeiter, where the global value of counterfeit products rivals that of illegal drugs (OECD, 2007). The third concerns situations where firms use legal tools for illegal purposes, for which the impact is challenging to quantify and one goal of this project. In each area, the project (i) develops state-of-the-art empirical models that incorporate illicit behaviors, (ii) proposes novel estimation methods that can be used to detect illegal behavior, and (iii) provides evidence that the proposed methodology is feasible and the data are sufficient to estimate the models. Incorporating and estimating unobserved behavior in a variety of settings is an ambitious undertaking. However, it is vital as a key objective of the proposal is to provide policy makers with tangible tools that accurately reflect the unobserved nature of these markets. Given the global significance of illicit markets, the novel concepts proposed, and the focus on policy, this project has the potential to make a sizable impact, both in and beyond academia, representing an ambitious but worthwhile pursuit.
Summary
Many markets are characterized by illicit or illegal behavior by agents. To the extent that the empirical economic framework does not incorporate these unobserved actions or control for them in estimation, the resulting models are likely to be mis-specified. Naturally, if the models do not contain all elements relevant for decision making, then predictions based on the estimates will be misleading, which could result in incorrect policy recommendations. This project directly addresses three situations in which unobserved behavior plays a crucial role. The first concerns markets where consumers engage in illicit behavior. These markets are prevalent in society as they constitute the market for illicit drugs, which is estimated at more than $300 billion per year (UN, 2012). The second concerns markets where firms make strategic decisions in the presence of an unidentified competitor - a counterfeiter, where the global value of counterfeit products rivals that of illegal drugs (OECD, 2007). The third concerns situations where firms use legal tools for illegal purposes, for which the impact is challenging to quantify and one goal of this project. In each area, the project (i) develops state-of-the-art empirical models that incorporate illicit behaviors, (ii) proposes novel estimation methods that can be used to detect illegal behavior, and (iii) provides evidence that the proposed methodology is feasible and the data are sufficient to estimate the models. Incorporating and estimating unobserved behavior in a variety of settings is an ambitious undertaking. However, it is vital as a key objective of the proposal is to provide policy makers with tangible tools that accurately reflect the unobserved nature of these markets. Given the global significance of illicit markets, the novel concepts proposed, and the focus on policy, this project has the potential to make a sizable impact, both in and beyond academia, representing an ambitious but worthwhile pursuit.
Max ERC Funding
1 212 934 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym GASPARCON
Project Molecular steps of gas-to-particle conversion: From oxidation to precursors, clusters and secondary aerosol particles.
Researcher (PI) Mikko SIPILÄ
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Starting Grant (StG), PE10, ERC-2016-STG
Summary Atmospheric aerosol particles impact Earth’s climate, by directly scattering sunlight and indirectly by affecting cloud properties. The largest uncertainties in climate change projections are associated with the atmospheric aerosol system that has been altered by anthropogenic activities. A major source of that uncertainty involves the formation of secondary particles and cloud condensation nuclei from natural and anthropogenic emissions of volatile compounds. This research challenge persists despite significant efforts within recent decades.
I will build a research group that aims to resolve the atmospheric oxidation processes that convert volatile trace gases to particle precursor vapours, clusters and new aerosol particles. We will create novel measurement techniques and utilize the tremendous potential of mass spectrometry for detection of i) particle precursor vapours ii) oxidants, both conventional but also recently discovered stabilized Criegee intermediates, and, most importantly, iii) newly formed clusters. These methods and instrumentation will be applied for resolving the initial steps of new particle formation on molecular level from oxidation to clusters and stable aerosol particles. To reach these goals, targeted laboratory and field experiments together with long term field measurements will be performed employing the state-of-the-art instrumentation developed.
Principal outcomes of this project include i) new experimental methods and techniques vital for atmospheric research and a deep understanding of ii) oxidation pathways producing aerosol particle precursors, iii) the initial molecular steps of new particle formation and iv) mechanisms of growth of freshly formed clusters toward larger sizes, particularly in the crucial size range of a few nanometers. The conceptual understanding obtained during this project will open multiple new research horizons from oxidation chemistry to Earth system modeling.
Summary
Atmospheric aerosol particles impact Earth’s climate, by directly scattering sunlight and indirectly by affecting cloud properties. The largest uncertainties in climate change projections are associated with the atmospheric aerosol system that has been altered by anthropogenic activities. A major source of that uncertainty involves the formation of secondary particles and cloud condensation nuclei from natural and anthropogenic emissions of volatile compounds. This research challenge persists despite significant efforts within recent decades.
I will build a research group that aims to resolve the atmospheric oxidation processes that convert volatile trace gases to particle precursor vapours, clusters and new aerosol particles. We will create novel measurement techniques and utilize the tremendous potential of mass spectrometry for detection of i) particle precursor vapours ii) oxidants, both conventional but also recently discovered stabilized Criegee intermediates, and, most importantly, iii) newly formed clusters. These methods and instrumentation will be applied for resolving the initial steps of new particle formation on molecular level from oxidation to clusters and stable aerosol particles. To reach these goals, targeted laboratory and field experiments together with long term field measurements will be performed employing the state-of-the-art instrumentation developed.
Principal outcomes of this project include i) new experimental methods and techniques vital for atmospheric research and a deep understanding of ii) oxidation pathways producing aerosol particle precursors, iii) the initial molecular steps of new particle formation and iv) mechanisms of growth of freshly formed clusters toward larger sizes, particularly in the crucial size range of a few nanometers. The conceptual understanding obtained during this project will open multiple new research horizons from oxidation chemistry to Earth system modeling.
Max ERC Funding
1 953 790 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym GENOMIS
Project Illuminating GENome Organization through integrated MIcroscopy and Sequencing
Researcher (PI) Marzena Magda BIENKO
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary In human cells, two meters of DNA sequence are compressed into a nucleus whose linear size is five orders of magnitude smaller. Deciphering how this amazing structural organization is achieved and how DNA functions can ensue in the environment of a cell’s nucleus represent central questions for contemporary biology.
Here, I embrace this challenge by establishing a comprehensive framework of microscopy and sequencing technologies coupled with advanced analytical approaches, aimed at addressing three fundamental highly-interconnected questions: 1) What are the design principles that govern DNA compaction? 2) How does genome structure vary between different cell types as well as among cells of the same type? 3) What is the link between genome structure and function? In preliminary experiments, we have devised a powerful method for Genomic loci Positioning by Sequencing (GPSeq) in fixed cells with optimally preserved nuclear morphology. In parallel, we are developing high-end microscopy tools for simultaneous localization of dozens of genomic locations at high resolution in thousands of single cells.
We will obtain first-ever genome-wide maps of radial positioning of DNA loci in the nucleus, and combine them with available DNA contact probability maps in order to build 3D models of the human genome structure in different cell types. Using microscopy, we will visualize chromosomal shapes at unprecedented resolution, and use these rich datasets to discover general DNA folding principles. Finally, by combining high-resolution chromosome visualization with gene expression profiling in single cells, we will explore the link between DNA structure and function. Our study shall illuminate the design principles that dictate how genetic information is packed and read in the human nucleus, while providing a comprehensive repertoire of tools for studying genome organization.
Summary
In human cells, two meters of DNA sequence are compressed into a nucleus whose linear size is five orders of magnitude smaller. Deciphering how this amazing structural organization is achieved and how DNA functions can ensue in the environment of a cell’s nucleus represent central questions for contemporary biology.
Here, I embrace this challenge by establishing a comprehensive framework of microscopy and sequencing technologies coupled with advanced analytical approaches, aimed at addressing three fundamental highly-interconnected questions: 1) What are the design principles that govern DNA compaction? 2) How does genome structure vary between different cell types as well as among cells of the same type? 3) What is the link between genome structure and function? In preliminary experiments, we have devised a powerful method for Genomic loci Positioning by Sequencing (GPSeq) in fixed cells with optimally preserved nuclear morphology. In parallel, we are developing high-end microscopy tools for simultaneous localization of dozens of genomic locations at high resolution in thousands of single cells.
We will obtain first-ever genome-wide maps of radial positioning of DNA loci in the nucleus, and combine them with available DNA contact probability maps in order to build 3D models of the human genome structure in different cell types. Using microscopy, we will visualize chromosomal shapes at unprecedented resolution, and use these rich datasets to discover general DNA folding principles. Finally, by combining high-resolution chromosome visualization with gene expression profiling in single cells, we will explore the link between DNA structure and function. Our study shall illuminate the design principles that dictate how genetic information is packed and read in the human nucleus, while providing a comprehensive repertoire of tools for studying genome organization.
Max ERC Funding
1 499 808 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym GENSURGE
Project Designer recombinases for efficient and safe genome surgery
Researcher (PI) Frank Buchholz
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Advanced Grant (AdG), LS2, ERC-2016-ADG
Summary Recent breakthroughs in the field of genome editing provide a genuine opportunity to establish innovative
approaches to repair DNA mutations to replace, engineer or regenerate malfunctioning cells in vitro or in
vivo. However, most of the recently developed technologies introduce double-strand DNA breaks at a target
locus as the first step to gene correction. These breaks are subsequently repaired by one of the cell intrinsic
DNA repair pathways, typically inducing an abundance of insertions and deletions (indels). Ideally, for many
applications genome editing should, however, be efficient and specific, without the introduction of indels.
Site-specific recombinases (SSRs) allow precise genome editing without triggering endogenous DNA repair
pathways and possess the unique ability to fulfill both cleavage and immediate resealing of the processed
DNA in vivo. However, customizing the DNA binding specificity of SSRs is not straightforward. With this
project, we propose to solve this shortcoming. We have already demonstrated that by applying substrate-linked
directed evolution, SSRs can be generated that specifically recognize therapeutic targets. The
objective of this project is the development of a universal genome editing platform that allows flexible,
efficient and safe gene corrections in cells of any origin without triggering cell intrinsic DNA repair.
GenSurge aims to: i) sequence an unprecedented, comprehensive compendium of evolved SSRs to
understand the directed molecular evolution process at nucleotide resolution; ii) integrate the knowledge
obtained in i) to develop a unique SSR-based approach to correct genomic inversions; iii) develop a
universal SSR-based strategy that allows flawless, precise and safe genome editing to correct any gene defect
in human, animal or plant cells. The successful implementation of this project will deliver a comprehensive,
safe and efficient platform from which genome surgery-based cure strategies can be initiated.
Summary
Recent breakthroughs in the field of genome editing provide a genuine opportunity to establish innovative
approaches to repair DNA mutations to replace, engineer or regenerate malfunctioning cells in vitro or in
vivo. However, most of the recently developed technologies introduce double-strand DNA breaks at a target
locus as the first step to gene correction. These breaks are subsequently repaired by one of the cell intrinsic
DNA repair pathways, typically inducing an abundance of insertions and deletions (indels). Ideally, for many
applications genome editing should, however, be efficient and specific, without the introduction of indels.
Site-specific recombinases (SSRs) allow precise genome editing without triggering endogenous DNA repair
pathways and possess the unique ability to fulfill both cleavage and immediate resealing of the processed
DNA in vivo. However, customizing the DNA binding specificity of SSRs is not straightforward. With this
project, we propose to solve this shortcoming. We have already demonstrated that by applying substrate-linked
directed evolution, SSRs can be generated that specifically recognize therapeutic targets. The
objective of this project is the development of a universal genome editing platform that allows flexible,
efficient and safe gene corrections in cells of any origin without triggering cell intrinsic DNA repair.
GenSurge aims to: i) sequence an unprecedented, comprehensive compendium of evolved SSRs to
understand the directed molecular evolution process at nucleotide resolution; ii) integrate the knowledge
obtained in i) to develop a unique SSR-based approach to correct genomic inversions; iii) develop a
universal SSR-based strategy that allows flawless, precise and safe genome editing to correct any gene defect
in human, animal or plant cells. The successful implementation of this project will deliver a comprehensive,
safe and efficient platform from which genome surgery-based cure strategies can be initiated.
Max ERC Funding
2 380 425 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym GEOSTICK
Project Morphodynamic Stickiness: the influence of physical and biological cohesion in sedimentary systems
Researcher (PI) Daniel Roy PARSONS
Host Institution (HI) UNIVERSITY OF HULL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Our coasts, estuaries, & low-land river environments are some of the most sensitive systems to sea-level rise & environmental change. In order to manage these systems, & adapt to future changes, we desperately need to be able to predict how they will alter under various scenarios. However, our models for these environments are not yet robust enough to predict, with confidence, very far into the future. Moreover, we also need to improve how we use our understanding of modern environments in reconstructing paleo-environments, where significant assumptions have been made in the way in which relationships derived from the modern have been applied to ancient rocks.
One of the main reasons our models, & geological interpretations, of these environments, are not yet good enough is because these models have formulations that are based on assumptions that these systems are composed of only non-cohesive sands. However, mud is the most common sediment on Earth & many of these systems are actually dominated by biologically-active muds & complex sediment mixtures. We need to therefore find ways to incorporate the effect of sticky mud & sticky biological components into our predictions. Recent work my colleagues & I have published show just how important such abiotic-biotic interactions can be: inclusion of only relatively small (<0.1% by mass) quantities of biological material into sediment mixtures can reduce alluvial bedform size by an order of magnitude.
However, this is just a start & there is much to do in order to advance our fundamental understanding & develop robust models that predict the combined effects of abiotic & biotic processes on morphological evolution of these environments under changing drivers & conditions. GEOSTICK will deliver this advance allowing us to test how sensitive these environments are, assess if there are tipping points in their resilience & examine evidence for the evolution of life in the ancient sediments of early Earth and Mars.
Summary
Our coasts, estuaries, & low-land river environments are some of the most sensitive systems to sea-level rise & environmental change. In order to manage these systems, & adapt to future changes, we desperately need to be able to predict how they will alter under various scenarios. However, our models for these environments are not yet robust enough to predict, with confidence, very far into the future. Moreover, we also need to improve how we use our understanding of modern environments in reconstructing paleo-environments, where significant assumptions have been made in the way in which relationships derived from the modern have been applied to ancient rocks.
One of the main reasons our models, & geological interpretations, of these environments, are not yet good enough is because these models have formulations that are based on assumptions that these systems are composed of only non-cohesive sands. However, mud is the most common sediment on Earth & many of these systems are actually dominated by biologically-active muds & complex sediment mixtures. We need to therefore find ways to incorporate the effect of sticky mud & sticky biological components into our predictions. Recent work my colleagues & I have published show just how important such abiotic-biotic interactions can be: inclusion of only relatively small (<0.1% by mass) quantities of biological material into sediment mixtures can reduce alluvial bedform size by an order of magnitude.
However, this is just a start & there is much to do in order to advance our fundamental understanding & develop robust models that predict the combined effects of abiotic & biotic processes on morphological evolution of these environments under changing drivers & conditions. GEOSTICK will deliver this advance allowing us to test how sensitive these environments are, assess if there are tipping points in their resilience & examine evidence for the evolution of life in the ancient sediments of early Earth and Mars.
Max ERC Funding
2 581 155 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym GeroProtect
Project Developing Geroprotectors to Prevent Polymorbidity
Researcher (PI) Linda PARTRIDGE
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2016-ADG
Summary Advancing age is the major risk factor for disability and illness, including cardiovascular, metabolic and neu-rodegenerative disease and cancer. The increasing incidence of older people in European countries is posing major medical, social and economic challenges, and there is an urgent need to find ways of compressing late-life morbidity. Ageing has proved malleable to genetic and pharmacological interventions in laboratory animals, and at least some of the mechanisms are conserved over large evolutionary distances. Reduced activity of the nutrient-sensing insulin/insulin-like growth factor/TOR signalling network can increase health and combat ageing-related disease in laboratory animals, with increasing evidence of its importance in human ageing. There is thus a prospect for pharmacological intervention to prevent more than one ageing-related condition, rather than tackling diseases one by one and as they arise. The aim of this research programme is to evaluate the potential for pharmacological prevention of ageing-related decline in humans with a polypill targeting the nutrient-sensing network. We find that three licensed drugs, lithium, rapamycin and trametinib, act independently, at different nodes in the network, to increase lifespan in the fruitfly Drosophila, implying that the network controls more than one underlying mechanism of ageing, and that a polypill of these drugs could be particularly effective. We shall test this idea in mice, and assess the underlying mechanisms in Drosophila and mice. We have found that suppression of the Ras signalling branch of the network, which has a well known role in human cancer, can extend lifespan in both the fruitfly Drosophila and mice, and we shall assess its role in humans. Interventions that ameliorate ageing often have sex-specific effects, and we shall investigate the mechanisms leading to these for the nutrient-sensing network. The outputs of the work will inform future clinical trails in humans.
Summary
Advancing age is the major risk factor for disability and illness, including cardiovascular, metabolic and neu-rodegenerative disease and cancer. The increasing incidence of older people in European countries is posing major medical, social and economic challenges, and there is an urgent need to find ways of compressing late-life morbidity. Ageing has proved malleable to genetic and pharmacological interventions in laboratory animals, and at least some of the mechanisms are conserved over large evolutionary distances. Reduced activity of the nutrient-sensing insulin/insulin-like growth factor/TOR signalling network can increase health and combat ageing-related disease in laboratory animals, with increasing evidence of its importance in human ageing. There is thus a prospect for pharmacological intervention to prevent more than one ageing-related condition, rather than tackling diseases one by one and as they arise. The aim of this research programme is to evaluate the potential for pharmacological prevention of ageing-related decline in humans with a polypill targeting the nutrient-sensing network. We find that three licensed drugs, lithium, rapamycin and trametinib, act independently, at different nodes in the network, to increase lifespan in the fruitfly Drosophila, implying that the network controls more than one underlying mechanism of ageing, and that a polypill of these drugs could be particularly effective. We shall test this idea in mice, and assess the underlying mechanisms in Drosophila and mice. We have found that suppression of the Ras signalling branch of the network, which has a well known role in human cancer, can extend lifespan in both the fruitfly Drosophila and mice, and we shall assess its role in humans. Interventions that ameliorate ageing often have sex-specific effects, and we shall investigate the mechanisms leading to these for the nutrient-sensing network. The outputs of the work will inform future clinical trails in humans.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym GLOBALFIRMS
Project Global Firms and Global Value Chains: Measurement and Mechanisms
Researcher (PI) Kalina MANOVA
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary The growing fragmentation of production across firms and countries has revolutionized international trade in recent decades. Firms today choose which production stages to conduct themselves and which to outsource to other parties, which to complete at home and which to offshore abroad. Known as global value chains (GVCs), this phenomenon creates new challenges and opportunities for individual firms and aggregate economies. Of primary policy interest are the implications of GVCs for growth, the transmission of shocks across firms and borders, and the design of economic policies. Yet academic research has faced two major challenges: poor measurement and poorly understood mechanisms.
I propose an ambitious research program that will use exceptional new data and novel GVC measures for path-breaking GVC analysis. First, I will exploit unique panel data on firm production, management practices, export and import transactions for the world’s two largest export economies, China and the US; and unique panel data on firm production, export and import transactions, and the network of domestic firm-to-firm transactions for one of the most open economies, Belgium. Second, I will develop measures that comprehensively characterize three dimensions of firms’ GVC activity: value added (total/domestic/foreign), production line position (upstreamness), and network position (centrality). Third, I will empirically and theoretically examine the impact of GVCs on firm growth, shock transmission, and export-finance policy through six synergistic projects. Each project will make a distinct contribution by investigating new economic mechanisms, establishing new empirical facts, and combining theory and data for informative welfare calculations.
The novelty of the data and the complex mechanisms driving GVCs make this research program highly ambitious. At the same time, the importance of understanding GVCs for economic policy and academic research make this agenda extraordinarily high-return.
Summary
The growing fragmentation of production across firms and countries has revolutionized international trade in recent decades. Firms today choose which production stages to conduct themselves and which to outsource to other parties, which to complete at home and which to offshore abroad. Known as global value chains (GVCs), this phenomenon creates new challenges and opportunities for individual firms and aggregate economies. Of primary policy interest are the implications of GVCs for growth, the transmission of shocks across firms and borders, and the design of economic policies. Yet academic research has faced two major challenges: poor measurement and poorly understood mechanisms.
I propose an ambitious research program that will use exceptional new data and novel GVC measures for path-breaking GVC analysis. First, I will exploit unique panel data on firm production, management practices, export and import transactions for the world’s two largest export economies, China and the US; and unique panel data on firm production, export and import transactions, and the network of domestic firm-to-firm transactions for one of the most open economies, Belgium. Second, I will develop measures that comprehensively characterize three dimensions of firms’ GVC activity: value added (total/domestic/foreign), production line position (upstreamness), and network position (centrality). Third, I will empirically and theoretically examine the impact of GVCs on firm growth, shock transmission, and export-finance policy through six synergistic projects. Each project will make a distinct contribution by investigating new economic mechanisms, establishing new empirical facts, and combining theory and data for informative welfare calculations.
The novelty of the data and the complex mechanisms driving GVCs make this research program highly ambitious. At the same time, the importance of understanding GVCs for economic policy and academic research make this agenda extraordinarily high-return.
Max ERC Funding
1 462 304 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym GLOBALMACRO
Project Global Production Networks and Macroeconomic Interdependence
Researcher (PI) Julian DI GIOVANNI
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Consolidator Grant (CoG), SH1, ERC-2016-COG
Summary Researchers and policymakers alike have highlighted the potential efficiency gains of a global production structure. However, such linkages also raise the possibility of risks. This proposal tackles both empirical and theoretical challenges in incorporating the microeconomic structure of trade and international production networks in the study of the propagation of shocks internationally, and their impact on macroeconomic interdependence. Using newly constructed micro-level datasets, I provide quantitative analysis of the importance of the linkages in multicountry general equilibrium models of trade. First, using firm export and imported-input linkages, I provide a novel model-based estimation strategy to identify the role of country and firm-level shocks, and the implications of these estimates for the transmission of shocks across borders. By using structural trade models to estimate shocks at the firm level and studying the implications for the transmission of shocks across borders, I help bridge the micro-macro nexus in international economics. Second, I take an even more granular focus by studying the role of firm-to-firm production linkages in transmitting shocks across countries. To do so, I exploit a novel matching procedure between a country’s administrative dataset and cross-country firm-level data. I further build on these data by adding in domestic bank-firm relationships. This strategy allows for the study of how financial shocks are exported abroad via firms’ trade and multinational linkages. Third, I incorporate the insights from the empirical work into a full-scale multicountry general equilibrium model of trade, which allows for firm-level heterogeneity and microeconomic and macroeconomics shocks. I use the model for a quantitative study of the cross-country transmission of the different shocks via trade. This allows me to perform counterfactuals and examine the impact of policies, such as how opening to trade impacts macroeconomic interdependence.
Summary
Researchers and policymakers alike have highlighted the potential efficiency gains of a global production structure. However, such linkages also raise the possibility of risks. This proposal tackles both empirical and theoretical challenges in incorporating the microeconomic structure of trade and international production networks in the study of the propagation of shocks internationally, and their impact on macroeconomic interdependence. Using newly constructed micro-level datasets, I provide quantitative analysis of the importance of the linkages in multicountry general equilibrium models of trade. First, using firm export and imported-input linkages, I provide a novel model-based estimation strategy to identify the role of country and firm-level shocks, and the implications of these estimates for the transmission of shocks across borders. By using structural trade models to estimate shocks at the firm level and studying the implications for the transmission of shocks across borders, I help bridge the micro-macro nexus in international economics. Second, I take an even more granular focus by studying the role of firm-to-firm production linkages in transmitting shocks across countries. To do so, I exploit a novel matching procedure between a country’s administrative dataset and cross-country firm-level data. I further build on these data by adding in domestic bank-firm relationships. This strategy allows for the study of how financial shocks are exported abroad via firms’ trade and multinational linkages. Third, I incorporate the insights from the empirical work into a full-scale multicountry general equilibrium model of trade, which allows for firm-level heterogeneity and microeconomic and macroeconomics shocks. I use the model for a quantitative study of the cross-country transmission of the different shocks via trade. This allows me to perform counterfactuals and examine the impact of policies, such as how opening to trade impacts macroeconomic interdependence.
Max ERC Funding
1 381 250 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym GLOBALPROD
Project The Global and Local Organization of Production
Researcher (PI) Andreas MOXNES
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary A defining feature of the global economy is the gradual fragmentation of production across firms and
borders, a phenomenon that has been termed outsourcing or global value chains.
State-of-the-art empirical economic analysis on value chains has mostly been limited to the study of
aggregate data because there is limited data on actual firm-to-firm linkages in the global economy. Even less
is currently known about which products are typically outsourced, and which workers are affected.
This project will change that. I will bring together four unique firm-to-firm datasets on local and global value
chains that will push the research frontier forward in two main directions:
- Previous research has shown that economic integration encourages growth. Due to data limitations,
however, we know little about the origins of growth, and to what extent the emergence of value chains can
explain the growth response. New theory is needed, where firm-to-firm connections are endogenously
formed in response to economic integration. I will confront theory with data and directly test whether
integration facilitates new buyer-supplier relationships and growth.
- Previous research has found that economic integration has large negative effects on wages for low-skill
workers. But again, due to data limitations, it is unclear to what extent value chains are responsible for this.
Simply put, the impact of outsourcing on wages will depend on which workers are displaced by outsourcing.
Until now, researchers have not been able to observe which workers, along with their occupations and
skills, that are employed in both the supplying and outsourcing firm. For the first time, this information will
be available, allowing for a rich analysis of labor market effects for different skill groups.
GLOBALPROD will inform policymakers about how wages for different types of skills change in response
to globalization, but also how economic integration can promote efficiency and competitiveness.
Summary
A defining feature of the global economy is the gradual fragmentation of production across firms and
borders, a phenomenon that has been termed outsourcing or global value chains.
State-of-the-art empirical economic analysis on value chains has mostly been limited to the study of
aggregate data because there is limited data on actual firm-to-firm linkages in the global economy. Even less
is currently known about which products are typically outsourced, and which workers are affected.
This project will change that. I will bring together four unique firm-to-firm datasets on local and global value
chains that will push the research frontier forward in two main directions:
- Previous research has shown that economic integration encourages growth. Due to data limitations,
however, we know little about the origins of growth, and to what extent the emergence of value chains can
explain the growth response. New theory is needed, where firm-to-firm connections are endogenously
formed in response to economic integration. I will confront theory with data and directly test whether
integration facilitates new buyer-supplier relationships and growth.
- Previous research has found that economic integration has large negative effects on wages for low-skill
workers. But again, due to data limitations, it is unclear to what extent value chains are responsible for this.
Simply put, the impact of outsourcing on wages will depend on which workers are displaced by outsourcing.
Until now, researchers have not been able to observe which workers, along with their occupations and
skills, that are employed in both the supplying and outsourcing firm. For the first time, this information will
be available, allowing for a rich analysis of labor market effects for different skill groups.
GLOBALPROD will inform policymakers about how wages for different types of skills change in response
to globalization, but also how economic integration can promote efficiency and competitiveness.
Max ERC Funding
1 476 948 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym GLYCONOISE
Project Emergent properties of cell surface glycosylation in cell-cell communication
Researcher (PI) Christoph Johannes Heinrich RADEMACHER
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), LS2, ERC-2016-STG
Summary The surface of every living cell is covered with a dense matrix of glycans. Its particular composition and structure codes important messages in cell-cell communication, influencing development, differentiation, and immunological processes. The matrix is formed by highly complex biopolymers whose compositions vary from cell to cell, even between genetically identical cells. This gives rise to population noise in cell-cell communication. A second level of noise stems from glycans present on the same cell that disturb the decoding of the message by glycans binding receptors through competitive binding. Glycan-based communication is characterized by a high redundancy of both glycans and their receptors. Thus, noise and redundancy emerge as key properties of glycan-based cell-cell communication, but their extent and function are poorly understood.
By adapting a transmitter-receiver model from communication sciences and combining it with state-of-the-art experimental techniques from biophysics and cell biology, we will address two fundamental questions: What is the role of the redundancy in glycan-based communication? How much ‚noise’ can it tolerate, before the message is lost?
To do so, we first establish a simplified model system for glycan-based communication. Biophysical rate constants are determined for lectin-glycan interactions and expanded to glycosylated microparticles that trigger a biological response in lectin expressing receiver cells. Next, single cell glycomes are reconstructed from ultra-high dimensional flow cytometry data using lectin mixtures enabled by recent advancements in instrumentation and glycobioinformatics software. Glycomes accessible on single cell level allow replacing the microparticles with transmitter cells and employ a cell-cell interaction model. Our transmitter-receiver model is used to quantify the noise and reveals how redundancy provides robustness of messaging by cell surface glycans in cellular communication.
Summary
The surface of every living cell is covered with a dense matrix of glycans. Its particular composition and structure codes important messages in cell-cell communication, influencing development, differentiation, and immunological processes. The matrix is formed by highly complex biopolymers whose compositions vary from cell to cell, even between genetically identical cells. This gives rise to population noise in cell-cell communication. A second level of noise stems from glycans present on the same cell that disturb the decoding of the message by glycans binding receptors through competitive binding. Glycan-based communication is characterized by a high redundancy of both glycans and their receptors. Thus, noise and redundancy emerge as key properties of glycan-based cell-cell communication, but their extent and function are poorly understood.
By adapting a transmitter-receiver model from communication sciences and combining it with state-of-the-art experimental techniques from biophysics and cell biology, we will address two fundamental questions: What is the role of the redundancy in glycan-based communication? How much ‚noise’ can it tolerate, before the message is lost?
To do so, we first establish a simplified model system for glycan-based communication. Biophysical rate constants are determined for lectin-glycan interactions and expanded to glycosylated microparticles that trigger a biological response in lectin expressing receiver cells. Next, single cell glycomes are reconstructed from ultra-high dimensional flow cytometry data using lectin mixtures enabled by recent advancements in instrumentation and glycobioinformatics software. Glycomes accessible on single cell level allow replacing the microparticles with transmitter cells and employ a cell-cell interaction model. Our transmitter-receiver model is used to quantify the noise and reveals how redundancy provides robustness of messaging by cell surface glycans in cellular communication.
Max ERC Funding
1 499 813 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym GOCART
Project Gauging Ocean organic Carbon fluxes using Autonomous Robotic Technologies
Researcher (PI) Stephanie Anne HENSON
Host Institution (HI) NATURAL ENVIRONMENT RESEARCH COUNCIL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary Climate change driven by CO2 emissions from human activities is a significant challenge facing mankind. An important component of Earth’s carbon (C) cycle is the ocean’s biological C pump; without it atmospheric CO2 would be ~50% higher than it is now. The pump consists of sinking organic matter which is remineralised back into CO2 in the deep ocean. The depth at which remineralisation occurs is the main factor affecting the amount of organic C stored in the ocean. Currently we do not understand how or why remineralisation depth varies in time, which limits our ability to make robust predictions of how the future C cycle, and hence our climate, will change into the future. This is mainly due to the challenges of measuring remineralisation depth using conventional methods– a barrier which autonomous underwater vehicles are poised to overcome by providing high frequency data over long periods. This technological innovation will revolutionise our understanding of this important planetary C flux.
I propose an ambitious project to address current uncertainties in remineralisation depth. GOCART encompasses new observations, obtained using cutting-edge technology and novel methodology, through to global climate modelling. Underwater glider deployments will be used to establish the characteristics and significance of temporal variability in organic C flux and remineralisation depth during the most dynamic period of the year. This will enable new insights into the factors driving variability in remineralisation depth, ultimately leading to development of a new model parameterisation incorporating temporal variability. Using an innovative modelling framework, this parameterisation will be tested for its potential to improve predictions of ocean C storage. GOCART represents a significant advance in quantifying temporal variability in remineralisation depth, which is key to reducing uncertainty in model predictions of ocean C storage, and yet currently almost entirely unknown.
Summary
Climate change driven by CO2 emissions from human activities is a significant challenge facing mankind. An important component of Earth’s carbon (C) cycle is the ocean’s biological C pump; without it atmospheric CO2 would be ~50% higher than it is now. The pump consists of sinking organic matter which is remineralised back into CO2 in the deep ocean. The depth at which remineralisation occurs is the main factor affecting the amount of organic C stored in the ocean. Currently we do not understand how or why remineralisation depth varies in time, which limits our ability to make robust predictions of how the future C cycle, and hence our climate, will change into the future. This is mainly due to the challenges of measuring remineralisation depth using conventional methods– a barrier which autonomous underwater vehicles are poised to overcome by providing high frequency data over long periods. This technological innovation will revolutionise our understanding of this important planetary C flux.
I propose an ambitious project to address current uncertainties in remineralisation depth. GOCART encompasses new observations, obtained using cutting-edge technology and novel methodology, through to global climate modelling. Underwater glider deployments will be used to establish the characteristics and significance of temporal variability in organic C flux and remineralisation depth during the most dynamic period of the year. This will enable new insights into the factors driving variability in remineralisation depth, ultimately leading to development of a new model parameterisation incorporating temporal variability. Using an innovative modelling framework, this parameterisation will be tested for its potential to improve predictions of ocean C storage. GOCART represents a significant advance in quantifying temporal variability in remineralisation depth, which is key to reducing uncertainty in model predictions of ocean C storage, and yet currently almost entirely unknown.
Max ERC Funding
1 999 110 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym HemTree2.0
Project Single cell genomic analysis and perturbations of hematopoietic progenitors: Towards a refined model of hematopoiesis
Researcher (PI) Ido AMIT
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Consolidator Grant (CoG), LS2, ERC-2016-COG
Summary Hematopoiesis is an important model for stem cell differentiation with great medical significance.
Heterogeneity within hematopoietic progenitor populations has considerably limited characterization and
molecular understanding of lineage commitment in both health and disease. Advances in single-cell genomic
technologies provide an extraordinary opportunity for unbiased and high resolution mapping of biological
function and regulation. Recently we have developed an experimental and analytical method, termed
massively parallel single-cell RNA-Seq (MARS-Seq), for unbiased classification of individual cells from
their native context and successfully applied it for characterization of immune and hematopoietic
progenitors.
Here, we propose to uncover the hierarchy and regulatory mechanisms controlling hematopoiesis by
combining comprehensive single-cell RNA-Seq analyses, modelling approaches, advanced functional assays,
single-cell CRISPR screens, knockout models and epigenetic profiling. Exciting preliminary result show that
indeed our approach is starting to uncover the complexity of hematopoietic progenitors and the regulatory
circuits driving hematopoietic decisions. We will pursue the following aims: (i) Generate a refined model of
hematopoiesis by comprehensive single-cell RNA-Seq profiling of hematopoietic progenitors, (ii) validate
the predicted model by in vivo functional developmental assays and then (iii) test candidate transcription and
chromatin factors uncovered by our model for their role in controlling progression towards various lineages
using single-cell measurements combined with CRISPR screens. Together, our study is expected to generate
a revised and high-resolution hematopoietic model and decipher the regulatory networks that control
hematopoiesis. Our methods and models may provide a platform for future medical advancements including
a large-scale European collaborative project to discover a comprehensive human hematopoietic tree.
Summary
Hematopoiesis is an important model for stem cell differentiation with great medical significance.
Heterogeneity within hematopoietic progenitor populations has considerably limited characterization and
molecular understanding of lineage commitment in both health and disease. Advances in single-cell genomic
technologies provide an extraordinary opportunity for unbiased and high resolution mapping of biological
function and regulation. Recently we have developed an experimental and analytical method, termed
massively parallel single-cell RNA-Seq (MARS-Seq), for unbiased classification of individual cells from
their native context and successfully applied it for characterization of immune and hematopoietic
progenitors.
Here, we propose to uncover the hierarchy and regulatory mechanisms controlling hematopoiesis by
combining comprehensive single-cell RNA-Seq analyses, modelling approaches, advanced functional assays,
single-cell CRISPR screens, knockout models and epigenetic profiling. Exciting preliminary result show that
indeed our approach is starting to uncover the complexity of hematopoietic progenitors and the regulatory
circuits driving hematopoietic decisions. We will pursue the following aims: (i) Generate a refined model of
hematopoiesis by comprehensive single-cell RNA-Seq profiling of hematopoietic progenitors, (ii) validate
the predicted model by in vivo functional developmental assays and then (iii) test candidate transcription and
chromatin factors uncovered by our model for their role in controlling progression towards various lineages
using single-cell measurements combined with CRISPR screens. Together, our study is expected to generate
a revised and high-resolution hematopoietic model and decipher the regulatory networks that control
hematopoiesis. Our methods and models may provide a platform for future medical advancements including
a large-scale European collaborative project to discover a comprehensive human hematopoietic tree.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym HETEROPOLIS
Project The Design of Social Policy in a Heterogeneous World
Researcher (PI) Johannes SPINNEWIJN
Host Institution (HI) LONDON SCHOOL OF ECONOMICS AND POLITICAL SCIENCE
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary Modern societies are characterized by tremendous heterogeneity in economic outcomes: from heterogeneity in wages and employment, to heterogeneity in capital income, wealth and health outcomes. It is unclear, however, how to map heterogeneity in these outcomes to heterogeneity in welfare. This mapping is crucial for the design of tax and benefit systems, providing insurance against individual risk and redistributing income between individuals, while maintaining proper incentives.
The main objectives of HETEROPOLIS are: 1) to provide new insights on the relation between inequality in earnings, wealth and consumption, 2) to develop a new consumption-based method to measure welfare inequality and heterogeneity in the marginal value of social transfers, 3) to provide and implement a simple, but general evidence-based framework to evaluate the differential design of social insurance based on observable heterogeneity, 4) to analyse selection effects due to unobservable heterogeneity and how they affect social insurance design, 5) to analyse heterogeneity in behavioural “biases” and their consequences for policy design.
The first part of HETEROPOLIS analyses the use of registry-based consumption measures to evaluate heterogeneity in welfare and exploits a newly developed data set based on administrative registers for the universe of Swedish households providing comprehensive and detailed information on income, wealth, labour market outcomes and other variables. The second part develops and implements a general evidence-based framework to evaluate the design of multi-faceted social insurance programs in a heterogeneous world. The final part of HETEROPOLIS analyses and estimates different sources of heterogeneity that affect market efficiency and justify further government interventions.
Summary
Modern societies are characterized by tremendous heterogeneity in economic outcomes: from heterogeneity in wages and employment, to heterogeneity in capital income, wealth and health outcomes. It is unclear, however, how to map heterogeneity in these outcomes to heterogeneity in welfare. This mapping is crucial for the design of tax and benefit systems, providing insurance against individual risk and redistributing income between individuals, while maintaining proper incentives.
The main objectives of HETEROPOLIS are: 1) to provide new insights on the relation between inequality in earnings, wealth and consumption, 2) to develop a new consumption-based method to measure welfare inequality and heterogeneity in the marginal value of social transfers, 3) to provide and implement a simple, but general evidence-based framework to evaluate the differential design of social insurance based on observable heterogeneity, 4) to analyse selection effects due to unobservable heterogeneity and how they affect social insurance design, 5) to analyse heterogeneity in behavioural “biases” and their consequences for policy design.
The first part of HETEROPOLIS analyses the use of registry-based consumption measures to evaluate heterogeneity in welfare and exploits a newly developed data set based on administrative registers for the universe of Swedish households providing comprehensive and detailed information on income, wealth, labour market outcomes and other variables. The second part develops and implements a general evidence-based framework to evaluate the design of multi-faceted social insurance programs in a heterogeneous world. The final part of HETEROPOLIS analyses and estimates different sources of heterogeneity that affect market efficiency and justify further government interventions.
Max ERC Funding
1 497 505 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym HOPE
Project Humans On Planet Earth - Long-term impacts on biosphere dynamics
Researcher (PI) Harry John Betteley BIRKS
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary A critical question in Earth system science is what was the impact of prehistoric people on the biosphere and climate? There is much information about human impact through clearance, agriculture, erosion, and modifying water and nutrient budgets. Humans have greatly changed the Earth in the last 8000 years, but did humans modify the major ecological processes (e.g. assembly rules) that shape community assembly and dynamics? Did inter-relationships between processes change in response to human impact? Lyons et al. & Dietl (2016 Nature) suggest that human activities in the last 6000 years had such impacts. Dietl proposes that using past ‘natural experiments’ to predict future changes is “flawed” and “out is the use of uniformitarianism”. As using natural experiments is a common strategy and uniformitarianism is the major working concept in Earth sciences, it is imperative to test whether prehistoric human activity changed major ecological processes determining community development. To test this hypothesis, patterns in pollen-stratigraphical data for the past 11,500 years from over 2000 sites across the globe will be explored consistently using numerical techniques to discern changes in 25 ecosystem properties (richness, evenness, and diversity; turnover; rates of change; taxon co-occurrences, etc.). Patterns in these properties will be compared statistically at sites within biomes, between biomes, within continents, and between continents to test the hypotheses that prehistoric human activities changed the basic ecological processes of community assembly and that their inter-relationships changed through time. These areas provide major contrasts in human prehistory and biomes. HOPE is interdisciplinary: pollen analysis, databases, multivariate analysis, ecology, new statistical methods, numerical simulations, statistical modelling. HOPE’s impact goes beyond human effects on the biosphere and extends to the very core of Earth science’s basic conceptual framework.
Summary
A critical question in Earth system science is what was the impact of prehistoric people on the biosphere and climate? There is much information about human impact through clearance, agriculture, erosion, and modifying water and nutrient budgets. Humans have greatly changed the Earth in the last 8000 years, but did humans modify the major ecological processes (e.g. assembly rules) that shape community assembly and dynamics? Did inter-relationships between processes change in response to human impact? Lyons et al. & Dietl (2016 Nature) suggest that human activities in the last 6000 years had such impacts. Dietl proposes that using past ‘natural experiments’ to predict future changes is “flawed” and “out is the use of uniformitarianism”. As using natural experiments is a common strategy and uniformitarianism is the major working concept in Earth sciences, it is imperative to test whether prehistoric human activity changed major ecological processes determining community development. To test this hypothesis, patterns in pollen-stratigraphical data for the past 11,500 years from over 2000 sites across the globe will be explored consistently using numerical techniques to discern changes in 25 ecosystem properties (richness, evenness, and diversity; turnover; rates of change; taxon co-occurrences, etc.). Patterns in these properties will be compared statistically at sites within biomes, between biomes, within continents, and between continents to test the hypotheses that prehistoric human activities changed the basic ecological processes of community assembly and that their inter-relationships changed through time. These areas provide major contrasts in human prehistory and biomes. HOPE is interdisciplinary: pollen analysis, databases, multivariate analysis, ecology, new statistical methods, numerical simulations, statistical modelling. HOPE’s impact goes beyond human effects on the biosphere and extends to the very core of Earth science’s basic conceptual framework.
Max ERC Funding
2 278 884 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym HYDROCARB
Project Hydrogen isotopes in plant-derived organic compounds as new tool to identify changes in the carbon metabolism of plants and ecosystems during the anthropocene
Researcher (PI) Ansgar KAHMEN
Host Institution (HI) UNIVERSITAT BASEL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary HYDROCARB is motivated by the enormous potential that stable hydrogen isotope ratios (δ2H values) in plant compounds have as hydrological proxy, but in particular as new proxy for the carbon metabolism in plants. Current conceptual models suggest that δ2H values in plant organic compounds are composed of (i) hydrological and (ii) metabolic signals. The hydrological information that is contained in δ2H values of plant material is now well understood and is often applied in (paleo-) hydrological research. In contrast, the metabolic information that is contained in plant δ2H values is mostly unknown. Intriguing recent research suggests, however, that metabolic signals in the δ2H values of plant organic compounds reflect the balance of autotrophic and heterotrophic processes in plants. This suggests that exciting and previously unknown opportunities exist to exploit δ2H values in plant compounds for information on the carbohydrate metabolism of plants, which would be relevant for a broad range of biological and biogeochemical disciplines.
The goal of HYDROCARB is to perform the experimental work that is now needed to identify the key biochemical and physiological processes that determine the metabolic information that is recorded in the δ2H values of plant organic compounds such as leaf wax lipids, lignin and cellulose. With this HYDROCARB will provide the basis for semi-mechanistic models that will allow (i) disentangling hydrological from metabolic signals in plant δ2H values and (ii) identifying the precise physiological processes with respect to a plants carbohydrate metabolism that can be deducted from the δ2H values of different plant compounds. If successful, HYDROCARB will establish with this research δ2H values in plant organic compounds as a powerful new proxy that will allow ground-breaking and innovative research on plant and ecosystem carbon cycling, which has implications for plant biology, biogeochemistry and the earth system sciences.
Summary
HYDROCARB is motivated by the enormous potential that stable hydrogen isotope ratios (δ2H values) in plant compounds have as hydrological proxy, but in particular as new proxy for the carbon metabolism in plants. Current conceptual models suggest that δ2H values in plant organic compounds are composed of (i) hydrological and (ii) metabolic signals. The hydrological information that is contained in δ2H values of plant material is now well understood and is often applied in (paleo-) hydrological research. In contrast, the metabolic information that is contained in plant δ2H values is mostly unknown. Intriguing recent research suggests, however, that metabolic signals in the δ2H values of plant organic compounds reflect the balance of autotrophic and heterotrophic processes in plants. This suggests that exciting and previously unknown opportunities exist to exploit δ2H values in plant compounds for information on the carbohydrate metabolism of plants, which would be relevant for a broad range of biological and biogeochemical disciplines.
The goal of HYDROCARB is to perform the experimental work that is now needed to identify the key biochemical and physiological processes that determine the metabolic information that is recorded in the δ2H values of plant organic compounds such as leaf wax lipids, lignin and cellulose. With this HYDROCARB will provide the basis for semi-mechanistic models that will allow (i) disentangling hydrological from metabolic signals in plant δ2H values and (ii) identifying the precise physiological processes with respect to a plants carbohydrate metabolism that can be deducted from the δ2H values of different plant compounds. If successful, HYDROCARB will establish with this research δ2H values in plant organic compounds as a powerful new proxy that will allow ground-breaking and innovative research on plant and ecosystem carbon cycling, which has implications for plant biology, biogeochemistry and the earth system sciences.
Max ERC Funding
1 999 941 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym IASI-FT
Project IASI - Flux and temperature
Researcher (PI) Cathy CLERBAUX
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary IASI - Flux and temperature
July 2016 was Earth's warmest month on record. The first six months of 2016 were also the warmest six-month period since modern meteorology observations began. This, along with the recent so-called “hiatus” in the warming trend, and the Paris climate agreement, all attracted scientific and public attention as to how reliable the historical temperature record is, and to the level of confidence in future model climate projections. Although the role of satellites in observing the variability and change of the Earth system has increased in recent decades, remotely-sensed observations are still underexploited to accurately assess climate change fingerprints. The IASI - Flux and Temperature (IASI-FT) project aims at providing new benchmarks for top-of-atmosphere radiative flux and temperature observations using the calibrated radiances measured twice a day at any location by the IASI instrument on the suite of MetOp satellites.
The main challenge is to achieve the stringent accuracy and stability necessary for climate studies, particularly for climate trends. Building upon the expertise accumulated by my group during the last 10 years, I propose the development of innovative algorithms and statistical tools to generate climate data records at the global scale, of (1) spectrally resolved outgoing radiances, (2) land and sea skin surface temperatures, and (3) temperatures at selected altitudes. Time series of these quantities will be compared with in situ and other satellite observations if available, atmospheric reanalyses, and climate model simulations. The observed trends will be analyzed at seasonal and regional scales in order to disentangle natural (weather/dynamical) variability and human-induced climate forcings. This project, while clearly research-oriented, will lead towards an operational integrated observational strategy for the Earth climate system, given that the IASI program started in 2006 and will last until 2040 at least.
Summary
IASI - Flux and temperature
July 2016 was Earth's warmest month on record. The first six months of 2016 were also the warmest six-month period since modern meteorology observations began. This, along with the recent so-called “hiatus” in the warming trend, and the Paris climate agreement, all attracted scientific and public attention as to how reliable the historical temperature record is, and to the level of confidence in future model climate projections. Although the role of satellites in observing the variability and change of the Earth system has increased in recent decades, remotely-sensed observations are still underexploited to accurately assess climate change fingerprints. The IASI - Flux and Temperature (IASI-FT) project aims at providing new benchmarks for top-of-atmosphere radiative flux and temperature observations using the calibrated radiances measured twice a day at any location by the IASI instrument on the suite of MetOp satellites.
The main challenge is to achieve the stringent accuracy and stability necessary for climate studies, particularly for climate trends. Building upon the expertise accumulated by my group during the last 10 years, I propose the development of innovative algorithms and statistical tools to generate climate data records at the global scale, of (1) spectrally resolved outgoing radiances, (2) land and sea skin surface temperatures, and (3) temperatures at selected altitudes. Time series of these quantities will be compared with in situ and other satellite observations if available, atmospheric reanalyses, and climate model simulations. The observed trends will be analyzed at seasonal and regional scales in order to disentangle natural (weather/dynamical) variability and human-induced climate forcings. This project, while clearly research-oriented, will lead towards an operational integrated observational strategy for the Earth climate system, given that the IASI program started in 2006 and will last until 2040 at least.
Max ERC Funding
2 200 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym IllegalPharma
Project Competitive Dynamics in the Informal Economy: The case of Illegal Pharmaceutical Drugs
Researcher (PI) LUIS FRANCISCO DIESTRE MARTIN
Host Institution (HI) INSTITUTO DE EMPRESA SL
Call Details Starting Grant (StG), SH1, ERC-2016-STG
Summary This project aims to develop a competitive dynamics theory of the informal economy, which is currently lacking in academic research. Specifically, this project will adopt an institutional theory perspective to better understand three fundamental outcomes in the informal economy: market entry (illegal businesses’ decision to be active in a specific niche), price competition (price differentials between legal and illegal products), and product quality (quality of products sold in illegal businesses). The main conceptual proposition suggested in this project is that selling products through illegal means may still be perceived as a legitimate activity. Building on this statement, it will be proposed that the degree in which actors perceive the sale of an illegal product as a more or less legitimate activity will influence (1) entrepreneurs’ decision to illegally enter such market, (2) consumers’ willingness to pay for such illegal product (i.e., price differential versus the legal version of the product) and (3) manufacturers’ motivation to keep quality standards for that illegal product. The empirical setting for this study will be the illegal sale of pharmaceutical drugs. The sale of illegal pharmaceuticals accounts for more than 10% of the medicines market and over €30 billion in annual earnings (World Health Organization, 2003). It represents one of the biggest challenges for societies in that, attending to the WHO’s Department of Essential Medicines and Health Products, anywhere from 100,000 to a million people die every year due to falsified drugs. Accordingly, this study aims to provide two main contributions: (1) an academic contribution by developing a radically new theory of the competitive dynamics in the informal economy, and (2) a practical contribution by providing a better understanding of the determinants of the informal economy that could help policy makers and regulators in their goal of fighting the trading of illegal medicines.
Summary
This project aims to develop a competitive dynamics theory of the informal economy, which is currently lacking in academic research. Specifically, this project will adopt an institutional theory perspective to better understand three fundamental outcomes in the informal economy: market entry (illegal businesses’ decision to be active in a specific niche), price competition (price differentials between legal and illegal products), and product quality (quality of products sold in illegal businesses). The main conceptual proposition suggested in this project is that selling products through illegal means may still be perceived as a legitimate activity. Building on this statement, it will be proposed that the degree in which actors perceive the sale of an illegal product as a more or less legitimate activity will influence (1) entrepreneurs’ decision to illegally enter such market, (2) consumers’ willingness to pay for such illegal product (i.e., price differential versus the legal version of the product) and (3) manufacturers’ motivation to keep quality standards for that illegal product. The empirical setting for this study will be the illegal sale of pharmaceutical drugs. The sale of illegal pharmaceuticals accounts for more than 10% of the medicines market and over €30 billion in annual earnings (World Health Organization, 2003). It represents one of the biggest challenges for societies in that, attending to the WHO’s Department of Essential Medicines and Health Products, anywhere from 100,000 to a million people die every year due to falsified drugs. Accordingly, this study aims to provide two main contributions: (1) an academic contribution by developing a radically new theory of the competitive dynamics in the informal economy, and (2) a practical contribution by providing a better understanding of the determinants of the informal economy that could help policy makers and regulators in their goal of fighting the trading of illegal medicines.
Max ERC Funding
1 374 185 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym IntScOmics
Project A single-cell genomics approach integrating gene expression, lineage, and physical interactions
Researcher (PI) Alexander VAN OUDENAARDEN
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Advanced Grant (AdG), LS2, ERC-2016-ADG
Summary From populations of unicellular organisms to complex tissues, cell-to-cell variability in phenotypic traits seems to be universal. To study this heterogeneity and its biological consequences, researchers have used advanced microscopy-based approaches that provide exquisite spatial and temporal resolution, but these methods are typically limited to measuring a few properties in parallel. On the other hand, next generation sequencing technologies allow for massively parallel genome-wide approaches but have, until recently, relied on studying population averages obtained from pooling thousands to millions of cells, precluding genome-wide analysis of cell-to-cell variability. Very excitingly, in the last few years there has been a revolution in single-cell sequencing technologies allowing genome-wide quantification of mRNA and genomic DNA in thousands of individual cells leading to the convergence of genomics and single-cell biology. However, during this convergence the spatial and temporal information, easily accessed by microscopy-based approaches, is often lost in a single-cell sequencing experiment. The overarching goal of this proposal is to develop single-cell sequencing technology that retains important aspects of the spatial-temporal information. In particular I will focus on integrating single-cell transcriptome and epigenome measurements with the physical cell-to-cell interaction network (spatial information) and lineage information (temporal information). These tools will be utilized to (i) explore the division symmetry of intestinal stem cells in vivo; (ii) to reconstruct the cell lineage history during zebrafish regeneration; and (iii) to determine lineage relations and the physical cell-to-cell interaction network of progenitor cells in the murine bone marrow.
Summary
From populations of unicellular organisms to complex tissues, cell-to-cell variability in phenotypic traits seems to be universal. To study this heterogeneity and its biological consequences, researchers have used advanced microscopy-based approaches that provide exquisite spatial and temporal resolution, but these methods are typically limited to measuring a few properties in parallel. On the other hand, next generation sequencing technologies allow for massively parallel genome-wide approaches but have, until recently, relied on studying population averages obtained from pooling thousands to millions of cells, precluding genome-wide analysis of cell-to-cell variability. Very excitingly, in the last few years there has been a revolution in single-cell sequencing technologies allowing genome-wide quantification of mRNA and genomic DNA in thousands of individual cells leading to the convergence of genomics and single-cell biology. However, during this convergence the spatial and temporal information, easily accessed by microscopy-based approaches, is often lost in a single-cell sequencing experiment. The overarching goal of this proposal is to develop single-cell sequencing technology that retains important aspects of the spatial-temporal information. In particular I will focus on integrating single-cell transcriptome and epigenome measurements with the physical cell-to-cell interaction network (spatial information) and lineage information (temporal information). These tools will be utilized to (i) explore the division symmetry of intestinal stem cells in vivo; (ii) to reconstruct the cell lineage history during zebrafish regeneration; and (iii) to determine lineage relations and the physical cell-to-cell interaction network of progenitor cells in the murine bone marrow.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym ITHACA
Project An Information Theoretic Approach to Improving the Reliability of Weather and Climate Simulations
Researcher (PI) Timothy PALMER
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary The aim of this project is to develop a new synergy between climate and computer science to increase the accuracy and hence reliability of comprehensive weather and climate models. The scientific basis for this project lies in the PI’s pioneering research on stochastic sub-grid parametrisations for climate models. These parametrisations provide estimates of irreducible uncertainty in weather and climate models, and will be used to determine where numerical precision for model variables can be reduced without degradation. By identifying those bits that carry negligible information – typically in high-wavenumber components of the dynamical core and within parametrisation and Earth-System modules – computational resources can be reinvested into areas (resolution, process representation, ensemble size) where they are sorely needed. This project will determine scale-dependent estimates of information content as rigorously as possible based on a variety of new tools, which include information-theoretic diagnostics and emulators of imprecision, and in a variety of models, from idealised to comprehensive. The project will contribute significantly to the development of next-generation weather and climate models and is well timed for the advent of exascale supercomputing where energy efficiency is paramount and where movement of bits, being the single biggest determinant of power consumption, must be minimised. The ideas will be tested on emerging hardware capable of exploiting the benefits of mixed-precision arithmetic. A testable scientific hypothesis is presented: a proposed increase in forecast reliability arising from an increase in the forecast model’s vertical resolution, the cost being paid for by a reduction in precision of small-scale variables. This project can be expected to provide new scientific understanding of how different scales interact in the nonlinear climate system, for example in maintaining persistent atmospheric flow regimes.
Summary
The aim of this project is to develop a new synergy between climate and computer science to increase the accuracy and hence reliability of comprehensive weather and climate models. The scientific basis for this project lies in the PI’s pioneering research on stochastic sub-grid parametrisations for climate models. These parametrisations provide estimates of irreducible uncertainty in weather and climate models, and will be used to determine where numerical precision for model variables can be reduced without degradation. By identifying those bits that carry negligible information – typically in high-wavenumber components of the dynamical core and within parametrisation and Earth-System modules – computational resources can be reinvested into areas (resolution, process representation, ensemble size) where they are sorely needed. This project will determine scale-dependent estimates of information content as rigorously as possible based on a variety of new tools, which include information-theoretic diagnostics and emulators of imprecision, and in a variety of models, from idealised to comprehensive. The project will contribute significantly to the development of next-generation weather and climate models and is well timed for the advent of exascale supercomputing where energy efficiency is paramount and where movement of bits, being the single biggest determinant of power consumption, must be minimised. The ideas will be tested on emerging hardware capable of exploiting the benefits of mixed-precision arithmetic. A testable scientific hypothesis is presented: a proposed increase in forecast reliability arising from an increase in the forecast model’s vertical resolution, the cost being paid for by a reduction in precision of small-scale variables. This project can be expected to provide new scientific understanding of how different scales interact in the nonlinear climate system, for example in maintaining persistent atmospheric flow regimes.
Max ERC Funding
2 494 117 €
Duration
Start date: 2017-10-01, End date: 2022-09-30