Project acronym ACCLAIM
Project Aerosols effects on convective clouds and climate
Researcher (PI) Philip Stier
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary Clouds play a key role in the climate system. Small anthropogenic perturbations of the cloud system potentially have large radiative effects. Aerosols perturb the global radiation budget directly, by scattering and absorption, as well as indirectly, by the modification of cloud properties and occurrence. The applicability of traditional conceptual models of indirect aerosol effects to convective clouds is disputed as cloud dynamics complicates the picture.
Strong evidence for numerous aerosol effects on convection has been established in individual disciplines: through remote sensing and in-situ observations as well as by cloud resolving and global modelling. However, a coherent scientific view of the effects of aerosols on convection has yet to be established.
The primary objective of ACCLAIM is to recast the effects of aerosols on convective clouds as basis for improved global estimates of anthropogenic climate effects. Specific objectives include: i) to unravel the governing principles of aerosol effects on convective clouds; ii) provide quantitative constraints on satellite-retrieved relationships between convective clouds and aerosols; and ultimately iii) to enable global climate models to represent the full range of anthropogenic climate perturbations and quantify the climate response to aerosol effects on convective clouds.
I have developed the research strategy of ACCLAIM to overcome disciplinary barriers in this frontier research area and seek five years of funding to establish an interdisciplinary, physics focused, research group consisting of two PostDocs, two PhD students and myself. ACCLAIM will be centred around global aerosol-convection climate modelling studies, complemented by research constraining aerosol-convection interactions through remote sensing and a process focused research strand, advancing fundamental understanding and global model parameterisations through high resolution aerosol-cloud modelling in synergy with in-situ observations.
Summary
Clouds play a key role in the climate system. Small anthropogenic perturbations of the cloud system potentially have large radiative effects. Aerosols perturb the global radiation budget directly, by scattering and absorption, as well as indirectly, by the modification of cloud properties and occurrence. The applicability of traditional conceptual models of indirect aerosol effects to convective clouds is disputed as cloud dynamics complicates the picture.
Strong evidence for numerous aerosol effects on convection has been established in individual disciplines: through remote sensing and in-situ observations as well as by cloud resolving and global modelling. However, a coherent scientific view of the effects of aerosols on convection has yet to be established.
The primary objective of ACCLAIM is to recast the effects of aerosols on convective clouds as basis for improved global estimates of anthropogenic climate effects. Specific objectives include: i) to unravel the governing principles of aerosol effects on convective clouds; ii) provide quantitative constraints on satellite-retrieved relationships between convective clouds and aerosols; and ultimately iii) to enable global climate models to represent the full range of anthropogenic climate perturbations and quantify the climate response to aerosol effects on convective clouds.
I have developed the research strategy of ACCLAIM to overcome disciplinary barriers in this frontier research area and seek five years of funding to establish an interdisciplinary, physics focused, research group consisting of two PostDocs, two PhD students and myself. ACCLAIM will be centred around global aerosol-convection climate modelling studies, complemented by research constraining aerosol-convection interactions through remote sensing and a process focused research strand, advancing fundamental understanding and global model parameterisations through high resolution aerosol-cloud modelling in synergy with in-situ observations.
Max ERC Funding
1 429 243 €
Duration
Start date: 2011-09-01, End date: 2017-02-28
Project acronym ALEXANDRIA
Project Large-Scale Formal Proof for the Working Mathematician
Researcher (PI) Lawrence PAULSON
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Summary
Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Max ERC Funding
2 430 140 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ASAP
Project Adaptive Security and Privacy
Researcher (PI) Bashar Nuseibeh
Host Institution (HI) THE OPEN UNIVERSITY
Call Details Advanced Grant (AdG), PE6, ERC-2011-ADG_20110209
Summary With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Summary
With the prevalence of mobile computing devices and the increasing availability of pervasive services, ubiquitous computing (Ubicomp) is a reality for many people. This reality is generating opportunities for people to interact socially in new and richer ways, and to work more effectively in a variety of new environments. More generally, Ubicomp infrastructures – controlled by software – will determine users’ access to critical services.
With these opportunities come higher risks of misuse by malicious agents. Therefore, the role and design of software for managing use and protecting against misuse is critical, and the engineering of software that is both functionally effective while safe guarding user assets from harm is a key challenge. Indeed the very nature of Ubicomp means that software must adapt to the changing needs of users and their environment, and, more critically, to the different threats to users’ security and privacy.
ASAP proposes to radically re-conceptualise software engineering for Ubicomp in ways that are cognisant of the changing functional needs of users, of the changing threats to user assets, and of the changing relationships between them. We propose to deliver adaptive software capabilities for supporting users in managing their privacy requirements, and adaptive software capabilities to deliver secure software that underpin those requirements. A key novelty of our approach is its holistic treatment of security and human behaviour. To achieve this, it draws upon contributions from requirements engineering, security & privacy engineering, and human-computer interaction. Our aim is to contribute to software engineering that empowers and protects Ubicomp users. Underpinning our approach will be the development of representations of security and privacy problem structures that capture user requirements, the context in which those requirements arise, and the adaptive software that aims to meet those requirements.
Max ERC Funding
2 499 041 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym BIONET
Project Network Topology Complements Genome as a Source of Biological Information
Researcher (PI) Natasa Przulj
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Summary
Genetic sequences have had an enormous impact on our understanding of biology. The expectation is that biological network data will have a similar impact. However, progress is hindered by a lack of sophisticated graph theoretic tools that will mine these large networked datasets.
In recent breakthrough work at the boundary of computer science and biology supported by my USA NSF CAREER award, I developed sensitive network analysis, comparison and embedding tools which demonstrated that protein-protein interaction networks of eukaryotes are best modeled by geometric graphs. Also, they established phenotypically validated, unprecedented link between network topology and biological function and disease. Now I propose to substantially extend these preliminary results and design sensitive and robust network alignment methods that will lead to uncovering unknown biology and evolutionary relationships. The potential ground-breaking impact of such network alignment tools could be parallel to the impact the BLAST family of sequence alignment tools that have revolutionized our understanding of biological systems and therapeutics. Furthermore, I propose to develop additional sophisticated graph theoretic techniques to mine network data and hence complement biological information that can be extracted from sequence. I propose to exploit these new techniques for biological applications in collaboration with experimentalists at Imperial College London: 1. aligning biological networks of species whose genomes are closely related, but that have very different phenotypes, in order to uncover systems-level factors that contribute to pronounced differences; 2. compare and contrast stress response pathways and metabolic pathways in bacteria in a unified systems-level framework and exploit the findings for: (a) bioengineering of micro-organisms for industrial applications (production of bio-fuels, bioremediation, production of biopolymers); (b) biomedical applications.
Max ERC Funding
1 638 175 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym BYONIC
Project Beyond the Iron Curtain
Researcher (PI) Alessandro TAGLIABUE
Host Institution (HI) THE UNIVERSITY OF LIVERPOOL
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Summary
As one of the largest carbon reservoirs in the Earth system, the ocean is central to understanding past, present and future fluctuations in atmospheric carbon dioxide. In this context, microscopic plants called phytoplankton are key as they consume carbon dioxide during photosynthesis and transfer part of this carbon to the ocean’s interior and ultimately the lithosphere. The overall abundance of phytoplankton also forms the foundation of ocean food webs and drives the richness of marine fisheries.
It is key that we understand drivers of variations in phytoplankton growth, so we can explain changes in ocean productivity and the global carbon cycle, as well as project future trends with confidence. The numerical models we rely on for these tasks are prevented from doing so at present, however, due to a major theoretical gap concerning the role of trace metals in shaping phytoplankton growth in the ocean. This omission is particularly lacking at regional scales, where subtle interactions can lead to their co-limitation of biological activity. While we have long known that trace metals are fundamentally important to the photosynthesis and respiration of phytoplankton, it is only very recently that the necessary large-scale oceanic datasets required by numerical models have become available. I am leading such efforts with the trace metal iron, but we urgently need to expand our approach to other essential trace metals such as cobalt, copper, manganese and zinc.
This project will combine knowledge of biological requirement for trace metals with these newly emerging datasets to move ‘beyond the iron curtain’ and develop the first ever complete numerical model of resource limitation of phytoplankton growth, accounting for co-limiting interactions. Via a progressive combination of data synthesis and state of the art modelling, I will deliver a step-change into how we think resource availability controls life in the ocean.
Max ERC Funding
1 668 418 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym CACH
Project Reconstructing abrupt Changes in Chemistry and Circulation of the Equatorial Atlantic Ocean: Implications for global Climate and deep-water Habitats
Researcher (PI) Laura Frances Robinson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary "Ice-core records show that glacials had lower atmospheric pCO2 and cooler temperatures than today and that the last deglaciation was punctuated by large, abrupt millennial-scale climate events. Explaining the mechanism controlling these oscillations remains an outstanding puzzle. The ocean is a key player, and the Atlantic is particularly dynamic as it transports heat, carbon and nutrients across the equator. This project proposes to consolidate my research through a focused study of present and past ocean chemistry in the Equatorial Atlantic and to assess the impact of ocean chemistry on fragile deep-sea ecosystems. Despite decades of research there are distinct gaps in our knowledge of the history of the deep and intermediate ocean. Major hurdles include access to suitable archives, development of geochemical proxies and analyses that are sufficiently precise to test climate hypotheses. Through a combination of ship board field work, modern calibrations and cutting-edge geochemical analyses this project will produce samples and data that address each of these gaps. A particular focus will be on using the skeletons of deep-sea corals. Research using deep-sea corals as climate archives, and indeed research into their habitats, environmental controls and potential threats to their survival are still fields in their infancy. The expense and logistics of working in the deep ocean, the complexity of the ecosystem and the biogeochemistry of the coral skeletons have all proved to be significant challenges. The potential payoffs of high-resolution, dateable archives, however, make the effort worthwhile. There have been no studies that attempt to match up co-located deep-sea coral, seawater and sediment samples in a single program, so this would be the first directed study of its type, and as such promises to provide a substantial step in quantifying the fluxes and transport of mass, heat and nutrients across the equator in the past."
Summary
"Ice-core records show that glacials had lower atmospheric pCO2 and cooler temperatures than today and that the last deglaciation was punctuated by large, abrupt millennial-scale climate events. Explaining the mechanism controlling these oscillations remains an outstanding puzzle. The ocean is a key player, and the Atlantic is particularly dynamic as it transports heat, carbon and nutrients across the equator. This project proposes to consolidate my research through a focused study of present and past ocean chemistry in the Equatorial Atlantic and to assess the impact of ocean chemistry on fragile deep-sea ecosystems. Despite decades of research there are distinct gaps in our knowledge of the history of the deep and intermediate ocean. Major hurdles include access to suitable archives, development of geochemical proxies and analyses that are sufficiently precise to test climate hypotheses. Through a combination of ship board field work, modern calibrations and cutting-edge geochemical analyses this project will produce samples and data that address each of these gaps. A particular focus will be on using the skeletons of deep-sea corals. Research using deep-sea corals as climate archives, and indeed research into their habitats, environmental controls and potential threats to their survival are still fields in their infancy. The expense and logistics of working in the deep ocean, the complexity of the ecosystem and the biogeochemistry of the coral skeletons have all proved to be significant challenges. The potential payoffs of high-resolution, dateable archives, however, make the effort worthwhile. There have been no studies that attempt to match up co-located deep-sea coral, seawater and sediment samples in a single program, so this would be the first directed study of its type, and as such promises to provide a substantial step in quantifying the fluxes and transport of mass, heat and nutrients across the equator in the past."
Max ERC Funding
1 998 833 €
Duration
Start date: 2011-10-01, End date: 2017-09-30
Project acronym CASCAde
Project Confidentiality-preserving Security Assurance
Researcher (PI) Thomas GROSS
Host Institution (HI) UNIVERSITY OF NEWCASTLE UPON TYNE
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary "This proposal aims to create a new generation of security assurance. It investigates whether one can certify an inter-connected dynamically changing system in such a way that one can prove its security properties without disclosing sensitive information about the system's blueprint.
This has several compelling advantages. First, the security of large-scale dynamically changing systems will be significantly improved. Second, we can prove properties of topologies, hosts and users who participate in transactions in one go, while keeping sensitive information confidential. Third, we can prove the integrity of graph data structures to others, while maintaining their their confidentiality. This will benefit EU governments and citizens through the increased security of critical systems.
The proposal pursues the main research hypothesis that usable confidentiality-preserving security assurance will trigger a paradigm shift in security and dependability. It will pursue this objective by the creation of new cryptographic techniques to certify and prove properties of graph data structures. A preliminary investigation in 2015 showed that graph signature schemes are indeed feasible. The essence of this solution can be traced back to my earlier research on highly efficient attribute encodings for anonymous credential schemes in 2008.
However, the invention of graph signature schemes only clears one obstacle in a long journey to create a new generation of security assurance systems. There are still many complex obstacles, first and foremost, assuring ""soundness"" in the sense that integrity proofs a verifier accepts translate to the state of the system at that time. The work program involves six WPs: 1) to develop graph signatures and new cryptographic primitives; 2) to establish cross-system soundness; 3) to handle scale and change; 4) to establish human trust and usability; 5) to create new architectures; and 6) to test prototypes in practice."
Summary
"This proposal aims to create a new generation of security assurance. It investigates whether one can certify an inter-connected dynamically changing system in such a way that one can prove its security properties without disclosing sensitive information about the system's blueprint.
This has several compelling advantages. First, the security of large-scale dynamically changing systems will be significantly improved. Second, we can prove properties of topologies, hosts and users who participate in transactions in one go, while keeping sensitive information confidential. Third, we can prove the integrity of graph data structures to others, while maintaining their their confidentiality. This will benefit EU governments and citizens through the increased security of critical systems.
The proposal pursues the main research hypothesis that usable confidentiality-preserving security assurance will trigger a paradigm shift in security and dependability. It will pursue this objective by the creation of new cryptographic techniques to certify and prove properties of graph data structures. A preliminary investigation in 2015 showed that graph signature schemes are indeed feasible. The essence of this solution can be traced back to my earlier research on highly efficient attribute encodings for anonymous credential schemes in 2008.
However, the invention of graph signature schemes only clears one obstacle in a long journey to create a new generation of security assurance systems. There are still many complex obstacles, first and foremost, assuring ""soundness"" in the sense that integrity proofs a verifier accepts translate to the state of the system at that time. The work program involves six WPs: 1) to develop graph signatures and new cryptographic primitives; 2) to establish cross-system soundness; 3) to handle scale and change; 4) to establish human trust and usability; 5) to create new architectures; and 6) to test prototypes in practice."
Max ERC Funding
1 485 643 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym CO2VOLC
Project CO2VOLC: Quantifying the global volcanic CO2 cycle
Researcher (PI) Michael Burton
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary "Global climate change induced by anthropogenic emissions of CO2 is currently a major issue facing humanity, but uncertainties in the magnitude and rate of climate change remain, and deterministic predictions are beyond our capacity. In this context, the study of how the geochemical carbon cycle established a relatively narrow band of variability in atmospheric CO2 concentrations over the last 400 ka is of great interest. However, large uncertainties in both weathering and volcanic CO2 fluxes prevent a truly quantitative assessment of this critical cycle. Measuring the global volcanic CO2 flux, GVFCO2, would allow us to better understand the likely impact large eruptions have had in Earth’s history, and constrain the natural vs. anthropogenic CO2 flux.
We propose a truly innovative project to address head on the problem of determining GVFCO2. We will create new, compact instruments, utilising cutting-edge laser technologies, which will allow us to measure volcanic CO2, H2O, SO2 and HCl fluxes from aircraft. By flying below and through the volcanic plumes created by ~50 active volcanoes (~10% of all active volcanoes) of the Banda-Sunda arc in Indonesia, the majority of which have never been measured before, we will dramatically increase our understanding of GVFCO2 and geochemical cycles for all these species.
Measuring the volcanic emissions from an entire subduction arc is an unprecedented experiment, providing insight into the slab and mantle heterogeneity and volatile mass balance. Perhaps the most important breakthrough that we will pursue will be the determination of the 37Cl/35Cl ratio from HCl emitted from each volcano. This ratio reflects the mantle/slab source proportion, and allows the input rate of volatiles to the mantle to be measured.
The application of innovative new technology we propose here will produce ground-breaking insights into volcanology, isotope and gas geochemistry, volatile cycles, subduction and climate change."
Summary
"Global climate change induced by anthropogenic emissions of CO2 is currently a major issue facing humanity, but uncertainties in the magnitude and rate of climate change remain, and deterministic predictions are beyond our capacity. In this context, the study of how the geochemical carbon cycle established a relatively narrow band of variability in atmospheric CO2 concentrations over the last 400 ka is of great interest. However, large uncertainties in both weathering and volcanic CO2 fluxes prevent a truly quantitative assessment of this critical cycle. Measuring the global volcanic CO2 flux, GVFCO2, would allow us to better understand the likely impact large eruptions have had in Earth’s history, and constrain the natural vs. anthropogenic CO2 flux.
We propose a truly innovative project to address head on the problem of determining GVFCO2. We will create new, compact instruments, utilising cutting-edge laser technologies, which will allow us to measure volcanic CO2, H2O, SO2 and HCl fluxes from aircraft. By flying below and through the volcanic plumes created by ~50 active volcanoes (~10% of all active volcanoes) of the Banda-Sunda arc in Indonesia, the majority of which have never been measured before, we will dramatically increase our understanding of GVFCO2 and geochemical cycles for all these species.
Measuring the volcanic emissions from an entire subduction arc is an unprecedented experiment, providing insight into the slab and mantle heterogeneity and volatile mass balance. Perhaps the most important breakthrough that we will pursue will be the determination of the 37Cl/35Cl ratio from HCl emitted from each volcano. This ratio reflects the mantle/slab source proportion, and allows the input rate of volatiles to the mantle to be measured.
The application of innovative new technology we propose here will produce ground-breaking insights into volcanology, isotope and gas geochemistry, volatile cycles, subduction and climate change."
Max ERC Funding
1 721 000 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym CODITA
Project Cosmic Dust in the Terrestrial Atmosphere
Researcher (PI) John Maurice Campbell Plane
Host Institution (HI) UNIVERSITY OF LEEDS
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary "This project addresses a fundamental problem – the size of the cosmic dust input to the earth’s atmosphere. Zodiacal cloud observations and spaceborne dust detection indicate a daily input of 100 – 300 tonnes, in agreement with the accumulation rates of cosmic elements (e.g. Ir, Pt) in polar ice cores and deep-sea sediments. In contrast, measurements in the middle atmosphere – by radar, lidar, high-flying aircraft and satellite remote sensing – indicate that the input is only 5 - 50 tonnes. The aim of CODITA is to resolve this huge discrepancy.
There are two reasons why this matters. First, if the upper range of estimates is correct, then vertical transport in the middle atmosphere must be considerably faster than generally believed; whereas if the lower range is correct, then our understanding of dust evolution in the solar system, and transport from the middle atmosphere to the surface, will need substantial revision. Second, cosmic dust particles enter the atmosphere at high speeds and in most cases completely ablate. The resulting metals injected into the atmosphere are involved in a diverse range of phenomena, including: formation of layers of metal atoms and ions; nucleation of noctilucent clouds; impacts on stratospheric aerosols and O3 chemistry (which need to be evaluated against the background of a cooling stratosphere and geo-engineering plans to increase sulphate aerosol); and fertilization of the ocean with bio-available Fe, which has potential climate feedbacks.
CODITA will use laboratory studies to target poorly understood aspects of this problem, such as the nature of the ablation process itself, the formation of meteoric smoke particles, and their role in ice nucleation and the freezing of polar stratospheric clouds. The results will be incorporated into a chemistry-climate model of the whole atmosphere, so that it will be possible, for the first time, to model the effects of cosmic dust self-consistently from the thermosphere to the surface."
Summary
"This project addresses a fundamental problem – the size of the cosmic dust input to the earth’s atmosphere. Zodiacal cloud observations and spaceborne dust detection indicate a daily input of 100 – 300 tonnes, in agreement with the accumulation rates of cosmic elements (e.g. Ir, Pt) in polar ice cores and deep-sea sediments. In contrast, measurements in the middle atmosphere – by radar, lidar, high-flying aircraft and satellite remote sensing – indicate that the input is only 5 - 50 tonnes. The aim of CODITA is to resolve this huge discrepancy.
There are two reasons why this matters. First, if the upper range of estimates is correct, then vertical transport in the middle atmosphere must be considerably faster than generally believed; whereas if the lower range is correct, then our understanding of dust evolution in the solar system, and transport from the middle atmosphere to the surface, will need substantial revision. Second, cosmic dust particles enter the atmosphere at high speeds and in most cases completely ablate. The resulting metals injected into the atmosphere are involved in a diverse range of phenomena, including: formation of layers of metal atoms and ions; nucleation of noctilucent clouds; impacts on stratospheric aerosols and O3 chemistry (which need to be evaluated against the background of a cooling stratosphere and geo-engineering plans to increase sulphate aerosol); and fertilization of the ocean with bio-available Fe, which has potential climate feedbacks.
CODITA will use laboratory studies to target poorly understood aspects of this problem, such as the nature of the ablation process itself, the formation of meteoric smoke particles, and their role in ice nucleation and the freezing of polar stratospheric clouds. The results will be incorporated into a chemistry-climate model of the whole atmosphere, so that it will be possible, for the first time, to model the effects of cosmic dust self-consistently from the thermosphere to the surface."
Max ERC Funding
2 484 369 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym COMITAC
Project An integrated geoscientific study of the thermodynamics and composition of the Earth's core-mantle interface
Researcher (PI) James Wookey
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary The core-mantle interface is the central cog in the Earth's titanic heat engine. As the boundary between the two major convecting parts of the Earth system (the solid silicate mantle and the liquid iron outer core) the properties of this region have a profound influence on the thermochemical and dynamic evolution of the entire planet, including tectonic phenomena at the surface. Evidence from seismology shows that D" (the lowermost few hundred kilometres of the mantle) is strongly heterogeneous in temperature, chemistry, structure and dynamics; this may dominate the long term evolution of the Earth's magnetic field and the morphology of mantle convection and chemical stratification, for example. Mapping and characterising this heterogeneity requires a detailed knowledge of the properties of the constituents and dynamics of D"; this is achievable by resolving its seismic anisotropy. The observation of anisotropy in the shallow lithosphere was an important piece of evidence for the theory of plate tectonics; now such a breakthrough is possible for the analogous deep boundary. We are at a critical juncture where developments in modelling strain in the mantle, petrofabrics and seismic wave propagation can be combined to produce a new generation of integrated models of D", embodying more complete information than any currently available. I propose a groundbreaking project to build such multidisciplinary models and to produce the first complete image of lowermost mantle anisotropy using the best available global, high resolution seismic dataset. The comparison of the models with these data is the key to making a fundamental improvement in our understanding of the thermodynamics and composition of the core-mantle interface, and illuminating its role in the wider Earth system.
Summary
The core-mantle interface is the central cog in the Earth's titanic heat engine. As the boundary between the two major convecting parts of the Earth system (the solid silicate mantle and the liquid iron outer core) the properties of this region have a profound influence on the thermochemical and dynamic evolution of the entire planet, including tectonic phenomena at the surface. Evidence from seismology shows that D" (the lowermost few hundred kilometres of the mantle) is strongly heterogeneous in temperature, chemistry, structure and dynamics; this may dominate the long term evolution of the Earth's magnetic field and the morphology of mantle convection and chemical stratification, for example. Mapping and characterising this heterogeneity requires a detailed knowledge of the properties of the constituents and dynamics of D"; this is achievable by resolving its seismic anisotropy. The observation of anisotropy in the shallow lithosphere was an important piece of evidence for the theory of plate tectonics; now such a breakthrough is possible for the analogous deep boundary. We are at a critical juncture where developments in modelling strain in the mantle, petrofabrics and seismic wave propagation can be combined to produce a new generation of integrated models of D", embodying more complete information than any currently available. I propose a groundbreaking project to build such multidisciplinary models and to produce the first complete image of lowermost mantle anisotropy using the best available global, high resolution seismic dataset. The comparison of the models with these data is the key to making a fundamental improvement in our understanding of the thermodynamics and composition of the core-mantle interface, and illuminating its role in the wider Earth system.
Max ERC Funding
1 639 615 €
Duration
Start date: 2009-09-01, End date: 2015-08-31