Project acronym Active-DNA
Project Computationally Active DNA Nanostructures
Researcher (PI) Damien WOODS
Host Institution (HI) NATIONAL UNIVERSITY OF IRELAND MAYNOOTH
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Summary
During the 20th century computer technology evolved from bulky, slow, special purpose mechanical engines to the now ubiquitous silicon chips and software that are one of the pinnacles of human ingenuity. The goal of the field of molecular programming is to take the next leap and build a new generation of matter-based computers using DNA, RNA and proteins. This will be accomplished by computer scientists, physicists and chemists designing molecules to execute ``wet'' nanoscale programs in test tubes. The workflow includes proposing theoretical models, mathematically proving their computational properties, physical modelling and implementation in the wet-lab.
The past decade has seen remarkable progress at building static 2D and 3D DNA nanostructures. However, unlike biological macromolecules and complexes that are built via specified self-assembly pathways, that execute robotic-like movements, and that undergo evolution, the activity of human-engineered nanostructures is severely limited. We will need sophisticated algorithmic ideas to build structures that rival active living systems. Active-DNA, aims to address this challenge by achieving a number of objectives on computation, DNA-based self-assembly and molecular robotics. Active-DNA research work will range from defining models and proving theorems that characterise the computational and expressive capabilities of such active programmable materials to experimental work implementing active DNA nanostructures in the wet-lab.
Max ERC Funding
2 349 603 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym ACUITY
Project Algorithms for coping with uncertainty and intractability
Researcher (PI) Nikhil Bansal
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Summary
The two biggest challenges in solving practical optimization problems are computational intractability, and the presence
of uncertainty: most problems are either NP-hard, or have incomplete input data which
makes an exact computation impossible.
Recently, there has been a huge progress in our understanding of intractability, based on spectacular algorithmic and lower bound techniques. For several problems, especially those with only local constraints, we can design optimum
approximation algorithms that are provably the best possible.
However, typical optimization problems usually involve complex global constraints and are much less understood. The situation is even worse for coping with uncertainty. Most of the algorithms are based on ad-hoc techniques and there is no deeper understanding of what makes various problems easy or hard.
This proposal describes several new directions, together with concrete intermediate goals, that will break important new ground in the theory of approximation and online algorithms. The particular directions we consider are (i) extend the primal dual method to systematically design online algorithms, (ii) build a structural theory of online problems based on work functions, (iii) develop new tools to use the power of strong convex relaxations and (iv) design new algorithmic approaches based on non-constructive proof techniques.
The proposed research is at the
cutting edge of algorithm design, and builds upon the recent success of the PI in resolving several longstanding questions in these areas. Any progress is likely to be a significant contribution to theoretical
computer science and combinatorial optimization.
Max ERC Funding
1 519 285 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym ADDICTION
Project Beyond the Genetics of Addiction
Researcher (PI) Jacqueline Mignon Vink
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary My proposal seeks to explain the complex interplay between genetic and environmental causes of individual variation in substance use and the risk for abuse. Substance use is common. Substances like nicotine and cannabis have well-known negative health consequences, while alcohol and caffeine use may be both beneficial and detrimental, depending on quantity and frequency of use. Twin studies (including my own) demonstrated that both heritable and environmental factors play a role.
My proposal on substance use (nicotine, alcohol, cannabis and caffeine) is organized around several key objectives: 1. To unravel the complex contribution of genetic and environmental factors to substance use by using extended twin family designs; 2. To identify and confirm genes and gene networks involved in substance use by using DNA-variant data; 3. To explore gene expression patterns with RNA data in substance users versus non-users; 4. To investigate biomarkers in substance users versus non-users using blood or urine; 5. To unravel relation between substance use and health by linking twin-family data to national medical databases.
To realize these aims I will use the extensive resources of the Netherlands Twin Register (NTR); including both the longitudinal phenotype database and the biological samples. I have been involved in data collection, coordination of data collection and analyzing NTR data since 1999. With my comprehensive experience in data collection, data analyses and my knowledge in the field of behavior genetics and addiction research I will be able to successfully lead this cutting-edge project. Additional data crucial for the project will be collected by my team. Large samples will be available for this study and state-of-the art methods will be used to analyze the data. All together, my project will offer powerful approaches to unravel the complex interaction between genetic and environmental causes of individual differences in substance use and the risk for abuse.
Summary
My proposal seeks to explain the complex interplay between genetic and environmental causes of individual variation in substance use and the risk for abuse. Substance use is common. Substances like nicotine and cannabis have well-known negative health consequences, while alcohol and caffeine use may be both beneficial and detrimental, depending on quantity and frequency of use. Twin studies (including my own) demonstrated that both heritable and environmental factors play a role.
My proposal on substance use (nicotine, alcohol, cannabis and caffeine) is organized around several key objectives: 1. To unravel the complex contribution of genetic and environmental factors to substance use by using extended twin family designs; 2. To identify and confirm genes and gene networks involved in substance use by using DNA-variant data; 3. To explore gene expression patterns with RNA data in substance users versus non-users; 4. To investigate biomarkers in substance users versus non-users using blood or urine; 5. To unravel relation between substance use and health by linking twin-family data to national medical databases.
To realize these aims I will use the extensive resources of the Netherlands Twin Register (NTR); including both the longitudinal phenotype database and the biological samples. I have been involved in data collection, coordination of data collection and analyzing NTR data since 1999. With my comprehensive experience in data collection, data analyses and my knowledge in the field of behavior genetics and addiction research I will be able to successfully lead this cutting-edge project. Additional data crucial for the project will be collected by my team. Large samples will be available for this study and state-of-the art methods will be used to analyze the data. All together, my project will offer powerful approaches to unravel the complex interaction between genetic and environmental causes of individual differences in substance use and the risk for abuse.
Max ERC Funding
1 491 964 €
Duration
Start date: 2011-12-01, End date: 2017-05-31
Project acronym AFFORDS-HIGHER
Project Skilled Intentionality for 'Higher' Embodied Cognition: Joining forces with a field of affordances in flux
Researcher (PI) Dirk Willem Rietveld
Host Institution (HI) ACADEMISCH MEDISCH CENTRUM BIJ DE UNIVERSITEIT VAN AMSTERDAM
Call Details Starting Grant (StG), SH4, ERC-2015-STG
Summary In many situations experts act adequately, yet without deliberation. Architects e.g, immediately sense opportunities offered by the site of a new project. One could label these manifestations of expert intuition as ‘higher-level’ cognition, but still these experts act unreflectively. The aim of my project is to develop the Skilled Intentionality Framework (SIF), a new conceptual framework for the field of embodied/enactive cognitive science (Chemero, 2009; Thompson, 2007). I argue that affordances - possibilities for action provided by our surroundings - are highly significant in cases of unreflective and reflective ‘higher’ cognition. Skilled Intentionality is skilled coordination with multiple affordances simultaneously.
The two central ideas behind this proposal are (a) that episodes of skilled ‘higher’ cognition can be understood as responsiveness to affordances for ‘higher’ cognition and (b) that our surroundings are highly resourceful and contribute to skillful action and cognition in a far more fundamental way than is generally acknowledged. I use embedded philosophical research in a particular practice of architecture to shed new light on the ways in which affordances for ‘higher’ cognition support creative imagination, anticipation, explicit planning and self-reflection.
The Skilled Intentionality Framework is groundbreaking in relating findings established at several complementary levels of analysis: philosophy/phenomenology, ecological psychology, affective science and neurodynamics.
Empirical findings thought to be exclusively valid for everyday unreflective action can now be used to explain skilled ‘higher’ cognition as well. Moreover, SIF brings both the context and the social back into cognitive science. I will show SIF’s relevance for Friston’s work on the anticipating brain, and apply it in the domain of architecture and public health. SIF will radically widen the scope of the increasingly influential field of embodied cognitive science.
Summary
In many situations experts act adequately, yet without deliberation. Architects e.g, immediately sense opportunities offered by the site of a new project. One could label these manifestations of expert intuition as ‘higher-level’ cognition, but still these experts act unreflectively. The aim of my project is to develop the Skilled Intentionality Framework (SIF), a new conceptual framework for the field of embodied/enactive cognitive science (Chemero, 2009; Thompson, 2007). I argue that affordances - possibilities for action provided by our surroundings - are highly significant in cases of unreflective and reflective ‘higher’ cognition. Skilled Intentionality is skilled coordination with multiple affordances simultaneously.
The two central ideas behind this proposal are (a) that episodes of skilled ‘higher’ cognition can be understood as responsiveness to affordances for ‘higher’ cognition and (b) that our surroundings are highly resourceful and contribute to skillful action and cognition in a far more fundamental way than is generally acknowledged. I use embedded philosophical research in a particular practice of architecture to shed new light on the ways in which affordances for ‘higher’ cognition support creative imagination, anticipation, explicit planning and self-reflection.
The Skilled Intentionality Framework is groundbreaking in relating findings established at several complementary levels of analysis: philosophy/phenomenology, ecological psychology, affective science and neurodynamics.
Empirical findings thought to be exclusively valid for everyday unreflective action can now be used to explain skilled ‘higher’ cognition as well. Moreover, SIF brings both the context and the social back into cognitive science. I will show SIF’s relevance for Friston’s work on the anticipating brain, and apply it in the domain of architecture and public health. SIF will radically widen the scope of the increasingly influential field of embodied cognitive science.
Max ERC Funding
1 499 850 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ALGOCom
Project Novel Algorithmic Techniques through the Lens of Combinatorics
Researcher (PI) Parinya Chalermsook
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Starting Grant (StG), PE6, ERC-2017-STG
Summary Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Summary
Real-world optimization problems pose major challenges to algorithmic research. For instance, (i) many important problems are believed to be intractable (i.e. NP-hard) and (ii) with the growth of data size, modern applications often require a decision making under {\em incomplete and dynamically changing input data}. After several decades of research, central problems in these domains have remained poorly understood (e.g. Is there an asymptotically most efficient binary search trees?) Existing algorithmic techniques either reach their limitation or are inherently tailored to special cases.
This project attempts to untangle this gap in the state of the art and seeks new interplay across multiple areas of algorithms, such as approximation algorithms, online algorithms, fixed-parameter tractable (FPT) algorithms, exponential time algorithms, and data structures. We propose new directions from the {\em structural perspectives} that connect the aforementioned algorithmic problems to basic questions in combinatorics.
Our approaches fall into one of the three broad schemes: (i) new structural theory, (ii) intermediate problems, and (iii) transfer of techniques. These directions partially build on the PI's successes in resolving more than ten classical problems in this context.
Resolving the proposed problems will likely revolutionize our understanding about algorithms and data structures and potentially unify techniques in multiple algorithmic regimes. Any progress is, in fact, already a significant contribution to the algorithms community. We suggest concrete intermediate goals that are of independent interest and have lower risks, so they are suitable for Ph.D students.
Max ERC Funding
1 411 258 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ASICA
Project New constraints on the Amazonian carbon balance from airborne observations of the stable isotopes of CO2
Researcher (PI) Wouter Peters
Host Institution (HI) WAGENINGEN UNIVERSITY
Call Details Consolidator Grant (CoG), PE10, ERC-2014-CoG
Summary Severe droughts in Amazonia in 2005 and 2010 caused widespread loss of carbon from the terrestrial biosphere. This loss, almost twice the annual fossil fuel CO2 emissions in the EU, suggests a large sensitivity of the Amazonian carbon balance to a predicted more intense drought regime in the next decades. This is a dangerous inference though, as there is no scientific consensus on the most basic metrics of Amazonian carbon exchange: the gross primary production (GPP) and its response to moisture deficits in the soil and atmosphere. Measuring them on scales that span the whole Amazon forest was thus far impossible, but in this project I aim to deliver the first observation-based estimate of pan-Amazonian GPP and its drought induced variations.
My program builds on two recent breakthroughs in our use of stable isotopes (13C, 17O, 18O) in atmospheric CO2: (1) Our discovery that observed δ¹³C in CO2 in the atmosphere is a quantitative measure for vegetation water-use efficiency over millions of square kilometers, integrating the drought response of individual plants. (2) The possibility to precisely measure the relative ratios of 18O/16O and 17O/16O in CO2, called Δ17O. Anomalous Δ17O values are present in air coming down from the stratosphere, but this anomaly is removed upon contact of CO2 with leaf water inside plant stomata. Hence, observed Δ17O values depend directly on the magnitude of GPP. Both δ¹³C and Δ17O measurements are scarce over the Amazon-basin, and I propose more than 7000 new measurements leveraging an established aircraft monitoring program in Brazil. Quantitative interpretation of these observations will break new ground in our use of stable isotopes to understand climate variations, and is facilitated by our renowned numerical modeling system “CarbonTracker”. My program will answer two burning question in carbon cycle science today: (a) What is the magnitude of GPP in Amazonia? And (b) How does it vary over different intensities of drought?
Summary
Severe droughts in Amazonia in 2005 and 2010 caused widespread loss of carbon from the terrestrial biosphere. This loss, almost twice the annual fossil fuel CO2 emissions in the EU, suggests a large sensitivity of the Amazonian carbon balance to a predicted more intense drought regime in the next decades. This is a dangerous inference though, as there is no scientific consensus on the most basic metrics of Amazonian carbon exchange: the gross primary production (GPP) and its response to moisture deficits in the soil and atmosphere. Measuring them on scales that span the whole Amazon forest was thus far impossible, but in this project I aim to deliver the first observation-based estimate of pan-Amazonian GPP and its drought induced variations.
My program builds on two recent breakthroughs in our use of stable isotopes (13C, 17O, 18O) in atmospheric CO2: (1) Our discovery that observed δ¹³C in CO2 in the atmosphere is a quantitative measure for vegetation water-use efficiency over millions of square kilometers, integrating the drought response of individual plants. (2) The possibility to precisely measure the relative ratios of 18O/16O and 17O/16O in CO2, called Δ17O. Anomalous Δ17O values are present in air coming down from the stratosphere, but this anomaly is removed upon contact of CO2 with leaf water inside plant stomata. Hence, observed Δ17O values depend directly on the magnitude of GPP. Both δ¹³C and Δ17O measurements are scarce over the Amazon-basin, and I propose more than 7000 new measurements leveraging an established aircraft monitoring program in Brazil. Quantitative interpretation of these observations will break new ground in our use of stable isotopes to understand climate variations, and is facilitated by our renowned numerical modeling system “CarbonTracker”. My program will answer two burning question in carbon cycle science today: (a) What is the magnitude of GPP in Amazonia? And (b) How does it vary over different intensities of drought?
Max ERC Funding
2 269 689 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ATM-GTP
Project Atmospheric Gas-to-Particle conversion
Researcher (PI) Markku KULMALA
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Summary
Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym ATMNUCLE
Project Atmospheric nucleation: from molecular to global scale
Researcher (PI) Markku Tapio Kulmala
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2008-AdG
Summary Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Summary
Atmospheric aerosol particles and trace gases affect the quality of our life in many ways (e.g. health effects, changes in climate and hydrological cycle). Trace gases and atmospheric aerosols are tightly connected via physical, chemical, meteorological and biological processes occurring in the atmosphere and at the atmosphere-biosphere interface. One important phenomenon is atmospheric aerosol formation, which involves the production of nanometer-size particles by nucleation and their growth to detectable sizes. The main scientific objectives of this project are 1) to quantify the mechanisms responsible for atmospheric new particle formation and 2) to find out how important this process is for the behaviour of the global aerosol system and, ultimately, for the whole climate system. Our scientific plan is designed as a research chain that aims to advance our understanding of climate and air quality through a series of connected activities. We start from molecular simulations and laboratory measurements to understand nucleation and aerosol thermodynamic processes. We measure nanoparticles and atmospheric clusters at 15-20 sites all around the world using state of the art instrumentation and study feedbacks and interactions between climate and biosphere. With these atmospheric boundary layer studies we form a link to regional-scale processes and further to global-scale phenomena. In order to be able to simulate global climate and air quality, the most recent progress on this chain of processes must be compiled, integrated and implemented in Climate Change and Air Quality numerical models via novel parameterizations.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ATUNE
Project Attenuation Tomography Using Novel observations of Earth's free oscillations
Researcher (PI) Arwen Fedora Deuss
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Tectonic phenomena at the Earth's surface, like volcanic eruptions and earthquakes,are driven by convection deep in the mantle. Seismic tomography has been very successful in elucidating the Earth's internal velocity structure. However, seismic velocity is insufficient to obtain robust estimates of temperature and composition, and make direct links with mantle convection. Thus, fundamental questions remain unanswered: Do subducting slabs bring water into the lower mantle? Are the large low-shear velocity provinces under the Pacific and Africa mainly thermal or compositional? Is there any partial melt or water near the mantle transition zone or core mantle boundary?
Seismic attenuation, or loss of energy, is key to mapping melt, water and temperature variations, and answering these questions. Unfortunately, attenuation has only been imaged using short- and intermediate-period seismic data, showing little similarity even for the upper mantle and no reliable lower mantle models exist. The aim of ATUNE is to develop novel full-spectrum techniques and apply them to Earth's long period free oscillations to observe global-scale regional variations in seismic attenuation from the lithosphere to the core mantle boundary. Scattering and focussing - problematic for shorter period techniques - are easily included using cross-coupling (or resonance) between free oscillations not requiring approximations. The recent occurrence of large earthquakes, increase in computer power and my world-leading expertise in free oscillations now make it possible to increase the frequency dependence of attenuation to a much wider frequency band, allowing us to distinguish between scattering (redistribution of energy) versus intrinsic attenuation. ATUNE will deliver the first ever full-waveform global tomographic model of 3D attenuation variations in the lower mantle, providing essential constraints on melt, water and temperature for understanding the complex dynamics of our planet.
Summary
Tectonic phenomena at the Earth's surface, like volcanic eruptions and earthquakes,are driven by convection deep in the mantle. Seismic tomography has been very successful in elucidating the Earth's internal velocity structure. However, seismic velocity is insufficient to obtain robust estimates of temperature and composition, and make direct links with mantle convection. Thus, fundamental questions remain unanswered: Do subducting slabs bring water into the lower mantle? Are the large low-shear velocity provinces under the Pacific and Africa mainly thermal or compositional? Is there any partial melt or water near the mantle transition zone or core mantle boundary?
Seismic attenuation, or loss of energy, is key to mapping melt, water and temperature variations, and answering these questions. Unfortunately, attenuation has only been imaged using short- and intermediate-period seismic data, showing little similarity even for the upper mantle and no reliable lower mantle models exist. The aim of ATUNE is to develop novel full-spectrum techniques and apply them to Earth's long period free oscillations to observe global-scale regional variations in seismic attenuation from the lithosphere to the core mantle boundary. Scattering and focussing - problematic for shorter period techniques - are easily included using cross-coupling (or resonance) between free oscillations not requiring approximations. The recent occurrence of large earthquakes, increase in computer power and my world-leading expertise in free oscillations now make it possible to increase the frequency dependence of attenuation to a much wider frequency band, allowing us to distinguish between scattering (redistribution of energy) versus intrinsic attenuation. ATUNE will deliver the first ever full-waveform global tomographic model of 3D attenuation variations in the lower mantle, providing essential constraints on melt, water and temperature for understanding the complex dynamics of our planet.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-06-01, End date: 2021-05-31