Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BODY-UI
Project Using Embodied Cognition to Create the Next Generations of Body-based User Interfaces
Researcher (PI) Kasper Anders Soren Hornbæk
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or scratch their forearm to control a mobile phone. Body-based UIs are attractive because they free users from having to hold or touch a device and because they allow always-on, eyes-free interaction. Currently, however, research on body-based UIs proceeds in an ad hoc fashion and when body-based UIs are compared to device-based alternatives, they perform poorly. This is likely because little is known about the body as a user interface and because it is unclear whether theory and design principles from human-computer interaction (HCI) can be applied to body-based UIs. While body-based UIs may well be the next interaction paradigm for HCI, results so far are mixed.
This project aims at establishing the scientific foundation for the next generations of body-based UIs. The main novelty in my approach is to use results and methods from research on embodied cognition. Embodied cognition suggest that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. We use embodied cognition to study how body-based UIs affect users, and to increase our understanding of similarities and differences to device-based input. From those studies we develop new body-based UIs, both for input (e.g., gestures in mid-air) and output (e.g., stimulating users’ muscles to move their fingers), and evaluate users’ experience of interacting through their bodies. We also show how models, evaluation criteria, and design principles in HCI need to be adapted for embodied cognition and body-based UIs. If successful, the project will show how to create body-based UIs that are usable and orders of magnitude better than current UIs.
Summary
Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or scratch their forearm to control a mobile phone. Body-based UIs are attractive because they free users from having to hold or touch a device and because they allow always-on, eyes-free interaction. Currently, however, research on body-based UIs proceeds in an ad hoc fashion and when body-based UIs are compared to device-based alternatives, they perform poorly. This is likely because little is known about the body as a user interface and because it is unclear whether theory and design principles from human-computer interaction (HCI) can be applied to body-based UIs. While body-based UIs may well be the next interaction paradigm for HCI, results so far are mixed.
This project aims at establishing the scientific foundation for the next generations of body-based UIs. The main novelty in my approach is to use results and methods from research on embodied cognition. Embodied cognition suggest that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. We use embodied cognition to study how body-based UIs affect users, and to increase our understanding of similarities and differences to device-based input. From those studies we develop new body-based UIs, both for input (e.g., gestures in mid-air) and output (e.g., stimulating users’ muscles to move their fingers), and evaluate users’ experience of interacting through their bodies. We also show how models, evaluation criteria, and design principles in HCI need to be adapted for embodied cognition and body-based UIs. If successful, the project will show how to create body-based UIs that are usable and orders of magnitude better than current UIs.
Max ERC Funding
1 853 158 €
Duration
Start date: 2015-05-01, End date: 2020-04-30
Project acronym BTVI
Project First Biodegradable Biocatalytic VascularTherapeutic Implants
Researcher (PI) Alexander Zelikin
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary "We aim to perform academic development of a novel biomedical opportunity: localized synthesis of drugs within biocatalytic therapeutic vascular implants (BVI) for site-specific drug delivery to target organs and tissues. Primary envisioned targets for therapeutic intervention using BVI are atherosclerosis, viral hepatitis, and hepatocellular carcinoma: three of the most prevalent and debilitating conditions which affect hundreds of millions worldwide and which continue to increase in their importance in the era of increasingly aging population. For hepatic applications, we aim to develop drug eluting beads which are equipped with tools of enzyme-prodrug therapy (EPT) and are administered to the liver via trans-arterial catheter embolization. Therein, the beads perform localized synthesis of drugs and imaging reagents for anticancer combination therapy and theranostics, antiviral and anti-inflammatory agents for the treatment of hepatitis. Further, we conceive vascular therapeutic inserts (VTI) as a novel type of implantable biomaterials for treatment of atherosclerosis and re-endothelialization of vascular stents and grafts. Using EPT, inserts will tame “the guardian of cardiovascular grafts”, nitric oxide, for which localized, site specific synthesis and delivery spell success of therapeutic intervention and/or aided tissue regeneration. This proposal is positioned on the forefront of biomedical engineering and its success requires excellence in polymer chemistry, materials design, medicinal chemistry, and translational medicine. Each part of this proposal - design of novel types of vascular implants, engineering novel biomaterials, developing innovative fabrication and characterization techniques – is of high value for fundamental biomedical sciences. The project is target-oriented and once successful, will be of highest practical value and contribute to increased quality of life of millions of people worldwide."
Summary
"We aim to perform academic development of a novel biomedical opportunity: localized synthesis of drugs within biocatalytic therapeutic vascular implants (BVI) for site-specific drug delivery to target organs and tissues. Primary envisioned targets for therapeutic intervention using BVI are atherosclerosis, viral hepatitis, and hepatocellular carcinoma: three of the most prevalent and debilitating conditions which affect hundreds of millions worldwide and which continue to increase in their importance in the era of increasingly aging population. For hepatic applications, we aim to develop drug eluting beads which are equipped with tools of enzyme-prodrug therapy (EPT) and are administered to the liver via trans-arterial catheter embolization. Therein, the beads perform localized synthesis of drugs and imaging reagents for anticancer combination therapy and theranostics, antiviral and anti-inflammatory agents for the treatment of hepatitis. Further, we conceive vascular therapeutic inserts (VTI) as a novel type of implantable biomaterials for treatment of atherosclerosis and re-endothelialization of vascular stents and grafts. Using EPT, inserts will tame “the guardian of cardiovascular grafts”, nitric oxide, for which localized, site specific synthesis and delivery spell success of therapeutic intervention and/or aided tissue regeneration. This proposal is positioned on the forefront of biomedical engineering and its success requires excellence in polymer chemistry, materials design, medicinal chemistry, and translational medicine. Each part of this proposal - design of novel types of vascular implants, engineering novel biomaterials, developing innovative fabrication and characterization techniques – is of high value for fundamental biomedical sciences. The project is target-oriented and once successful, will be of highest practical value and contribute to increased quality of life of millions of people worldwide."
Max ERC Funding
1 996 126 €
Duration
Start date: 2014-04-01, End date: 2019-09-30
Project acronym CLIMSEC
Project Climate Variability and Security Threats
Researcher (PI) Halvard Buhaug
Host Institution (HI) INSTITUTT FOR FREDSFORSKNING STIFTELSE
Call Details Consolidator Grant (CoG), SH2, ERC-2014-CoG
Summary Recent uprisings across the world have accentuated claims that food insecurity is an important trigger of political violence. Is the Arab Spring representative of a general climate-conflict pattern, where severe droughts and other climate anomalies are a key driving force? Research to date has failed to conclude on a robust relationship but several notable theoretical and methodological shortcomings limit inference. CLIMSEC will address these research gaps. It asks: How does climate variability affect dynamics of political violence? This overarching research question will be addressed through the accomplishment of four key objectives: (1) Investigate how food security impacts of climate variability affect political violence; (2) Investigate how economic impacts of climate variability affect political violence; (3) Conduct short-term forecasts of political violence in response to food and economic shocks; and (4) Develop a comprehensive, testable theoretical model of security implications of climate variability. To achieve these objectives, CLIMSEC will advance the research frontier on theoretical as well as analytical accounts. Central in this endeavor is conceptual and empirical disaggregation. Instead of assuming states and calendar years as unitary and fixed entities, the project proposes causal processes that act at multiple temporal and spatial scales, involve various types of actors, and lead to very different forms of outcomes depending on the context. The empirical component will make innovative use of new geo - referenced data and methods; focus on a broad range of insecurity outcomes, including non-violent resistance; and combine rigorous statistical models with out-of-sample simulations and qualitative case studies for theorizing and validation of key findings. Based at PRIO, the project will be led by Research Professor Halvard Buhaug, a leading scholar on climate change and security with strong publication record and project management experience.
Summary
Recent uprisings across the world have accentuated claims that food insecurity is an important trigger of political violence. Is the Arab Spring representative of a general climate-conflict pattern, where severe droughts and other climate anomalies are a key driving force? Research to date has failed to conclude on a robust relationship but several notable theoretical and methodological shortcomings limit inference. CLIMSEC will address these research gaps. It asks: How does climate variability affect dynamics of political violence? This overarching research question will be addressed through the accomplishment of four key objectives: (1) Investigate how food security impacts of climate variability affect political violence; (2) Investigate how economic impacts of climate variability affect political violence; (3) Conduct short-term forecasts of political violence in response to food and economic shocks; and (4) Develop a comprehensive, testable theoretical model of security implications of climate variability. To achieve these objectives, CLIMSEC will advance the research frontier on theoretical as well as analytical accounts. Central in this endeavor is conceptual and empirical disaggregation. Instead of assuming states and calendar years as unitary and fixed entities, the project proposes causal processes that act at multiple temporal and spatial scales, involve various types of actors, and lead to very different forms of outcomes depending on the context. The empirical component will make innovative use of new geo - referenced data and methods; focus on a broad range of insecurity outcomes, including non-violent resistance; and combine rigorous statistical models with out-of-sample simulations and qualitative case studies for theorizing and validation of key findings. Based at PRIO, the project will be led by Research Professor Halvard Buhaug, a leading scholar on climate change and security with strong publication record and project management experience.
Max ERC Funding
1 996 945 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ConTExt
Project Connecting the Extreme
Researcher (PI) Sune Toft
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary Advances in technology and methodology over the last decade, have enabled the study of galaxies to the highest redshifts. This has revolutionized our understanding of the origin and evolution of galaxies. I have played a central role in this revolution, by discovering that at z=2, when the universe was only 3 Gyr old, half of the most massive galaxies were extremely compact and had already completed their star formation. During the last five years I have led a successful group of postdocs and students dedicated to investigating the extreme properties of these galaxies and place them into cosmological context. Combining a series of high profile observational studies published by my group and others, I recently proposed an evolutionary sequence that ties together the most extreme galaxies in the universe, from the most intense dusty starburst at cosmic dawn, through quasars: the brightest sources in the universe, driven by feedback from supermassive black holes, and galaxy cores hosting the densest conglomerations of stellar mass known, to the sleeping giants of the local universe, the giant ellipticals. The proposed research program will explore if such an evolutionary sequence exists, with the ultimate goal of reaching, for the first time, a coherent physical understanding of how the most massive galaxies in the universe formed. While there is a chance the rigorous tests may ultimately reveal the proposed sequence to be too simplistic, a guarantied outcome of the program is a significantly improved understanding of the physical mechanisms that shape galaxies and drive their star formation and quenching
Summary
Advances in technology and methodology over the last decade, have enabled the study of galaxies to the highest redshifts. This has revolutionized our understanding of the origin and evolution of galaxies. I have played a central role in this revolution, by discovering that at z=2, when the universe was only 3 Gyr old, half of the most massive galaxies were extremely compact and had already completed their star formation. During the last five years I have led a successful group of postdocs and students dedicated to investigating the extreme properties of these galaxies and place them into cosmological context. Combining a series of high profile observational studies published by my group and others, I recently proposed an evolutionary sequence that ties together the most extreme galaxies in the universe, from the most intense dusty starburst at cosmic dawn, through quasars: the brightest sources in the universe, driven by feedback from supermassive black holes, and galaxy cores hosting the densest conglomerations of stellar mass known, to the sleeping giants of the local universe, the giant ellipticals. The proposed research program will explore if such an evolutionary sequence exists, with the ultimate goal of reaching, for the first time, a coherent physical understanding of how the most massive galaxies in the universe formed. While there is a chance the rigorous tests may ultimately reveal the proposed sequence to be too simplistic, a guarantied outcome of the program is a significantly improved understanding of the physical mechanisms that shape galaxies and drive their star formation and quenching
Max ERC Funding
1 999 526 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym CoreSat
Project Dynamics of Earth’s core from multi-satellite observations
Researcher (PI) Christopher FINLAY
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary Earth's magnetic field plays a fundamental role in our planetary habitat, controlling interactions between the Earth and the solar wind. Here, I propose to use magnetic observations, made simultaneously by multiple satellites, along with numerical models of outer core dynamics, to test whether convective processes can account for ongoing changes in the field. The geomagnetic field is generated by a dynamo process within the core converting kinetic energy of the moving liquid metal into magnetic energy. Yet observations show a region of persistently weak field in the South Atlantic that has grown in size in recent decades. Pinning down the core dynamics responsible for this behaviour is essential if we are to understand the detailed time-dependence of the geodynamo, and to forecast future field changes.
Global magnetic observations from the Swarm constellation mission, with three identical satellites now carrying out the most detailed ever survey of the geomagnetic field, provide an exciting opportunity to probe the dynamics of the core in exquisite detail. To exploit this wealth of data, it is urgent that contaminating magnetic sources in the lithosphere and ionosphere are better separated from the core-generated field. I propose to achieve this, and to test the hypothesis that core convection has controlled the recent field evolution in the South Atlantic, via three interlinked projects. First I will co-estimate separate models for the lithospheric and core fields, making use of prior information from crustal geology and dynamo theory. In parallel, I will develop a new scheme for isolating and removing the signature of polar ionospheric currents, better utilising ground-based data. Taking advantage of these improvements, data from Swarm and previous missions will be reprocessed and then assimilated into a purpose-built model of quasi-geostrophic core convection.
Summary
Earth's magnetic field plays a fundamental role in our planetary habitat, controlling interactions between the Earth and the solar wind. Here, I propose to use magnetic observations, made simultaneously by multiple satellites, along with numerical models of outer core dynamics, to test whether convective processes can account for ongoing changes in the field. The geomagnetic field is generated by a dynamo process within the core converting kinetic energy of the moving liquid metal into magnetic energy. Yet observations show a region of persistently weak field in the South Atlantic that has grown in size in recent decades. Pinning down the core dynamics responsible for this behaviour is essential if we are to understand the detailed time-dependence of the geodynamo, and to forecast future field changes.
Global magnetic observations from the Swarm constellation mission, with three identical satellites now carrying out the most detailed ever survey of the geomagnetic field, provide an exciting opportunity to probe the dynamics of the core in exquisite detail. To exploit this wealth of data, it is urgent that contaminating magnetic sources in the lithosphere and ionosphere are better separated from the core-generated field. I propose to achieve this, and to test the hypothesis that core convection has controlled the recent field evolution in the South Atlantic, via three interlinked projects. First I will co-estimate separate models for the lithospheric and core fields, making use of prior information from crustal geology and dynamo theory. In parallel, I will develop a new scheme for isolating and removing the signature of polar ionospheric currents, better utilising ground-based data. Taking advantage of these improvements, data from Swarm and previous missions will be reprocessed and then assimilated into a purpose-built model of quasi-geostrophic core convection.
Max ERC Funding
1 828 708 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym Cosmoglobe
Project Cosmoglobe -- mapping the universe from the Milky Way to the Big Bang
Researcher (PI) Ingunn Kathrine WEHUS
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Summary
In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Max ERC Funding
1 999 382 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym F-BioIce
Project Fundamentals of Biological Ice Nucleation
Researcher (PI) Tobias WEIDNER
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary Ice active bacteria can promote the growth of ice more effectively than any other material known. Using specialized ice nucleating proteins (INPs), they attack plants by frost damage and, when airborne in the atmosphere, they drive ice nucleation within clouds and control global precipitation patterns. The control INPs exert over water phase transitions has relevance for disciplines as diverse as climatology, plant pathology, biomedicine and material science. Despite the apparent importance, the molecular mechanisms behind INP freezing have remained largely elusive. This lack of our knowledge can be traced back to the challenges in studying protein and water structure and dynamics at the very interface between monolayers of proteins and water.
With F-BioIce my team and I want to reveal the molecular details of INP function. We ask the questions: What is the structural basis for protein control of freezing? What structural motifs do proteins use to interact with water, and what is the configuration of water molecules that INPs imprint into interfacial water layers? What is the role of structural dynamics and for surface freezing? We will develop new methods based on sum frequency generation (SFG) spectroscopy to determine mode of action by which INPs interact with and manipulate water. The INPs and water structure will be obtained by combining three rising methods in the field: SFG techniques that I have been spearheading, computer simulations and cryo-electron microscopy. We will study model water surfaces and, for the first time, realistic water aerosols interacting with INPs. These new strategies could lead to a paradigm shift in the entire field of ice nucleation and a search for similar processes in ice active fungi and pollen and abiotic ice nucleators – feldspar, silica and soot. The obtained information will provide critical input for climate models and revolutionary new freezing technologies for food preservation, cryomedicine and cloud seeding.
Summary
Ice active bacteria can promote the growth of ice more effectively than any other material known. Using specialized ice nucleating proteins (INPs), they attack plants by frost damage and, when airborne in the atmosphere, they drive ice nucleation within clouds and control global precipitation patterns. The control INPs exert over water phase transitions has relevance for disciplines as diverse as climatology, plant pathology, biomedicine and material science. Despite the apparent importance, the molecular mechanisms behind INP freezing have remained largely elusive. This lack of our knowledge can be traced back to the challenges in studying protein and water structure and dynamics at the very interface between monolayers of proteins and water.
With F-BioIce my team and I want to reveal the molecular details of INP function. We ask the questions: What is the structural basis for protein control of freezing? What structural motifs do proteins use to interact with water, and what is the configuration of water molecules that INPs imprint into interfacial water layers? What is the role of structural dynamics and for surface freezing? We will develop new methods based on sum frequency generation (SFG) spectroscopy to determine mode of action by which INPs interact with and manipulate water. The INPs and water structure will be obtained by combining three rising methods in the field: SFG techniques that I have been spearheading, computer simulations and cryo-electron microscopy. We will study model water surfaces and, for the first time, realistic water aerosols interacting with INPs. These new strategies could lead to a paradigm shift in the entire field of ice nucleation and a search for similar processes in ice active fungi and pollen and abiotic ice nucleators – feldspar, silica and soot. The obtained information will provide critical input for climate models and revolutionary new freezing technologies for food preservation, cryomedicine and cloud seeding.
Max ERC Funding
1 999 936 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym FRECOM
Project Nonlinear-Distortion Free Communication over the Optical Fibre Channel
Researcher (PI) Darko ZIBAR
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary Motivation
The enormous growth in the Internet of Things and server farms for cloud services has increased the strain on the optical communication infrastructure. By 2025, our society will require data rates that are physically impossible to implement using current state-of-the-art optical communication technologies. This is because fibre-optic communication systems are rapidly approaching their fundamental capacity limits imposed by the Kerr nonlinearity of the fibre. Nonlinear distortion limits the ability to transport and detect the information stream. This is a very critical problem for increasing the data rates of any optical fibre communication system.
Proposed research
The only physical quantities not affected by the nonlinearity are eigenvalues, associated with the optical fibre propagation equation. Eigenvalues are thereby ideal candidates for information transport. The concept of eigenvalues is derived under the assumption that the fibre is lossless and that there is no noise in the system which is not strictly correct. Therefore, novel methodologies and concepts for the design of a noise mitigating receiver and a noise robust transmitter are needed to reap the full benefits of optical communication systems employing eigenvalues. This proposal will develop such strategies. This will be achieved by combining, for the first time, the fields of nonlinear optics, optical communication and nonlinear digital signal processing. The results from the project will be verified experimentally, and will form the basis for a new generation of commercial optical communication systems.
Preliminary results
Our proof-of-concept results demonstrate, for the first time, that noise can be handled by employing novel receiver concepts. An order of magnitude improvement compared to the state-of-the-art is demonstrated.
Environment
The research will be carried out in close cooperation with leading groups at Stanford University and Technical University of Munich.
Summary
Motivation
The enormous growth in the Internet of Things and server farms for cloud services has increased the strain on the optical communication infrastructure. By 2025, our society will require data rates that are physically impossible to implement using current state-of-the-art optical communication technologies. This is because fibre-optic communication systems are rapidly approaching their fundamental capacity limits imposed by the Kerr nonlinearity of the fibre. Nonlinear distortion limits the ability to transport and detect the information stream. This is a very critical problem for increasing the data rates of any optical fibre communication system.
Proposed research
The only physical quantities not affected by the nonlinearity are eigenvalues, associated with the optical fibre propagation equation. Eigenvalues are thereby ideal candidates for information transport. The concept of eigenvalues is derived under the assumption that the fibre is lossless and that there is no noise in the system which is not strictly correct. Therefore, novel methodologies and concepts for the design of a noise mitigating receiver and a noise robust transmitter are needed to reap the full benefits of optical communication systems employing eigenvalues. This proposal will develop such strategies. This will be achieved by combining, for the first time, the fields of nonlinear optics, optical communication and nonlinear digital signal processing. The results from the project will be verified experimentally, and will form the basis for a new generation of commercial optical communication systems.
Preliminary results
Our proof-of-concept results demonstrate, for the first time, that noise can be handled by employing novel receiver concepts. An order of magnitude improvement compared to the state-of-the-art is demonstrated.
Environment
The research will be carried out in close cooperation with leading groups at Stanford University and Technical University of Munich.
Max ERC Funding
2 000 000 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym GRANN
Project Graphene Coated Nanoparticles and Nanograins
Researcher (PI) Liv Haahr Hornekaer
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), PE4, ERC-2014-CoG
Summary In a truly cross-disciplinary research project encompassing surface science, optics, nano-science, astrophysics and chemistry we will synthesize a novel family of high quality mono-layer graphene coated nanoparticles and graphene nanograins with new chemical and optical properties and investigate their catalytic activity, chemical stability and optical characteristics to gauge their relevance for and applicability in industrial catalysis, solar cells, and interstellar chemistry.
This will be accomplished by extending existing expertise, knowledge and methods developed by us and by international colleagues for graphene synthesis, graphene reactivity and chemical functionalization, graphene coatings on industrially relevant samples and interstellar surface astrochemistry on carbonaceous materials, into the nanoparticle regime. Combined with state-of-the-art surface science characterization methods with emphasis on scanning tunnelling microscopy and spectroscopy, high resolution transmission electron microscopy, X-ray photoelectron spectroscopy, and thermal desorption mass spectrometry, complemented by Raman and transmission spectroscopy, this will enable us to design, characterize, and understand the properties of this new family of particles at the atomic level.
The vision is to harness and combine the remarkable properties of graphene and nanoparticles to create systems with entirely new and unexplored characteristics, to tune these characteristics to be useful for real-world applications, and to exploit the new systems as the first realistic laboratory models of catalytic nanoparticles for interstellar surface chemistry.
This ambitious and cross-disciplinary research program will predominantly take place at the Surface Dynamics Laboratory at Aarhus University which is headed by the applicant, but will also involve local, national and international collaborators.
Summary
In a truly cross-disciplinary research project encompassing surface science, optics, nano-science, astrophysics and chemistry we will synthesize a novel family of high quality mono-layer graphene coated nanoparticles and graphene nanograins with new chemical and optical properties and investigate their catalytic activity, chemical stability and optical characteristics to gauge their relevance for and applicability in industrial catalysis, solar cells, and interstellar chemistry.
This will be accomplished by extending existing expertise, knowledge and methods developed by us and by international colleagues for graphene synthesis, graphene reactivity and chemical functionalization, graphene coatings on industrially relevant samples and interstellar surface astrochemistry on carbonaceous materials, into the nanoparticle regime. Combined with state-of-the-art surface science characterization methods with emphasis on scanning tunnelling microscopy and spectroscopy, high resolution transmission electron microscopy, X-ray photoelectron spectroscopy, and thermal desorption mass spectrometry, complemented by Raman and transmission spectroscopy, this will enable us to design, characterize, and understand the properties of this new family of particles at the atomic level.
The vision is to harness and combine the remarkable properties of graphene and nanoparticles to create systems with entirely new and unexplored characteristics, to tune these characteristics to be useful for real-world applications, and to exploit the new systems as the first realistic laboratory models of catalytic nanoparticles for interstellar surface chemistry.
This ambitious and cross-disciplinary research program will predominantly take place at the Surface Dynamics Laboratory at Aarhus University which is headed by the applicant, but will also involve local, national and international collaborators.
Max ERC Funding
1 996 147 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym INTERACTION
Project Cloud-cloud interaction in convective precipitation
Researcher (PI) Jan Olaf Mirko Härter
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary State-of-the-art simulations and observations highlight the self-organization of convective clouds. Our recent work shows two aspects: these clouds are capable of unexpected increase in extreme precipitation when temperature rises; interactions between clouds produce the extremes. As clouds interact, they organize in space and carry a memory of past interaction and precipitation events. This evidence reveals a severe shortcoming of the conventional separation into "forcing" and "feedback" in climate model parameterizations, namely that the "feedback" develops a dynamics of its own, thus driving the extremes. The major scientific challenge tackled in INTERACTION is to make a ground-breaking departure from the established paradigm of "quasi-equilibrium" and instantaneous convective adjustment, traditionally used for parameterization of "sub-grid-scale processes" in general circulation models. To capture convective self-organization and extremes, the out-of-equilibrium cloud field must be described. In INTERACTION, I will produce a conceptual model for the out-of-equilibrium system of interacting clouds. Once triggered, clouds precipitate on a short timescale, but then relax in a "recovery" state where further precipitation is suppressed. Interaction with the surroundings occurs through cold pool outflow,facilitating the onset of new events in the wake. I will perform tailored numerical experiments using cutting-edge large-eddy simulations and very-high-resolution observational analysis to determine the effective interactions in the cloud system. Going beyond traditional forcing-and-feedback descriptions, I emphasize gradual self-organization with explicit temperature dependence. The list of key variables of atmospheric water vapor, temperature and precipitation must therefore be amended by variables describing organization. Capturing the self-organization of convection is essential for understanding of the risk of precipitation extremes today and in a future climate.
Summary
State-of-the-art simulations and observations highlight the self-organization of convective clouds. Our recent work shows two aspects: these clouds are capable of unexpected increase in extreme precipitation when temperature rises; interactions between clouds produce the extremes. As clouds interact, they organize in space and carry a memory of past interaction and precipitation events. This evidence reveals a severe shortcoming of the conventional separation into "forcing" and "feedback" in climate model parameterizations, namely that the "feedback" develops a dynamics of its own, thus driving the extremes. The major scientific challenge tackled in INTERACTION is to make a ground-breaking departure from the established paradigm of "quasi-equilibrium" and instantaneous convective adjustment, traditionally used for parameterization of "sub-grid-scale processes" in general circulation models. To capture convective self-organization and extremes, the out-of-equilibrium cloud field must be described. In INTERACTION, I will produce a conceptual model for the out-of-equilibrium system of interacting clouds. Once triggered, clouds precipitate on a short timescale, but then relax in a "recovery" state where further precipitation is suppressed. Interaction with the surroundings occurs through cold pool outflow,facilitating the onset of new events in the wake. I will perform tailored numerical experiments using cutting-edge large-eddy simulations and very-high-resolution observational analysis to determine the effective interactions in the cloud system. Going beyond traditional forcing-and-feedback descriptions, I emphasize gradual self-organization with explicit temperature dependence. The list of key variables of atmospheric water vapor, temperature and precipitation must therefore be amended by variables describing organization. Capturing the self-organization of convection is essential for understanding of the risk of precipitation extremes today and in a future climate.
Max ERC Funding
1 314 800 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym ISLAS
Project Isotopic links to atmopheric water's sources
Researcher (PI) Harald SODEMANN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary The hydrological cycle, with its feedbacks related to water vapour and clouds, is the largest source of uncertainty in weather prediction and climate models. Particularly processes that occur on scales smaller than the model grid lead to errors, which can compensate one another, making them difficult to detect and correct for. Undetectable compensating errors critically limit the understanding of hydrological extremes, the response of the water cycle to a changing climate, and the interpretation of paleoclimate records. Stable water isotopes have a unique potential to serve as the needed constraints, as they provide measures of moisture origin and of the phase change history. We have recently spearheaded a revised view of the atmospheric water cycle, which highlights the importance of connections on a regional scale. This implies that in some areas, all relevant processes can be studied on a regional scale. The Nordic Seas are an ideal case of such a natural laboratory, with distinct evaporation events, shallow transport processes, and swift precipitation formation. Together with recent technological advances in isotope measurements and in-situ sample collection, this will allow us to acquire a new kind of observational data set that will follow the history of water vapour from source to sink. The high-resolution, high-precision isotope data will provide a combined view of established and novel natural isotopic source tracers and set new benchmarks for climate models. A unique palette of sophisticated model tools will allow us to decipher, synthesize and exploit these observations, and to identify compensating errors between water cycle processes in models. In ISLAS, my team and I will thus make unprecedented use of stable isotopes to provide the sought-after constraints for an improved understanding of the hydrological cycle in nature and in climate models, leading towards improved predictions of future climate.
Summary
The hydrological cycle, with its feedbacks related to water vapour and clouds, is the largest source of uncertainty in weather prediction and climate models. Particularly processes that occur on scales smaller than the model grid lead to errors, which can compensate one another, making them difficult to detect and correct for. Undetectable compensating errors critically limit the understanding of hydrological extremes, the response of the water cycle to a changing climate, and the interpretation of paleoclimate records. Stable water isotopes have a unique potential to serve as the needed constraints, as they provide measures of moisture origin and of the phase change history. We have recently spearheaded a revised view of the atmospheric water cycle, which highlights the importance of connections on a regional scale. This implies that in some areas, all relevant processes can be studied on a regional scale. The Nordic Seas are an ideal case of such a natural laboratory, with distinct evaporation events, shallow transport processes, and swift precipitation formation. Together with recent technological advances in isotope measurements and in-situ sample collection, this will allow us to acquire a new kind of observational data set that will follow the history of water vapour from source to sink. The high-resolution, high-precision isotope data will provide a combined view of established and novel natural isotopic source tracers and set new benchmarks for climate models. A unique palette of sophisticated model tools will allow us to decipher, synthesize and exploit these observations, and to identify compensating errors between water cycle processes in models. In ISLAS, my team and I will thus make unprecedented use of stable isotopes to provide the sought-after constraints for an improved understanding of the hydrological cycle in nature and in climate models, leading towards improved predictions of future climate.
Max ERC Funding
1 999 054 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym LIMA
Project Controlling light-matter interactions by quantum designed 2D materials
Researcher (PI) Kristian Sommer THYGESEN
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Consolidator Grant (CoG), PE3, ERC-2017-COG
Summary Progress within many contemporary or emergent technologies, including photovoltaics, single-photon light sources, and plasmonics, depends crucially on our ability to control the interactions between light and matter. The complexity of the light-matter interactions has made the development of photonic materials a slow, expensive, and empirical-based science. Of particular importance are the detrimental non-radiative processes mediated by defects and phonons that lead to efficiency losses in photovoltaics, reduce the quantum efficiency of single-photon emitters, and cause Ohmic losses in the metallic components of plasmonic devices. LIMA will develop ground breaking methods for calculating non-radiative relaxation rates in real materials from first principles. These will be used to evaluate key performance parameters such as photo-carrier lifetimes and plasmon propagation lengths and thus facilitate a realistic computational assessment of the application potential of photonic materials. In terms of materials, LIMA will focus on the emergent class of atomically thin two-dimensional (2D) materials. The possibility of combining different 2D materials into van der Waals heterostructures (vdWHs) provides a unique platform for controlling light-matter interactions with atomic scale precision. Multi-scale methods for predicting quasiparticle band structures of general, incommensurable vdWHs will be developed and used to design novel photonic materials with tailored light dispersion and multi-junction solar cells with high absorption and low thermalization losses. High-throughput computational screening will be used to identify novel color centers in 2D materials with potential to act as single-photon sources with high quantum yield and narrow linewidths, which are urgently needed by leading quantum technologies. The possibilities of controlling the color centers via strain engineering and light management will be explored in close collaboration with experimentalists.
Summary
Progress within many contemporary or emergent technologies, including photovoltaics, single-photon light sources, and plasmonics, depends crucially on our ability to control the interactions between light and matter. The complexity of the light-matter interactions has made the development of photonic materials a slow, expensive, and empirical-based science. Of particular importance are the detrimental non-radiative processes mediated by defects and phonons that lead to efficiency losses in photovoltaics, reduce the quantum efficiency of single-photon emitters, and cause Ohmic losses in the metallic components of plasmonic devices. LIMA will develop ground breaking methods for calculating non-radiative relaxation rates in real materials from first principles. These will be used to evaluate key performance parameters such as photo-carrier lifetimes and plasmon propagation lengths and thus facilitate a realistic computational assessment of the application potential of photonic materials. In terms of materials, LIMA will focus on the emergent class of atomically thin two-dimensional (2D) materials. The possibility of combining different 2D materials into van der Waals heterostructures (vdWHs) provides a unique platform for controlling light-matter interactions with atomic scale precision. Multi-scale methods for predicting quasiparticle band structures of general, incommensurable vdWHs will be developed and used to design novel photonic materials with tailored light dispersion and multi-junction solar cells with high absorption and low thermalization losses. High-throughput computational screening will be used to identify novel color centers in 2D materials with potential to act as single-photon sources with high quantum yield and narrow linewidths, which are urgently needed by leading quantum technologies. The possibilities of controlling the color centers via strain engineering and light management will be explored in close collaboration with experimentalists.
Max ERC Funding
1 951 354 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym Loops and groups
Project Loops and groups: Geodesics, moduli spaces, and infinite discrete groups via string topology and homological stability
Researcher (PI) Nathalie Anne M. Wahl
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE1, ERC-2017-COG
Summary This proposal lies at the intersection of algebra, topology, and geometry, with the scientific goal of answering central questions about homological stability, geodesics on manifolds, and the moduli space of Riemann surfaces. Homological stability is a subject that has seen spectacular progress in recent years, and recent work of the PI has opened up new perspectives on this field, through, among other things, associating a canonical family of spaces to any stability problem. The first two goals of the proposal are to give conditions under which this family of spaces is highly connected, and to use this to prove homological and representation stability theorems, with determination of the stable homology. Particular attention is given to Thompson-like groups, building on a recent breakthrough of the PI with Szymik. The last two goals concern geodesics and moduli spaces via string topology: The third goal seeks a geometric construction of compactified string topology, which we propose to use to address counting problems for geodesics on manifolds. Finally our fourth goal is to use compactified string topology to study the harmonic compactification itself, and give a new approach to finding families of unstable homology classes in the moduli space of Riemann surfaces. The feasibility of the last goals is demonstrated by the PIs earlier algebraic work in this direction; the proposal is to incorporate geometry in a much more fundamental way.
The project combines breakthrough methods from homotopy theory with methods from algebraic, differential and geometric topology. Some of the goals are high risk, but we note that in those cases even partial results will be of significant interest. The PI has a proven track record at the international forefront of research, and as a research leader, e.g., through a previous ERC Starting Grant. The research team will consist of the PI together with 3 PhD students and 3 postdocs in total during the 5 years.
Summary
This proposal lies at the intersection of algebra, topology, and geometry, with the scientific goal of answering central questions about homological stability, geodesics on manifolds, and the moduli space of Riemann surfaces. Homological stability is a subject that has seen spectacular progress in recent years, and recent work of the PI has opened up new perspectives on this field, through, among other things, associating a canonical family of spaces to any stability problem. The first two goals of the proposal are to give conditions under which this family of spaces is highly connected, and to use this to prove homological and representation stability theorems, with determination of the stable homology. Particular attention is given to Thompson-like groups, building on a recent breakthrough of the PI with Szymik. The last two goals concern geodesics and moduli spaces via string topology: The third goal seeks a geometric construction of compactified string topology, which we propose to use to address counting problems for geodesics on manifolds. Finally our fourth goal is to use compactified string topology to study the harmonic compactification itself, and give a new approach to finding families of unstable homology classes in the moduli space of Riemann surfaces. The feasibility of the last goals is demonstrated by the PIs earlier algebraic work in this direction; the proposal is to incorporate geometry in a much more fundamental way.
The project combines breakthrough methods from homotopy theory with methods from algebraic, differential and geometric topology. Some of the goals are high risk, but we note that in those cases even partial results will be of significant interest. The PI has a proven track record at the international forefront of research, and as a research leader, e.g., through a previous ERC Starting Grant. The research team will consist of the PI together with 3 PhD students and 3 postdocs in total during the 5 years.
Max ERC Funding
1 864 419 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym LOPRE
Project Lossy Preprocessing
Researcher (PI) Saket SAURABH
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Summary
A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym MSMA
Project Moduli Spaces, Manifolds and Arithmetic
Researcher (PI) Søren Galatius
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary This proposal concerns the application of homotopy theoretic methods to multiple questions of geometric nature, and in particular the study of moduli spaces. Firmly based in topology, the research proposed here is strongly motivated by applications and potential applications to differential geometry, algebraic geometry and especially number theory.
Any “moduli space” parametrizes how certain objects may vary in families. The moduli spaces of manifolds parametrize how smooth manifolds may vary in families (smooth fiber bundles), and the representation varieties studied in the second major component parametrize how linear representations of a group may vary in algebraic families.
The homotopy theoretic study of moduli spaces of manifolds has seen spectacular successes in the last 15 years, kickstarted by a theorem of Madsen and Weiss concerning the topology of moduli spaces of 2-dimensional manifolds. Very recently, anongoing collaboration between O. Randal-Williams and myself promises to establish analoguous results for manifolds of higher dimension. If funded, the research proposed here will bring this research program to a point where all major results about surface moduli spaces have proven analogues for manifolds of higher dimension.
The second major component of this proposal has strong number-theoretic origins, but is essentially homotopy theoretic. It concerns the study of universal deformations of representations of (Galois) groups. If funded, the research in this component of the proposal, joint with Akshay Venkatesh, will develop derived (simplicial) deformation rings. Classical deformation rings have had spectacular applications in number theory (starting with Wiles’ work) and we also propose to begin the study of applications ofderived deformation rings.
Finally, the proposal contains smaller or more speculative projects, and points out many questions which might be suitable for the Ph.D.-students and postdocs also applied for in this proposal.
Summary
This proposal concerns the application of homotopy theoretic methods to multiple questions of geometric nature, and in particular the study of moduli spaces. Firmly based in topology, the research proposed here is strongly motivated by applications and potential applications to differential geometry, algebraic geometry and especially number theory.
Any “moduli space” parametrizes how certain objects may vary in families. The moduli spaces of manifolds parametrize how smooth manifolds may vary in families (smooth fiber bundles), and the representation varieties studied in the second major component parametrize how linear representations of a group may vary in algebraic families.
The homotopy theoretic study of moduli spaces of manifolds has seen spectacular successes in the last 15 years, kickstarted by a theorem of Madsen and Weiss concerning the topology of moduli spaces of 2-dimensional manifolds. Very recently, anongoing collaboration between O. Randal-Williams and myself promises to establish analoguous results for manifolds of higher dimension. If funded, the research proposed here will bring this research program to a point where all major results about surface moduli spaces have proven analogues for manifolds of higher dimension.
The second major component of this proposal has strong number-theoretic origins, but is essentially homotopy theoretic. It concerns the study of universal deformations of representations of (Galois) groups. If funded, the research in this component of the proposal, joint with Akshay Venkatesh, will develop derived (simplicial) deformation rings. Classical deformation rings have had spectacular applications in number theory (starting with Wiles’ work) and we also propose to begin the study of applications ofderived deformation rings.
Finally, the proposal contains smaller or more speculative projects, and points out many questions which might be suitable for the Ph.D.-students and postdocs also applied for in this proposal.
Max ERC Funding
1 991 061 €
Duration
Start date: 2016-06-01, End date: 2021-11-30
Project acronym PAW
Project Automated Program Analysis for Advanced Web Applications
Researcher (PI) Anders Møller
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary Web applications that execute in the user's web browser constitute a substantial part of modern software. JavaScript is the main programming language of the web, although alternatives are emerging, in particular, TypeScript and Dart. Despite the advances in design of languages and libraries, it is difficult to prevent errors when programming such web applications. Although the basic principles of software verification have been known for decades and researchers have developed an abundance of techniques for formal reasoning about programs, modern software has lots of errors, as everyday users can testify.
The PAW project will create novel automated program analysis algorithms for preventing errors and improving performance of advanced web applications. The project hypothesis is that a scientific breakthrough is within reach, due to recent results in static and dynamic program analysis for JavaScript. The central idea is to combine static and dynamic analysis in new ways. In addition, the project will make program analysis algorithms and infrastructure available in a form that embraces reusability.
Summary
Web applications that execute in the user's web browser constitute a substantial part of modern software. JavaScript is the main programming language of the web, although alternatives are emerging, in particular, TypeScript and Dart. Despite the advances in design of languages and libraries, it is difficult to prevent errors when programming such web applications. Although the basic principles of software verification have been known for decades and researchers have developed an abundance of techniques for formal reasoning about programs, modern software has lots of errors, as everyday users can testify.
The PAW project will create novel automated program analysis algorithms for preventing errors and improving performance of advanced web applications. The project hypothesis is that a scientific breakthrough is within reach, due to recent results in static and dynamic program analysis for JavaScript. The central idea is to combine static and dynamic analysis in new ways. In addition, the project will make program analysis algorithms and infrastructure available in a form that embraces reusability.
Max ERC Funding
1 977 382 €
Duration
Start date: 2015-08-01, End date: 2021-07-31
Project acronym PHOENEEX
Project Pyrolytic Hierarchical Organic Electrodes for sustaiNable Electrochemical Energy Systems
Researcher (PI) Stephan Sylvest Keller
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary The demand for compact energy systems for portable devices such as wearable sensors or mobile phones is increasing. Electrochemical systems are promising candidates for sustainable energy conversion and storage on miniaturised platforms. A recent approach to harvest green energy is biophotovoltaic systems (BPVs), where photosynthetic microorganisms are used to transform light into electrical energy. However, BPVs still provide a relatively low efficiency and are yet unable to deliver the high peak power required for sensor operation or wireless signal transmission in portable systems. In PHOENEEX, I will address these limitations by i) improving the efficiency of BPVs and ii) combining the BPVs with microsupercapacitors (µSCs) which can temporarily store the harvested electrical energy and provide a higher peak power output upon request. More specifically, I will develop highly optimised 3D carbon microelectrodes (3DCMEs) to enhance electron harvesting from cyanobacteria in BPVs and for increased energy density in µSCs. Finally, the improved BPVs and the optimised µSCs will be integrated on the BioCapacitor Microchip - a compact sustainable energy platform for portable systems.
The fabrication of 3DCMEs with highly tailored material properties, large surface area and hierarchical architecture is achieved by pyrolysis of polymer templates in an inert atmosphere. The fundamental hypothesis of PHOENEEX is that the combination of novel precursor materials, new methods for 3D polymer microfabrication and optimised pyrolysis processes will allow for fabrication of 3DCMEs with highly tailored material properties, large surface area and hierarchical architecture impossible to obtain with any other method.
Summary
The demand for compact energy systems for portable devices such as wearable sensors or mobile phones is increasing. Electrochemical systems are promising candidates for sustainable energy conversion and storage on miniaturised platforms. A recent approach to harvest green energy is biophotovoltaic systems (BPVs), where photosynthetic microorganisms are used to transform light into electrical energy. However, BPVs still provide a relatively low efficiency and are yet unable to deliver the high peak power required for sensor operation or wireless signal transmission in portable systems. In PHOENEEX, I will address these limitations by i) improving the efficiency of BPVs and ii) combining the BPVs with microsupercapacitors (µSCs) which can temporarily store the harvested electrical energy and provide a higher peak power output upon request. More specifically, I will develop highly optimised 3D carbon microelectrodes (3DCMEs) to enhance electron harvesting from cyanobacteria in BPVs and for increased energy density in µSCs. Finally, the improved BPVs and the optimised µSCs will be integrated on the BioCapacitor Microchip - a compact sustainable energy platform for portable systems.
The fabrication of 3DCMEs with highly tailored material properties, large surface area and hierarchical architecture is achieved by pyrolysis of polymer templates in an inert atmosphere. The fundamental hypothesis of PHOENEEX is that the combination of novel precursor materials, new methods for 3D polymer microfabrication and optimised pyrolysis processes will allow for fabrication of 3DCMEs with highly tailored material properties, large surface area and hierarchical architecture impossible to obtain with any other method.
Max ERC Funding
2 745 500 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym PLEDGEDEM
Project Pledges in democracy
Researcher (PI) Carsten JENSEN
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), SH2, ERC-2018-COG
Summary Election pledges are supposedly a vital part of representative democracy. Yet we do not in fact know whether and how pledges matter for vote choice and accountability. This project thus asks: Do election pledges matter for voters’ democratic behavior and beliefs?
The role of pledges in citizens’ democratic behavior and beliefs is, surprisingly, virtually unexplored. This project’s ambition is therefore to create a new research agenda that redefines how political scientists think about the link between parties and voters. The project not only advances the research frontier by introducing a new, crucial phenomenon for political scientists to study; it also breaks new ground because it provides original theoretical and methodological tools for this new research agenda.
The key empirical contribution of this project is to collect two path-breaking datasets in the United States, France, and Norway that produce an unbiased estimate of voters’ awareness and use of pledges. The first consists of a set of innovative panel surveys with embedded conjoint experiments conducted both before and after national elections. The second dataset codes all pledges; whether or not they are broken; and how the mass media report on them.
This project is unique in its scientific ambition: It studies the core mechanism of representative democracy as it happens in real time, and does so in several countries. If successful, we will have much firmer knowledge about how voters select parties that best represent them and sanction those that betray their trust – and what this all implies for people’s trust in democracy.
Summary
Election pledges are supposedly a vital part of representative democracy. Yet we do not in fact know whether and how pledges matter for vote choice and accountability. This project thus asks: Do election pledges matter for voters’ democratic behavior and beliefs?
The role of pledges in citizens’ democratic behavior and beliefs is, surprisingly, virtually unexplored. This project’s ambition is therefore to create a new research agenda that redefines how political scientists think about the link between parties and voters. The project not only advances the research frontier by introducing a new, crucial phenomenon for political scientists to study; it also breaks new ground because it provides original theoretical and methodological tools for this new research agenda.
The key empirical contribution of this project is to collect two path-breaking datasets in the United States, France, and Norway that produce an unbiased estimate of voters’ awareness and use of pledges. The first consists of a set of innovative panel surveys with embedded conjoint experiments conducted both before and after national elections. The second dataset codes all pledges; whether or not they are broken; and how the mass media report on them.
This project is unique in its scientific ambition: It studies the core mechanism of representative democracy as it happens in real time, and does so in several countries. If successful, we will have much firmer knowledge about how voters select parties that best represent them and sanction those that betray their trust – and what this all implies for people’s trust in democracy.
Max ERC Funding
1 999 255 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym POLICYAID
Project Policy, practice and patient experience in the age of intensified data sourcing
Researcher (PI) Klaus Lindgaard Hoeyer
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), SH2, ERC-2015-CoG
Summary The European healthcare services have begun collecting tissue samples and healthcare data from patients on an unprecedented scale. With POLICYAID we suggest the term 'intensified data sourcing' to describe these attempts at getting more data, on more people, of better quality while simultaneously making the data available for multiple uses. Data are used for research, for financial remuneration purposes, for quality assurance, to attract capital and even for police work. POLICYAID investigates how the diverse agendas interact in the making of a new infrastructure for healthcare.
POLICYAID ambitiously aims to understand the drivers for and implications of intensified data sourcing in the biomedical realm across three levels: 1) policymaking, 2) everyday clinical practices, and 3) citizen experiences of health, illness, rights and duties. To achieve this aim we compare four different forms of intensified data sourcing, and analyze the regulatory frameworks guiding the data procurement and use in Denmark, the EU and beyond.
Based on PI’s strong inter-disciplinary background and experience, we fuse legal, sociological, anthropological and public health scholarship and develop new methodologies for policy analysis by combining document analysis, interviews, participant observation and register-based methodologies. Instead of simply assuming that data sourcing can be reduced to matters of surveillance, we open up the black box of data sourcing by describing how data are selected; financed; what they are used for; how data practices relate to the involved stakeholders' hopes and concerns, and; who gains which rights to the data. We can thereby explore how intensified data sourcing affects clinical routines and patient experience, as well as understand how Big Data for medical research emerges. POLICYAID thereby arrives at novel understandings of both policy making and what it means to be patient in the age of intensified data sourcing.
Summary
The European healthcare services have begun collecting tissue samples and healthcare data from patients on an unprecedented scale. With POLICYAID we suggest the term 'intensified data sourcing' to describe these attempts at getting more data, on more people, of better quality while simultaneously making the data available for multiple uses. Data are used for research, for financial remuneration purposes, for quality assurance, to attract capital and even for police work. POLICYAID investigates how the diverse agendas interact in the making of a new infrastructure for healthcare.
POLICYAID ambitiously aims to understand the drivers for and implications of intensified data sourcing in the biomedical realm across three levels: 1) policymaking, 2) everyday clinical practices, and 3) citizen experiences of health, illness, rights and duties. To achieve this aim we compare four different forms of intensified data sourcing, and analyze the regulatory frameworks guiding the data procurement and use in Denmark, the EU and beyond.
Based on PI’s strong inter-disciplinary background and experience, we fuse legal, sociological, anthropological and public health scholarship and develop new methodologies for policy analysis by combining document analysis, interviews, participant observation and register-based methodologies. Instead of simply assuming that data sourcing can be reduced to matters of surveillance, we open up the black box of data sourcing by describing how data are selected; financed; what they are used for; how data practices relate to the involved stakeholders' hopes and concerns, and; who gains which rights to the data. We can thereby explore how intensified data sourcing affects clinical routines and patient experience, as well as understand how Big Data for medical research emerges. POLICYAID thereby arrives at novel understandings of both policy making and what it means to be patient in the age of intensified data sourcing.
Max ERC Funding
1 972 860 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym RESOURCE Q
Project Efficient Conversion of Quantum Information Resources
Researcher (PI) Matthias Christandl
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary This proposal explores the power of quantum information in two respects. The first is the technological power of quantum information in a communication infrastructure, and the second is its descriptive power in many-particle quantum systems. My point of departure is to view quantum information as a resource that can be processed and converted.
In quantum communication, a famous resource conversion is provided by the quantum teleportation protocol, which allows us to send one quantum bit (1 qubit) through the transmission of two classical bits (2 cbits) and the use of one entangled pair of quantum bits (1 ebit):
1 ebit + 2 cbits > 1 qubit.
Casting quantum protocols in such resource inequalities has proven useful, since the algebraic manipulation of inequalities results in new protocols, but this approach has hitherto largely been limited to point-to-point communication. It is the first goal of this project to overcome this limitation and characterise resource conversion in larger quantum networks. This will result in more efficient communication protocols that will have an impact on the use and design of quantum communication networks, which are currently being built around the globe.
A quantum network involving distant communicating labs is mirrored at the small scale by a set of interacting quantum particles. The quantum state arising from pairwise interactions can be strongly entangled, with an underlying entanglement structure given by a graph with entangled pairs along the edges. There is a surprising and close connection between such entanglement structures and tensor research in the context of algebraic complexity theory. The second goal of the project is to exploit this connection and characterise the resource conversion of entanglement structures. The research will lead to more efficient tensor network representations of many-particle quantum states, and to progress on the computational complexity of matrix multiplication, a long-standing unsolved problem.
Summary
This proposal explores the power of quantum information in two respects. The first is the technological power of quantum information in a communication infrastructure, and the second is its descriptive power in many-particle quantum systems. My point of departure is to view quantum information as a resource that can be processed and converted.
In quantum communication, a famous resource conversion is provided by the quantum teleportation protocol, which allows us to send one quantum bit (1 qubit) through the transmission of two classical bits (2 cbits) and the use of one entangled pair of quantum bits (1 ebit):
1 ebit + 2 cbits > 1 qubit.
Casting quantum protocols in such resource inequalities has proven useful, since the algebraic manipulation of inequalities results in new protocols, but this approach has hitherto largely been limited to point-to-point communication. It is the first goal of this project to overcome this limitation and characterise resource conversion in larger quantum networks. This will result in more efficient communication protocols that will have an impact on the use and design of quantum communication networks, which are currently being built around the globe.
A quantum network involving distant communicating labs is mirrored at the small scale by a set of interacting quantum particles. The quantum state arising from pairwise interactions can be strongly entangled, with an underlying entanglement structure given by a graph with entangled pairs along the edges. There is a surprising and close connection between such entanglement structures and tensor research in the context of algebraic complexity theory. The second goal of the project is to exploit this connection and characterise the resource conversion of entanglement structures. The research will lead to more efficient tensor network representations of many-particle quantum states, and to progress on the computational complexity of matrix multiplication, a long-standing unsolved problem.
Max ERC Funding
1 953 750 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym RYD-QNLO
Project Quantum nonlinear optics through Rydberg interaction
Researcher (PI) Sebastian HOFFERBERTH
Host Institution (HI) SYDDANSK UNIVERSITET
Call Details Consolidator Grant (CoG), PE2, ERC-2017-COG
Summary Optical photons, for all practical purposes, do not interact. This fundamental property of light forms the basis of modern optics and enables a multitude of technical applications in our every-day life, such as all-optical communication and microscopy. On the other hand, an engineered interaction between individual photons would allow the creation and control of light photon by photon, providing fundamental insights into the quantum nature of light and allowing us to harness non-classical states of light as resource for future technology. Mapping the strong interaction between Rydberg atoms onto individual photons has emerged as a highly promising approach towards this ambitious goal. In this project, we will advance and significantly broaden the research field of Rydberg quantum optics to develop new tools for realizing strongly correlated quantum many-body states of photons. Building on our successful work over recent years, we will greatly expand our control over Rydberg slow-light polaritons to implement mesoscopic systems of strongly interacting photons in an ultracold ytterbium gas. In parallel, we will explore a new approach to strong light-matter coupling, utilizing Rydberg superatoms made out of thousands of individual atoms, strongly coupled to a propagating light mode. This free-space QED system enables strong coupling between single photons and single artificial atoms in the optical domain without any confining structures for the light. Finally, we will experimentally realize a novel quantum hybrid system exploiting the strong electric coupling between single Rydberg atoms and piezo-electric micro-mechanical oscillators. Building on this unique coupling scheme, we will explore Rydberg-mediated cooling of a mechanical system and dissipative preparation of non-classical phonon states. The three complementary parts ultimately unite into a powerful Rydberg quantum optics toolbox which will provide unprecedented control over single photons and single phonons.
Summary
Optical photons, for all practical purposes, do not interact. This fundamental property of light forms the basis of modern optics and enables a multitude of technical applications in our every-day life, such as all-optical communication and microscopy. On the other hand, an engineered interaction between individual photons would allow the creation and control of light photon by photon, providing fundamental insights into the quantum nature of light and allowing us to harness non-classical states of light as resource for future technology. Mapping the strong interaction between Rydberg atoms onto individual photons has emerged as a highly promising approach towards this ambitious goal. In this project, we will advance and significantly broaden the research field of Rydberg quantum optics to develop new tools for realizing strongly correlated quantum many-body states of photons. Building on our successful work over recent years, we will greatly expand our control over Rydberg slow-light polaritons to implement mesoscopic systems of strongly interacting photons in an ultracold ytterbium gas. In parallel, we will explore a new approach to strong light-matter coupling, utilizing Rydberg superatoms made out of thousands of individual atoms, strongly coupled to a propagating light mode. This free-space QED system enables strong coupling between single photons and single artificial atoms in the optical domain without any confining structures for the light. Finally, we will experimentally realize a novel quantum hybrid system exploiting the strong electric coupling between single Rydberg atoms and piezo-electric micro-mechanical oscillators. Building on this unique coupling scheme, we will explore Rydberg-mediated cooling of a mechanical system and dissipative preparation of non-classical phonon states. The three complementary parts ultimately unite into a powerful Rydberg quantum optics toolbox which will provide unprecedented control over single photons and single phonons.
Max ERC Funding
1 993 793 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym S4F
Project Setting the Stage for Solar System Formation
Researcher (PI) Jes Kristian Jørgensen
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary Low-mass stars like our Sun are formed in the centers of dark clouds of dust and gas that obscure their visible light. Deep observations at infrared and submillimeter wavelengths are uniquely suited to probe the inner regions of these young stellar objects and unravel their structures, as well as the physical and chemical processes involved. These earliest stages are particularly interesting because the properties of the deeply embedded objects reflect the star formation process itself and how it relates to its environment. It is for example during this stage that the final mass of the star and the properties of its disk – and thus ability to form planets – are determined. It is also during these stages that the first seeds for the chemical evolution of the protoplanetary disk are planted and where some complex organic, possibly prebiotic, molecules may be formed. I here apply for an ERC Consolidator Grant that will support an ambitious program to map the physics and chemistry of the early Solar System. The proposed research program intends to use new high resolution, high sensitivity observations from the Atacama Large Millimeter Array (ALMA) - including a number of recently approved large programs – coupled to state-of-the-art radiative transfer tools and theoretical simulations to address some of the key questions concerning the physics and chemistry of the earliest stages of the Solar System: How is the chemistry of the earliest protostellar stages related to the physical structure and evolution of the young stellar object and its surrounding environment? Which complex organic molecules are present in the inner regions of low-mass protostars? What are the chances the rich chemistry of the earliest stages is incorporated into planetary systems such as our own?
Summary
Low-mass stars like our Sun are formed in the centers of dark clouds of dust and gas that obscure their visible light. Deep observations at infrared and submillimeter wavelengths are uniquely suited to probe the inner regions of these young stellar objects and unravel their structures, as well as the physical and chemical processes involved. These earliest stages are particularly interesting because the properties of the deeply embedded objects reflect the star formation process itself and how it relates to its environment. It is for example during this stage that the final mass of the star and the properties of its disk – and thus ability to form planets – are determined. It is also during these stages that the first seeds for the chemical evolution of the protoplanetary disk are planted and where some complex organic, possibly prebiotic, molecules may be formed. I here apply for an ERC Consolidator Grant that will support an ambitious program to map the physics and chemistry of the early Solar System. The proposed research program intends to use new high resolution, high sensitivity observations from the Atacama Large Millimeter Array (ALMA) - including a number of recently approved large programs – coupled to state-of-the-art radiative transfer tools and theoretical simulations to address some of the key questions concerning the physics and chemistry of the earliest stages of the Solar System: How is the chemistry of the earliest protostellar stages related to the physical structure and evolution of the young stellar object and its surrounding environment? Which complex organic molecules are present in the inner regions of low-mass protostars? What are the chances the rich chemistry of the earliest stages is incorporated into planetary systems such as our own?
Max ERC Funding
1 999 659 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym SEEWHI
Project Solar Energy Enabled for the World by High-resolution Imaging
Researcher (PI) Jens Wenzel Andreasen
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary THE GOAL
We will derive new and fundamental insight in the relation between nano-scale structure and the performance of 3rd generation solar cells, and determine how to apply this in large-scale processing.
THE CHALLENGES
We currently have a superficial understanding of the correlations between structure and performance of photovoltaic heterojunctions, based on studies of small-scale devices and model systems with characterization techniques that indirectly probe their internal structure. The real structures of optimized devices have never been “seen”, and in devices manufactured by large-scale processing, almost nothing is known about the formation of structures and interfaces.
THE SCIENCE
We will take a ground-breaking new approach by combining imaging techniques where state of the art is moving in time spans on the order of months, with ultrafast scattering experiments and modelling. The techniques include high resolution X-ray phase contrast and X-ray dark-field tomography, in situ small and wide angle X-ray scattering, resonant scattering and imaging and time resolved studies of charge transport and transfer. To relate our findings to device performance, we will establish full 3D models of charge generation and transport in nano-structured solar cells.
THE FOCUS
Solution cast solar cells is the only technology that promises fast and cheap industrial scaling, and it is consequently the focus of our efforts. They require a tight control of processing conditions to ensure that the proper nano-structure is formed in the photoactive layers, with optimal contacts to charge transport layers and interfaces. The prime contenders are non-toxic polymer and kesterite solar cells.
THE IMPACT
Our results may advance 3rd generation, solution-cast solar cells to meet the “unification challenge” where high efficiency, stability and cheap processing combines in a single technology, scalable to the level of gigawatts per day, thus becoming a centrepiece in global energy supply.
Summary
THE GOAL
We will derive new and fundamental insight in the relation between nano-scale structure and the performance of 3rd generation solar cells, and determine how to apply this in large-scale processing.
THE CHALLENGES
We currently have a superficial understanding of the correlations between structure and performance of photovoltaic heterojunctions, based on studies of small-scale devices and model systems with characterization techniques that indirectly probe their internal structure. The real structures of optimized devices have never been “seen”, and in devices manufactured by large-scale processing, almost nothing is known about the formation of structures and interfaces.
THE SCIENCE
We will take a ground-breaking new approach by combining imaging techniques where state of the art is moving in time spans on the order of months, with ultrafast scattering experiments and modelling. The techniques include high resolution X-ray phase contrast and X-ray dark-field tomography, in situ small and wide angle X-ray scattering, resonant scattering and imaging and time resolved studies of charge transport and transfer. To relate our findings to device performance, we will establish full 3D models of charge generation and transport in nano-structured solar cells.
THE FOCUS
Solution cast solar cells is the only technology that promises fast and cheap industrial scaling, and it is consequently the focus of our efforts. They require a tight control of processing conditions to ensure that the proper nano-structure is formed in the photoactive layers, with optimal contacts to charge transport layers and interfaces. The prime contenders are non-toxic polymer and kesterite solar cells.
THE IMPACT
Our results may advance 3rd generation, solution-cast solar cells to meet the “unification challenge” where high efficiency, stability and cheap processing combines in a single technology, scalable to the level of gigawatts per day, thus becoming a centrepiece in global energy supply.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym SIRFUNCT
Project Chemical Tools for Unravelling Sirtuin Function
Researcher (PI) Christian Adam OLSEN
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2016-COG
Summary It was recently realized that lysine acetylation affects a wide variety of cellular processes in addition to the initially recognized histone related gene regulation. Together with recent groundbreaking results, revealing the presence of additional acyllysine modifications, the basis for a paradigm shift in this area was formed. Examples of enzymes formerly thought to be lysine deacetylases, have been shown to cleave these new types of lysine modification and members of the sirtuin class of enzymes play a central role.
Development of new tools to investigate the importance of these new modifications as well as the sirtuins that cleave them is required. We therefore propose to adopt an interdisciplinary approach by developing selective inhibitors and so-called activity-based probes (ABPs) and applying these to the investigation of proteins recognizing novel post-translational acylations of lysine residues in cells. Such ABPs will be powerful tools for providing insight regarding this rapidly evolving area of biochemistry; however, the current state-of-the-art in ABP design is endowed with severe limitations because the modifications are inherently cleaved by various hydrolases in human cells. Thus, in the present project, I propose that novel designs accommodating non-cleavable modifications are warranted to maintain structural integrity during experiments.
Furthermore, I propose to apply similar mechanism-based designs to develop potent and isoform-selective sirtuin inhibitors, which will serve as chemical probes to investigate links between cancer and metabolism, and may ultimately serve as lead compounds for pre-clinical pharmaceutical development.
AIM-I. (a) Development and (b) application of collections of chemical probes for activity-based investigation of enzymes that interact with post-translationally acylated proteins.
AIM-II. Utilization of structural and mechanistic insight to design potent and selective inhibitors of sirtuin enzymes.
Summary
It was recently realized that lysine acetylation affects a wide variety of cellular processes in addition to the initially recognized histone related gene regulation. Together with recent groundbreaking results, revealing the presence of additional acyllysine modifications, the basis for a paradigm shift in this area was formed. Examples of enzymes formerly thought to be lysine deacetylases, have been shown to cleave these new types of lysine modification and members of the sirtuin class of enzymes play a central role.
Development of new tools to investigate the importance of these new modifications as well as the sirtuins that cleave them is required. We therefore propose to adopt an interdisciplinary approach by developing selective inhibitors and so-called activity-based probes (ABPs) and applying these to the investigation of proteins recognizing novel post-translational acylations of lysine residues in cells. Such ABPs will be powerful tools for providing insight regarding this rapidly evolving area of biochemistry; however, the current state-of-the-art in ABP design is endowed with severe limitations because the modifications are inherently cleaved by various hydrolases in human cells. Thus, in the present project, I propose that novel designs accommodating non-cleavable modifications are warranted to maintain structural integrity during experiments.
Furthermore, I propose to apply similar mechanism-based designs to develop potent and isoform-selective sirtuin inhibitors, which will serve as chemical probes to investigate links between cancer and metabolism, and may ultimately serve as lead compounds for pre-clinical pharmaceutical development.
AIM-I. (a) Development and (b) application of collections of chemical probes for activity-based investigation of enzymes that interact with post-translationally acylated proteins.
AIM-II. Utilization of structural and mechanistic insight to design potent and selective inhibitors of sirtuin enzymes.
Max ERC Funding
1 758 742 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym SolarALMA
Project ALMA – The key to the Sun’s coronal heating problem.
Researcher (PI) Sven Wedemeyer
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Summary
How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Max ERC Funding
1 995 964 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym SSS
Project Scalable Similarity Search
Researcher (PI) Rasmus Pagh
Host Institution (HI) IT-UNIVERSITETET I KOBENHAVN
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary Similarity search is the task of identifying, in a collection of items, the ones that are “similar” to a given
query item. This task has a range of important applications (e.g. in information retrieval, pattern
recognition, statistics, and machine learning) where data sets are often big, high dimensional, and
possibly noisy. State-of-the-art methods for similarity search offer only weak guarantees when faced with
big data. Either the space overhead is excessive (1000s of times larger than the space for the data itself),
or the work needed to report the similar items may be comparable to the work needed to go through all
items (even if just a tiny fraction of the items are similar). As a result, many applications have to resort to
the use of ad-hoc solutions with only weak theoretical guarantees.
This proposal aims at strengthening the theoretical foundation of scalable similarity search, and
developing novel practical similarity search methods backed by theory. In particular we will:
- Leverage new types of embeddings that are kernelized, asymmetric, and complex-valued.
- Consider statistical models of noise in data, and design similarity search data structures whose
performance guarantees are phrased in statistical terms.
- Build a new theory of the communication complexity of distributed, dynamic similarity search,
emphasizing the communication bottleneck present in modern computing infrastructures.
The objective is to produce new methods for similarity search that are: 1) Provably robust, 2) scalable
to large and high-dimensional data sets, 3) substantially more resource efficient than current state-ofthe-
art solutions, and 4) able to provide statistical guarantees on query answers.
The study of similarity search has been an incubator for techniques (e.g. locality-sensitive hashing and
random projections) that have wide-ranging applications. The new techniques developed in this project
are likely to have significant impacts beyond similarity search.
Summary
Similarity search is the task of identifying, in a collection of items, the ones that are “similar” to a given
query item. This task has a range of important applications (e.g. in information retrieval, pattern
recognition, statistics, and machine learning) where data sets are often big, high dimensional, and
possibly noisy. State-of-the-art methods for similarity search offer only weak guarantees when faced with
big data. Either the space overhead is excessive (1000s of times larger than the space for the data itself),
or the work needed to report the similar items may be comparable to the work needed to go through all
items (even if just a tiny fraction of the items are similar). As a result, many applications have to resort to
the use of ad-hoc solutions with only weak theoretical guarantees.
This proposal aims at strengthening the theoretical foundation of scalable similarity search, and
developing novel practical similarity search methods backed by theory. In particular we will:
- Leverage new types of embeddings that are kernelized, asymmetric, and complex-valued.
- Consider statistical models of noise in data, and design similarity search data structures whose
performance guarantees are phrased in statistical terms.
- Build a new theory of the communication complexity of distributed, dynamic similarity search,
emphasizing the communication bottleneck present in modern computing infrastructures.
The objective is to produce new methods for similarity search that are: 1) Provably robust, 2) scalable
to large and high-dimensional data sets, 3) substantially more resource efficient than current state-ofthe-
art solutions, and 4) able to provide statistical guarantees on query answers.
The study of similarity search has been an incubator for techniques (e.g. locality-sensitive hashing and
random projections) that have wide-ranging applications. The new techniques developed in this project
are likely to have significant impacts beyond similarity search.
Max ERC Funding
1 889 712 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym STAMFORD
Project Statistical Methods For High Dimensional Diffusions
Researcher (PI) Mark Podolskij
Host Institution (HI) AARHUS UNIVERSITET
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary In the past twenty years the availability of vast dimensional data, typically referred to as big data, has given rise to exciting challenges in various fields of mathematics and computer sciences. The increasing need for getting a better understanding of such data in internet traffic, biology, genetics, and economics, has lead to a revolution in statistical and machine learning, optimisation and numerical analysis. Due to high dimensionality of modern statistical models, parameter estimation is a difficult task and statisticians typically investigate estimation methods under sparsity constraints. While an abundance of estimation algorithms is now available for high dimensional discrete models, a rigorous mathematical investigation of estimation problems for high dimensional continuous-time processes is completely undeveloped.
The aim of STAMFORD is to provide a concise statistical theory for estimation of high dimensional diffusions. Such high dimensional processes naturally appear in modelling particle interactions in physics, neural networks in biology or large portfolios in economics, just to name a few. The methodological part of the project will require development of novel
advanced techniques in mathematical statistics and probability theory. In particular, new results will be needed in parametric and non-parametric statistics, and high dimensional probability, that are reaching far beyond the state-of-the-art. Hence, a successful outcome of STAMFORD will not only have a tremendous impact on statistical inference for continuous-time models in natural and applied sciences, but also strongly influence the field of high dimensional statistics and probability.
Summary
In the past twenty years the availability of vast dimensional data, typically referred to as big data, has given rise to exciting challenges in various fields of mathematics and computer sciences. The increasing need for getting a better understanding of such data in internet traffic, biology, genetics, and economics, has lead to a revolution in statistical and machine learning, optimisation and numerical analysis. Due to high dimensionality of modern statistical models, parameter estimation is a difficult task and statisticians typically investigate estimation methods under sparsity constraints. While an abundance of estimation algorithms is now available for high dimensional discrete models, a rigorous mathematical investigation of estimation problems for high dimensional continuous-time processes is completely undeveloped.
The aim of STAMFORD is to provide a concise statistical theory for estimation of high dimensional diffusions. Such high dimensional processes naturally appear in modelling particle interactions in physics, neural networks in biology or large portfolios in economics, just to name a few. The methodological part of the project will require development of novel
advanced techniques in mathematical statistics and probability theory. In particular, new results will be needed in parametric and non-parametric statistics, and high dimensional probability, that are reaching far beyond the state-of-the-art. Hence, a successful outcome of STAMFORD will not only have a tremendous impact on statistical inference for continuous-time models in natural and applied sciences, but also strongly influence the field of high dimensional statistics and probability.
Max ERC Funding
1 655 048 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym stardust2asteroids
Project Stardust to asteroids: Unravelling the formation and earliest evolution of a habitable solar system
Researcher (PI) Martin Bizzarro
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary As far as we know, our solar system is unique. It could, in principle, be the only planetary system in the Universe to harbor intelligent life or, indeed, life at all. As such, attempting to reconstruct its history is one of the most fundamental pursuits in the natural sciences. Whereas astronomical observations of star- forming regions provide a framework for understanding the formation of low-mass stars and the early evolution of planetary systems in general, direct information about the earliest solar system can only come from primitive meteorites and their components and some differentiated meteorites that record the birth of the solar system. The main objective of this proposal is to investigate the timescales and processes – including the role of supernovas – leading to the formation of the solar system by measurement of isotopic variations in meteorites. To achieve our objectives, we will integrate long-lived and short-lived radioisotope chronometers with the presence/absence of nucleosynthetic anomalies in various meteorites and meteoritic components. Our isotopic measurements will be obtained using state-of-the-art technologies such as second-generation mass spectrometers housed in laboratories directed by the PI and fully dedicated to cosmochemistry. This will allow us to: 1) define the mechanism and timescale for the collapse of the protosolar molecular cloud and emergence of the protoplanetary disk, 2) constrain the source and locale of chondrule-forming event(s) as well as the nature of the mechanism(s) required to transport chondrules to the accretion regions of chondrites, and 3) provide robust estimates of the timing and mechanism of asteroidal differentiation. We aim to understand how the variable initial conditions imposed by the range of possible stellar environments and protoplanetary disk properties regulated the formation and assemblage of disk solids into asteroidal and planetary bodies comprising our solar system.
Summary
As far as we know, our solar system is unique. It could, in principle, be the only planetary system in the Universe to harbor intelligent life or, indeed, life at all. As such, attempting to reconstruct its history is one of the most fundamental pursuits in the natural sciences. Whereas astronomical observations of star- forming regions provide a framework for understanding the formation of low-mass stars and the early evolution of planetary systems in general, direct information about the earliest solar system can only come from primitive meteorites and their components and some differentiated meteorites that record the birth of the solar system. The main objective of this proposal is to investigate the timescales and processes – including the role of supernovas – leading to the formation of the solar system by measurement of isotopic variations in meteorites. To achieve our objectives, we will integrate long-lived and short-lived radioisotope chronometers with the presence/absence of nucleosynthetic anomalies in various meteorites and meteoritic components. Our isotopic measurements will be obtained using state-of-the-art technologies such as second-generation mass spectrometers housed in laboratories directed by the PI and fully dedicated to cosmochemistry. This will allow us to: 1) define the mechanism and timescale for the collapse of the protosolar molecular cloud and emergence of the protoplanetary disk, 2) constrain the source and locale of chondrule-forming event(s) as well as the nature of the mechanism(s) required to transport chondrules to the accretion regions of chondrites, and 3) provide robust estimates of the timing and mechanism of asteroidal differentiation. We aim to understand how the variable initial conditions imposed by the range of possible stellar environments and protoplanetary disk properties regulated the formation and assemblage of disk solids into asteroidal and planetary bodies comprising our solar system.
Max ERC Funding
1 910 889 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym STERCP
Project Synchronisation to enhance reliability of climate predictions
Researcher (PI) Noel Sebastian Keenlyside
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE10, ERC-2014-CoG
Summary Climate prediction is the next frontier in climate research. Prediction of climate on timescales from a season to a decade has shown progress, but beyond the ocean skill remains low. And while the historical evolution of climate at global scales can be reasonably simulated, agreement at a regional level is limited and large uncertainties exist in future climate change. These large uncertainties pose a major challenge to those providing climate services and to informing policy makers.
This proposal aims to investigate the potential of an innovative technique to reduce model systematic error, and hence to improve climate prediction skill and reduce uncertainties in future climate projections. The current practice to account for model systematic error, as for example adopted by the Intergovernmental Panel on Climate Change, is to perform simulations with ensembles of different models. This leads to more reliable predictions, and to a better representation of climate. Instead of running models independently, we propose to connect the different models in manner that they synchronise and errors compensate, thus leading to a model superior to any of the individual models – a super model.
The concept stems from theoretical non-dynamics and relies on advanced machine learning algorithms. Its application to climate modelling has been rudimentary. Nevertheless, our initial results show it holds great promise for improving climate prediction. To achieve even greater gains, we will extend the approach to allow greater connectivity among multiple complex climate models to create a true super climate model. We will assess the approach’s potential to enhance seasonal-to-decadal prediction, focusing on the Tropical Pacific and North Atlantic, and to reduce uncertainties in climate projections. Importantly, this work will improve our understanding of climate, as well as how systematic model errors impact prediction skill and contribute to climate change uncertainties.
Summary
Climate prediction is the next frontier in climate research. Prediction of climate on timescales from a season to a decade has shown progress, but beyond the ocean skill remains low. And while the historical evolution of climate at global scales can be reasonably simulated, agreement at a regional level is limited and large uncertainties exist in future climate change. These large uncertainties pose a major challenge to those providing climate services and to informing policy makers.
This proposal aims to investigate the potential of an innovative technique to reduce model systematic error, and hence to improve climate prediction skill and reduce uncertainties in future climate projections. The current practice to account for model systematic error, as for example adopted by the Intergovernmental Panel on Climate Change, is to perform simulations with ensembles of different models. This leads to more reliable predictions, and to a better representation of climate. Instead of running models independently, we propose to connect the different models in manner that they synchronise and errors compensate, thus leading to a model superior to any of the individual models – a super model.
The concept stems from theoretical non-dynamics and relies on advanced machine learning algorithms. Its application to climate modelling has been rudimentary. Nevertheless, our initial results show it holds great promise for improving climate prediction. To achieve even greater gains, we will extend the approach to allow greater connectivity among multiple complex climate models to create a true super climate model. We will assess the approach’s potential to enhance seasonal-to-decadal prediction, focusing on the Tropical Pacific and North Atlantic, and to reduce uncertainties in climate projections. Importantly, this work will improve our understanding of climate, as well as how systematic model errors impact prediction skill and contribute to climate change uncertainties.
Max ERC Funding
1 999 389 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym TUVOLU
Project Tundra biogenic volatile emissions in the 21st century
Researcher (PI) Riikka Tiivi Mariisa Rinnan
Host Institution (HI) KOBENHAVNS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary Biogenic volatile organic compounds (BVOCs) influence atmospheric oxidation causing climate feedback thought to be especially significant in remote areas with low anthropogenic emissions, such as the Arctic. Still, we do not understand the dynamics and impact of climatic and biotic BVOC emission drivers in arctic and alpine tundra, which are highly temperature-sensitive BVOC sources.
TUVOLU will redefine tundra BVOC emission estimates to account for rapid and dramatic climate warming accompanied by effects of vegetation change, permafrost thaw, insect outbreaks and herbivory using multidisciplinary, established and novel methodology.
We will quantify the relationships between leaf and canopy temperatures and BVOC emissions to improve BVOC emission model predictions of emission rates in low-statured tundra vegetation, which efficiently heats up. We will experimentally determine the contribution of induced BVOC emissions from insect herbivory in the warming Arctic by field manipulation experiments addressing basal herbivory and insect outbreaks and by stable isotope labelling to identify sources of the induced emission. Complementary laboratory assessment will determine if permafrost thaw leads to significant BVOC emissions from thawing processes and newly available soil processes, or if released BVOCs are largely taken up by soil microbes. We will also use a global network of existing climate warming experiments in alpine tundra to assess how the BVOC emissions from tundra vegetation world-wide respond to climate change.
Measurement data will help develop and parameterize BVOC emission models to produce holistic enhanced predictions for global tundra emissions. Finally, modelling will be used to estimate emission impact on tropospheric ozone concentrations and secondary organic aerosol levels, producing the first assessment of arctic BVOC-mediated feedback on regional air quality and climate.
Summary
Biogenic volatile organic compounds (BVOCs) influence atmospheric oxidation causing climate feedback thought to be especially significant in remote areas with low anthropogenic emissions, such as the Arctic. Still, we do not understand the dynamics and impact of climatic and biotic BVOC emission drivers in arctic and alpine tundra, which are highly temperature-sensitive BVOC sources.
TUVOLU will redefine tundra BVOC emission estimates to account for rapid and dramatic climate warming accompanied by effects of vegetation change, permafrost thaw, insect outbreaks and herbivory using multidisciplinary, established and novel methodology.
We will quantify the relationships between leaf and canopy temperatures and BVOC emissions to improve BVOC emission model predictions of emission rates in low-statured tundra vegetation, which efficiently heats up. We will experimentally determine the contribution of induced BVOC emissions from insect herbivory in the warming Arctic by field manipulation experiments addressing basal herbivory and insect outbreaks and by stable isotope labelling to identify sources of the induced emission. Complementary laboratory assessment will determine if permafrost thaw leads to significant BVOC emissions from thawing processes and newly available soil processes, or if released BVOCs are largely taken up by soil microbes. We will also use a global network of existing climate warming experiments in alpine tundra to assess how the BVOC emissions from tundra vegetation world-wide respond to climate change.
Measurement data will help develop and parameterize BVOC emission models to produce holistic enhanced predictions for global tundra emissions. Finally, modelling will be used to estimate emission impact on tropospheric ozone concentrations and secondary organic aerosol levels, producing the first assessment of arctic BVOC-mediated feedback on regional air quality and climate.
Max ERC Funding
2 347 668 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym WILLOW
Project WIreLess LOWband communications: massive and ultra-reliable access
Researcher (PI) Petar Popovski
Host Institution (HI) AALBORG UNIVERSITET
Call Details Consolidator Grant (CoG), PE7, ERC-2014-CoG
Summary The overall objective of WILLOW is to make wireless communication a true commodity by enabling lowband communications: low-rate links for massive number of devices and ultra-reliable connectivity. This research effort is a major endeavour in the area of wireless communications, taking a different path from the mainstream research that aims at “4G, but faster”. Lowband communication is the key to enabling new applications, such as massive sensing, ultra-reliable vehicular links and wireless cloud connectivity with guaranteed minimal rate. The research in WILLOW is centred on two fundamental issues. First, it is the efficient communication with short packets, in which the data size is comparable to the size of the metadata, i.e. control information, which is not the case in broadband communication. Communication of short packets that come from a massive number of devices and/or need to meet a latency constraint requires fundamental rethinking of the packet structure and the associated communication protocols. Second is the system architecture in which graceful rate degradation, low latency and massive access can exist simultaneously with the broadband services. The principles from WILLOW will be applied to: (a) clean-slate wireless systems; (b) reengineer existing wireless systems. Option (b) is unique to lowband communication that does not require high physical-layer speed, but can reuse the physical layer of an existing system and redefine the metadata/data relationship to achieve massive/ultra-reliable communication. WILLOW carries high risk by conjecturing that it is possible to support an unprecedented number of connected devices and wireless reliability levels. Considering the timeliness and the relevance, the strong track record of the PI and the rich wireless research environment at Aalborg University, WILLOW is poised to make a breakthrough towards lowband communications and create the technology that will enable a plethora of new wireless usage modes.
Summary
The overall objective of WILLOW is to make wireless communication a true commodity by enabling lowband communications: low-rate links for massive number of devices and ultra-reliable connectivity. This research effort is a major endeavour in the area of wireless communications, taking a different path from the mainstream research that aims at “4G, but faster”. Lowband communication is the key to enabling new applications, such as massive sensing, ultra-reliable vehicular links and wireless cloud connectivity with guaranteed minimal rate. The research in WILLOW is centred on two fundamental issues. First, it is the efficient communication with short packets, in which the data size is comparable to the size of the metadata, i.e. control information, which is not the case in broadband communication. Communication of short packets that come from a massive number of devices and/or need to meet a latency constraint requires fundamental rethinking of the packet structure and the associated communication protocols. Second is the system architecture in which graceful rate degradation, low latency and massive access can exist simultaneously with the broadband services. The principles from WILLOW will be applied to: (a) clean-slate wireless systems; (b) reengineer existing wireless systems. Option (b) is unique to lowband communication that does not require high physical-layer speed, but can reuse the physical layer of an existing system and redefine the metadata/data relationship to achieve massive/ultra-reliable communication. WILLOW carries high risk by conjecturing that it is possible to support an unprecedented number of connected devices and wireless reliability levels. Considering the timeliness and the relevance, the strong track record of the PI and the rich wireless research environment at Aalborg University, WILLOW is poised to make a breakthrough towards lowband communications and create the technology that will enable a plethora of new wireless usage modes.
Max ERC Funding
1 994 411 €
Duration
Start date: 2015-04-01, End date: 2020-03-31