Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym CHROMPHYS
Project Physics of the Solar Chromosphere
Researcher (PI) Mats Per-Olof Carlsson
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Summary
CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Max ERC Funding
2 487 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym CODE
Project Coincidence detection of proteins and lipids in regulation of cellular membrane dynamics
Researcher (PI) Harald STENMARK
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), LS3, ERC-2017-ADG
Summary Specific recruitment of different proteins to distinct intracellular membranes is fundamental in the biology of eukaryotic cells, but the molecular basis for specificity is incompletely understood. This proposal investigates the hypothesis that coincidence detection of proteins and lipids constitutes a major mechanism for specific recruitment of proteins to intracellular membranes in order to control cellular membrane dynamics. CODE will establish and validate mathematical models for coincidence detection, identify and functionally characterise novel coincidence detectors, and engineer artificial coincidence detectors as novel tools in cell biology and biotechnology.
Summary
Specific recruitment of different proteins to distinct intracellular membranes is fundamental in the biology of eukaryotic cells, but the molecular basis for specificity is incompletely understood. This proposal investigates the hypothesis that coincidence detection of proteins and lipids constitutes a major mechanism for specific recruitment of proteins to intracellular membranes in order to control cellular membrane dynamics. CODE will establish and validate mathematical models for coincidence detection, identify and functionally characterise novel coincidence detectors, and engineer artificial coincidence detectors as novel tools in cell biology and biotechnology.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym COMTESSA
Project Camera Observation and Modelling of 4D Tracer Dispersion in the Atmosphere
Researcher (PI) Andreas Stohl
Host Institution (HI) NORSK INSTITUTT FOR LUFTFORSKNING STIFTELSE
Call Details Advanced Grant (AdG), PE10, ERC-2014-ADG
Summary COMTESSA will push back the limits of our understanding of turbulence and plume dispersion in the atmosphere by bringing together full four-dimensional (space and time) observations of a (nearly) passive tracer (sulfur dioxide, SO2), with advanced data analysis and turbulence and dispersion modelling.
Observations will be made with six cameras sensitive to ultraviolet (UV) radiation and three cameras sensitive to infrared (IR) radiation. The UV cameras will be built specifically for this project where high sensitivity and fast sampling is important. The accuracy of UV and IR retrievals will be improved by using a state-of-the art-3D radiative transfer model.
Controlled puff and plume releases of SO2 will be made from a tower, which will be observed by all cameras, yielding multiple 2D images of SO2 integrated along the line of sight. The simultaneous observations will allow - for the first time - a tomographic reconstruction of the 3D tracer concentration distribution at high space (< 1 m) and time (>10 Hz) resolution. An optical flow code will be used to determine the eddy-resolved velocity vector field of the plume. Special turbulent phenomena (e.g. plume rise) will be studied using existing SO2 sources (e.g. smelters, power plants, volcanic fumaroles).
Analysis of the novel campaign observations will deepen our understanding of turbulence and tracer dispersion in the atmosphere. For instance, for the first time we will be able to extensively measure the concentration probability density function (PDF) in a plume not only near the ground but also at high-er altitudes; quantify relative and absolute dispersion; estimate the value of the Richardson-Obukhov constant, etc. We will also use the data to evaluate state-of-the-art LES and Lagrangian dispersion models and revise their underlying parameterizations.
COMTESSA’s vision is that the project results will lead to large improvements of tracer transport in all atmospheric models.
Summary
COMTESSA will push back the limits of our understanding of turbulence and plume dispersion in the atmosphere by bringing together full four-dimensional (space and time) observations of a (nearly) passive tracer (sulfur dioxide, SO2), with advanced data analysis and turbulence and dispersion modelling.
Observations will be made with six cameras sensitive to ultraviolet (UV) radiation and three cameras sensitive to infrared (IR) radiation. The UV cameras will be built specifically for this project where high sensitivity and fast sampling is important. The accuracy of UV and IR retrievals will be improved by using a state-of-the art-3D radiative transfer model.
Controlled puff and plume releases of SO2 will be made from a tower, which will be observed by all cameras, yielding multiple 2D images of SO2 integrated along the line of sight. The simultaneous observations will allow - for the first time - a tomographic reconstruction of the 3D tracer concentration distribution at high space (< 1 m) and time (>10 Hz) resolution. An optical flow code will be used to determine the eddy-resolved velocity vector field of the plume. Special turbulent phenomena (e.g. plume rise) will be studied using existing SO2 sources (e.g. smelters, power plants, volcanic fumaroles).
Analysis of the novel campaign observations will deepen our understanding of turbulence and tracer dispersion in the atmosphere. For instance, for the first time we will be able to extensively measure the concentration probability density function (PDF) in a plume not only near the ground but also at high-er altitudes; quantify relative and absolute dispersion; estimate the value of the Richardson-Obukhov constant, etc. We will also use the data to evaluate state-of-the-art LES and Lagrangian dispersion models and revise their underlying parameterizations.
COMTESSA’s vision is that the project results will lead to large improvements of tracer transport in all atmospheric models.
Max ERC Funding
2 800 000 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym Cosmoglobe
Project Cosmoglobe -- mapping the universe from the Milky Way to the Big Bang
Researcher (PI) Ingunn Kathrine WEHUS
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Summary
In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Max ERC Funding
1 999 382 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym DIME
Project Disequilibirum metamorphism of stressed lithosphere
Researcher (PI) Bjørn Jamtveit
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2014-ADG
Summary Most changes in mineralogy, density, and rheology of the Earth’s lithosphere take place by metamorphism, whereby rocks evolve through interactions between minerals and fluids. These changes are coupled with a large range of geodynamic processes and they have first order effects on the global geochemical cycles of a large number of elements.
In the presence of fluids, metamorphic reactions are fast compared to tectonically induced changes in pressure and temperature. Hence, during fluid-producing metamorphism, rocks evolve through near-equilibrium states. However, much of the Earth’s lower and middle crust, and a significant fraction of the upper mantle do not contain free fluids. These parts of the lithosphere exist in a metastable state and are mechanically strong. When subject to changing temperature and pressure conditions at plate boundaries or elsewhere, these rocks do not react until exposed to externally derived fluids.
Metamorphism of such rocks consumes fluids, and takes place far from equilibrium through a complex coupling between fluid migration, chemical reactions, and deformation processes. This disequilibrium metamorphism is characterized by fast reaction rates, release of large amounts of energy in the form of heat and work, and a strong coupling to far-field tectonic stress.
Our overarching goal is to provide the first quantitative physics-based model of disequilibrium metamorphism that properly connects fluid-rock interactions at the micro and nano-meter scale to lithosphere scale stresses. This model will include quantification of the forces required to squeeze fluids out of grain-grain contacts for geologically relevant materials (Objective 1), a new experimentally based model describing how the progress of volatilization reactions depends on tectonic stress (Objective 2), and testing of this model by analyzing the kinetics of a natural serpentinization process through the Oman Ophiolite Drilling Project (Objective 3).
Summary
Most changes in mineralogy, density, and rheology of the Earth’s lithosphere take place by metamorphism, whereby rocks evolve through interactions between minerals and fluids. These changes are coupled with a large range of geodynamic processes and they have first order effects on the global geochemical cycles of a large number of elements.
In the presence of fluids, metamorphic reactions are fast compared to tectonically induced changes in pressure and temperature. Hence, during fluid-producing metamorphism, rocks evolve through near-equilibrium states. However, much of the Earth’s lower and middle crust, and a significant fraction of the upper mantle do not contain free fluids. These parts of the lithosphere exist in a metastable state and are mechanically strong. When subject to changing temperature and pressure conditions at plate boundaries or elsewhere, these rocks do not react until exposed to externally derived fluids.
Metamorphism of such rocks consumes fluids, and takes place far from equilibrium through a complex coupling between fluid migration, chemical reactions, and deformation processes. This disequilibrium metamorphism is characterized by fast reaction rates, release of large amounts of energy in the form of heat and work, and a strong coupling to far-field tectonic stress.
Our overarching goal is to provide the first quantitative physics-based model of disequilibrium metamorphism that properly connects fluid-rock interactions at the micro and nano-meter scale to lithosphere scale stresses. This model will include quantification of the forces required to squeeze fluids out of grain-grain contacts for geologically relevant materials (Objective 1), a new experimentally based model describing how the progress of volatilization reactions depends on tectonic stress (Objective 2), and testing of this model by analyzing the kinetics of a natural serpentinization process through the Oman Ophiolite Drilling Project (Objective 3).
Max ERC Funding
2 900 000 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym FEEC-A
Project Finite Element Exterior Calculus and Applications
Researcher (PI) Ragnar Winther
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "The finite element method is one of the most successful techniques for designing numerical methods for systems of partial differential equations (PDEs). It is not only a methodology for developing numerical algorithms, but also a mathematical framework in which to explore their behavior. The finite element exterior calculus (FEEC) provides a new structure that produces a deeper understanding of the finite element method and its connections to the partial differential equation being approximated. The goal is to develop discretizations which are compatible with the geometric, topological, and algebraic structures which underlie well-posedness of the partial differential equation. The phrase FEEC was first used in a paper the PI wrote for Acta Numerica in 2006, together with his coworkers, D.N. Arnold and R.S. Falk. The general philosophy of FEEC has led to the design of new algorithms and software developments, also in areas beyond the direct application of the theory. The present project will be devoted to further development of the foundations of FEEC, and to direct or indirect use of FEEC in specific applications. The ambition is to set the scene for a nubmer of new research directions based on FEEC by giving ground-braking contributions to its foundation. The aim is also to use FEEC as a tool, or a guideline, to extend the foundation of numerical PDEs to a variety of problems for which this foundation does not exist. The more application oriented parts of the project includes topics like numerical methods for elasticity, its generalizations to more general models in materials science such as viscoelasticity, poroelasticity, and liquid crystals, and the applications of these models to CO2 storage and deformations of the spinal cord."
Summary
"The finite element method is one of the most successful techniques for designing numerical methods for systems of partial differential equations (PDEs). It is not only a methodology for developing numerical algorithms, but also a mathematical framework in which to explore their behavior. The finite element exterior calculus (FEEC) provides a new structure that produces a deeper understanding of the finite element method and its connections to the partial differential equation being approximated. The goal is to develop discretizations which are compatible with the geometric, topological, and algebraic structures which underlie well-posedness of the partial differential equation. The phrase FEEC was first used in a paper the PI wrote for Acta Numerica in 2006, together with his coworkers, D.N. Arnold and R.S. Falk. The general philosophy of FEEC has led to the design of new algorithms and software developments, also in areas beyond the direct application of the theory. The present project will be devoted to further development of the foundations of FEEC, and to direct or indirect use of FEEC in specific applications. The ambition is to set the scene for a nubmer of new research directions based on FEEC by giving ground-braking contributions to its foundation. The aim is also to use FEEC as a tool, or a guideline, to extend the foundation of numerical PDEs to a variety of problems for which this foundation does not exist. The more application oriented parts of the project includes topics like numerical methods for elasticity, its generalizations to more general models in materials science such as viscoelasticity, poroelasticity, and liquid crystals, and the applications of these models to CO2 storage and deformations of the spinal cord."
Max ERC Funding
2 059 687 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym gRESONANT
Project Resonant Nuclear Gamma Decay and the Heavy-Element Nucleosynthesis
Researcher (PI) Ann-Cecilie Larsen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary THE GRAND CHALLENGE: The “Holy Grail” of nuclear astrophysics is to understand the astrophysical processes responsible for the formation of the elements. A particularly challenging part is the description of the heavy-element nucleosynthesis. The only way to build the majority of these heavy nuclides is via neutron-capture processes. Unaccounted-for nuclear structure effects may drastically change these rates.
MAIN HYPOTHESIS: Nuclear low-energy gamma-decay resonances at high excitation energies will enhance the astrophysical neutron-capture reaction rates.
NOVEL APPROACH: This proposal is, for the first time, addressing the M1 scissors resonance in deformed, neutron-rich nuclei and superheavy elements. A new experimental technique will be developed to determine the electromagnetic nature of the unexpected upbend enhancement. Further, s-process branch points for the Re-Os cosmochronology will be studied for the first time with the Oslo method.
OBJECTIVES:
1) Measure s-process branch point nuclei with the Oslo method
2) Radioactive-beam experiments for neutron-rich nuclei searching for the low-energy upbend and the M1 scissors resonance
3) Develop new experimental technique to identify the upbend’s electromagnetic nature
4) Superheavy-element experiments looking for the M1 scissors resonance
POTENTIAL IMPACT IN THE RESEARCH FIELD: This proposal will trigger a new direction of research, as there are no data on the low-energy gamma resonances neither on neutron-rich nor superheavy nuclei. Their presence may have profound implications for the astrophysical neutron-capture rates. Developing a new experimental technique to determine the electromagnetic character of the upbend is crucial to distinguish between two competing explanations of this phenomenon. Unknown neutron-capture cross sections will be estimated with a much better precision than prior to this project, and lead to a major leap forward in the field of nuclear astrophysics.
Summary
THE GRAND CHALLENGE: The “Holy Grail” of nuclear astrophysics is to understand the astrophysical processes responsible for the formation of the elements. A particularly challenging part is the description of the heavy-element nucleosynthesis. The only way to build the majority of these heavy nuclides is via neutron-capture processes. Unaccounted-for nuclear structure effects may drastically change these rates.
MAIN HYPOTHESIS: Nuclear low-energy gamma-decay resonances at high excitation energies will enhance the astrophysical neutron-capture reaction rates.
NOVEL APPROACH: This proposal is, for the first time, addressing the M1 scissors resonance in deformed, neutron-rich nuclei and superheavy elements. A new experimental technique will be developed to determine the electromagnetic nature of the unexpected upbend enhancement. Further, s-process branch points for the Re-Os cosmochronology will be studied for the first time with the Oslo method.
OBJECTIVES:
1) Measure s-process branch point nuclei with the Oslo method
2) Radioactive-beam experiments for neutron-rich nuclei searching for the low-energy upbend and the M1 scissors resonance
3) Develop new experimental technique to identify the upbend’s electromagnetic nature
4) Superheavy-element experiments looking for the M1 scissors resonance
POTENTIAL IMPACT IN THE RESEARCH FIELD: This proposal will trigger a new direction of research, as there are no data on the low-energy gamma resonances neither on neutron-rich nor superheavy nuclei. Their presence may have profound implications for the astrophysical neutron-capture rates. Developing a new experimental technique to determine the electromagnetic character of the upbend is crucial to distinguish between two competing explanations of this phenomenon. Unknown neutron-capture cross sections will be estimated with a much better precision than prior to this project, and lead to a major leap forward in the field of nuclear astrophysics.
Max ERC Funding
1 443 472 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym HOPE
Project Humans On Planet Earth - Long-term impacts on biosphere dynamics
Researcher (PI) Harry John Betteley BIRKS
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary A critical question in Earth system science is what was the impact of prehistoric people on the biosphere and climate? There is much information about human impact through clearance, agriculture, erosion, and modifying water and nutrient budgets. Humans have greatly changed the Earth in the last 8000 years, but did humans modify the major ecological processes (e.g. assembly rules) that shape community assembly and dynamics? Did inter-relationships between processes change in response to human impact? Lyons et al. & Dietl (2016 Nature) suggest that human activities in the last 6000 years had such impacts. Dietl proposes that using past ‘natural experiments’ to predict future changes is “flawed” and “out is the use of uniformitarianism”. As using natural experiments is a common strategy and uniformitarianism is the major working concept in Earth sciences, it is imperative to test whether prehistoric human activity changed major ecological processes determining community development. To test this hypothesis, patterns in pollen-stratigraphical data for the past 11,500 years from over 2000 sites across the globe will be explored consistently using numerical techniques to discern changes in 25 ecosystem properties (richness, evenness, and diversity; turnover; rates of change; taxon co-occurrences, etc.). Patterns in these properties will be compared statistically at sites within biomes, between biomes, within continents, and between continents to test the hypotheses that prehistoric human activities changed the basic ecological processes of community assembly and that their inter-relationships changed through time. These areas provide major contrasts in human prehistory and biomes. HOPE is interdisciplinary: pollen analysis, databases, multivariate analysis, ecology, new statistical methods, numerical simulations, statistical modelling. HOPE’s impact goes beyond human effects on the biosphere and extends to the very core of Earth science’s basic conceptual framework.
Summary
A critical question in Earth system science is what was the impact of prehistoric people on the biosphere and climate? There is much information about human impact through clearance, agriculture, erosion, and modifying water and nutrient budgets. Humans have greatly changed the Earth in the last 8000 years, but did humans modify the major ecological processes (e.g. assembly rules) that shape community assembly and dynamics? Did inter-relationships between processes change in response to human impact? Lyons et al. & Dietl (2016 Nature) suggest that human activities in the last 6000 years had such impacts. Dietl proposes that using past ‘natural experiments’ to predict future changes is “flawed” and “out is the use of uniformitarianism”. As using natural experiments is a common strategy and uniformitarianism is the major working concept in Earth sciences, it is imperative to test whether prehistoric human activity changed major ecological processes determining community development. To test this hypothesis, patterns in pollen-stratigraphical data for the past 11,500 years from over 2000 sites across the globe will be explored consistently using numerical techniques to discern changes in 25 ecosystem properties (richness, evenness, and diversity; turnover; rates of change; taxon co-occurrences, etc.). Patterns in these properties will be compared statistically at sites within biomes, between biomes, within continents, and between continents to test the hypotheses that prehistoric human activities changed the basic ecological processes of community assembly and that their inter-relationships changed through time. These areas provide major contrasts in human prehistory and biomes. HOPE is interdisciplinary: pollen analysis, databases, multivariate analysis, ecology, new statistical methods, numerical simulations, statistical modelling. HOPE’s impact goes beyond human effects on the biosphere and extends to the very core of Earth science’s basic conceptual framework.
Max ERC Funding
2 278 884 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym ICEMASS
Project Global Glacier Mass Continuity
Researcher (PI) Hans Andreas Max Kääb
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2012-ADG_20120216
Summary For the first time in history satellite data and respective archive holdings are now sufficient in terms of their spatial and temporal resolution, and their accuracy, to measure volume changes, velocities and changes in these velocities over time for glaciers and ice caps other than ice sheets on a global scale.
The ICEMASS project will derive and analyse glacier thickness changes using satellite laser and radar altimetry, and satellite-derived and other digital elevation models, and convert these to a global glacier mass budget. Such data set will enable major steps forward in glacier and Earth science, in particular: constrain current sea-level contribution from glaciers; complete climate change patterns as reflected in glacier mass changes; quantify the contribution of glacier imbalance to river run-off; allow to separate glacier mass loss from other components of gravity changes as detected through satellite gravimetry; and allow improved modelling of the isostatic uplift component due to current changes in glacier load.
These results will be connected to global-scale glacier dynamics, for which a global set of repeat optical and radar satellite images will be processed to measure displacements due to glacier flow and their annual to decadal-scale changes. The analysis of these data will enable several major steps forward in glacier and Earth science, in particular: progress the understanding of glacier response to climate and its changes; provide new insights in processes underlying spatio-temporal variability and instability of glacier flow on decadal scales; improve understanding of dynamic thickness change effects; allow estimating global calving fluxes; progress understanding of transport in glaciers and their role in landscape development; and help to better assess potentially hazardous glacier lakes.
Summary
For the first time in history satellite data and respective archive holdings are now sufficient in terms of their spatial and temporal resolution, and their accuracy, to measure volume changes, velocities and changes in these velocities over time for glaciers and ice caps other than ice sheets on a global scale.
The ICEMASS project will derive and analyse glacier thickness changes using satellite laser and radar altimetry, and satellite-derived and other digital elevation models, and convert these to a global glacier mass budget. Such data set will enable major steps forward in glacier and Earth science, in particular: constrain current sea-level contribution from glaciers; complete climate change patterns as reflected in glacier mass changes; quantify the contribution of glacier imbalance to river run-off; allow to separate glacier mass loss from other components of gravity changes as detected through satellite gravimetry; and allow improved modelling of the isostatic uplift component due to current changes in glacier load.
These results will be connected to global-scale glacier dynamics, for which a global set of repeat optical and radar satellite images will be processed to measure displacements due to glacier flow and their annual to decadal-scale changes. The analysis of these data will enable several major steps forward in glacier and Earth science, in particular: progress the understanding of glacier response to climate and its changes; provide new insights in processes underlying spatio-temporal variability and instability of glacier flow on decadal scales; improve understanding of dynamic thickness change effects; allow estimating global calving fluxes; progress understanding of transport in glaciers and their role in landscape development; and help to better assess potentially hazardous glacier lakes.
Max ERC Funding
2 395 320 €
Duration
Start date: 2013-03-01, End date: 2019-02-28
Project acronym INNOSTOCH
Project INNOVATIONS IN STOCHASTIC ANALYSIS AND APPLICATIONS with emphasis on STOCHASTIC CONTROL AND INFORMATION
Researcher (PI) Bernt Karsten Øksendal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary "For almost all kinds of dynamic systems modeling real processes in nature or society, most of the mathematical models we can formulate are - at best - inaccurate, and subject to random fluctuations and other types of ""noise"". Therefore it is important to be able to deal with such noisy models in a mathematically rigorous way. This rigorous theory is stochastic analysis. Theoretical progress in stochastic analysis will lead to new and improved applications in a wide range of fields.
The main purpose of this proposal is to establish a research environment which enhances the creation of new ideas and methods in the research of stochastic analysis and its applications. The emphasis is more on innovation, new models and challenges in the research frontiers, rather than small variations and minor improvements of already established theories and results. We will concentrate on applications in finance and biology, but the theoretical results may as well apply to several other areas.
Utilizing recent results and achievements by PI and a large group of distinguished coworkers, the natural extensions from the present knowledge is to concentrate on the mathematical theory of the interplay between stochastic analysis, stochastic control and information. More precisely, we have ambitions to make fundamental progress in the general theory of stochastic control of random systems and applications in finance and biology, and the explicit relation between the optimal performance and the amount of information available to the controller. Explicit examples of special interest include optimal control under partial or delayed information, and optimal control under inside or advanced information. A success of the present proposal will represent a substantial breakthrough, and in turn bring us a significant step forward in our attempts to understand various aspects of the world better, and it will help us to find optimal, sustainable ways to influence it."
Summary
"For almost all kinds of dynamic systems modeling real processes in nature or society, most of the mathematical models we can formulate are - at best - inaccurate, and subject to random fluctuations and other types of ""noise"". Therefore it is important to be able to deal with such noisy models in a mathematically rigorous way. This rigorous theory is stochastic analysis. Theoretical progress in stochastic analysis will lead to new and improved applications in a wide range of fields.
The main purpose of this proposal is to establish a research environment which enhances the creation of new ideas and methods in the research of stochastic analysis and its applications. The emphasis is more on innovation, new models and challenges in the research frontiers, rather than small variations and minor improvements of already established theories and results. We will concentrate on applications in finance and biology, but the theoretical results may as well apply to several other areas.
Utilizing recent results and achievements by PI and a large group of distinguished coworkers, the natural extensions from the present knowledge is to concentrate on the mathematical theory of the interplay between stochastic analysis, stochastic control and information. More precisely, we have ambitions to make fundamental progress in the general theory of stochastic control of random systems and applications in finance and biology, and the explicit relation between the optimal performance and the amount of information available to the controller. Explicit examples of special interest include optimal control under partial or delayed information, and optimal control under inside or advanced information. A success of the present proposal will represent a substantial breakthrough, and in turn bring us a significant step forward in our attempts to understand various aspects of the world better, and it will help us to find optimal, sustainable ways to influence it."
Max ERC Funding
1 864 800 €
Duration
Start date: 2009-09-01, End date: 2014-08-31
Project acronym INSULATRONICS
Project Controlling Electric Signals with Insulating Antiferromagnets and Insulating Ferromagnets
Researcher (PI) Arne Brataas
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Advanced Grant (AdG), PE3, ERC-2014-ADG
Summary The proposal aims to facilitate a revolution of information and communication technologies by controlling electric signals with antiferromagnetic insulators and ferromagnetic insulators. We recently discovered that antiferromagnets can be active components in spintronics devices despite their lack of a macroscopic magnetic moment, and even when they are insulating.
Conventional electronics- and spintronics-based logic and memory devices, interconnects, and microwave oscillators are based on (spin-polarized) charge transport, which inherently dissipates power due to ohmic losses. The research proposed seeks to determine the extents to which “Insulatronics” has the potential to control the electric and thermal signal generation, transmission, and detection in more power-efficient ways.
Insulatronics is profoundly different because there are no moving charges involved so the power reduction is significant. We hope to establish the extents to which spin-waves and coherent magnons in antiferromagnetic insulators and ferromagnetic insulators can be strongly coupled to electric and thermal currents in adjacent conductors and utilize this coupling to control electric signals. The coupling will be facilitated by spin-transfer torques and spin-pumping – a technique we pioneered – as well as spin-orbit torques and its reciprocal process of charge-pumping.
The core of this project focuses on the theoretical and fundamental challenges facing Insulatronics. Beyond the duration of the project, if we are successful, the use of spin signals in insulators with extremely low power dissipation may enable superior low-power technologies such as oscillators, logic devices, interconnects, non-volatile random access memories, and perhaps even quantum information processing.
Summary
The proposal aims to facilitate a revolution of information and communication technologies by controlling electric signals with antiferromagnetic insulators and ferromagnetic insulators. We recently discovered that antiferromagnets can be active components in spintronics devices despite their lack of a macroscopic magnetic moment, and even when they are insulating.
Conventional electronics- and spintronics-based logic and memory devices, interconnects, and microwave oscillators are based on (spin-polarized) charge transport, which inherently dissipates power due to ohmic losses. The research proposed seeks to determine the extents to which “Insulatronics” has the potential to control the electric and thermal signal generation, transmission, and detection in more power-efficient ways.
Insulatronics is profoundly different because there are no moving charges involved so the power reduction is significant. We hope to establish the extents to which spin-waves and coherent magnons in antiferromagnetic insulators and ferromagnetic insulators can be strongly coupled to electric and thermal currents in adjacent conductors and utilize this coupling to control electric signals. The coupling will be facilitated by spin-transfer torques and spin-pumping – a technique we pioneered – as well as spin-orbit torques and its reciprocal process of charge-pumping.
The core of this project focuses on the theoretical and fundamental challenges facing Insulatronics. Beyond the duration of the project, if we are successful, the use of spin signals in insulators with extremely low power dissipation may enable superior low-power technologies such as oscillators, logic devices, interconnects, non-volatile random access memories, and perhaps even quantum information processing.
Max ERC Funding
2 140 503 €
Duration
Start date: 2015-12-01, End date: 2020-11-30
Project acronym ISLAS
Project Isotopic links to atmopheric water's sources
Researcher (PI) Harald SODEMANN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary The hydrological cycle, with its feedbacks related to water vapour and clouds, is the largest source of uncertainty in weather prediction and climate models. Particularly processes that occur on scales smaller than the model grid lead to errors, which can compensate one another, making them difficult to detect and correct for. Undetectable compensating errors critically limit the understanding of hydrological extremes, the response of the water cycle to a changing climate, and the interpretation of paleoclimate records. Stable water isotopes have a unique potential to serve as the needed constraints, as they provide measures of moisture origin and of the phase change history. We have recently spearheaded a revised view of the atmospheric water cycle, which highlights the importance of connections on a regional scale. This implies that in some areas, all relevant processes can be studied on a regional scale. The Nordic Seas are an ideal case of such a natural laboratory, with distinct evaporation events, shallow transport processes, and swift precipitation formation. Together with recent technological advances in isotope measurements and in-situ sample collection, this will allow us to acquire a new kind of observational data set that will follow the history of water vapour from source to sink. The high-resolution, high-precision isotope data will provide a combined view of established and novel natural isotopic source tracers and set new benchmarks for climate models. A unique palette of sophisticated model tools will allow us to decipher, synthesize and exploit these observations, and to identify compensating errors between water cycle processes in models. In ISLAS, my team and I will thus make unprecedented use of stable isotopes to provide the sought-after constraints for an improved understanding of the hydrological cycle in nature and in climate models, leading towards improved predictions of future climate.
Summary
The hydrological cycle, with its feedbacks related to water vapour and clouds, is the largest source of uncertainty in weather prediction and climate models. Particularly processes that occur on scales smaller than the model grid lead to errors, which can compensate one another, making them difficult to detect and correct for. Undetectable compensating errors critically limit the understanding of hydrological extremes, the response of the water cycle to a changing climate, and the interpretation of paleoclimate records. Stable water isotopes have a unique potential to serve as the needed constraints, as they provide measures of moisture origin and of the phase change history. We have recently spearheaded a revised view of the atmospheric water cycle, which highlights the importance of connections on a regional scale. This implies that in some areas, all relevant processes can be studied on a regional scale. The Nordic Seas are an ideal case of such a natural laboratory, with distinct evaporation events, shallow transport processes, and swift precipitation formation. Together with recent technological advances in isotope measurements and in-situ sample collection, this will allow us to acquire a new kind of observational data set that will follow the history of water vapour from source to sink. The high-resolution, high-precision isotope data will provide a combined view of established and novel natural isotopic source tracers and set new benchmarks for climate models. A unique palette of sophisticated model tools will allow us to decipher, synthesize and exploit these observations, and to identify compensating errors between water cycle processes in models. In ISLAS, my team and I will thus make unprecedented use of stable isotopes to provide the sought-after constraints for an improved understanding of the hydrological cycle in nature and in climate models, leading towards improved predictions of future climate.
Max ERC Funding
1 999 054 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym LOPRE
Project Lossy Preprocessing
Researcher (PI) Saket SAURABH
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Summary
A critical component of computational processing of data sets is the `preprocessing' or `compression' step which is the computation of a \emph{succinct, sufficiently accurate} representation
of the given data. Preprocessing is ubiquitous and a rigorous mathematical understanding of preprocessing algorithms is crucial in order to reason about and understand the limits of preprocessing.
Unfortunately, there is no mathematical framework to analyze and objectively compare two preprocessing routines while simultaneously taking into account `all three dimensions' --
-- the efficiency of computing the succinct representation,
-- the space required to store this representation, and
-- the accuracy with which the original data is captured in the succinct representation.
``The overarching goal of this proposal is the development of a mathematical framework for the rigorous analysis of preprocessing algorithms. ''
We will achieve the goal by designing new algorithmic techniques for preprocessing, developing a framework of analysis to make qualitative comparisons between various preprocessing routines based on the criteria above and by developing lower bound tools required
to understand the limitations of preprocessing for concrete problems.
This project will lift our understanding of algorithmic preprocessing to new heights and lead to a groundbreaking shift in the set of basic research questions attached to the study of preprocessing for specific problems. It will significantly advance the analysis of preprocessing and yield substantial technology transfer between adjacent subfields of computer science such as dynamic algorithms, streaming algorithms, property testing and graph theory.
Max ERC Funding
2 000 000 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym LUSI LAB
Project Lusi: a unique natural laboratory for multidisciplinary studies of focussed fluid flow in sedimentary basins
Researcher (PI) Adriano Mazzini
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE10, ERC-2012-StG_20111012
Summary The 29th of May 2006 several gas and mud eruption sites suddenly appeared along a fault in the NE of Java, Indonesia. Within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. To date Lusi is still active and has forced 50.000 people to be evacuated and an area of more than 7 km2 is covered by mud. The social impact of the eruption and its spectacular dimensions still attract the attention of international media. Since 2006 I have completed four expeditions to Indonesia and initiated quantitative and experimental studies leading to the publication of two papers focussing on the plumbing system and the mechanisms of the Lusi eruption. However still many unanswered questions remain. What lies beneath Lusi? Is Lusi a mud volcano or part of a larger hydrothermal system? What are the mechanisms triggering the eruption? How long will the eruption last?
LUSI LAB is an ambitious project that aims to answer these questions and to perform a multidisciplinary study using Lusi as a unique natural laboratory. Due to its relatively easy accessibility, the geological setting, and the vast scale, the Lusi eruption represents an unprecedented opportunity to study and learn from an ongoing active eruptive system. The results will be crucial for understanding focused fluid flow systems in other sedimentary basins world-wide, and to unravel issues related to geohazards and palaeoclimate aspects. The project will use multisensory sampling devices within the active feeder channel and a remote-controlled raft and flying device to access and sample the crater and the erupted gases. UV-gas camera imaging to measure the rate and composition of the erupted gases will be coupled with a network of seismometers to evaluate the impact that seismicity, local faulting and the neighbouring Arjuno-Welirang volcanic complex have on the long-lasting Lusi activity. This information will provide robust constraints to model the pulsating Lusi behaviour.
Summary
The 29th of May 2006 several gas and mud eruption sites suddenly appeared along a fault in the NE of Java, Indonesia. Within weeks several villages were submerged by boiling mud. The most prominent eruption site was named Lusi. To date Lusi is still active and has forced 50.000 people to be evacuated and an area of more than 7 km2 is covered by mud. The social impact of the eruption and its spectacular dimensions still attract the attention of international media. Since 2006 I have completed four expeditions to Indonesia and initiated quantitative and experimental studies leading to the publication of two papers focussing on the plumbing system and the mechanisms of the Lusi eruption. However still many unanswered questions remain. What lies beneath Lusi? Is Lusi a mud volcano or part of a larger hydrothermal system? What are the mechanisms triggering the eruption? How long will the eruption last?
LUSI LAB is an ambitious project that aims to answer these questions and to perform a multidisciplinary study using Lusi as a unique natural laboratory. Due to its relatively easy accessibility, the geological setting, and the vast scale, the Lusi eruption represents an unprecedented opportunity to study and learn from an ongoing active eruptive system. The results will be crucial for understanding focused fluid flow systems in other sedimentary basins world-wide, and to unravel issues related to geohazards and palaeoclimate aspects. The project will use multisensory sampling devices within the active feeder channel and a remote-controlled raft and flying device to access and sample the crater and the erupted gases. UV-gas camera imaging to measure the rate and composition of the erupted gases will be coupled with a network of seismometers to evaluate the impact that seismicity, local faulting and the neighbouring Arjuno-Welirang volcanic complex have on the long-lasting Lusi activity. This information will provide robust constraints to model the pulsating Lusi behaviour.
Max ERC Funding
1 422 420 €
Duration
Start date: 2013-01-01, End date: 2018-12-31
Project acronym MC2
Project Mixed-phase clouds and climate (MC2) – from process-level understanding to large-scale impacts
Researcher (PI) Trude STORELVMO
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE10, ERC-2017-STG
Summary The importance of mixed-phase clouds (i.e. clouds in which liquid and ice may co-exist) for weather and climate has become increasingly evident in recent years. We now know that a majority of the precipitation reaching Earth’s surface originates from mixed-phase clouds, and the way cloud phase changes under global warming has emerged as a critically important climate feedback. Atmospheric aerosols may also have affected climate via mixed-phase clouds, but the magnitude and even sign of this effect is currently unknown. Satellite observations have recently revealed that cloud phase is misrepresented in global climate models (GCMs), suggesting systematic GCM biases in precipitation formation and cloud-climate feedbacks. Such biases give us reason to doubt GCM projections of the climate response to CO2 increases, or to changing atmospheric aerosol loadings. This proposal seeks to address the above issues, through a multi-angle and multi-tool approach: (i) By conducting field measurements of cloud phase at mid- and high latitudes, we seek to identify the small-scale structure of mixed-phase clouds. (ii) Large-eddy simulations will then be employed to identify the underlying physics responsible for the observed structures, and the field measurements will provide case studies for regional cloud-resolving modelling in order to test and revise state-of-the-art cloud microphysics parameterizations. (iii) GCMs, with revised microphysics parameterizations, will be confronted with cloud phase constraints available from space. (iv) Finally, the same GCMs will be used to re-evaluate the climate impact of mixed-phase clouds in terms of their contribution to climate forcings and feedbacks. Through this synergistic combination of tools for a multi-scale study of mixed-phase clouds, the proposed research has the potential to bring the field of climate science forward, from improved process-level understanding at small scales, to better climate change predictions on the global scale.
Summary
The importance of mixed-phase clouds (i.e. clouds in which liquid and ice may co-exist) for weather and climate has become increasingly evident in recent years. We now know that a majority of the precipitation reaching Earth’s surface originates from mixed-phase clouds, and the way cloud phase changes under global warming has emerged as a critically important climate feedback. Atmospheric aerosols may also have affected climate via mixed-phase clouds, but the magnitude and even sign of this effect is currently unknown. Satellite observations have recently revealed that cloud phase is misrepresented in global climate models (GCMs), suggesting systematic GCM biases in precipitation formation and cloud-climate feedbacks. Such biases give us reason to doubt GCM projections of the climate response to CO2 increases, or to changing atmospheric aerosol loadings. This proposal seeks to address the above issues, through a multi-angle and multi-tool approach: (i) By conducting field measurements of cloud phase at mid- and high latitudes, we seek to identify the small-scale structure of mixed-phase clouds. (ii) Large-eddy simulations will then be employed to identify the underlying physics responsible for the observed structures, and the field measurements will provide case studies for regional cloud-resolving modelling in order to test and revise state-of-the-art cloud microphysics parameterizations. (iii) GCMs, with revised microphysics parameterizations, will be confronted with cloud phase constraints available from space. (iv) Finally, the same GCMs will be used to re-evaluate the climate impact of mixed-phase clouds in terms of their contribution to climate forcings and feedbacks. Through this synergistic combination of tools for a multi-scale study of mixed-phase clouds, the proposed research has the potential to bring the field of climate science forward, from improved process-level understanding at small scales, to better climate change predictions on the global scale.
Max ERC Funding
1 500 000 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym NANOSCOPY
Project High-speed chip-based nanoscopy to discover real-time sub-cellular dynamics
Researcher (PI) Balpreet Singh Ahluwalia
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2013-StG
Summary Optical nanoscopy has given a glimpse of the impact it may have on medical care in the future. Slow imaging speed and the complexity of the current nanoscope limits its use for living cells. The imaging speed is limited by the bulk optics that is used in present nanoscopy. In this project, I propose a paradigm-shift in the field of advanced microscopy by developing optical nanoscopy based on a photonic integrated circuit. The project will take advantage of nanotechnology to fabricate an advance waveguide-chip, while fast telecom optical devices will provide switching of light to the chip, enhancing the speed of imaging. This unconventional route will change the field of optical microscopy, as a simple chip-based system can be added to a normal microscope. In this project, I will build a waveguide-based structured-illumination microscope (W-SIM) to acquire fast images (25 Hz or better) from a living cell with an optical resolution of 50-100 nm. I will use W-SIM to discover the dynamics (opening and closing) of fenestrations (100 nm) present in the membrane of a living liver sinusoidal scavenger endothelial cell. It is believed among the Hepatology community that these fenestrations open and close dynamically, however there is no scientific evidence to support this hypothesis because of the lack of suitable tools. The successful imaging of fenestration kinetics in a live cell during this project will provide new fundamental knowledge and benefit human health with improved diagnoses and drug discovery for liver. Chip-based nanoscopy is a new research field, inherently making this a high-risk project, but the possible gains are also high. The W-SIM will be the first of its kind, which may open a new era of simple, integrated nanoscopy. The proposed multiple-disciplinary project requires a near-unique expertise in the field of laser physics, integrated optics, advanced microscopy and cell-biology that I have acquired at leading research centers on three continents.
Summary
Optical nanoscopy has given a glimpse of the impact it may have on medical care in the future. Slow imaging speed and the complexity of the current nanoscope limits its use for living cells. The imaging speed is limited by the bulk optics that is used in present nanoscopy. In this project, I propose a paradigm-shift in the field of advanced microscopy by developing optical nanoscopy based on a photonic integrated circuit. The project will take advantage of nanotechnology to fabricate an advance waveguide-chip, while fast telecom optical devices will provide switching of light to the chip, enhancing the speed of imaging. This unconventional route will change the field of optical microscopy, as a simple chip-based system can be added to a normal microscope. In this project, I will build a waveguide-based structured-illumination microscope (W-SIM) to acquire fast images (25 Hz or better) from a living cell with an optical resolution of 50-100 nm. I will use W-SIM to discover the dynamics (opening and closing) of fenestrations (100 nm) present in the membrane of a living liver sinusoidal scavenger endothelial cell. It is believed among the Hepatology community that these fenestrations open and close dynamically, however there is no scientific evidence to support this hypothesis because of the lack of suitable tools. The successful imaging of fenestration kinetics in a live cell during this project will provide new fundamental knowledge and benefit human health with improved diagnoses and drug discovery for liver. Chip-based nanoscopy is a new research field, inherently making this a high-risk project, but the possible gains are also high. The W-SIM will be the first of its kind, which may open a new era of simple, integrated nanoscopy. The proposed multiple-disciplinary project requires a near-unique expertise in the field of laser physics, integrated optics, advanced microscopy and cell-biology that I have acquired at leading research centers on three continents.
Max ERC Funding
1 490 976 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym NCGQG
Project Noncommutative geometry and quantum groups
Researcher (PI) Sergiy Neshveyev
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary "The goal of the project is to make fundamental contributions to the study of quantum groups in the operator algebraic setting. Two main directions it aims to explore are noncommutative differential geometry and boundary theory of quantum random walks.
The idea behind noncommutative geometry is to bring geometric insight to the study of noncommutative algebras and to analyze spaces which are beyond the reach via classical means. It has been particularly successful in the latter, for example, in the study of the spaces of leaves of foliations. Quantum groups supply plenty of examples of noncommutative algebras, but the question how they fit into noncommutative geometry remains complicated. A successful union of these two areas is important for testing ideas of noncommutative geometry and for its development in new directions. One of the main goals of the project is to use the momentum created by our recent work in the area in order to further expand the boundaries of our understanding. Specifically, we are going to study such problems as the local index formula, equivariance of Dirac operators with respect to the dual group action (with an eye towards the Baum-Connes conjecture for discrete quantum groups), construction of Dirac operators on quantum homogeneous spaces, structure of quantized C*-algebras of continuous functions, computation of dual cohomology of compact quantum groups.
The boundary theory of quantum random walks was created around ten years ago. In the recent years there has been a lot of progress on the “measure-theoretic” side of the theory, while the questions largely remain open on the “topological” side. A significant progress in this area can have a great influence on understanding of quantum groups, construction of new examples and development of quantum probability. The main problems we are going to study are boundary convergence of quantum random walks and computation of Martin boundaries."
Summary
"The goal of the project is to make fundamental contributions to the study of quantum groups in the operator algebraic setting. Two main directions it aims to explore are noncommutative differential geometry and boundary theory of quantum random walks.
The idea behind noncommutative geometry is to bring geometric insight to the study of noncommutative algebras and to analyze spaces which are beyond the reach via classical means. It has been particularly successful in the latter, for example, in the study of the spaces of leaves of foliations. Quantum groups supply plenty of examples of noncommutative algebras, but the question how they fit into noncommutative geometry remains complicated. A successful union of these two areas is important for testing ideas of noncommutative geometry and for its development in new directions. One of the main goals of the project is to use the momentum created by our recent work in the area in order to further expand the boundaries of our understanding. Specifically, we are going to study such problems as the local index formula, equivariance of Dirac operators with respect to the dual group action (with an eye towards the Baum-Connes conjecture for discrete quantum groups), construction of Dirac operators on quantum homogeneous spaces, structure of quantized C*-algebras of continuous functions, computation of dual cohomology of compact quantum groups.
The boundary theory of quantum random walks was created around ten years ago. In the recent years there has been a lot of progress on the “measure-theoretic” side of the theory, while the questions largely remain open on the “topological” side. A significant progress in this area can have a great influence on understanding of quantum groups, construction of new examples and development of quantum probability. The main problems we are going to study are boundary convergence of quantum random walks and computation of Martin boundaries."
Max ERC Funding
1 144 930 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym PaPaAlg
Project Pareto-Optimal Parameterized Algorithms
Researcher (PI) Daniel LOKSHTANOV
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary In this project we revise the foundations of parameterized complexity, a modern multi-variate approach to algorithm design. The underlying question of every algorithmic paradigm is ``what is the best algorithm?'' When the running time of algorithms is measured in terms of only one variable, it is easy to compare which one is the fastest. However, when the running time depends on more than one variable, as is the case for parameterized complexity:
**It is not clear what a ``fastest possible algorithm'' really means.**
The previous formalizations of what a fastest possible parameterized algorithm means are one-dimensional, contrary to the core philosophy of parameterized complexity. These one-dimensional approaches to a multi-dimensional algorithmic paradigm unavoidably miss the most efficient algorithms, and ultimately fail to solve instances that we could have solved.
We propose the first truly multi-dimensional framework for comparing the running times of parameterized algorithms. Our new definitions are based on the notion of Pareto-optimality from economics. The new approach encompasses all existing paradigms for comparing parameterized algorithms, opens up a whole new world of research directions in parameterized complexity, and reveals new fundamental questions about parameterized problems that were considered well-understood.
In this project we will develop powerful algorithmic and complexity theoretic tools to answer these research questions. The successful completion of this project will take parameterized complexity far beyond the state of the art, make parameterized algorithms more relevant for practical applications, and significantly advance adjacent subfields of theoretical computer science and mathematics.
Summary
In this project we revise the foundations of parameterized complexity, a modern multi-variate approach to algorithm design. The underlying question of every algorithmic paradigm is ``what is the best algorithm?'' When the running time of algorithms is measured in terms of only one variable, it is easy to compare which one is the fastest. However, when the running time depends on more than one variable, as is the case for parameterized complexity:
**It is not clear what a ``fastest possible algorithm'' really means.**
The previous formalizations of what a fastest possible parameterized algorithm means are one-dimensional, contrary to the core philosophy of parameterized complexity. These one-dimensional approaches to a multi-dimensional algorithmic paradigm unavoidably miss the most efficient algorithms, and ultimately fail to solve instances that we could have solved.
We propose the first truly multi-dimensional framework for comparing the running times of parameterized algorithms. Our new definitions are based on the notion of Pareto-optimality from economics. The new approach encompasses all existing paradigms for comparing parameterized algorithms, opens up a whole new world of research directions in parameterized complexity, and reveals new fundamental questions about parameterized problems that were considered well-understood.
In this project we will develop powerful algorithmic and complexity theoretic tools to answer these research questions. The successful completion of this project will take parameterized complexity far beyond the state of the art, make parameterized algorithms more relevant for practical applications, and significantly advance adjacent subfields of theoretical computer science and mathematics.
Max ERC Funding
1 499 557 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym PARAPPROX
Project Parameterized Approximation
Researcher (PI) Saket Saurabh
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary "The main goal of this project is to lay the foundations of a ``non-polynomial time theory of approximation"" -- the Parameterized Approximation for NP-hard optimization problems. A combination that will use the salient features of Approximation Algorithms and
Parameterized Complexity. In the former, one relaxes the requirement of finding an optimum solution. In the latter, one relaxes the requirement of finishing in polynomial time by restricting the
combinatorial explosion in the running time to a parameter that for reasonable inputs is much smaller than the input size. This project will explore the following fundamental question:
Approximation Algorithms + Parameterized Complexity=?
New techniques will be developed that will simultaneously utilize the notions of relaxed time complexity and accuracy and thereby make problems for which both these approaches have failed independently, tractable. It is however conceivable that for some problems even this combined approach may not succeed. But in those situations we will glean valuable insight into the reasons for failure. In parallel to algorithmic studies, an intractability theory will be
developed which will provide the theoretical framework to specify the extent to which this approach might work. Thus, on one hand the project will give rise to algorithms that will have impact beyond the boundaries of computer science and on the other hand it will lead to a complexity theory that will go beyond the established notions of intractability. Both these aspects of my project are groundbreaking -- the new theory will transcend our current ideas of
efficient approximation and thereby raise the state of the art to a new level."
Summary
"The main goal of this project is to lay the foundations of a ``non-polynomial time theory of approximation"" -- the Parameterized Approximation for NP-hard optimization problems. A combination that will use the salient features of Approximation Algorithms and
Parameterized Complexity. In the former, one relaxes the requirement of finding an optimum solution. In the latter, one relaxes the requirement of finishing in polynomial time by restricting the
combinatorial explosion in the running time to a parameter that for reasonable inputs is much smaller than the input size. This project will explore the following fundamental question:
Approximation Algorithms + Parameterized Complexity=?
New techniques will be developed that will simultaneously utilize the notions of relaxed time complexity and accuracy and thereby make problems for which both these approaches have failed independently, tractable. It is however conceivable that for some problems even this combined approach may not succeed. But in those situations we will glean valuable insight into the reasons for failure. In parallel to algorithmic studies, an intractability theory will be
developed which will provide the theoretical framework to specify the extent to which this approach might work. Thus, on one hand the project will give rise to algorithms that will have impact beyond the boundaries of computer science and on the other hand it will lead to a complexity theory that will go beyond the established notions of intractability. Both these aspects of my project are groundbreaking -- the new theory will transcend our current ideas of
efficient approximation and thereby raise the state of the art to a new level."
Max ERC Funding
1 690 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym PI3K-III COMPLEX
Project The PI3K-III complex: Function in cell regulation and tumour suppression
Researcher (PI) Harald Alfred Stenmark
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), LS3, ERC-2008-AdG
Summary Phosphoinositides (PIs), phosphorylated derivatives of phosphatidylinositol (PtdIns), control cellular functions through recruitment of cytosolic proteins to specific membranes. Among the kinases involved in PI generation, the PI3K-III complex, which catalyzes conversion of PtdIns into PtdIns 3-phosphate (PI3P), is of great interest for several reasons. Firstly, it is required for three topologically related membrane involution processes - the biogenesis of multivesicular endosomes, autophagy, and cytokinesis. Secondly, through its catalytic product this protein complex mediates anti-apoptotic and antiproliferative signalling. Thirdly, several subunits of the PI3K-III complex are known tumour suppressors, making the PI3K-III complex a possible target for cancer therapy and diagnostics. This proposal aims to undertake a systematic analysis of the PI3K-III complex and its functions, and the following key questions will be addressed: How is the PI3K-III complex recruited to specific membranes? How does it control membrane involution and signal transduction? By which mechanisms do subunits of this protein complex serve as tumour suppressors? The project will be divided into seven subprojects, which include (1) characterization of the PI3K-III complex, (2) detection of the PI3K-III product PI3P in cells and tissues, (3) the function of the PI3K-III complex in downregulation of growth factor receptors, (4) the function of the PI3K-III complex in autophagy, (5) the function of the PI3K-III complex in cytokinesis, (6) the function of the PI3K-III complex in cell signalling, and (7) dissecting the tumour suppressor activities of the PI3K-III complex. The analyses will range from protein biochemistry to development of novel imaging probes, siRNA screens for novel PI3P effectors, functional characterization of PI3K-III subunits and PI3P effectors in cell culture models, and tumour suppressor analyses in novel Drosophila models.
Summary
Phosphoinositides (PIs), phosphorylated derivatives of phosphatidylinositol (PtdIns), control cellular functions through recruitment of cytosolic proteins to specific membranes. Among the kinases involved in PI generation, the PI3K-III complex, which catalyzes conversion of PtdIns into PtdIns 3-phosphate (PI3P), is of great interest for several reasons. Firstly, it is required for three topologically related membrane involution processes - the biogenesis of multivesicular endosomes, autophagy, and cytokinesis. Secondly, through its catalytic product this protein complex mediates anti-apoptotic and antiproliferative signalling. Thirdly, several subunits of the PI3K-III complex are known tumour suppressors, making the PI3K-III complex a possible target for cancer therapy and diagnostics. This proposal aims to undertake a systematic analysis of the PI3K-III complex and its functions, and the following key questions will be addressed: How is the PI3K-III complex recruited to specific membranes? How does it control membrane involution and signal transduction? By which mechanisms do subunits of this protein complex serve as tumour suppressors? The project will be divided into seven subprojects, which include (1) characterization of the PI3K-III complex, (2) detection of the PI3K-III product PI3P in cells and tissues, (3) the function of the PI3K-III complex in downregulation of growth factor receptors, (4) the function of the PI3K-III complex in autophagy, (5) the function of the PI3K-III complex in cytokinesis, (6) the function of the PI3K-III complex in cell signalling, and (7) dissecting the tumour suppressor activities of the PI3K-III complex. The analyses will range from protein biochemistry to development of novel imaging probes, siRNA screens for novel PI3P effectors, functional characterization of PI3K-III subunits and PI3P effectors in cell culture models, and tumour suppressor analyses in novel Drosophila models.
Max ERC Funding
2 272 000 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym PREPROCESSING
Project RIGOROUS THEORY OF PREPROCESSING
Researcher (PI) Fedor Fomin
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), PE6, ERC-2010-AdG_20100224
Summary The main research goal of this project is the quest for rigorous mathematical theory explaining the power and failure of heuristics. The incapability of current computational models to explain the success of heuristic algorithms in practical computing is the subject of wide discussion for more than four decades. Within this project we expect a significant breakthrough in the study of a large family of heuristics: Preprocessing (data reduction or kernelization). Preprocessing is a reduction of the problem to a simpler one and this is the type of algorithms used in almost every application.
As key to novel and groundbreaking results, the proposed project aims to develop new theory of polynomial time compressibility. Understanding the origin of compressibility will serve to build more powerful heuristic algorithms, as well as to explain the behaviour of preprocessing.
The ubiquity of preprocessing makes the theory of compressibility extremely important.
The new theory will be able to transfer the ideas of efficient computation beyond the established borders.
Summary
The main research goal of this project is the quest for rigorous mathematical theory explaining the power and failure of heuristics. The incapability of current computational models to explain the success of heuristic algorithms in practical computing is the subject of wide discussion for more than four decades. Within this project we expect a significant breakthrough in the study of a large family of heuristics: Preprocessing (data reduction or kernelization). Preprocessing is a reduction of the problem to a simpler one and this is the type of algorithms used in almost every application.
As key to novel and groundbreaking results, the proposed project aims to develop new theory of polynomial time compressibility. Understanding the origin of compressibility will serve to build more powerful heuristic algorithms, as well as to explain the behaviour of preprocessing.
The ubiquity of preprocessing makes the theory of compressibility extremely important.
The new theory will be able to transfer the ideas of efficient computation beyond the established borders.
Max ERC Funding
2 227 051 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym QGP tomography
Project A novel Quark-Gluon Plasma tomography tool: from jet quenching to exploring the extreme medium properties
Researcher (PI) Magdalena DJORDJEVIC
Host Institution (HI) INSTITUT ZA FIZIKU
Call Details Consolidator Grant (CoG), PE2, ERC-2016-COG
Summary Quark-Gluon Plasma (QGP) is a primordial state of matter, which consists of interacting free quarks and gluons. QGP likely existed immediately after the Big-Bang, and this extreme form of matter is today created in Little Bangs, which are ultra-relativistic collisions of heavy nuclei at the LHC and RHIC experiments. Based on the deconfinement ideas, a gas-like behaviour of QGP was anticipated. Unexpectedly, predictions of relativistic hydrodynamics - applicable to low momentum hadron data - indicated that QGP behaves as nearly perfect fluid, thus bringing exciting connections between the hottest (QGP) and the coldest (perfect Fermi gas) matter on Earth. However, predictions of hydrodynamical simulations are often weakly sensitive to changes of the bulk QGP parameters. In particular, even a large increase of viscosity not far from the phase transition does not notably change the low momentum predictions; in addition, the origin of the surprisingly low viscosity remains unclear. To understand the QGP properties, and to challenge the perfect fluid paradigm, we will develop a novel precision tomographic tool based on: i) state of the art, no free parameters, energy loss model of high momentum parton interactions with evolving QGP, ii) simulations of QGP evolution, in which the medium parameters will be systematically varied, and the resulting temperature profiles used as inputs for the energy loss model. In a substantially novel approach, this will allow using the data of rare high momentum particles to constrain the properties of the bulk medium. We will use this tool to: i) test our “soft-to-hard” medium hypothesis, i.e. if the bulk behaves as a nearly perfect fluid near critical temperature Tc, and as a weakly coupled system at higher temperatures, ii) map “soft-to-hard” boundary for QGP, iii) understand the origin of the low viscosity near Tc, and iv) test if QGP is formed in small (p+p or p(d)+A) systems.
Summary
Quark-Gluon Plasma (QGP) is a primordial state of matter, which consists of interacting free quarks and gluons. QGP likely existed immediately after the Big-Bang, and this extreme form of matter is today created in Little Bangs, which are ultra-relativistic collisions of heavy nuclei at the LHC and RHIC experiments. Based on the deconfinement ideas, a gas-like behaviour of QGP was anticipated. Unexpectedly, predictions of relativistic hydrodynamics - applicable to low momentum hadron data - indicated that QGP behaves as nearly perfect fluid, thus bringing exciting connections between the hottest (QGP) and the coldest (perfect Fermi gas) matter on Earth. However, predictions of hydrodynamical simulations are often weakly sensitive to changes of the bulk QGP parameters. In particular, even a large increase of viscosity not far from the phase transition does not notably change the low momentum predictions; in addition, the origin of the surprisingly low viscosity remains unclear. To understand the QGP properties, and to challenge the perfect fluid paradigm, we will develop a novel precision tomographic tool based on: i) state of the art, no free parameters, energy loss model of high momentum parton interactions with evolving QGP, ii) simulations of QGP evolution, in which the medium parameters will be systematically varied, and the resulting temperature profiles used as inputs for the energy loss model. In a substantially novel approach, this will allow using the data of rare high momentum particles to constrain the properties of the bulk medium. We will use this tool to: i) test our “soft-to-hard” medium hypothesis, i.e. if the bulk behaves as a nearly perfect fluid near critical temperature Tc, and as a weakly coupled system at higher temperatures, ii) map “soft-to-hard” boundary for QGP, iii) understand the origin of the low viscosity near Tc, and iv) test if QGP is formed in small (p+p or p(d)+A) systems.
Max ERC Funding
1 356 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym sCENT
Project Cryptophane-Enhanced Trace Gas Spectroscopy for On-Chip Methane Detection
Researcher (PI) Jana JAGERSKA
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2017-STG
Summary Sensitivity of on-chip gas sensors is still at least 2-3 orders of magnitude lower than what is needed for applications in atmospheric monitoring and climate research. For optical sensors, this comes as a natural consequence of miniaturization: sensitivity scales with interaction length, which is directly related to instrument size. The aim of this project is to explore a new concept of combined chemical and spectroscopic detection for on-chip sensing of methane, the principal component of natural gas and a potent climate forcer.
The sought-after sensitivity will be achieved by pre-concentrating gas molecules directly on a chip surface using cryptophanes, and subsequently detecting them using slow-light waveguides and mid-infrared laser absorption spectroscopy. Cryptophanes are macromolecular structures that can bind and thus pre-concentrate different small molecules, including methane. Spectroscopic detection of methane in a cryptophane host is an absolute novelty, and, if successful, it will not only contribute to unprecedented sensitivity enhancement, but will also address fundamental questions about the dynamics of small molecules upon encapsulation. The actual gas sensing will be realized using evanescent field interaction in photonic crystal waveguides, which exhibit both large evanescent field confinement and long effective interaction pathlengths due to the slow-light effect. The waveguide design alone is expected to improve the per-length sensitivity up to 10 times, while another 10 to 100-fold sensitivity enhancement is expected from the pre-concentration.
The targeted detection limit of 10 ppb will revolutionize current methods of atmospheric monitoring, enabling large-scale networks of integrated sensors for better quantification of global methane emissions. Beyond that, this method can be extended to the detection of other gases, e.g. CO2 and different volatile organic compounds with equally relevant applications in the medical domain.
Summary
Sensitivity of on-chip gas sensors is still at least 2-3 orders of magnitude lower than what is needed for applications in atmospheric monitoring and climate research. For optical sensors, this comes as a natural consequence of miniaturization: sensitivity scales with interaction length, which is directly related to instrument size. The aim of this project is to explore a new concept of combined chemical and spectroscopic detection for on-chip sensing of methane, the principal component of natural gas and a potent climate forcer.
The sought-after sensitivity will be achieved by pre-concentrating gas molecules directly on a chip surface using cryptophanes, and subsequently detecting them using slow-light waveguides and mid-infrared laser absorption spectroscopy. Cryptophanes are macromolecular structures that can bind and thus pre-concentrate different small molecules, including methane. Spectroscopic detection of methane in a cryptophane host is an absolute novelty, and, if successful, it will not only contribute to unprecedented sensitivity enhancement, but will also address fundamental questions about the dynamics of small molecules upon encapsulation. The actual gas sensing will be realized using evanescent field interaction in photonic crystal waveguides, which exhibit both large evanescent field confinement and long effective interaction pathlengths due to the slow-light effect. The waveguide design alone is expected to improve the per-length sensitivity up to 10 times, while another 10 to 100-fold sensitivity enhancement is expected from the pre-concentration.
The targeted detection limit of 10 ppb will revolutionize current methods of atmospheric monitoring, enabling large-scale networks of integrated sensors for better quantification of global methane emissions. Beyond that, this method can be extended to the detection of other gases, e.g. CO2 and different volatile organic compounds with equally relevant applications in the medical domain.
Max ERC Funding
1 499 749 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym SNOWISO
Project Signals from the Surface Snow: Post-Depositional Processes Controlling the Ice Core IsotopicFingerprint
Researcher (PI) Hans Christian Steen-Larsen
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2017-STG
Summary For the past 50 years, our use of ice core records as climate archives has relied on the fundamental assumption that the isotopic composition of precipitation deposited on the ice sheet surface determines the ice core water isotopic composition. Since the isotopic composition in precipitation is assumed to be governed by the state of the climate this has made ice core isotope records one of the most important proxies for reconstructing the past climate.
New simultaneous measurements of snow and water vapor isotopes have shown that the surface snow exchanges with the atmospheric water vapor isotope signal, altering the deposited precipitation isotope signal. This severely questions the standard paradigm for interpreting the ice core proxy record and gives rise to the hypothesis that the isotope record from an ice core is determined by a combination of the atmospheric water vapor isotope signal and the precipitation isotope signal.
The SNOWISO project will verify this new hypothesis by combining laboratory and field experiments with in-situ observations of snow and water vapor isotopes in Greenland and Antarctica. This will enable me to quantify and parameterize the snow-air isotope exchange and post-depositional processes. I will implement these results into an isotope-enabled Regional Climate Model with a snowpack module and benchmarked against in-situ observations. Using the coupled snow-atmosphere isotope model I will establish the isotopic shift due to post-depositional processes under different climate conditions. This will facilitate the use of the full suite of water isotopes to infer past changes in the climate system, specifically changes in ocean sea surface temperature and relative humidity.
By establishing how the water isotope signal is recorded in the snow, the SNOWISO project will build the foundation for future integration of isotope-enabled General Circulation Models with ice core records; this opens a new frontier in climate reconstruction.
Summary
For the past 50 years, our use of ice core records as climate archives has relied on the fundamental assumption that the isotopic composition of precipitation deposited on the ice sheet surface determines the ice core water isotopic composition. Since the isotopic composition in precipitation is assumed to be governed by the state of the climate this has made ice core isotope records one of the most important proxies for reconstructing the past climate.
New simultaneous measurements of snow and water vapor isotopes have shown that the surface snow exchanges with the atmospheric water vapor isotope signal, altering the deposited precipitation isotope signal. This severely questions the standard paradigm for interpreting the ice core proxy record and gives rise to the hypothesis that the isotope record from an ice core is determined by a combination of the atmospheric water vapor isotope signal and the precipitation isotope signal.
The SNOWISO project will verify this new hypothesis by combining laboratory and field experiments with in-situ observations of snow and water vapor isotopes in Greenland and Antarctica. This will enable me to quantify and parameterize the snow-air isotope exchange and post-depositional processes. I will implement these results into an isotope-enabled Regional Climate Model with a snowpack module and benchmarked against in-situ observations. Using the coupled snow-atmosphere isotope model I will establish the isotopic shift due to post-depositional processes under different climate conditions. This will facilitate the use of the full suite of water isotopes to infer past changes in the climate system, specifically changes in ocean sea surface temperature and relative humidity.
By establishing how the water isotope signal is recorded in the snow, the SNOWISO project will build the foundation for future integration of isotope-enabled General Circulation Models with ice core records; this opens a new frontier in climate reconstruction.
Max ERC Funding
1 497 260 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym SolarALMA
Project ALMA – The key to the Sun’s coronal heating problem.
Researcher (PI) Sven Wedemeyer
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Summary
How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Max ERC Funding
1 995 964 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym STERCP
Project Synchronisation to enhance reliability of climate predictions
Researcher (PI) Noel Sebastian Keenlyside
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Consolidator Grant (CoG), PE10, ERC-2014-CoG
Summary Climate prediction is the next frontier in climate research. Prediction of climate on timescales from a season to a decade has shown progress, but beyond the ocean skill remains low. And while the historical evolution of climate at global scales can be reasonably simulated, agreement at a regional level is limited and large uncertainties exist in future climate change. These large uncertainties pose a major challenge to those providing climate services and to informing policy makers.
This proposal aims to investigate the potential of an innovative technique to reduce model systematic error, and hence to improve climate prediction skill and reduce uncertainties in future climate projections. The current practice to account for model systematic error, as for example adopted by the Intergovernmental Panel on Climate Change, is to perform simulations with ensembles of different models. This leads to more reliable predictions, and to a better representation of climate. Instead of running models independently, we propose to connect the different models in manner that they synchronise and errors compensate, thus leading to a model superior to any of the individual models – a super model.
The concept stems from theoretical non-dynamics and relies on advanced machine learning algorithms. Its application to climate modelling has been rudimentary. Nevertheless, our initial results show it holds great promise for improving climate prediction. To achieve even greater gains, we will extend the approach to allow greater connectivity among multiple complex climate models to create a true super climate model. We will assess the approach’s potential to enhance seasonal-to-decadal prediction, focusing on the Tropical Pacific and North Atlantic, and to reduce uncertainties in climate projections. Importantly, this work will improve our understanding of climate, as well as how systematic model errors impact prediction skill and contribute to climate change uncertainties.
Summary
Climate prediction is the next frontier in climate research. Prediction of climate on timescales from a season to a decade has shown progress, but beyond the ocean skill remains low. And while the historical evolution of climate at global scales can be reasonably simulated, agreement at a regional level is limited and large uncertainties exist in future climate change. These large uncertainties pose a major challenge to those providing climate services and to informing policy makers.
This proposal aims to investigate the potential of an innovative technique to reduce model systematic error, and hence to improve climate prediction skill and reduce uncertainties in future climate projections. The current practice to account for model systematic error, as for example adopted by the Intergovernmental Panel on Climate Change, is to perform simulations with ensembles of different models. This leads to more reliable predictions, and to a better representation of climate. Instead of running models independently, we propose to connect the different models in manner that they synchronise and errors compensate, thus leading to a model superior to any of the individual models – a super model.
The concept stems from theoretical non-dynamics and relies on advanced machine learning algorithms. Its application to climate modelling has been rudimentary. Nevertheless, our initial results show it holds great promise for improving climate prediction. To achieve even greater gains, we will extend the approach to allow greater connectivity among multiple complex climate models to create a true super climate model. We will assess the approach’s potential to enhance seasonal-to-decadal prediction, focusing on the Tropical Pacific and North Atlantic, and to reduce uncertainties in climate projections. Importantly, this work will improve our understanding of climate, as well as how systematic model errors impact prediction skill and contribute to climate change uncertainties.
Max ERC Funding
1 999 389 €
Duration
Start date: 2015-09-01, End date: 2021-08-31
Project acronym STUCCOFIELDS
Project Structure and scaling in computational field theories
Researcher (PI) Snorre Harald Christiansen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE1, ERC-2011-StG_20101014
Summary "The numerical simulations that are used in science and industry require ever more sophisticated mathematics. For the partial differential equations that are used to model physical processes, qualitative properties such as conserved quantities and monotonicity are crucial for well-posedness. Mimicking them in the discretizations seems equally important to get reliable results.
This project will contribute to the interplay of geometry and numerical analysis by bridging the gap between Lie group based techniques and finite elements. The role of Lie algebra valued differential forms will be highlighted. One aim is to develop construction techniques for complexes of finite element spaces incorporating special functions adapted to singular perturbations. Another is to marry finite elements with holonomy based discretizations used in mathematical physics, such as the Lattice Gauge Theory of particle physics and the Regge calculus of general relativity. Stability and convergence of algorithms will be related to differential geometric properties, and the interface between numerical analysis and quantum field theory will be explored. The techniques will be applied to the simulation of mechanics of complex materials and light-matter interactions."
Summary
"The numerical simulations that are used in science and industry require ever more sophisticated mathematics. For the partial differential equations that are used to model physical processes, qualitative properties such as conserved quantities and monotonicity are crucial for well-posedness. Mimicking them in the discretizations seems equally important to get reliable results.
This project will contribute to the interplay of geometry and numerical analysis by bridging the gap between Lie group based techniques and finite elements. The role of Lie algebra valued differential forms will be highlighted. One aim is to develop construction techniques for complexes of finite element spaces incorporating special functions adapted to singular perturbations. Another is to marry finite elements with holonomy based discretizations used in mathematical physics, such as the Lattice Gauge Theory of particle physics and the Regge calculus of general relativity. Stability and convergence of algorithms will be related to differential geometric properties, and the interface between numerical analysis and quantum field theory will be explored. The techniques will be applied to the simulation of mechanics of complex materials and light-matter interactions."
Max ERC Funding
1 100 000 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym SURFSPEC
Project Theoretical multiphoton spectroscopy for understanding surfaces and interfaces
Researcher (PI) Kenneth Ruud
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The project will develop new methods for calculating nonlinear spectroscopic properties, both in the electronic as well as in the vibrational domain. The methods will be used to study molecular interactions at interfaces, allowing for a direct comparison of experimental observations with theoretical calculations. In order to explore different ways of modeling surface and interface interactions, we will develop three different ab initio methods for calculating these nonlinear molecular properties: 1) Multiscale methods, in which the interface region is partitioned into three different layers. The part involving interface-absorbed molecules will be described by quantum-chemical methods, the closest surrounding part of the system where specific interactions are important will be described by classical, polarizable force fields, and the long-range electrostatic interactions will be described by a polarizable continuum. 2) Periodic-boundary conditions: We will extend a response theory framework recently developed in our group to describe periodic systems using Gaussian basis sets. This will be achieved by deriving the necessary formulas, and interface our response framework to existing periodic-boundary codes. 3) Time-domain methods: Starting from the equation of motion for the reduced single-electron density matrix, we will propagate the electron density and the classical nuclei in time in order to model time-resolved vibrational spectroscopies.
The novelty of the project is in its focus on nonlinear molecular properties, both electronic and vibrational, and the development of computational models for surfaces and interfaces that may help rationalize experimental observations of interface phenomena and molecular adsorption at interfaces. In the application of the methods developed, particular attention will be given to nonlinear electronic and vibrational spectroscopies that selectively probe surfaces and interfaces in a non-invasive manner, such as SFG.
Summary
The project will develop new methods for calculating nonlinear spectroscopic properties, both in the electronic as well as in the vibrational domain. The methods will be used to study molecular interactions at interfaces, allowing for a direct comparison of experimental observations with theoretical calculations. In order to explore different ways of modeling surface and interface interactions, we will develop three different ab initio methods for calculating these nonlinear molecular properties: 1) Multiscale methods, in which the interface region is partitioned into three different layers. The part involving interface-absorbed molecules will be described by quantum-chemical methods, the closest surrounding part of the system where specific interactions are important will be described by classical, polarizable force fields, and the long-range electrostatic interactions will be described by a polarizable continuum. 2) Periodic-boundary conditions: We will extend a response theory framework recently developed in our group to describe periodic systems using Gaussian basis sets. This will be achieved by deriving the necessary formulas, and interface our response framework to existing periodic-boundary codes. 3) Time-domain methods: Starting from the equation of motion for the reduced single-electron density matrix, we will propagate the electron density and the classical nuclei in time in order to model time-resolved vibrational spectroscopies.
The novelty of the project is in its focus on nonlinear molecular properties, both electronic and vibrational, and the development of computational models for surfaces and interfaces that may help rationalize experimental observations of interface phenomena and molecular adsorption at interfaces. In the application of the methods developed, particular attention will be given to nonlinear electronic and vibrational spectroscopies that selectively probe surfaces and interfaces in a non-invasive manner, such as SFG.
Max ERC Funding
1 498 500 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym TAIAC
Project Breaking the paradigm: A new approach to understanding and controlling combustion instabilities
Researcher (PI) Nicholas Worth
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Starting Grant (StG), PE8, ERC-2015-STG
Summary It is well known that current and future low-emission combustion concepts for gas turbines are prone to thermoacoustic instabilities. These give rise to large pressure fluctuations that can drastically reduce the operable range and threaten the structural integrity of stationary gas turbines and aero engines. In the last 6 years the development of laboratory-scale annular combustors and high-performance computing based on Large Eddy Simulations (LES) have been able to reproduce thermoacoustic oscillations in annular combustion chambers, giving us unprecedented access to information about their nature.
Until now, it has been assumed that a complete understanding of thermoacoustic instabilities could be developed by studying the response of single axisymmetric flames. Consequently stability issues crop up far into engine development programmes, or in service, because we lack the knowledge to predict their occurrence at the design stage. However, the ability to experimentally study thermoacoustic instabilities in laboratory-scale annular combustors using modern experimental methods has set the stage for a breakthrough in our scientific understanding capable of yielding truly predictive tools.
This proposal aims to break the existing paradigm of studying isolated flames and provide a step change in our scientific understanding by studying thermoacoustic instabilities in annular chambers where the full multiphysics of the problem are present. The technical goals of the proposal are: to develop a novel annular facility with engine relevant boundary conditions; to use this to radically increase our understanding of the underlying physics and flame response, paving the way for the next generation of predictive methods; and to exploit this understanding to improve system stability through intelligent design. Through these goals the proposal will provide an essential bridge between academic and industrial research and strengthening European thermoacoustic expertises.
Summary
It is well known that current and future low-emission combustion concepts for gas turbines are prone to thermoacoustic instabilities. These give rise to large pressure fluctuations that can drastically reduce the operable range and threaten the structural integrity of stationary gas turbines and aero engines. In the last 6 years the development of laboratory-scale annular combustors and high-performance computing based on Large Eddy Simulations (LES) have been able to reproduce thermoacoustic oscillations in annular combustion chambers, giving us unprecedented access to information about their nature.
Until now, it has been assumed that a complete understanding of thermoacoustic instabilities could be developed by studying the response of single axisymmetric flames. Consequently stability issues crop up far into engine development programmes, or in service, because we lack the knowledge to predict their occurrence at the design stage. However, the ability to experimentally study thermoacoustic instabilities in laboratory-scale annular combustors using modern experimental methods has set the stage for a breakthrough in our scientific understanding capable of yielding truly predictive tools.
This proposal aims to break the existing paradigm of studying isolated flames and provide a step change in our scientific understanding by studying thermoacoustic instabilities in annular chambers where the full multiphysics of the problem are present. The technical goals of the proposal are: to develop a novel annular facility with engine relevant boundary conditions; to use this to radically increase our understanding of the underlying physics and flame response, paving the way for the next generation of predictive methods; and to exploit this understanding to improve system stability through intelligent design. Through these goals the proposal will provide an essential bridge between academic and industrial research and strengthening European thermoacoustic expertises.
Max ERC Funding
1 929 103 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym TGF-MEPPA
Project Terrestrial Gamma Flashes-the Most Energetic Photon Phenomenon in our Atmosphere
Researcher (PI) Nikolai Østgaard
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), PE10, ERC-2012-ADG_20120216
Summary "Only 20 years after the discovery of Cosmic Gamma-ray Bursts from the universe another completely unknown phenomenon involving gamma-rays was discovered by coincidence the BATSE instrument on the Compton Gamma-Ray Observatory. Short-lived (~1 ms) and very energetic photon emissions (>1 MeV and later: >40 MeV) were found to originate from the Earth’s atmosphere and were named Terrestrial Gamma Flashes (TGFs). These flashes are the most energetic natural photon phenomenon that is known to exist on Earth, in which also anti-matter is produced. Based on the few datasets available to date we believe that TGFs are related to electric discharges in thunderstorm systems and that electrons accelerated to relativistic energies are involved to produce bremsstrahlung of such high energies. However, it is not known how frequent TGFs are, the altitude range and the spatial extent of their source region, to what kind of thunderstorms and lightning they are related or the implications of relativistic electrons and positrons ejected into space. There is no consensus on how TGFs are produced. All these questions need to be answered before we understand how important they are and how they may affect the Earth’s electrical circuit and atmosphere.
The goal of the TGF-MEPPA project is to attack these questions by combining modelling of electron acceleration in thunderstorm electric fields, X- and gamma-ray production and propagation, lightning development with unprecedented measurements of TGFs from three different altitudes: 350 km, 30 km and 20 km to obtain the most comprehensive and detailed dataset needed to make significant advances in the TGF research. I will also perform electric discharge experiments in the laboratory. The goal is to establish a consistent model for the TGF-production and answer the question ‘How common are TGFs?’ to determine their implications for the Earth’s electrical circuit, atmosphere and outer space."
Summary
"Only 20 years after the discovery of Cosmic Gamma-ray Bursts from the universe another completely unknown phenomenon involving gamma-rays was discovered by coincidence the BATSE instrument on the Compton Gamma-Ray Observatory. Short-lived (~1 ms) and very energetic photon emissions (>1 MeV and later: >40 MeV) were found to originate from the Earth’s atmosphere and were named Terrestrial Gamma Flashes (TGFs). These flashes are the most energetic natural photon phenomenon that is known to exist on Earth, in which also anti-matter is produced. Based on the few datasets available to date we believe that TGFs are related to electric discharges in thunderstorm systems and that electrons accelerated to relativistic energies are involved to produce bremsstrahlung of such high energies. However, it is not known how frequent TGFs are, the altitude range and the spatial extent of their source region, to what kind of thunderstorms and lightning they are related or the implications of relativistic electrons and positrons ejected into space. There is no consensus on how TGFs are produced. All these questions need to be answered before we understand how important they are and how they may affect the Earth’s electrical circuit and atmosphere.
The goal of the TGF-MEPPA project is to attack these questions by combining modelling of electron acceleration in thunderstorm electric fields, X- and gamma-ray production and propagation, lightning development with unprecedented measurements of TGFs from three different altitudes: 350 km, 30 km and 20 km to obtain the most comprehensive and detailed dataset needed to make significant advances in the TGF research. I will also perform electric discharge experiments in the laboratory. The goal is to establish a consistent model for the TGF-production and answer the question ‘How common are TGFs?’ to determine their implications for the Earth’s electrical circuit, atmosphere and outer space."
Max ERC Funding
2 492 811 €
Duration
Start date: 2013-03-01, End date: 2018-02-28
Project acronym Waterscales
Project Mathematical and computational foundations for modeling cerebral fluid flow.
Researcher (PI) Marie Elisabeth ROGNES
Host Institution (HI) SIMULA RESEARCH LABORATORY AS
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary Your brain has its own waterscape: whether you are reading or sleeping, fluid flows through the brain tissue and clears waste in the process. These physiological processes are crucial for the well-being of the brain. In spite of their importance we understand them but little. Mathematics and numerics could play a crucial role in gaining new insight. Indeed, medical doctors express an urgent need for multiscale modeling of water transport through the brain, to overcome limitations in traditional techniques. Surprisingly little attention has been paid to the numerics of the brain's waterscape however, and fundamental knowledge is missing.
In response, the Waterscales ambition is to establish the mathematical and computational foundations for predictively modeling fluid flow and solute transport through the brain across scales -- from the cellular to the organ level. The project aims to bridge multiscale fluid mechanics and cellular electrophysiology to pioneer new families of mathematical models that couple macroscale, mesoscale and microscale flow with glial cell dynamics. For these models, we will design numerical discretizations that preserve key properties and that allow for whole organ simulations. To evaluate predictability, we will develop a new computational platform for model adaptivity and calibration. The project is multidisciplinary combining mathematics, mechanics, scientific computing, and physiology.
If successful, this project enables the first in silico studies of the brain's waterscape across scales. The new models would open up a new research field within computational neuroscience with ample opportunities for further mathematical and more applied study. The processes at hand are associated with neurodegenerative diseases e.g. dementia and with brain swelling caused by e.g. stroke. The Waterscales project will provide the field with a sorely needed, new avenue of investigation to understand these conditions, with tremendous long-term impact.
Summary
Your brain has its own waterscape: whether you are reading or sleeping, fluid flows through the brain tissue and clears waste in the process. These physiological processes are crucial for the well-being of the brain. In spite of their importance we understand them but little. Mathematics and numerics could play a crucial role in gaining new insight. Indeed, medical doctors express an urgent need for multiscale modeling of water transport through the brain, to overcome limitations in traditional techniques. Surprisingly little attention has been paid to the numerics of the brain's waterscape however, and fundamental knowledge is missing.
In response, the Waterscales ambition is to establish the mathematical and computational foundations for predictively modeling fluid flow and solute transport through the brain across scales -- from the cellular to the organ level. The project aims to bridge multiscale fluid mechanics and cellular electrophysiology to pioneer new families of mathematical models that couple macroscale, mesoscale and microscale flow with glial cell dynamics. For these models, we will design numerical discretizations that preserve key properties and that allow for whole organ simulations. To evaluate predictability, we will develop a new computational platform for model adaptivity and calibration. The project is multidisciplinary combining mathematics, mechanics, scientific computing, and physiology.
If successful, this project enables the first in silico studies of the brain's waterscape across scales. The new models would open up a new research field within computational neuroscience with ample opportunities for further mathematical and more applied study. The processes at hand are associated with neurodegenerative diseases e.g. dementia and with brain swelling caused by e.g. stroke. The Waterscales project will provide the field with a sorely needed, new avenue of investigation to understand these conditions, with tremendous long-term impact.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-04-01, End date: 2022-03-31