Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AEROBIC
Project Assessing the Effects of Rising O2 on Biogeochemical Cycles: Integrated Laboratory Experiments and Numerical Simulations
Researcher (PI) Itay Halevy
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE10, ERC-2013-StG
Summary The rise of atmospheric O2 ~2,500 million years ago is one of the most profound transitions in Earth's history. Yet, despite its central role in shaping Earth's surface environment, the cause for the rise of O2 remains poorly understood. Tight coupling between the O2 cycle and the biogeochemical cycles of redox-active elements, such as C, Fe and S, implies radical changes in these cycles before, during and after the rise of O2. These changes, too, are incompletely understood, but have left valuable information encoded in the geological record. This information has been qualitatively interpreted, leaving many aspects of the rise of O2, including its causes and constraints on ocean chemistry before and after it, topics of ongoing research and debate. Here, I outline a research program to address this fundamental question in geochemical Earth systems evolution. The inherently interdisciplinary program uniquely integrates laboratory experiments, numerical models, geological observations, and geochemical analyses. Laboratory experiments and geological observations will constrain unknown parameters of the early biogeochemical cycles, and, in combination with field studies, will validate and refine the use of paleoenvironmental proxies. The insight gained will be used to develop detailed models of the coupled biogeochemical cycles, which will themselves be used to quantitatively understand the events surrounding the rise of O2, and to illuminate the dynamics of elemental cycles in the early oceans.
This program is expected to yield novel, quantitative insight into these important events in Earth history and to have a major impact on our understanding of early ocean chemistry and the rise of O2. An ERC Starting Grant will enable me to use the excellent experimental and computational facilities at my disposal, to access the outstanding human resource at the Weizmann Institute of Science, and to address one of the major open questions in modern geochemistry.
Summary
The rise of atmospheric O2 ~2,500 million years ago is one of the most profound transitions in Earth's history. Yet, despite its central role in shaping Earth's surface environment, the cause for the rise of O2 remains poorly understood. Tight coupling between the O2 cycle and the biogeochemical cycles of redox-active elements, such as C, Fe and S, implies radical changes in these cycles before, during and after the rise of O2. These changes, too, are incompletely understood, but have left valuable information encoded in the geological record. This information has been qualitatively interpreted, leaving many aspects of the rise of O2, including its causes and constraints on ocean chemistry before and after it, topics of ongoing research and debate. Here, I outline a research program to address this fundamental question in geochemical Earth systems evolution. The inherently interdisciplinary program uniquely integrates laboratory experiments, numerical models, geological observations, and geochemical analyses. Laboratory experiments and geological observations will constrain unknown parameters of the early biogeochemical cycles, and, in combination with field studies, will validate and refine the use of paleoenvironmental proxies. The insight gained will be used to develop detailed models of the coupled biogeochemical cycles, which will themselves be used to quantitatively understand the events surrounding the rise of O2, and to illuminate the dynamics of elemental cycles in the early oceans.
This program is expected to yield novel, quantitative insight into these important events in Earth history and to have a major impact on our understanding of early ocean chemistry and the rise of O2. An ERC Starting Grant will enable me to use the excellent experimental and computational facilities at my disposal, to access the outstanding human resource at the Weizmann Institute of Science, and to address one of the major open questions in modern geochemistry.
Max ERC Funding
1 472 690 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym ATOMICAR
Project ATOMic Insight Cavity Array Reactor
Researcher (PI) Peter Christian Kjærgaard VESBORG
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The goal of ATOMICAR is to achieve the ultimate sensitivity limit in heterogeneous catalysis:
Quantitative measurement of chemical turnover on a single catalytic nanoparticle.
Most heterogeneous catalysis occurs on metal nanoparticle in the size range of 3 nm - 10 nm. Model studies have established that there is often a strong coupling between nanoparticle size & shape - and catalytic activity. The strong structure-activity coupling renders it probable that “super-active” nanoparticles exist. However, since there is no way to measure catalytic activity of less than ca 1 million nanoparticles at a time, any super-activity will always be hidden by “ensemble smearing” since one million nanoparticles of exactly identical size and shape cannot be made. The state-of-the-art in catalysis benchmarking is microfabricated flow reactors with mass-spectrometric detection, but the sensitivity of this approach cannot be incrementally improved by six orders of magnitude. This calls for a new measurement paradigm where the activity of a single nanoparticle can be benchmarked – the ultimate limit for catalytic measurement.
A tiny batch reactor is the solution, but there are three key problems: How to seal it; how to track catalytic turnover inside it; and how to see the nanoparticle inside it? Graphene solves all three problems: A microfabricated cavity with a thin SixNy bottom window, a single catalytic nanoparticle inside, and a graphene seal forms a gas tight batch reactor since graphene has zero gas permeability. Catalysis is then tracked as an internal pressure change via the stress & deflection of the graphene seal. Crucially, the electron-transparency of graphene and SixNy enables subsequent transmission electron microscope access with atomic resolution so that active nanoparticles can be studied in full detail.
ATOMICAR will re-define the experimental limits of catalyst benchmarking and lift the field of basic catalysis research into the single-nanoparticle age.
Summary
The goal of ATOMICAR is to achieve the ultimate sensitivity limit in heterogeneous catalysis:
Quantitative measurement of chemical turnover on a single catalytic nanoparticle.
Most heterogeneous catalysis occurs on metal nanoparticle in the size range of 3 nm - 10 nm. Model studies have established that there is often a strong coupling between nanoparticle size & shape - and catalytic activity. The strong structure-activity coupling renders it probable that “super-active” nanoparticles exist. However, since there is no way to measure catalytic activity of less than ca 1 million nanoparticles at a time, any super-activity will always be hidden by “ensemble smearing” since one million nanoparticles of exactly identical size and shape cannot be made. The state-of-the-art in catalysis benchmarking is microfabricated flow reactors with mass-spectrometric detection, but the sensitivity of this approach cannot be incrementally improved by six orders of magnitude. This calls for a new measurement paradigm where the activity of a single nanoparticle can be benchmarked – the ultimate limit for catalytic measurement.
A tiny batch reactor is the solution, but there are three key problems: How to seal it; how to track catalytic turnover inside it; and how to see the nanoparticle inside it? Graphene solves all three problems: A microfabricated cavity with a thin SixNy bottom window, a single catalytic nanoparticle inside, and a graphene seal forms a gas tight batch reactor since graphene has zero gas permeability. Catalysis is then tracked as an internal pressure change via the stress & deflection of the graphene seal. Crucially, the electron-transparency of graphene and SixNy enables subsequent transmission electron microscope access with atomic resolution so that active nanoparticles can be studied in full detail.
ATOMICAR will re-define the experimental limits of catalyst benchmarking and lift the field of basic catalysis research into the single-nanoparticle age.
Max ERC Funding
1 496 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym BeadsOnString
Project Beads on String Genomics: Experimental Toolbox for Unmasking Genetic / Epigenetic Variation in Genomic DNA and Chromatin
Researcher (PI) Yuval Ebenstein
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Starting Grant (StG), PE4, ERC-2013-StG
Summary Next generation sequencing (NGS) is revolutionizing all fields of biological research but it fails to extract the full range of information associated with genetic material and is lacking in its ability to resolve variations between genomes. The high degree of genome variation exhibited both on the population level as well as between genetically “identical” cells (even in the same organ) makes genetic and epigenetic analysis on the single cell and single genome level a necessity.
Chromosomes may be conceptually represented as a linear one-dimensional barcode. However, in contrast to a traditional binary barcode approach that considers only two possible bits of information (1 & 0), I will use colour and molecular structure to expand the variety of information represented in the barcode. Like colourful beads threaded on a string, where each bead represents a distinct type of observable, I will label each type of genomic information with a different chemical moiety thus expanding the repertoire of information that can be simultaneously measured. A major effort in this proposal is invested in the development of unique chemistries to enable this labelling.
I specifically address three types of genomic variation: Variations in genomic layout (including DNA repeats, structural and copy number variations), variations in the patterns of chemical DNA modifications (such as methylation of cytosine bases) and variations in the chromatin composition (including nucleosome and transcription factor distributions). I will use physical extension of long DNA molecules on surfaces and in nanofluidic channels to reveal this information visually in the form of a linear, fluorescent “barcode” that is read-out by advanced imaging techniques. Similarly, DNA molecules will be threaded through a nanopore where the sequential position of “bulky” molecular groups attached to the DNA may be inferred from temporal modulation of an ionic current measured across the pore.
Summary
Next generation sequencing (NGS) is revolutionizing all fields of biological research but it fails to extract the full range of information associated with genetic material and is lacking in its ability to resolve variations between genomes. The high degree of genome variation exhibited both on the population level as well as between genetically “identical” cells (even in the same organ) makes genetic and epigenetic analysis on the single cell and single genome level a necessity.
Chromosomes may be conceptually represented as a linear one-dimensional barcode. However, in contrast to a traditional binary barcode approach that considers only two possible bits of information (1 & 0), I will use colour and molecular structure to expand the variety of information represented in the barcode. Like colourful beads threaded on a string, where each bead represents a distinct type of observable, I will label each type of genomic information with a different chemical moiety thus expanding the repertoire of information that can be simultaneously measured. A major effort in this proposal is invested in the development of unique chemistries to enable this labelling.
I specifically address three types of genomic variation: Variations in genomic layout (including DNA repeats, structural and copy number variations), variations in the patterns of chemical DNA modifications (such as methylation of cytosine bases) and variations in the chromatin composition (including nucleosome and transcription factor distributions). I will use physical extension of long DNA molecules on surfaces and in nanofluidic channels to reveal this information visually in the form of a linear, fluorescent “barcode” that is read-out by advanced imaging techniques. Similarly, DNA molecules will be threaded through a nanopore where the sequential position of “bulky” molecular groups attached to the DNA may be inferred from temporal modulation of an ionic current measured across the pore.
Max ERC Funding
1 627 600 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym CAPRI
Project Clouds and Precipitation Response to Anthropogenic Changes in the Natural Environment
Researcher (PI) Ilan Koren
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Starting Grant (StG), PE10, ERC-2012-StG_20111012
Summary Clouds and precipitation play a crucial role in the Earth's energy balance, global atmospheric circulation and the water cycle. Despite their importance, clouds still pose the largest uncertainty in climate research.
I propose a new approach for studying anthropogenic effects on cloud fields and rain, tackling the challenge from both scientific ends: reductionism and systems approach. We will develop a novel research approach using observations and models interactively that will allow us to “peel apart” detailed physical processes. In parallel we will develop a systems view of cloud fields looking for Emergent Behavior rising out of the complexity, as the end result of all of the coupled processes. Better understanding of key processes on a detailed (reductionist) manner will enable us to formulate the important basic rules that control the field and to look for emergence of the overall effects.
We will merge ideas and methods from four different disciplines: remote sensing and radiative transfer, cloud physics, pattern recognition and computer vision and ideas developed in systems approach. All of this will be done against the backdrop of natural variability of meteorological systems.
The outcomes of this work will include fundamental new understanding of the coupled surface-aerosol-cloud-precipitation system. More importantly this work will emphasize the consequences of human actions on the environment, and how we change our climate and hydrological cycle as we input pollutants and transform the Earth’s surface. This work will open new horizons in cloud research by developing novel methods and employing the bulk knowledge of pattern recognition, complexity, networking and self organization to cloud and climate studies. We are proposing a long-term, open-ended program of study that will have scientific and societal relevance as long as human-caused influences continue, evolve and change.
Summary
Clouds and precipitation play a crucial role in the Earth's energy balance, global atmospheric circulation and the water cycle. Despite their importance, clouds still pose the largest uncertainty in climate research.
I propose a new approach for studying anthropogenic effects on cloud fields and rain, tackling the challenge from both scientific ends: reductionism and systems approach. We will develop a novel research approach using observations and models interactively that will allow us to “peel apart” detailed physical processes. In parallel we will develop a systems view of cloud fields looking for Emergent Behavior rising out of the complexity, as the end result of all of the coupled processes. Better understanding of key processes on a detailed (reductionist) manner will enable us to formulate the important basic rules that control the field and to look for emergence of the overall effects.
We will merge ideas and methods from four different disciplines: remote sensing and radiative transfer, cloud physics, pattern recognition and computer vision and ideas developed in systems approach. All of this will be done against the backdrop of natural variability of meteorological systems.
The outcomes of this work will include fundamental new understanding of the coupled surface-aerosol-cloud-precipitation system. More importantly this work will emphasize the consequences of human actions on the environment, and how we change our climate and hydrological cycle as we input pollutants and transform the Earth’s surface. This work will open new horizons in cloud research by developing novel methods and employing the bulk knowledge of pattern recognition, complexity, networking and self organization to cloud and climate studies. We are proposing a long-term, open-ended program of study that will have scientific and societal relevance as long as human-caused influences continue, evolve and change.
Max ERC Funding
1 428 169 €
Duration
Start date: 2012-09-01, End date: 2017-08-31
Project acronym CartiLube
Project Lubricating Cartilage: exploring the relation between lubrication and gene-regulation to alleviate osteoarthritis
Researcher (PI) Jacob KLEIN
Host Institution (HI) WEIZMANN INSTITUTE OF SCIENCE
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary Can we exploit insights from the remarkably lubricated surfaces of articular cartilage, to create lubricants that may alleviate osteoarthritis (OA), the most widespread joint disease, affecting millions? These, succinctly, are the challenges of the present proposal. They are driven by our recent finding that lubrication of destabilised joints leads to changes in gene-regulation of the cartilage-embedded chondrocytes to protect against development of the disease. OA alleviation is known to arise through orthopedically suppressing shear-stresses on the cartilage, and a central premise of this project is that, by reducing friction at the articulating cartilage through suitable lubrication, we may achieve the same beneficial effect on the disease. The objectives of this project are to better understand the origins of cartilage boundary lubrication through examination of friction-reduction by its main molecular components, and exploit that understanding to create lubricants that, on intra-articular injection, will lubricate cartilage sufficiently well to achieve alleviation of OA via gene regulation. The project will examine, via both nanotribometric and macroscopic measurements, how the main molecular species implicated in cartilage lubrication, lipids, hyaluronan and lubricin, and their combinations, act together to form optimally lubricating boundary layers on model surfaces as well as on excised cartilage. Based on this, we shall develop suitable materials to lubricate cartilage in joints, using mouse models. Lubricants will further be optimized with respect to their retention in the joint and cartilage targeting, both in model studies and in vivo. The effect of the lubricants in regulating gene expression, in reducing pain and cartilage degradation, and in promoting stem-cell adhesion to the cartilage will be studied in a mouse model in which OA has been induced. Our results will have implications for treatment of a common, debilitating disease.
Summary
Can we exploit insights from the remarkably lubricated surfaces of articular cartilage, to create lubricants that may alleviate osteoarthritis (OA), the most widespread joint disease, affecting millions? These, succinctly, are the challenges of the present proposal. They are driven by our recent finding that lubrication of destabilised joints leads to changes in gene-regulation of the cartilage-embedded chondrocytes to protect against development of the disease. OA alleviation is known to arise through orthopedically suppressing shear-stresses on the cartilage, and a central premise of this project is that, by reducing friction at the articulating cartilage through suitable lubrication, we may achieve the same beneficial effect on the disease. The objectives of this project are to better understand the origins of cartilage boundary lubrication through examination of friction-reduction by its main molecular components, and exploit that understanding to create lubricants that, on intra-articular injection, will lubricate cartilage sufficiently well to achieve alleviation of OA via gene regulation. The project will examine, via both nanotribometric and macroscopic measurements, how the main molecular species implicated in cartilage lubrication, lipids, hyaluronan and lubricin, and their combinations, act together to form optimally lubricating boundary layers on model surfaces as well as on excised cartilage. Based on this, we shall develop suitable materials to lubricate cartilage in joints, using mouse models. Lubricants will further be optimized with respect to their retention in the joint and cartilage targeting, both in model studies and in vivo. The effect of the lubricants in regulating gene expression, in reducing pain and cartilage degradation, and in promoting stem-cell adhesion to the cartilage will be studied in a mouse model in which OA has been induced. Our results will have implications for treatment of a common, debilitating disease.
Max ERC Funding
2 499 944 €
Duration
Start date: 2017-09-01, End date: 2022-08-31