Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym APES
Project Accuracy and precision for molecular solids
Researcher (PI) Jiri KLIMES
Host Institution (HI) UNIVERZITA KARLOVA
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The description of high pressure phases or polymorphism of molecular solids represents a significant scientific challenge both for experiment and theory. Theoretical methods that are currently used struggle to describe the tiny energy differences between different phases. It is the aim of this project to develop a scheme that would allow accurate and reliable predictions of the binding energies of molecular solids and of the energy differences between different phases.
To reach the required accuracy, we will combine the coupled cluster approach, widely used for reference quality calculations for molecules, with the random phase approximation (RPA) within periodic boundary conditions. As I have recently shown, RPA-based approaches are already some of the most accurate and practically usable methods for the description of extended systems. However, reliability is not only a question of accuracy. Reliable data need to be precise, that is, converged with the numerical parameters so that they are reproducible by other researchers.
Reproducibility is already a growing concern in the field. It is likely to become a considerable issue for highly accurate methods as the calculated energies have a stronger dependence on the simulation parameters such as the basis set size. Two main approaches will be explored to assure precision. First, we will develop the so-called asymptotic correction scheme to speed-up the convergence of the correlation energies with the basis set size. Second, we will directly compare the lattice energies from periodic and finite cluster based calculations. Both should yield identical answers, but if and how the agreement can be reached for general system is currently far from being understood for methods such as coupled cluster. Reliable data will allow us to answer some of the open questions regarding the stability of polymorphs and high pressure phases, such as the possibility of existence of high pressure ionic phases of water and ammonia.
Summary
The description of high pressure phases or polymorphism of molecular solids represents a significant scientific challenge both for experiment and theory. Theoretical methods that are currently used struggle to describe the tiny energy differences between different phases. It is the aim of this project to develop a scheme that would allow accurate and reliable predictions of the binding energies of molecular solids and of the energy differences between different phases.
To reach the required accuracy, we will combine the coupled cluster approach, widely used for reference quality calculations for molecules, with the random phase approximation (RPA) within periodic boundary conditions. As I have recently shown, RPA-based approaches are already some of the most accurate and practically usable methods for the description of extended systems. However, reliability is not only a question of accuracy. Reliable data need to be precise, that is, converged with the numerical parameters so that they are reproducible by other researchers.
Reproducibility is already a growing concern in the field. It is likely to become a considerable issue for highly accurate methods as the calculated energies have a stronger dependence on the simulation parameters such as the basis set size. Two main approaches will be explored to assure precision. First, we will develop the so-called asymptotic correction scheme to speed-up the convergence of the correlation energies with the basis set size. Second, we will directly compare the lattice energies from periodic and finite cluster based calculations. Both should yield identical answers, but if and how the agreement can be reached for general system is currently far from being understood for methods such as coupled cluster. Reliable data will allow us to answer some of the open questions regarding the stability of polymorphs and high pressure phases, such as the possibility of existence of high pressure ionic phases of water and ammonia.
Max ERC Funding
924 375 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym ATOMICAR
Project ATOMic Insight Cavity Array Reactor
Researcher (PI) Peter Christian Kjærgaard VESBORG
Host Institution (HI) DANMARKS TEKNISKE UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2017-STG
Summary The goal of ATOMICAR is to achieve the ultimate sensitivity limit in heterogeneous catalysis:
Quantitative measurement of chemical turnover on a single catalytic nanoparticle.
Most heterogeneous catalysis occurs on metal nanoparticle in the size range of 3 nm - 10 nm. Model studies have established that there is often a strong coupling between nanoparticle size & shape - and catalytic activity. The strong structure-activity coupling renders it probable that “super-active” nanoparticles exist. However, since there is no way to measure catalytic activity of less than ca 1 million nanoparticles at a time, any super-activity will always be hidden by “ensemble smearing” since one million nanoparticles of exactly identical size and shape cannot be made. The state-of-the-art in catalysis benchmarking is microfabricated flow reactors with mass-spectrometric detection, but the sensitivity of this approach cannot be incrementally improved by six orders of magnitude. This calls for a new measurement paradigm where the activity of a single nanoparticle can be benchmarked – the ultimate limit for catalytic measurement.
A tiny batch reactor is the solution, but there are three key problems: How to seal it; how to track catalytic turnover inside it; and how to see the nanoparticle inside it? Graphene solves all three problems: A microfabricated cavity with a thin SixNy bottom window, a single catalytic nanoparticle inside, and a graphene seal forms a gas tight batch reactor since graphene has zero gas permeability. Catalysis is then tracked as an internal pressure change via the stress & deflection of the graphene seal. Crucially, the electron-transparency of graphene and SixNy enables subsequent transmission electron microscope access with atomic resolution so that active nanoparticles can be studied in full detail.
ATOMICAR will re-define the experimental limits of catalyst benchmarking and lift the field of basic catalysis research into the single-nanoparticle age.
Summary
The goal of ATOMICAR is to achieve the ultimate sensitivity limit in heterogeneous catalysis:
Quantitative measurement of chemical turnover on a single catalytic nanoparticle.
Most heterogeneous catalysis occurs on metal nanoparticle in the size range of 3 nm - 10 nm. Model studies have established that there is often a strong coupling between nanoparticle size & shape - and catalytic activity. The strong structure-activity coupling renders it probable that “super-active” nanoparticles exist. However, since there is no way to measure catalytic activity of less than ca 1 million nanoparticles at a time, any super-activity will always be hidden by “ensemble smearing” since one million nanoparticles of exactly identical size and shape cannot be made. The state-of-the-art in catalysis benchmarking is microfabricated flow reactors with mass-spectrometric detection, but the sensitivity of this approach cannot be incrementally improved by six orders of magnitude. This calls for a new measurement paradigm where the activity of a single nanoparticle can be benchmarked – the ultimate limit for catalytic measurement.
A tiny batch reactor is the solution, but there are three key problems: How to seal it; how to track catalytic turnover inside it; and how to see the nanoparticle inside it? Graphene solves all three problems: A microfabricated cavity with a thin SixNy bottom window, a single catalytic nanoparticle inside, and a graphene seal forms a gas tight batch reactor since graphene has zero gas permeability. Catalysis is then tracked as an internal pressure change via the stress & deflection of the graphene seal. Crucially, the electron-transparency of graphene and SixNy enables subsequent transmission electron microscope access with atomic resolution so that active nanoparticles can be studied in full detail.
ATOMICAR will re-define the experimental limits of catalyst benchmarking and lift the field of basic catalysis research into the single-nanoparticle age.
Max ERC Funding
1 496 000 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym BEHAVFRICTIONS
Project Behavioral Implications of Information-Processing Frictions
Researcher (PI) Jakub STEINER
Host Institution (HI) NARODOHOSPODARSKY USTAV AKADEMIE VED CESKE REPUBLIKY VEREJNA VYZKUMNA INSTITUCE
Call Details Consolidator Grant (CoG), SH1, ERC-2017-COG
Summary BEHAVFRICTIONS will use novel models focussing on information-processing frictions to explain choice patterns described in behavioral economics and psychology. The proposed research will provide microfoundations that are essential for (i) identification of stable preferences, (ii) counterfactual predictions, and (iii) normative conclusions.
(i) Agents who face information-processing costs must trade the precision of choice against information costs. Their behavior thus reflects both their stable preferences and the context-dependent procedures that manage their errors stemming from imperfect information processing. In the absence of micro-founded models, the two drivers of the behavior are difficult to disentangle for outside observers. In some pillars of the proposal, the agents follow choice rules that closely resemble logit rules used in structural estimation. This will allow me to reinterpret the structural estimation fits to choice data and to make a distinction between the stable preferences and frictions.
(ii) Such a distinction is important in counterfactual policy analysis because the second-best decision procedures that manage the errors in choice are affected by the analysed policy. Incorporation of the information-processing frictions into existing empirical methods will improve our ability to predict effects of the policies.
(iii) My preliminary results suggest that when an agent is prone to committing errors, biases--such as overconfidence, confirmatory bias, or perception biases known from prospect theory--arise under second-best strategies. By providing the link between the agent's environment and the second-best distribution of the perception errors, my models will delineate environments in which these biases shield the agents from the most costly mistakes from environments in which the biases turn into maladaptations. The distinction will inform the normative debate on debiasing.
Summary
BEHAVFRICTIONS will use novel models focussing on information-processing frictions to explain choice patterns described in behavioral economics and psychology. The proposed research will provide microfoundations that are essential for (i) identification of stable preferences, (ii) counterfactual predictions, and (iii) normative conclusions.
(i) Agents who face information-processing costs must trade the precision of choice against information costs. Their behavior thus reflects both their stable preferences and the context-dependent procedures that manage their errors stemming from imperfect information processing. In the absence of micro-founded models, the two drivers of the behavior are difficult to disentangle for outside observers. In some pillars of the proposal, the agents follow choice rules that closely resemble logit rules used in structural estimation. This will allow me to reinterpret the structural estimation fits to choice data and to make a distinction between the stable preferences and frictions.
(ii) Such a distinction is important in counterfactual policy analysis because the second-best decision procedures that manage the errors in choice are affected by the analysed policy. Incorporation of the information-processing frictions into existing empirical methods will improve our ability to predict effects of the policies.
(iii) My preliminary results suggest that when an agent is prone to committing errors, biases--such as overconfidence, confirmatory bias, or perception biases known from prospect theory--arise under second-best strategies. By providing the link between the agent's environment and the second-best distribution of the perception errors, my models will delineate environments in which these biases shield the agents from the most costly mistakes from environments in which the biases turn into maladaptations. The distinction will inform the normative debate on debiasing.
Max ERC Funding
1 321 488 €
Duration
Start date: 2018-06-01, End date: 2023-05-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym BRuSH
Project Oral bacteria as determinants for respiratory health
Researcher (PI) Randi BERTELSEN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), LS7, ERC-2018-STG
Summary The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Summary
The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Max ERC Funding
1 499 938 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym C-MORPH
Project Noninvasive cell specific morphometry in neuroinflammation and degeneration
Researcher (PI) Henrik LUNDELL
Host Institution (HI) REGION HOVEDSTADEN
Call Details Starting Grant (StG), LS7, ERC-2018-STG
Summary Brain structure determines function. Disentangling regional microstructural properties and understanding how these properties constitute brain function is a central goal of neuroimaging of the human brain and a key prerequisite for a mechanistic understanding of brain diseases and their treatment. Using magnetic resonance (MR) imaging, previous research has established links between regional brain microstructure and inter-individual variation in brain function, but this line of research has been limited by the non-specificity of MR-derived markers. This hampers the application of MR imaging as a tool to identify specific fingerprints of the underlying disease process.
Exploiting state-of-the-art ultra-high field MR imaging techniques, I have recently developed two independent spectroscopic MR methods that have the potential to tackle this challenge: Powder averaged diffusion weighted spectroscopy (PADWS) can provide an unbiased marker for cell specific structural degeneration, and Spectrally tuned gradient trajectories (STGT) can isolate cell shape and size. In this project, I will harness these innovations for MR-based precision medicine. I will advance PADWS and STGT methodology on state-of-the-art MR hardware and harvest the synergy of these methods to realize Cell-specific in-vivo MORPHOMETRY (C-MORPH) of the intact human brain. I will establish novel MR read-outs and analyses to derive cell-type specific tissue properties in the healthy and diseased brain and validate them with the help of a strong translational experimental framework, including histological validation. Once validated, the experimental methods and analyses will be simplified and adapted to provide clinically applicable tools. This will push the frontiers of MR-based personalized medicine, guiding therapeutic decisions by providing sensitive probes of cell-specific microstructural changes caused by inflammation, neurodegeneration or treatment response.
Summary
Brain structure determines function. Disentangling regional microstructural properties and understanding how these properties constitute brain function is a central goal of neuroimaging of the human brain and a key prerequisite for a mechanistic understanding of brain diseases and their treatment. Using magnetic resonance (MR) imaging, previous research has established links between regional brain microstructure and inter-individual variation in brain function, but this line of research has been limited by the non-specificity of MR-derived markers. This hampers the application of MR imaging as a tool to identify specific fingerprints of the underlying disease process.
Exploiting state-of-the-art ultra-high field MR imaging techniques, I have recently developed two independent spectroscopic MR methods that have the potential to tackle this challenge: Powder averaged diffusion weighted spectroscopy (PADWS) can provide an unbiased marker for cell specific structural degeneration, and Spectrally tuned gradient trajectories (STGT) can isolate cell shape and size. In this project, I will harness these innovations for MR-based precision medicine. I will advance PADWS and STGT methodology on state-of-the-art MR hardware and harvest the synergy of these methods to realize Cell-specific in-vivo MORPHOMETRY (C-MORPH) of the intact human brain. I will establish novel MR read-outs and analyses to derive cell-type specific tissue properties in the healthy and diseased brain and validate them with the help of a strong translational experimental framework, including histological validation. Once validated, the experimental methods and analyses will be simplified and adapted to provide clinically applicable tools. This will push the frontiers of MR-based personalized medicine, guiding therapeutic decisions by providing sensitive probes of cell-specific microstructural changes caused by inflammation, neurodegeneration or treatment response.
Max ERC Funding
1 498 811 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31