Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AUTO-CD
Project COELIAC DISEASE: UNDERSTANDING HOW A FOREIGN PROTEIN DRIVES AUTOANTIBODY FORMATION
Researcher (PI) Ludvig Magne Sollid
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), LS6, ERC-2010-AdG_20100317
Summary The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Summary
The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Max ERC Funding
2 291 045 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym ENSEMBLE
Project Neural mechanisms for memory retrieval
Researcher (PI) May-Britt Moser
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Advanced Grant (AdG), LS5, ERC-2010-AdG_20100317
Summary Memory is one of the most extraordinary phenomena in biology. The mammalian brain stores billions of bits of information but the most remarkable property of memory is perhaps not its capacity but the speed at which the correct information can be retrieved from a pool of thousands or millions of competing alternatives. Despite more than hundred years of systematic study of the phenomenon, scientists are still largely ignorant about the mechanisms that enable mammalian brains to outperform even the best search engines. One of the greatest challenges has been the dynamic nature of memory. Whereas memories can be retrieved over time periods as short as milliseconds, underlying coding principles are normally inferred from activity time-averaged across many minutes. In the present proposal, I shall introduce a new ¿teleportation procedure¿ developed in my lab to monitor the representation of past and present environments in large ensembles of rat hippocampal neurons at ethologically valid time scales. By monitoring the evolution of hippocampal ensemble representations at millisecond resolution during retrieval of a non-local experience, I shall ask
(i) what is the minimum temporal unit of a hippocampal representation,
(ii) how is one representational unit replaced by the next in a sequence,
(iii) what external signals control switches between alternative representations,
(iv) how are representations synchronized across anatomical space, and
(v) when do adult-like retrieval mechanisms appear during ontogenesis of the nervous system and to what extent can their early absence be linked to infantile amnesia.
The proposed research programme is expected to identify some of the key principles for dynamic representation and retrieval of episodic memory in the mammalian hippocampus.
Summary
Memory is one of the most extraordinary phenomena in biology. The mammalian brain stores billions of bits of information but the most remarkable property of memory is perhaps not its capacity but the speed at which the correct information can be retrieved from a pool of thousands or millions of competing alternatives. Despite more than hundred years of systematic study of the phenomenon, scientists are still largely ignorant about the mechanisms that enable mammalian brains to outperform even the best search engines. One of the greatest challenges has been the dynamic nature of memory. Whereas memories can be retrieved over time periods as short as milliseconds, underlying coding principles are normally inferred from activity time-averaged across many minutes. In the present proposal, I shall introduce a new ¿teleportation procedure¿ developed in my lab to monitor the representation of past and present environments in large ensembles of rat hippocampal neurons at ethologically valid time scales. By monitoring the evolution of hippocampal ensemble representations at millisecond resolution during retrieval of a non-local experience, I shall ask
(i) what is the minimum temporal unit of a hippocampal representation,
(ii) how is one representational unit replaced by the next in a sequence,
(iii) what external signals control switches between alternative representations,
(iv) how are representations synchronized across anatomical space, and
(v) when do adult-like retrieval mechanisms appear during ontogenesis of the nervous system and to what extent can their early absence be linked to infantile amnesia.
The proposed research programme is expected to identify some of the key principles for dynamic representation and retrieval of episodic memory in the mammalian hippocampus.
Max ERC Funding
2 499 074 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym PREPROCESSING
Project RIGOROUS THEORY OF PREPROCESSING
Researcher (PI) Fedor Fomin
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Advanced Grant (AdG), PE6, ERC-2010-AdG_20100224
Summary The main research goal of this project is the quest for rigorous mathematical theory explaining the power and failure of heuristics. The incapability of current computational models to explain the success of heuristic algorithms in practical computing is the subject of wide discussion for more than four decades. Within this project we expect a significant breakthrough in the study of a large family of heuristics: Preprocessing (data reduction or kernelization). Preprocessing is a reduction of the problem to a simpler one and this is the type of algorithms used in almost every application.
As key to novel and groundbreaking results, the proposed project aims to develop new theory of polynomial time compressibility. Understanding the origin of compressibility will serve to build more powerful heuristic algorithms, as well as to explain the behaviour of preprocessing.
The ubiquity of preprocessing makes the theory of compressibility extremely important.
The new theory will be able to transfer the ideas of efficient computation beyond the established borders.
Summary
The main research goal of this project is the quest for rigorous mathematical theory explaining the power and failure of heuristics. The incapability of current computational models to explain the success of heuristic algorithms in practical computing is the subject of wide discussion for more than four decades. Within this project we expect a significant breakthrough in the study of a large family of heuristics: Preprocessing (data reduction or kernelization). Preprocessing is a reduction of the problem to a simpler one and this is the type of algorithms used in almost every application.
As key to novel and groundbreaking results, the proposed project aims to develop new theory of polynomial time compressibility. Understanding the origin of compressibility will serve to build more powerful heuristic algorithms, as well as to explain the behaviour of preprocessing.
The ubiquity of preprocessing makes the theory of compressibility extremely important.
The new theory will be able to transfer the ideas of efficient computation beyond the established borders.
Max ERC Funding
2 227 051 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym STOCHPOP
Project Stochastic Population Biology in a Fluctuating Environment
Researcher (PI) Bernt-Erik Sæther
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Advanced Grant (AdG), LS8, ERC-2010-AdG_20100317
Summary The aim of this project is to produce a new synthesis integrating ecological and evolutionary processes. This synthetic approach is based on the fundamental premise that the effects of environmental stochasticity are essential for the understanding of biological processes at every time scale because all natural populations are exposed to a fluctuating environment. Following my recent advances in the development of stochastic population models I will in this proposal address three questions. First, I will examine to whether the ecological effects of a fluctuating environment can be predicted from some basic set of characters distributing species along a slow-fast continuum of life history variation. Secondly, I will using my own long-term study systems partition selection on fitness-related traits into different hierarchical components, which all must be estimated for predicting the rate of evolutionary changes in quantitative characters. Thirdly, I will examine to what extent using a comparative approach how the strength of fluctuating selection caused by environmental change is predictable from basic life history characteristics. I expect that any emerging patterns arising from the evaluation of these hypotheses will represent a major break-through in evolutionary biology because it will enable identification of general principles and processes that affect the rate of change of populations both at ecological and evolutionary time scales and hence provide tools for development of quantitative predictions for the expected rate of Darwinian evolution in fluctuating environments. I further anticipate that the knowledge advanced will have significant implications for research in population biology and its future application in applied conservation biology.
Summary
The aim of this project is to produce a new synthesis integrating ecological and evolutionary processes. This synthetic approach is based on the fundamental premise that the effects of environmental stochasticity are essential for the understanding of biological processes at every time scale because all natural populations are exposed to a fluctuating environment. Following my recent advances in the development of stochastic population models I will in this proposal address three questions. First, I will examine to whether the ecological effects of a fluctuating environment can be predicted from some basic set of characters distributing species along a slow-fast continuum of life history variation. Secondly, I will using my own long-term study systems partition selection on fitness-related traits into different hierarchical components, which all must be estimated for predicting the rate of evolutionary changes in quantitative characters. Thirdly, I will examine to what extent using a comparative approach how the strength of fluctuating selection caused by environmental change is predictable from basic life history characteristics. I expect that any emerging patterns arising from the evaluation of these hypotheses will represent a major break-through in evolutionary biology because it will enable identification of general principles and processes that affect the rate of change of populations both at ecological and evolutionary time scales and hence provide tools for development of quantitative predictions for the expected rate of Darwinian evolution in fluctuating environments. I further anticipate that the knowledge advanced will have significant implications for research in population biology and its future application in applied conservation biology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2011-05-01, End date: 2016-04-30