Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym AUTO-CD
Project COELIAC DISEASE: UNDERSTANDING HOW A FOREIGN PROTEIN DRIVES AUTOANTIBODY FORMATION
Researcher (PI) Ludvig Magne Sollid
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), LS6, ERC-2010-AdG_20100317
Summary The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Summary
The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Max ERC Funding
2 291 045 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym CHROMPHYS
Project Physics of the Solar Chromosphere
Researcher (PI) Mats Per-Olof Carlsson
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Summary
CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Max ERC Funding
2 487 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym Cosmoglobe
Project Cosmoglobe -- mapping the universe from the Milky Way to the Big Bang
Researcher (PI) Ingunn Kathrine WEHUS
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Summary
In the aftermath of the high-precision Planck and BICEP2 experiments, cosmology has undergone a critical transition. Before 2014, most breakthroughs came as direct results of improved detector technology and increased noise sensitivity. After 2014, the main source of uncertainty will be due to astrophysical foregrounds, typically in the form of dust or synchrotron emission from the Milky Way. Indeed, this holds as true for the study of reionization and the cosmic dawn as it does for the hunt for inflationary gravitational waves. To break through this obscuring veil, it is of utmost importance to optimally exploit every piece of available information, merging the world's best observational data with the world's most advanced theoretical models. A first step toward this ultimate goal was recently published as the Planck 2015 Astrophysical Baseline Model, an effort led and conducted by myself.
Here I propose to build Cosmoglobe, a comprehensive model of the radio, microwave and sub-mm sky, covering 100 MHz to 10 THz in both intensity and polarization, extending existing models by three orders of magnitude in frequency and a factor of five in angular resolution. I will leverage a recent algorithmic breakthrough in multi-resolution component separation to jointly analyze some of the world's best data sets, including C-BASS, COMAP, PASIPHAE, Planck, SPIDER, WMAP and many more. This will result in the best cosmological (CMB, SZ, CIB etc.) and astrophysical (thermal and spinning dust, synchrotron and free-free emission etc.) component maps published to date. I will then use this model to derive the world's strongest limits on, and potentially detect, inflationary gravity waves using SPIDER observations; forecast, optimize and analyze observations from the leading next-generation CMB experiments, including LiteBIRD and S4; and derive the first 3D large-scale structure maps from CO intensity mapping from COMAP, potentially opening up a new window on the cosmic dawn.
Max ERC Funding
1 999 382 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym SolarALMA
Project ALMA – The key to the Sun’s coronal heating problem.
Researcher (PI) Sven Wedemeyer
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2015-CoG
Summary How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Summary
How are the outer layers of the Sun heated to temperatures in excess of a million kelvin? A large number of heating mechanisms have been proposed to explain this so-called coronal heating problem, one of the fundamental questions in contemporary solar physics. It is clear that the required energy is transported from the solar interior through the chromosphere into the outer layers but it remains open by which physical mechanisms and how the provided energy is eventually dissipated. The key to solving the chromospheric/coronal heating problem lies in accurate observations at high spatial, temporal and spectral resolution, facilitating the identification of the mechanisms responsible for the transport and dissipation of energy. This has so far been impeded by the small number of accessible diagnostics and the challenges with their interpretation. The interferometric Atacama Large Millimeter/submillimeter Array (ALMA) now offers impressive capabilities. Due to the properties of the solar radiation at millimeter wavelengths, ALMA serves as a linear thermometer, mapping narrow layers at different heights. It can measure the thermal structure and dynamics of the solar chromosphere and thus sources and sinks of atmospheric heating. Radio recombination and molecular lines (e.g., CO) potentially provide complementary kinetic and thermal diagnostics, while the polarisation of the continuum intensity and the Zeeman effect can be exploited for valuable chromospheric magnetic field measurements.
I will develop the necessary diagnostic tools and use them for solar observations with ALMA. The preparation, optimisation and interpretation of these observations will be supported by state-of-the-art numerical simulations. A key objective is the identification of the dominant physical processes and their contributions to the transport and dissipation of energy. The results will be a major step towards solving the coronal heating problem with general implications for stellar activity.
Max ERC Funding
1 995 964 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym SURFSPEC
Project Theoretical multiphoton spectroscopy for understanding surfaces and interfaces
Researcher (PI) Kenneth Ruud
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2011-StG_20101014
Summary The project will develop new methods for calculating nonlinear spectroscopic properties, both in the electronic as well as in the vibrational domain. The methods will be used to study molecular interactions at interfaces, allowing for a direct comparison of experimental observations with theoretical calculations. In order to explore different ways of modeling surface and interface interactions, we will develop three different ab initio methods for calculating these nonlinear molecular properties: 1) Multiscale methods, in which the interface region is partitioned into three different layers. The part involving interface-absorbed molecules will be described by quantum-chemical methods, the closest surrounding part of the system where specific interactions are important will be described by classical, polarizable force fields, and the long-range electrostatic interactions will be described by a polarizable continuum. 2) Periodic-boundary conditions: We will extend a response theory framework recently developed in our group to describe periodic systems using Gaussian basis sets. This will be achieved by deriving the necessary formulas, and interface our response framework to existing periodic-boundary codes. 3) Time-domain methods: Starting from the equation of motion for the reduced single-electron density matrix, we will propagate the electron density and the classical nuclei in time in order to model time-resolved vibrational spectroscopies.
The novelty of the project is in its focus on nonlinear molecular properties, both electronic and vibrational, and the development of computational models for surfaces and interfaces that may help rationalize experimental observations of interface phenomena and molecular adsorption at interfaces. In the application of the methods developed, particular attention will be given to nonlinear electronic and vibrational spectroscopies that selectively probe surfaces and interfaces in a non-invasive manner, such as SFG.
Summary
The project will develop new methods for calculating nonlinear spectroscopic properties, both in the electronic as well as in the vibrational domain. The methods will be used to study molecular interactions at interfaces, allowing for a direct comparison of experimental observations with theoretical calculations. In order to explore different ways of modeling surface and interface interactions, we will develop three different ab initio methods for calculating these nonlinear molecular properties: 1) Multiscale methods, in which the interface region is partitioned into three different layers. The part involving interface-absorbed molecules will be described by quantum-chemical methods, the closest surrounding part of the system where specific interactions are important will be described by classical, polarizable force fields, and the long-range electrostatic interactions will be described by a polarizable continuum. 2) Periodic-boundary conditions: We will extend a response theory framework recently developed in our group to describe periodic systems using Gaussian basis sets. This will be achieved by deriving the necessary formulas, and interface our response framework to existing periodic-boundary codes. 3) Time-domain methods: Starting from the equation of motion for the reduced single-electron density matrix, we will propagate the electron density and the classical nuclei in time in order to model time-resolved vibrational spectroscopies.
The novelty of the project is in its focus on nonlinear molecular properties, both electronic and vibrational, and the development of computational models for surfaces and interfaces that may help rationalize experimental observations of interface phenomena and molecular adsorption at interfaces. In the application of the methods developed, particular attention will be given to nonlinear electronic and vibrational spectroscopies that selectively probe surfaces and interfaces in a non-invasive manner, such as SFG.
Max ERC Funding
1 498 500 €
Duration
Start date: 2011-09-01, End date: 2016-08-31