Project acronym 1D-Engine
Project 1D-electrons coupled to dissipation: a novel approach for understanding and engineering superconducting materials and devices
Researcher (PI) Adrian KANTIAN
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Summary
Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Max ERC Funding
1 491 013 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 3DWATERWAVES
Project Mathematical aspects of three-dimensional water waves with vorticity
Researcher (PI) Erik Torsten Wahlén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Summary
The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Max ERC Funding
1 203 627 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym AFRODITE
Project Advanced Fluid Research On Drag reduction In Turbulence Experiments
Researcher (PI) Jens Henrik Mikael Fransson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary A hot topic in today's debate on global warming is drag reduction in aeronautics. The most beneficial concept for drag reduction is to maintain the major portion of the airfoil laminar. Estimations show that the potential drag reduction can be as much as 15%, which would give a significant reduction of NOx and CO emissions in the atmosphere considering that the number of aircraft take offs, only in the EU, is over 19 million per year. An important element for successful flow control, which can lead to a reduced aerodynamic drag, is enhanced physical understanding of the transition to turbulence process.
In previous wind tunnel measurements we have shown that roughness elements can be used to sensibly delay transition to turbulence. The result is revolutionary, since the common belief has been that surface roughness causes earlier transition and in turn increases the drag, and is a proof of concept of the passive control method per se. The beauty with a passive control technique is that no external energy has to be added to the flow system in order to perform the control, instead one uses the existing energy in the flow.
In this project proposal, AFRODITE, we will take this passive control method to the next level by making it twofold, more persistent and more robust. Transition prevention is the goal rather than transition delay and the method will be extended to simultaneously control separation, which is another unwanted flow phenomenon especially during airplane take offs. AFRODITE will be a catalyst for innovative research, which will lead to a cleaner sky.
Summary
A hot topic in today's debate on global warming is drag reduction in aeronautics. The most beneficial concept for drag reduction is to maintain the major portion of the airfoil laminar. Estimations show that the potential drag reduction can be as much as 15%, which would give a significant reduction of NOx and CO emissions in the atmosphere considering that the number of aircraft take offs, only in the EU, is over 19 million per year. An important element for successful flow control, which can lead to a reduced aerodynamic drag, is enhanced physical understanding of the transition to turbulence process.
In previous wind tunnel measurements we have shown that roughness elements can be used to sensibly delay transition to turbulence. The result is revolutionary, since the common belief has been that surface roughness causes earlier transition and in turn increases the drag, and is a proof of concept of the passive control method per se. The beauty with a passive control technique is that no external energy has to be added to the flow system in order to perform the control, instead one uses the existing energy in the flow.
In this project proposal, AFRODITE, we will take this passive control method to the next level by making it twofold, more persistent and more robust. Transition prevention is the goal rather than transition delay and the method will be extended to simultaneously control separation, which is another unwanted flow phenomenon especially during airplane take offs. AFRODITE will be a catalyst for innovative research, which will lead to a cleaner sky.
Max ERC Funding
1 418 399 €
Duration
Start date: 2010-11-01, End date: 2015-10-31
Project acronym ALMA
Project Attosecond Control of Light and Matter
Researcher (PI) Anne L'huillier
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Summary
Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Max ERC Funding
2 250 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ALPAM
Project Atomic-Level Physics of Advanced Materials
Researcher (PI) Börje Johansson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE5, ERC-2008-AdG
Summary Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Summary
Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym AMIMOS
Project Agile MIMO Systems for Communications, Biomedicine, and Defense
Researcher (PI) Bjorn Ottersten
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE7, ERC-2008-AdG
Summary This proposal targets the emerging frontier research field of multiple-input multiple-output (MIMO) systems along with several innovative and somewhat unconventional applications of such systems. The use of arrays of transmitters and receivers will have a profound impact on future medical imaging/therapy systems, radar systems, and radio communication networks. Multiple transmitters provide a tremendous versatility and allow waveforms to be adapted temporally and spatially to environmental conditions. This is useful for individually tailored illumination of human tissue in biomedical imaging or ultrasound therapy. In radar systems, multiple transmit beams can be formed simultaneously via separate waveform designs allowing accurate target classification. In a wireless communication system, multiple communication signals can be directed to one or more users at the same time on the same frequency carrier. In addition, multiple receivers can be used in the above applications to provide increased detection performance, interference rejection, and improved estimation accuracy. The joint modelling, analysis, and design of these multidimensional transmit and receive schemes form the core of this research proposal. Ultimately, our research aims at developing the fundamental tools that will allow the design of wireless communication systems with an order-of-magnitude higher capacity at a lower cost than today; of ultrasound therapy systems maximizing delivered power while reducing treatment duration and unwanted illumination; and of distributed aperture multi-beam radars allowing more effective target location, identification, and classification. Europe has several successful industries that are active in biomedical imaging/therapy, radar systems, and wireless communications. The future success of these sectors critically depends on the ability to innovate and integrate new technology.
Summary
This proposal targets the emerging frontier research field of multiple-input multiple-output (MIMO) systems along with several innovative and somewhat unconventional applications of such systems. The use of arrays of transmitters and receivers will have a profound impact on future medical imaging/therapy systems, radar systems, and radio communication networks. Multiple transmitters provide a tremendous versatility and allow waveforms to be adapted temporally and spatially to environmental conditions. This is useful for individually tailored illumination of human tissue in biomedical imaging or ultrasound therapy. In radar systems, multiple transmit beams can be formed simultaneously via separate waveform designs allowing accurate target classification. In a wireless communication system, multiple communication signals can be directed to one or more users at the same time on the same frequency carrier. In addition, multiple receivers can be used in the above applications to provide increased detection performance, interference rejection, and improved estimation accuracy. The joint modelling, analysis, and design of these multidimensional transmit and receive schemes form the core of this research proposal. Ultimately, our research aims at developing the fundamental tools that will allow the design of wireless communication systems with an order-of-magnitude higher capacity at a lower cost than today; of ultrasound therapy systems maximizing delivered power while reducing treatment duration and unwanted illumination; and of distributed aperture multi-beam radars allowing more effective target location, identification, and classification. Europe has several successful industries that are active in biomedical imaging/therapy, radar systems, and wireless communications. The future success of these sectors critically depends on the ability to innovate and integrate new technology.
Max ERC Funding
1 872 720 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANSR
Project Ab initio approach to nuclear structure and reactions (++)
Researcher (PI) Christian Erik Forssén
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE2, ERC-2009-StG
Summary Today, much interest in several fields of physics is devoted to the study of small, open quantum systems, whose properties are profoundly affected by the environment; i.e., the continuum of decay channels. In nuclear physics, these problems were originally studied in the context of nuclear reactions but their importance has been reestablished with the advent of radioactive-beam physics and the resulting interest in exotic nuclei. In particular, strong theory initiatives in this area of research will be instrumental for the success of the experimental program at the Facility for Antiproton and Ion Research (FAIR) in Germany. In addition, many of the aspects of open quantum systems are also being explored in the rapidly evolving research on ultracold atomic gases, quantum dots, and other nanodevices. A first-principles description of open quantum systems presents a substantial theoretical and computational challenge. However, the current availability of enormous computing power has allowed theorists to make spectacular progress on problems that were previously thought intractable. The importance of computational methods to study quantum many-body systems is stressed in this proposal. Our approach is based on the ab initio no-core shell model (NCSM), which is a well-established theoretical framework aimed originally at an exact description of nuclear structure starting from realistic inter-nucleon forces. A successful completion of this project requires extensions of the NCSM mathematical framework and the development of highly advanced computer codes. The '++' in the project title indicates the interdisciplinary aspects of the present research proposal and the ambition to make a significant impact on connected fields of many-body physics.
Summary
Today, much interest in several fields of physics is devoted to the study of small, open quantum systems, whose properties are profoundly affected by the environment; i.e., the continuum of decay channels. In nuclear physics, these problems were originally studied in the context of nuclear reactions but their importance has been reestablished with the advent of radioactive-beam physics and the resulting interest in exotic nuclei. In particular, strong theory initiatives in this area of research will be instrumental for the success of the experimental program at the Facility for Antiproton and Ion Research (FAIR) in Germany. In addition, many of the aspects of open quantum systems are also being explored in the rapidly evolving research on ultracold atomic gases, quantum dots, and other nanodevices. A first-principles description of open quantum systems presents a substantial theoretical and computational challenge. However, the current availability of enormous computing power has allowed theorists to make spectacular progress on problems that were previously thought intractable. The importance of computational methods to study quantum many-body systems is stressed in this proposal. Our approach is based on the ab initio no-core shell model (NCSM), which is a well-established theoretical framework aimed originally at an exact description of nuclear structure starting from realistic inter-nucleon forces. A successful completion of this project requires extensions of the NCSM mathematical framework and the development of highly advanced computer codes. The '++' in the project title indicates the interdisciplinary aspects of the present research proposal and the ambition to make a significant impact on connected fields of many-body physics.
Max ERC Funding
1 304 800 €
Duration
Start date: 2009-12-01, End date: 2014-11-30
Project acronym APPROXNP
Project Approximation of NP-hard optimization problems
Researcher (PI) Johan Håstad
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE6, ERC-2008-AdG
Summary The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Summary
The proposed project aims to create a center of excellence that aims at understanding the approximability of NP-hard optimization problems. In particular, for central problems like vertex cover, coloring of graphs, and various constraint satisfaction problems we want to study upper and lower bounds on how well they can be approximated in polynomial time. Many existing strong results are based on what is known as the Unique Games Conjecture (UGC) and a significant part of the project will be devoted to studying this conjecture. We expect that a major step needed to be taken in this process is to further develop the understanding of Boolean functions on the Boolean hypercube. We anticipate that the tools needed for this will come in the form of harmonic analysis which in its turn will rely on the corresponding results in the analysis of functions over the domain of real numbers.
Max ERC Funding
2 376 000 €
Duration
Start date: 2009-01-01, End date: 2014-12-31
Project acronym ASD
Project Atomistic Spin-Dynamics; Methodology and Applications
Researcher (PI) Olof Ragnar Eriksson
Host Institution (HI) Uppsala University
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary Our aim is to provide a theoretical framework for studies of dynamical aspects of magnetic materials and magnetisation reversal, which has potential for applications for magnetic data storage and magnetic memory devices. The project focuses on developing and using an atomistic spin dynamics simulation method. Our goal is to identify novel materials and device geometries with improved performance. The scientific questions which will be addressed concern the understanding of the fundamental temporal limit of magnetisation switching and reversal, and the mechanisms which govern this limit. The methodological developments concern the ability to, from first principles theory, calculate the interatomic exchange parameters of materials in general, in particular for correlated electron materials, via the use of dynamical mean-field theory. The theoretical development also involves an atomistic spin dynamics simulation method, which once it has been established, will be released as a public software package. The proposed theoretical research will be intimately connected to world-leading experimental efforts, especially in Europe where a leading activity in experimental studies of magnetisation dynamics has been established. The ambition with this project is to become world-leading in the theory of simulating spin-dynamics phenomena, and to promote education and training of young researchers. To achieve our goals we will build up an open and lively environment, where the advances in the theoretical knowledge of spin-dynamics phenomena will be used to address important questions in information technology. In this environment the next generation research leaders will be fostered and trained, thus ensuring that the society of tomorrow is equipped with the scientific competence to tackle the challenges of our future.
Summary
Our aim is to provide a theoretical framework for studies of dynamical aspects of magnetic materials and magnetisation reversal, which has potential for applications for magnetic data storage and magnetic memory devices. The project focuses on developing and using an atomistic spin dynamics simulation method. Our goal is to identify novel materials and device geometries with improved performance. The scientific questions which will be addressed concern the understanding of the fundamental temporal limit of magnetisation switching and reversal, and the mechanisms which govern this limit. The methodological developments concern the ability to, from first principles theory, calculate the interatomic exchange parameters of materials in general, in particular for correlated electron materials, via the use of dynamical mean-field theory. The theoretical development also involves an atomistic spin dynamics simulation method, which once it has been established, will be released as a public software package. The proposed theoretical research will be intimately connected to world-leading experimental efforts, especially in Europe where a leading activity in experimental studies of magnetisation dynamics has been established. The ambition with this project is to become world-leading in the theory of simulating spin-dynamics phenomena, and to promote education and training of young researchers. To achieve our goals we will build up an open and lively environment, where the advances in the theoretical knowledge of spin-dynamics phenomena will be used to address important questions in information technology. In this environment the next generation research leaders will be fostered and trained, thus ensuring that the society of tomorrow is equipped with the scientific competence to tackle the challenges of our future.
Max ERC Funding
2 130 000 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym ASTRODYN
Project Astrophysical Dynamos
Researcher (PI) Axel Brandenburg
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE9, ERC-2008-AdG
Summary Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Summary
Magnetic fields in stars, planets, accretion discs, and galaxies are believed to be the result of a dynamo process converting kinetic energy into magnetic energy. This work focuses on the solar dynamo, but dynamos in other astrophysical systems will also be addressed. In particular, direct high-resolution three-dimensional simulations are used to understand particular aspects of the solar dynamo and ultimately to simulate the solar dynamo as a whole. Phenomenological approaches will be avoided in favor of obtaining rigorous results. A major problem is catastrophic quenching, i.e. the decline of dynamo effects in inverse proportion to the magnetic Reynolds number, which is huge. Tremendous advances have been made in the last few years since the cause of catastrophic quenching in dynamos has been understood in terms of magnetic helicity evolution. The numerical tools are now in place to allow for magnetic helicity fluxes via coronal mass ejections, thus alleviating catastrophic quenching. This work employs simulations in spherical shells, augmented by Cartesian simulations in special cases. The roles of the near-surface shear layer, the tachocline, as well as pumping in the bulk of the convection zone are to be clarified. The Pencil Code will be used for most applications. The code is third order in time and sixth order in space and is used for solving the hydromagnetic equations. It is a public domain code developed by roughly 20 scientists world wide and maintained under an a central versioning system at Nordita. Automatic nightly tests of currently 30 applications ensure the integrity of the code. It is used for a wide range of applications and may include the effects of radiation, self-gravity, dust, chemistry, variable ionization, cosmic rays, in addition to those of magnetohydrodynamics. The code with its infrastructure offers a good opportunity for individuals within a broad group of people to develop new tools that may automatically be useful to others.
Max ERC Funding
2 220 000 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym ASTROGEOBIOSPHERE
Project An astronomical perspective on Earth's geological record and evolution of life
Researcher (PI) Birger Schmitz
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary "This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Summary
"This project will develop the use of relict, extraterrestrial minerals in Archean to Cenozoic slowly formed sediments as tracers of events in the solar system and cosmos, and to decipher the possible relation between such events and evolution of life and environmental change on Earth. There has been consensus that it would not be possible to reconstruct variations in the flux of different types of meteorites to Earth through the ages. Meteorite falls are rare and meteorites weather and decay rapidly on the Earth surface. However, the last years we have developed the first realistic approach to circumvent these problems. Almost all meteorite types contain a small fraction of spinel minerals that survives weathering and can be recovered from large samples of condensed sediments of any age. Inside the spinels we can locate by synchrotron-light X-ray tomography 1-30 micron sized inclusions of most of the other minerals that made up the original meteorite. With cutting-edge frontier microanalyses such as Ne-21 (solar wind, galactic rays), oxygen isotopes (meteorite group and type) and cosmic ray tracks (supernova densities) we will be able to unravel from the geological record fundamental new information about the solar system at specific times through the past 3.8 Gyr. Variations in flux and types of meteorites may reflect solar-system and galaxy gravity disturbances as well as the sequence of disruptions of the parent bodies for meteorite types known and not yet known. Cosmic-ray tracks in spinels may identify the galactic year (230 Myr) in the geological record. For the first time it will be possible to systematically relate major global biotic and tectonic events, changes in sea-level, climate and asteroid and comet impacts to what happened in the larger astronomical realm. In essence, the project is a robust approach to establish a pioneer ""astrostratigraphy"" for Earth's geological record, complementing existing bio-, chemo-, and magnetostratigraphies."
Max ERC Funding
1 950 000 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ATMOGAIN
Project Atmospheric Gas-Aerosol Interface:
From Fundamental Theory to Global Effects
Researcher (PI) Ilona Anniina Riipinen
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE10, ERC-2011-StG_20101014
Summary Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Summary
Atmospheric aerosol particles are a major player in the earth system: they impact the climate by scattering and absorbing solar radiation, as well as regulating the properties of clouds. On regional scales aerosol particles are among the main pollutants deteriorating air quality. Capturing the impact of aerosols is one of the main challenges in understanding the driving forces behind changing climate and air quality.
Atmospheric aerosol numbers are governed by the ultrafine (< 100 nm in diameter) particles. Most of these particles have been formed from atmospheric vapours, and their fate and impacts are governed by the mass transport processes between the gas and particulate phases. These transport processes are currently poorly understood. Correct representation of the aerosol growth/shrinkage by condensation/evaporation of atmospheric vapours is thus a prerequisite for capturing the evolution and impacts of aerosols.
I propose to start a research group that will address the major current unknowns in atmospheric ultrafine particle growth and evaporation. First, we will develop a unified theoretical framework to describe the mass accommodation processes at aerosol surfaces, aiming to resolve the current ambiguity with respect to the uptake of atmospheric vapours by aerosols. Second, we will study the condensational properties of selected organic compounds and their mixtures. Organic compounds are known to contribute significantly to atmospheric aerosol growth, but the properties that govern their condensation, such as saturation vapour pressures and activities, are largely unknown. Third, we aim to resolve the gas and particulate phase processes that govern the growth of realistic atmospheric aerosol. Fourth, we will parameterize ultrafine aerosol growth, implement the parameterizations to chemical transport models, and quantify the impact of these condensation and evaporation processes on global and regional aerosol budgets.
Max ERC Funding
1 498 099 €
Duration
Start date: 2011-09-01, End date: 2016-08-31
Project acronym AXION
Project Axions: From Heaven to Earth
Researcher (PI) Frank Wilczek
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2016-ADG
Summary Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Summary
Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Max ERC Funding
2 324 391 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BOPNIE
Project Boundary value problems for nonlinear integrable equations
Researcher (PI) Jonatan Carl Anders Lenells
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The purpose of this project is to develop new methods for solving boundary value problems (BVPs) for nonlinear integrable partial differential equations (PDEs). Integrable PDEs can be analyzed by means of the Inverse Scattering Transform, whose introduction was one of the most important developments in the theory of nonlinear PDEs in the 20th century. Until the 1990s the inverse scattering methodology was pursued almost entirely for pure initial-value problems. However, in many laboratory and field situations, the solution is generated by what corresponds to the imposition of boundary conditions rather than initial conditions. Thus, an understanding of BVPs is crucial.
In an exciting sequence of events taking place in the last two decades, new tools have become available to deal with BVPs for integrable PDEs. Although some important issues have already been resolved, several major problems remain open.
The aim of this project is to solve a number of these open problems and to find solutions of BVPs which were heretofore not solvable. More precisely, the proposal has eight objectives:
1. Develop methods for solving problems with time-periodic boundary conditions.
2. Answer some long-standing open questions raised by series of wave-tank experiments 35 years ago.
3. Develop a new approach for the study of space-periodic solutions.
4. Develop new approaches for the analysis of BVPs for equations with 3 x 3-matrix Lax pairs.
5. Derive new asymptotic formulas by using a nonlinear version of the steepest descent method.
6. Construct disk and disk/black-hole solutions of the stationary axisymmetric Einstein equations.
7. Solve a BVP in Einstein's theory of relativity describing two colliding gravitational waves.
8. Extend the above methods to BVPs in higher dimensions.
Summary
The purpose of this project is to develop new methods for solving boundary value problems (BVPs) for nonlinear integrable partial differential equations (PDEs). Integrable PDEs can be analyzed by means of the Inverse Scattering Transform, whose introduction was one of the most important developments in the theory of nonlinear PDEs in the 20th century. Until the 1990s the inverse scattering methodology was pursued almost entirely for pure initial-value problems. However, in many laboratory and field situations, the solution is generated by what corresponds to the imposition of boundary conditions rather than initial conditions. Thus, an understanding of BVPs is crucial.
In an exciting sequence of events taking place in the last two decades, new tools have become available to deal with BVPs for integrable PDEs. Although some important issues have already been resolved, several major problems remain open.
The aim of this project is to solve a number of these open problems and to find solutions of BVPs which were heretofore not solvable. More precisely, the proposal has eight objectives:
1. Develop methods for solving problems with time-periodic boundary conditions.
2. Answer some long-standing open questions raised by series of wave-tank experiments 35 years ago.
3. Develop a new approach for the study of space-periodic solutions.
4. Develop new approaches for the analysis of BVPs for equations with 3 x 3-matrix Lax pairs.
5. Derive new asymptotic formulas by using a nonlinear version of the steepest descent method.
6. Construct disk and disk/black-hole solutions of the stationary axisymmetric Einstein equations.
7. Solve a BVP in Einstein's theory of relativity describing two colliding gravitational waves.
8. Extend the above methods to BVPs in higher dimensions.
Max ERC Funding
2 000 000 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym BUCOPHSYS
Project Bottom-up hybrid control and planning synthesis with application to multi-robot multi-human coordination
Researcher (PI) DIMOS Dimarogonas
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE7, ERC-2014-STG
Summary Current control applications necessitate the treatment of systems with multiple interconnected components, rather than the traditional single component paradigm that has been studied extensively. The individual subsystems may need to fulfil different and possibly conflicting specifications in a real-time manner. At the same time, they may need to fulfill coupled constraints that are defined as relations between their states. Towards this end, the need for methods for decentralized control at the continuous level and planning at the task level becomes apparent. We aim here towards unification of these two complementary approaches. Existing solutions rely on a top down centralized approach. We instead consider here a decentralized, bottom-up solution to the problem. The approach relies on three layers of interaction. In the first layer, agents aim at coordinating in order to fulfil their coupled constraints with limited communication exchange of their state information and design of appropriate feedback controllers; in the second layer, agents coordinate in order to mutually satisfy their discrete tasks through exchange of the corresponding plans in the form of automata; in the third and most challenging layer, the communication exchange for coordination now includes both continuous state and discrete plan/abstraction information. The results will be demonstrated in a scenario involving multiple (possibly human) users and multiple robots.
The unification will yield a completely decentralized system, in which the bottom up approach to define tasks, the consideration of coupled constraints and their combination towards distributed hybrid control and planning in a coordinated fashion require for
new ways of thinking and approaches to analysis and constitute the proposal a beyond the SoA and groundbreaking approach to the fields of control and computer science.
Summary
Current control applications necessitate the treatment of systems with multiple interconnected components, rather than the traditional single component paradigm that has been studied extensively. The individual subsystems may need to fulfil different and possibly conflicting specifications in a real-time manner. At the same time, they may need to fulfill coupled constraints that are defined as relations between their states. Towards this end, the need for methods for decentralized control at the continuous level and planning at the task level becomes apparent. We aim here towards unification of these two complementary approaches. Existing solutions rely on a top down centralized approach. We instead consider here a decentralized, bottom-up solution to the problem. The approach relies on three layers of interaction. In the first layer, agents aim at coordinating in order to fulfil their coupled constraints with limited communication exchange of their state information and design of appropriate feedback controllers; in the second layer, agents coordinate in order to mutually satisfy their discrete tasks through exchange of the corresponding plans in the form of automata; in the third and most challenging layer, the communication exchange for coordination now includes both continuous state and discrete plan/abstraction information. The results will be demonstrated in a scenario involving multiple (possibly human) users and multiple robots.
The unification will yield a completely decentralized system, in which the bottom up approach to define tasks, the consideration of coupled constraints and their combination towards distributed hybrid control and planning in a coordinated fashion require for
new ways of thinking and approaches to analysis and constitute the proposal a beyond the SoA and groundbreaking approach to the fields of control and computer science.
Max ERC Funding
1 498 729 €
Duration
Start date: 2015-03-01, End date: 2020-02-29
Project acronym CC-MEM
Project Coordination and Composability: The Keys to Efficient Memory System Design
Researcher (PI) David BLACK-SCHAFFER
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary Computer systems today are power limited. As a result, efficiency gains can be translated into performance. Over the past decade we have been so effective at making computation more efficient that we are now at the point where we spend as much energy moving data (from memory to cache to processor) as we do computing the results. And this trend is only becoming worse as we demand more bandwidth for more powerful processors. To improve performance we need to revisit the way we design memory systems from an energy-first perspective, both at the hardware level and by coordinating data movement between hardware and software.
CC-MEM will address memory system efficiency by redesigning low-level hardware and high-level hardware/software integration for energy efficiency. The key novelty is in developing a framework for creating efficient memory systems. This framework will enable researchers and designers to compose solutions to different memory system problems (through a shared exchange of metadata) and coordinate them towards high-level system efficiency goals (through a shared policy framework). Central to this framework is a bilateral exchange of metadata and policy between hardware and software components. This novel communication will open new challenges and opportunities for fine-grained optimizations, system-level efficiency metrics, and more effective divisions of responsibility between hardware and software components.
CC-MEM will change how researchers and designers approach memory system design from today’s ad hoc development of local solutions to one wherein disparate components can be integrated (composed) and driven (coordinated) by system-level metrics. As a result, we will be able to more intelligently manage data, leading to dramatically lower memory system energy and increased performance, and open new possibilities for hardware and software optimizations.
Summary
Computer systems today are power limited. As a result, efficiency gains can be translated into performance. Over the past decade we have been so effective at making computation more efficient that we are now at the point where we spend as much energy moving data (from memory to cache to processor) as we do computing the results. And this trend is only becoming worse as we demand more bandwidth for more powerful processors. To improve performance we need to revisit the way we design memory systems from an energy-first perspective, both at the hardware level and by coordinating data movement between hardware and software.
CC-MEM will address memory system efficiency by redesigning low-level hardware and high-level hardware/software integration for energy efficiency. The key novelty is in developing a framework for creating efficient memory systems. This framework will enable researchers and designers to compose solutions to different memory system problems (through a shared exchange of metadata) and coordinate them towards high-level system efficiency goals (through a shared policy framework). Central to this framework is a bilateral exchange of metadata and policy between hardware and software components. This novel communication will open new challenges and opportunities for fine-grained optimizations, system-level efficiency metrics, and more effective divisions of responsibility between hardware and software components.
CC-MEM will change how researchers and designers approach memory system design from today’s ad hoc development of local solutions to one wherein disparate components can be integrated (composed) and driven (coordinated) by system-level metrics. As a result, we will be able to more intelligently manage data, leading to dramatically lower memory system energy and increased performance, and open new possibilities for hardware and software optimizations.
Max ERC Funding
1 610 000 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym CC-TOP
Project Cryosphere-Carbon on Top of the Earth (CC-Top):Decreasing Uncertainties of Thawing Permafrost and Collapsing Methane Hydrates in the Arctic
Researcher (PI) Örjan GUSTAFSSON
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary The enormous quantities of frozen carbon in the Arctic, held in shallow soils and sediments, act as “capacitors” of the global carbon system. Thawing permafrost (PF) and collapsing methane hydrates are top candidates to cause a net transfer of carbon from land/ocean to the atmosphere this century, yet uncertainties abound.
Our program targets the East Siberian Arctic Ocean (ESAO), the World’s largest shelf sea, as it holds 80% of coastal PF, 80% of subsea PF and 75% of shallow hydrates. Our initial findings (e.g., Science, 2010; Nature, 2012; PNAS; 2013; Nature Geoscience, 2013, 2014) are challenging earlier notions by showing complexities in terrestrial PF-Carbon remobilization and extensive venting of methane from subsea PF/hydrates. The objective of the CC-Top Program is to transform descriptive and data-lean pictures into quantitative understanding of the CC system, to pin down the present and predict future releases from these “Sleeping Giants” of the global carbon system.
The CC-Top program combines unique Arctic field capacities with powerful molecular-isotopic characterization of PF-carbon/methane to break through on:
The “awakening” of terrestrial PF-C pools: CC-Top will employ great pan-arctic rivers as natural integrators and by probing the δ13C/Δ14C and molecular fingerprints, apportion release fluxes of different PF-C pools.
The ESAO subsea cryosphere/methane: CC-Top will use recent spatially-extensive observations, deep sediment cores and gap-filling expeditions to (i) estimate distribution of subsea PF and hydrates; (ii) establish thermal state (thawing rate) of subsea PF-C; (iii) apportion sources of releasing methane btw subsea-PF, shallow hydrates vs seepage from the deep petroleum megapool using source-diagnostic triple-isotope fingerprinting.
Arctic Ocean slope hydrates: CC-Top will investigate sites (discovered by us 2008-2014) of collapsed hydrates venting methane, to characterize geospatial distribution and causes of destabilization.
Summary
The enormous quantities of frozen carbon in the Arctic, held in shallow soils and sediments, act as “capacitors” of the global carbon system. Thawing permafrost (PF) and collapsing methane hydrates are top candidates to cause a net transfer of carbon from land/ocean to the atmosphere this century, yet uncertainties abound.
Our program targets the East Siberian Arctic Ocean (ESAO), the World’s largest shelf sea, as it holds 80% of coastal PF, 80% of subsea PF and 75% of shallow hydrates. Our initial findings (e.g., Science, 2010; Nature, 2012; PNAS; 2013; Nature Geoscience, 2013, 2014) are challenging earlier notions by showing complexities in terrestrial PF-Carbon remobilization and extensive venting of methane from subsea PF/hydrates. The objective of the CC-Top Program is to transform descriptive and data-lean pictures into quantitative understanding of the CC system, to pin down the present and predict future releases from these “Sleeping Giants” of the global carbon system.
The CC-Top program combines unique Arctic field capacities with powerful molecular-isotopic characterization of PF-carbon/methane to break through on:
The “awakening” of terrestrial PF-C pools: CC-Top will employ great pan-arctic rivers as natural integrators and by probing the δ13C/Δ14C and molecular fingerprints, apportion release fluxes of different PF-C pools.
The ESAO subsea cryosphere/methane: CC-Top will use recent spatially-extensive observations, deep sediment cores and gap-filling expeditions to (i) estimate distribution of subsea PF and hydrates; (ii) establish thermal state (thawing rate) of subsea PF-C; (iii) apportion sources of releasing methane btw subsea-PF, shallow hydrates vs seepage from the deep petroleum megapool using source-diagnostic triple-isotope fingerprinting.
Arctic Ocean slope hydrates: CC-Top will investigate sites (discovered by us 2008-2014) of collapsed hydrates venting methane, to characterize geospatial distribution and causes of destabilization.
Max ERC Funding
2 499 756 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym collectiveQCD
Project Collectivity in small, srongly interacting systems
Researcher (PI) Korinna ZAPP
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2018-STG
Summary In collisions of heavy nuclei at collider energies, for instance at the Large Hadron Collider (LHC) at CERN, the energy density is so high that an equilibrated Quark-Gluon Plasma (QGP), an exotic state of matter consisting of deconfined quarks and gluons, is formed. In proton-proton (p+p) collisions, on the other hand, the density of produced particles is low. The traditional view on such reactions is that final state particles are free and do not rescatter. This picture is challenged by recent LHC data, which found features in p+p collisions that are indicative of collective behaviour and/or the formation of a hot and dense system. These findings have been taken as signs of QGP formation in p+p reactions. Such an interpretation is complicated by the fact that jets, which are the manifestation of very energetic quarks and gluons, are quenched in heavy ion collisions, but appear to be unmodified in p+p reactions. This is puzzling because collectivity and jet quenching are caused by the same processes. So far there is no consensus about the interpretation of these results, which is also due to a lack of suitable tools.
It is the objective of this proposal to address the question whether there are collective effects in p+p collisions. To this end two models capable of describing all relevant aspects of p+p and heavy ion collisions will be developed. They will be obtained by extending a successful description of p+p to heavy ion reactions and vice versa.
The answer to these questions will either clarify the long-standing problem how collectivity emerges from fundamental interactions, or it will necessitate qualitative changes to our interpretation of collective phenomena in p+p and/or heavy ion collisions.
The PI is in a unique position to accomplish this goal, as she has spent her entire career working on different aspects of p+p and heavy ion collisions. The group in Lund is the ideal host, as it is very active in developing alternative interpretations of the data.
Summary
In collisions of heavy nuclei at collider energies, for instance at the Large Hadron Collider (LHC) at CERN, the energy density is so high that an equilibrated Quark-Gluon Plasma (QGP), an exotic state of matter consisting of deconfined quarks and gluons, is formed. In proton-proton (p+p) collisions, on the other hand, the density of produced particles is low. The traditional view on such reactions is that final state particles are free and do not rescatter. This picture is challenged by recent LHC data, which found features in p+p collisions that are indicative of collective behaviour and/or the formation of a hot and dense system. These findings have been taken as signs of QGP formation in p+p reactions. Such an interpretation is complicated by the fact that jets, which are the manifestation of very energetic quarks and gluons, are quenched in heavy ion collisions, but appear to be unmodified in p+p reactions. This is puzzling because collectivity and jet quenching are caused by the same processes. So far there is no consensus about the interpretation of these results, which is also due to a lack of suitable tools.
It is the objective of this proposal to address the question whether there are collective effects in p+p collisions. To this end two models capable of describing all relevant aspects of p+p and heavy ion collisions will be developed. They will be obtained by extending a successful description of p+p to heavy ion reactions and vice versa.
The answer to these questions will either clarify the long-standing problem how collectivity emerges from fundamental interactions, or it will necessitate qualitative changes to our interpretation of collective phenomena in p+p and/or heavy ion collisions.
The PI is in a unique position to accomplish this goal, as she has spent her entire career working on different aspects of p+p and heavy ion collisions. The group in Lund is the ideal host, as it is very active in developing alternative interpretations of the data.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym COMPASS
Project Colloids with complex interactions: from model atoms to colloidal recognition and bio-inspired self assembly
Researcher (PI) Peter Schurtenberger
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE3, ERC-2013-ADG
Summary Self-assembly is the key construction principle that nature uses so successfully to fabricate its molecular machinery and highly elaborate structures. In this project we will follow nature’s strategies and make a concerted experimental and theoretical effort to study, understand and control self-assembly for a new generation of colloidal building blocks. Starting point will be recent advances in colloid synthesis strategies that have led to a spectacular array of colloids of different shapes, compositions, patterns and functionalities. These allow us to investigate the influence of anisotropy in shape and interactions on aggregation and self-assembly in colloidal suspensions and mixtures. Using responsive particles we will implement colloidal lock-and-key mechanisms and then assemble a library of “colloidal molecules” with well-defined and externally tunable binding sites using microfluidics-based and externally controlled fabrication and sorting principles. We will use them to explore the equilibrium phase behavior of particle systems interacting through a finite number of binding sites. In parallel, we will exploit them and investigate colloid self-assembly into well-defined nanostructures. Here we aim at achieving much more refined control than currently possible by implementing a protein-inspired approach to controlled self-assembly. We combine molecule-like colloidal building blocks that possess directional interactions and externally triggerable specific recognition sites with directed self-assembly where external fields not only facilitate assembly, but also allow fabricating novel structures. We will use the tunable combination of different contributions to the interaction potential between the colloidal building blocks and the ability to create chirality in the assembly to establish the requirements for the controlled formation of tubular shells and thus create a colloid-based minimal model of synthetic virus capsid proteins.
Summary
Self-assembly is the key construction principle that nature uses so successfully to fabricate its molecular machinery and highly elaborate structures. In this project we will follow nature’s strategies and make a concerted experimental and theoretical effort to study, understand and control self-assembly for a new generation of colloidal building blocks. Starting point will be recent advances in colloid synthesis strategies that have led to a spectacular array of colloids of different shapes, compositions, patterns and functionalities. These allow us to investigate the influence of anisotropy in shape and interactions on aggregation and self-assembly in colloidal suspensions and mixtures. Using responsive particles we will implement colloidal lock-and-key mechanisms and then assemble a library of “colloidal molecules” with well-defined and externally tunable binding sites using microfluidics-based and externally controlled fabrication and sorting principles. We will use them to explore the equilibrium phase behavior of particle systems interacting through a finite number of binding sites. In parallel, we will exploit them and investigate colloid self-assembly into well-defined nanostructures. Here we aim at achieving much more refined control than currently possible by implementing a protein-inspired approach to controlled self-assembly. We combine molecule-like colloidal building blocks that possess directional interactions and externally triggerable specific recognition sites with directed self-assembly where external fields not only facilitate assembly, but also allow fabricating novel structures. We will use the tunable combination of different contributions to the interaction potential between the colloidal building blocks and the ability to create chirality in the assembly to establish the requirements for the controlled formation of tubular shells and thus create a colloid-based minimal model of synthetic virus capsid proteins.
Max ERC Funding
2 498 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym COMPENZYMEEVOLUTION
Project Harnessing Proto-Enzymes for Novel Catalytic Functions
Researcher (PI) Shina Caroline Lynn Kamerlin
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2012-StG_20111012
Summary Enzymes are Nature’s catalysts, reducing the timescales of the chemical reactions that drive life from millions of years to seconds. There is also great scope for enzymes as biocatalysts outside the cell, from therapeutic and synthetic applications, to bioremediation and even for the generation of novel biofuels. Recent years have seen several impressive breakthroughs in the design of artificial enzymes, particularly through experimental studies that iteratively introduce random mutations to refine existing systems until a property of interest is observed (directed evolution), as well as examples of de novo enzyme design using combined in silico / in vitro approaches. However, the tremendous catalytic proficiencies of naturally occurring enzymes are, as yet, unmatched by any man made system, in no small part due the vastness of the sequence space that needs navigating and the almost surgical precision by which enzymatic catalysis is regulated. The proposed work aims to combine state of the art computational approaches capable of consistently reproducing the catalytic activities of both wild-type and mutant enzymes with novel screening approaches for predicting mutation hotspots, in order to redesign selected showcase systems. Specifically, we aim to (1) map catalytic promiscuity in the alkaline phosphatase superfamily, using the existing multifunctionality of these enzymes as a training set for the introduction of novel functionality, and (2) computationally design enantioselective enzymes, a problem which is of particular importance to the pharmaceutical industry due to the role of chirality in drug efficacy. The resulting theoretical constructs will be subjected to rigorous testing by our collaborators, providing a feedback loop for further design effort and methodology development. In this way, we plan to push existing theoretical tools to the limit in order to bridge the gap that exists between the catalytic proficiencies of biological and man-made catalysts.
Summary
Enzymes are Nature’s catalysts, reducing the timescales of the chemical reactions that drive life from millions of years to seconds. There is also great scope for enzymes as biocatalysts outside the cell, from therapeutic and synthetic applications, to bioremediation and even for the generation of novel biofuels. Recent years have seen several impressive breakthroughs in the design of artificial enzymes, particularly through experimental studies that iteratively introduce random mutations to refine existing systems until a property of interest is observed (directed evolution), as well as examples of de novo enzyme design using combined in silico / in vitro approaches. However, the tremendous catalytic proficiencies of naturally occurring enzymes are, as yet, unmatched by any man made system, in no small part due the vastness of the sequence space that needs navigating and the almost surgical precision by which enzymatic catalysis is regulated. The proposed work aims to combine state of the art computational approaches capable of consistently reproducing the catalytic activities of both wild-type and mutant enzymes with novel screening approaches for predicting mutation hotspots, in order to redesign selected showcase systems. Specifically, we aim to (1) map catalytic promiscuity in the alkaline phosphatase superfamily, using the existing multifunctionality of these enzymes as a training set for the introduction of novel functionality, and (2) computationally design enantioselective enzymes, a problem which is of particular importance to the pharmaceutical industry due to the role of chirality in drug efficacy. The resulting theoretical constructs will be subjected to rigorous testing by our collaborators, providing a feedback loop for further design effort and methodology development. In this way, we plan to push existing theoretical tools to the limit in order to bridge the gap that exists between the catalytic proficiencies of biological and man-made catalysts.
Max ERC Funding
1 497 667 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym ComplexSwimmers
Project Biocompatible and Interactive Artificial Micro- and Nanoswimmers and Their Applications
Researcher (PI) Giovanni Volpe
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2015-STG
Summary Microswimmers, i.e., biological and artificial microscopic objects capable of self-propulsion, have been attracting a growing interest from the biological and physical communities. From the fundamental side, their study can shed light on the far-from-equilibrium physics underlying the adaptive and collective behavior of biological entities such as chemotactic bacteria and eukaryotic cells. From the more applied side, they provide tantalizing options to perform tasks not easily achievable with other available techniques, such as the targeted localization, pick-up and delivery of microscopic and nanoscopic cargoes, e.g., in drug delivery, bioremediation and chemical sensing.
However, there are still several open challenges that need to be tackled in order to achieve the full scientific and technological potential of microswimmers in real-life settings. The main challenges are: (1) to identify a biocompatible propulstion mechanism and energy supply capable of lasting for the whole particle life-cycle; (2) to understand their behavior in complex and crowded environments; (3) to learn how to engineer emergent behaviors; and (4) to scale down their dimensions towards the nanoscale.
This project aims at tackling these challenges by developing biocompatible microswimmers capable of elaborate behaviors, by engineering their performance when interacting with other particles and with a complex environment, and by developing working nanoswimmers.
To achieve these goals, we have laid out a roadmap that will lead us to push the frontiers of the current understanding of active matter both at the mesoscopic and at the nanoscopic scale, and will permit us to develop some technologically disruptive techniques, namely, targeted delivery of cargoes within complex environments, which is of interest for drug delivery and bioremediation, and efficient sorting of chiral nanoparticles, which is of interest for biomedical and pharmaceutical applications.
Summary
Microswimmers, i.e., biological and artificial microscopic objects capable of self-propulsion, have been attracting a growing interest from the biological and physical communities. From the fundamental side, their study can shed light on the far-from-equilibrium physics underlying the adaptive and collective behavior of biological entities such as chemotactic bacteria and eukaryotic cells. From the more applied side, they provide tantalizing options to perform tasks not easily achievable with other available techniques, such as the targeted localization, pick-up and delivery of microscopic and nanoscopic cargoes, e.g., in drug delivery, bioremediation and chemical sensing.
However, there are still several open challenges that need to be tackled in order to achieve the full scientific and technological potential of microswimmers in real-life settings. The main challenges are: (1) to identify a biocompatible propulstion mechanism and energy supply capable of lasting for the whole particle life-cycle; (2) to understand their behavior in complex and crowded environments; (3) to learn how to engineer emergent behaviors; and (4) to scale down their dimensions towards the nanoscale.
This project aims at tackling these challenges by developing biocompatible microswimmers capable of elaborate behaviors, by engineering their performance when interacting with other particles and with a complex environment, and by developing working nanoswimmers.
To achieve these goals, we have laid out a roadmap that will lead us to push the frontiers of the current understanding of active matter both at the mesoscopic and at the nanoscopic scale, and will permit us to develop some technologically disruptive techniques, namely, targeted delivery of cargoes within complex environments, which is of interest for drug delivery and bioremediation, and efficient sorting of chiral nanoparticles, which is of interest for biomedical and pharmaceutical applications.
Max ERC Funding
1 497 500 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COOPNET
Project Cooperative Situational Awareness for Wireless Networks
Researcher (PI) Henk Wymeersch
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE7, ERC-2010-StG_20091028
Summary Devices in wireless networks are no longer used only for communicating binary information, but also for navigation and to sense their surroundings. We are currently approaching fundamental limitations in terms of communication throughput, position information availability and accuracy, and decision making based on sensory data. The goal of this proposal is to understand how the cooperative nature of future wireless networks can be leveraged to perform timekeeping, positioning, communication, and decision making, so as to obtain orders of magnitude performance improvements compared to current architectures.
Our research will have implications in many fields and will comprise fundamental theoretical contributions as well as a cooperative wireless testbed. The fundamental contributions will lead to a deep understanding of cooperative wireless networks and will enable new pervasive applications which currently cannot be supported. The testbed will be used to validate the research, and will serve as a kernel for other researchers worldwide to advance knowledge on cooperative networks. Our work will build on and consolidate knowledge currently dispersed in different scientific disciplines and communities (such as communication theory, sensor networks, distributed estimation and detection, environmental monitoring, control theory, positioning and timekeeping, distributed optimization). It will give a new thrust to research within those communities and forge relations between them.
Summary
Devices in wireless networks are no longer used only for communicating binary information, but also for navigation and to sense their surroundings. We are currently approaching fundamental limitations in terms of communication throughput, position information availability and accuracy, and decision making based on sensory data. The goal of this proposal is to understand how the cooperative nature of future wireless networks can be leveraged to perform timekeeping, positioning, communication, and decision making, so as to obtain orders of magnitude performance improvements compared to current architectures.
Our research will have implications in many fields and will comprise fundamental theoretical contributions as well as a cooperative wireless testbed. The fundamental contributions will lead to a deep understanding of cooperative wireless networks and will enable new pervasive applications which currently cannot be supported. The testbed will be used to validate the research, and will serve as a kernel for other researchers worldwide to advance knowledge on cooperative networks. Our work will build on and consolidate knowledge currently dispersed in different scientific disciplines and communities (such as communication theory, sensor networks, distributed estimation and detection, environmental monitoring, control theory, positioning and timekeeping, distributed optimization). It will give a new thrust to research within those communities and forge relations between them.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym CurvedSusy
Project Dynamics of Supersymmetry in Curved Space
Researcher (PI) Guido Festuccia
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2014-STG
Summary Quantum field theory provides a theoretical framework to explain quantitatively natural phenomena as diverse as the fluctuations in the cosmic microwave background, superconductivity, and elementary particle interactions in colliders. Even if we use quantum field theories in different settings, their structure and dynamics are still largely mysterious. Weakly coupled systems can be studied perturbatively, however many natural phenomena are characterized by strong self-interactions (e.g. high T superconductors, nuclear forces) and their analysis requires going beyond perturbation theory. Supersymmetric field theories are very interesting in this respect because they can be studied exactly even at strong coupling and their dynamics displays phenomena like confinement or the breaking of chiral symmetries that occur in nature and are very difficult to study analytically.
Recently it was realized that many interesting insights on the dynamics of supersymmetric field theories can be obtained by placing these theories in curved space preserving supersymmetry. These advances have opened new research avenues but also left many important questions unanswered. The aim of our research programme will be to clarify the dynamics of supersymmetric field theories in curved space and use this knowledge to establish new exact results for strongly coupled supersymmetric gauge theories. The novelty of our approach resides in the systematic use of the interplay between the physical properties of a supersymmetric theory and the geometrical properties of the space-time it lives in. The analytical results we will obtain, while derived for very symmetric theories, can be used as a guide in understanding the dynamics of many physical systems. Besides providing new tools to address the dynamics of quantum field theory at strong coupling this line of investigation could lead to new connections between Physics and Mathematics.
Summary
Quantum field theory provides a theoretical framework to explain quantitatively natural phenomena as diverse as the fluctuations in the cosmic microwave background, superconductivity, and elementary particle interactions in colliders. Even if we use quantum field theories in different settings, their structure and dynamics are still largely mysterious. Weakly coupled systems can be studied perturbatively, however many natural phenomena are characterized by strong self-interactions (e.g. high T superconductors, nuclear forces) and their analysis requires going beyond perturbation theory. Supersymmetric field theories are very interesting in this respect because they can be studied exactly even at strong coupling and their dynamics displays phenomena like confinement or the breaking of chiral symmetries that occur in nature and are very difficult to study analytically.
Recently it was realized that many interesting insights on the dynamics of supersymmetric field theories can be obtained by placing these theories in curved space preserving supersymmetry. These advances have opened new research avenues but also left many important questions unanswered. The aim of our research programme will be to clarify the dynamics of supersymmetric field theories in curved space and use this knowledge to establish new exact results for strongly coupled supersymmetric gauge theories. The novelty of our approach resides in the systematic use of the interplay between the physical properties of a supersymmetric theory and the geometrical properties of the space-time it lives in. The analytical results we will obtain, while derived for very symmetric theories, can be used as a guide in understanding the dynamics of many physical systems. Besides providing new tools to address the dynamics of quantum field theory at strong coupling this line of investigation could lead to new connections between Physics and Mathematics.
Max ERC Funding
1 145 879 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym CUSTOMER
Project Customizable Embedded Real-Time Systems: Challenges and Key Techniques
Researcher (PI) Yi WANG
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Advanced Grant (AdG), PE6, ERC-2018-ADG
Summary Today, many industrial products are defined by software and therefore customizable: their functionalities implemented by software can be modified and extended by dynamic software updates on demand. This trend towards customizable products is rapidly expanding into all domains of IT, including Embedded Real-Time Systems (ERTS) deployed in Cyber-Physical Systems such as cars, medical devices etc. However, the current state-of-practice in safety-critical systems allows hardly any modifications once they are put in operation. The lack of techniques to preserve crucial safety conditions for customizable systems severely restricts the benefits of advances in software-defined systems engineering.
CUSTOMER is to provide the missing paradigm and technology for building and updating ERTS after deployment – subject to stringent timing constraints, dynamic workloads, and limited resources on complex platforms. CUSTOMER explores research areas crossing two fields: Real-Time Computing and Formal Verification to develop the key techniques enabling (1) dynamic updates of ERTS in the field, (2) incremental updates over the products life time and (3) safe updates by verification to avoid updates that may compromise system safety.
CUSTOMER will develop a unified model-based framework supported with tools for the design, modelling, verification, deployment and update of ERTS, aiming at advancing the research fields by establishing the missing scientific foundation for multiprocessor real-time computing and providing the next generation of design tools with significantly enhanced capability and scalability increased by orders of magnitude compared with state-of-the-art tools e.g. UPPAAL.
Summary
Today, many industrial products are defined by software and therefore customizable: their functionalities implemented by software can be modified and extended by dynamic software updates on demand. This trend towards customizable products is rapidly expanding into all domains of IT, including Embedded Real-Time Systems (ERTS) deployed in Cyber-Physical Systems such as cars, medical devices etc. However, the current state-of-practice in safety-critical systems allows hardly any modifications once they are put in operation. The lack of techniques to preserve crucial safety conditions for customizable systems severely restricts the benefits of advances in software-defined systems engineering.
CUSTOMER is to provide the missing paradigm and technology for building and updating ERTS after deployment – subject to stringent timing constraints, dynamic workloads, and limited resources on complex platforms. CUSTOMER explores research areas crossing two fields: Real-Time Computing and Formal Verification to develop the key techniques enabling (1) dynamic updates of ERTS in the field, (2) incremental updates over the products life time and (3) safe updates by verification to avoid updates that may compromise system safety.
CUSTOMER will develop a unified model-based framework supported with tools for the design, modelling, verification, deployment and update of ERTS, aiming at advancing the research fields by establishing the missing scientific foundation for multiprocessor real-time computing and providing the next generation of design tools with significantly enhanced capability and scalability increased by orders of magnitude compared with state-of-the-art tools e.g. UPPAAL.
Max ERC Funding
2 499 894 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym D-SynMA
Project Distributed Synthesis: from Single to Multiple Agents
Researcher (PI) Nir PITERMAN
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Consolidator Grant (CoG), PE6, ERC-2017-COG
Summary Computing is changing from living on our desktops and in dedicated devices to being everywhere. In phones, sensors, appliances, and robots – computers (from now on devices) are everywhere and affecting all aspects of our lives. The techniques to make them safe and reliable are investigated and are starting to emerge and consolidate. However, these techniques enable devices to work in isolation or co-exist. We currently do not have techniques that enable development of real autonomous collaboration between devices. Such techniques will revolutionize all usage of devices and, as consequence, our lives. Manufacturing, supply chain, transportation, infrastructures, and earth- and space exploration would all transform using techniques that enable development of collaborating devices.
When considering isolated (and co-existing) devices, reactive synthesis – automatic production of plans from high level specification – is emerging as a viable tool for the development of robots and reactive software. This is especially important in the context of safety-critical systems, where assurances are required and systems need to have guarantees on performance. The techniques that are developed today to support robust, assured, reliable, and adaptive devices rely on a major change in focus of reactive synthesis. The revolution of correct-by-construction systems from specifications is occurring and is being pushed forward.
However, to take this approach forward to work also for real collaboration between devices the theoretical frameworks that will enable distributed synthesis are required. Such foundations will enable the correct-by-construction revolution to unleash its potential and allow a multiplicative increase of utility by cooperative computation.
d-SynMA will take distributed synthesis to this new frontier by considering novel interaction and communication concepts that would create an adaptable framework of correct-by-construction application of collaborating devices.
Summary
Computing is changing from living on our desktops and in dedicated devices to being everywhere. In phones, sensors, appliances, and robots – computers (from now on devices) are everywhere and affecting all aspects of our lives. The techniques to make them safe and reliable are investigated and are starting to emerge and consolidate. However, these techniques enable devices to work in isolation or co-exist. We currently do not have techniques that enable development of real autonomous collaboration between devices. Such techniques will revolutionize all usage of devices and, as consequence, our lives. Manufacturing, supply chain, transportation, infrastructures, and earth- and space exploration would all transform using techniques that enable development of collaborating devices.
When considering isolated (and co-existing) devices, reactive synthesis – automatic production of plans from high level specification – is emerging as a viable tool for the development of robots and reactive software. This is especially important in the context of safety-critical systems, where assurances are required and systems need to have guarantees on performance. The techniques that are developed today to support robust, assured, reliable, and adaptive devices rely on a major change in focus of reactive synthesis. The revolution of correct-by-construction systems from specifications is occurring and is being pushed forward.
However, to take this approach forward to work also for real collaboration between devices the theoretical frameworks that will enable distributed synthesis are required. Such foundations will enable the correct-by-construction revolution to unleash its potential and allow a multiplicative increase of utility by cooperative computation.
d-SynMA will take distributed synthesis to this new frontier by considering novel interaction and communication concepts that would create an adaptable framework of correct-by-construction application of collaborating devices.
Max ERC Funding
1 871 272 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym DALDECS
Project Development and Application of Laser Diagnostic Techniques for Combustion Studies
Researcher (PI) Lars Eric Marcus Aldén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE8, ERC-2009-AdG
Summary This project is directed towards development of new laser diagnostic techniques and a deepened physical understanding of more established techniques, aiming at new insights in phenomena related to combustion processes. These non-intrusive techniques with high resolution in space and time, will be used for measurements of key parameters, species concentrations and temperatures. The techniques to be used are; Non-linear optical techniques, mainly Polarization spectroscopy, PS. PS will mainly be developed for sensitive detection with high spatial resolution of "new" species in the IR region, e.g. individual hydrocarbons, toxic species as well as alkali metal compounds. Multiplex measurements of these species and temperature will be developed as well as 2D visualization. Quantitative measurements with high precision and accuracy; Laser induced fluorescence and Rayleigh/Raman scattering will be developed for quantitative measurements of species concentration and 2D temperatures. Also a new technique will be developed for single ended experiments based on picosecond LIDAR. Advanced imaging techniques; New high speed (10-100 kHz) visualization techniques as well as 3D and even 4D visualization will be developed. In order to properly visualize dense sprays we will develop Ballistic Imaging as well as a new technique based on structured illumination of the area of interest for suppression of multiple scattering which normally cause blurring effects. All techniques developed above will be used for key studies of phenomena related to various combustion phenomena; turbulent combustion, multiphase conversion processes, e.g. spray combustion and gasification/pyrolysis of solid bio fuels. The techniques will also be applied for development and physical understanding of how combustion could be influenced by plasma/electrical assistance. Finally, the techniques will be prepared for applications in industrial combustion apparatus, e.g. furnaces, gasturbines and IC engines
Summary
This project is directed towards development of new laser diagnostic techniques and a deepened physical understanding of more established techniques, aiming at new insights in phenomena related to combustion processes. These non-intrusive techniques with high resolution in space and time, will be used for measurements of key parameters, species concentrations and temperatures. The techniques to be used are; Non-linear optical techniques, mainly Polarization spectroscopy, PS. PS will mainly be developed for sensitive detection with high spatial resolution of "new" species in the IR region, e.g. individual hydrocarbons, toxic species as well as alkali metal compounds. Multiplex measurements of these species and temperature will be developed as well as 2D visualization. Quantitative measurements with high precision and accuracy; Laser induced fluorescence and Rayleigh/Raman scattering will be developed for quantitative measurements of species concentration and 2D temperatures. Also a new technique will be developed for single ended experiments based on picosecond LIDAR. Advanced imaging techniques; New high speed (10-100 kHz) visualization techniques as well as 3D and even 4D visualization will be developed. In order to properly visualize dense sprays we will develop Ballistic Imaging as well as a new technique based on structured illumination of the area of interest for suppression of multiple scattering which normally cause blurring effects. All techniques developed above will be used for key studies of phenomena related to various combustion phenomena; turbulent combustion, multiphase conversion processes, e.g. spray combustion and gasification/pyrolysis of solid bio fuels. The techniques will also be applied for development and physical understanding of how combustion could be influenced by plasma/electrical assistance. Finally, the techniques will be prepared for applications in industrial combustion apparatus, e.g. furnaces, gasturbines and IC engines
Max ERC Funding
2 466 000 €
Duration
Start date: 2010-02-01, End date: 2015-01-31
Project acronym DarkComb
Project Dark-Soliton Engineering in Microresonator Frequency Combs
Researcher (PI) Victor TORRES COMPANY
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Consolidator Grant (CoG), PE7, ERC-2017-COG
Summary The continuing increase in Internet data traffic is pushing the capacity of single-mode fiber to its fundamental limits. Space division multiplexing (SDM) offers the only remaining physical degree of freedom – the space dimension in the transmission channel – to substantially increase the capacity in lightwave communication systems.
The microresonator comb is an emerging technology platform that enables the generation of an optical frequency comb in a micrometer-scale cavity. Its compact size and compatibility with established semiconductor fabrication techniques promises to revolutionize the fields of frequency synthesis and metrology, and create new mass-market applications.
I envision significant scaling advantages in future fiber-optic communications by merging SDM with microresonator frequency combs. One major obstacle to overcome here is the poor conversion efficiency that can be fundamentally obtained using the most stable and broadest combs generated in microresonators today. I propose to look into the generation of dark, as opposed to bright, temporal solitons in linearly coupled microresonators. The goal is to achieve reliable microresonator combs with exceptionally high power conversion efficiency, resulting in optimal characteristics for SDM applications. The scientific and technological possibilities of this achievement promise significant impact beyond the realm of fiber-optic communications.
My broad international experience, unique background in fiber communications, photonic waveguides and ultrafast photonics, the preliminary results of my group and the available infrastructure at my university place me in an outstanding position to pioneer this new direction of research.
Summary
The continuing increase in Internet data traffic is pushing the capacity of single-mode fiber to its fundamental limits. Space division multiplexing (SDM) offers the only remaining physical degree of freedom – the space dimension in the transmission channel – to substantially increase the capacity in lightwave communication systems.
The microresonator comb is an emerging technology platform that enables the generation of an optical frequency comb in a micrometer-scale cavity. Its compact size and compatibility with established semiconductor fabrication techniques promises to revolutionize the fields of frequency synthesis and metrology, and create new mass-market applications.
I envision significant scaling advantages in future fiber-optic communications by merging SDM with microresonator frequency combs. One major obstacle to overcome here is the poor conversion efficiency that can be fundamentally obtained using the most stable and broadest combs generated in microresonators today. I propose to look into the generation of dark, as opposed to bright, temporal solitons in linearly coupled microresonators. The goal is to achieve reliable microresonator combs with exceptionally high power conversion efficiency, resulting in optimal characteristics for SDM applications. The scientific and technological possibilities of this achievement promise significant impact beyond the realm of fiber-optic communications.
My broad international experience, unique background in fiber communications, photonic waveguides and ultrafast photonics, the preliminary results of my group and the available infrastructure at my university place me in an outstanding position to pioneer this new direction of research.
Max ERC Funding
2 259 523 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym DARKJETS
Project Discovery strategies for Dark Matter and new phenomena in hadronic signatures with the ATLAS detector at the Large Hadron Collider
Researcher (PI) Caterina Doglioni
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE2, ERC-2015-STG
Summary The Standard Model of Particle Physics describes the fundamental components of ordinary matter and their interactions. Despite its success in predicting many experimental results, the Standard Model fails to account for a number of interesting phenomena. One phenomenon of particular interest is the large excess of unobservable (Dark) matter in the Universe. This excess cannot be explained by Standard Model particles. A compelling hypothesis is that Dark Matter is comprised of particles that can be produced in the proton-proton collisions from the Large Hadron Collider (LHC) at CERN.
Within this project, I will build a team of researchers at Lund University dedicated to searches for signals of the presence of Dark Matter particles. The discovery strategies employed seek the decays of particles that either mediate the interactions between Dark and Standard Model particles or are produced in association with Dark Matter. These new particles manifest in detectors as two, three, or four collimated jets of particles (hadronic jets).
The LHC will resume delivery of proton-proton collisions to the ATLAS detector in 2015. Searches for new, rare, low mass particles such as Dark Matter mediators have so far been hindered by constraints on the rates of data that can be stored. These constraints will be overcome through the implementation of a novel real-time data analysis technique and a new search signature, both introduced to ATLAS by this project. The coincidence of this project with the upcoming LHC runs and the software and hardware improvements within the ATLAS detector is a unique opportunity to increase the sensitivity to hadronically decaying new particles by a large margin with respect to any previous searches. The results of these searches will be interpreted within a comprehensive and coherent set of theoretical benchmarks, highlighting the strengths of collider experiments in the global quest for Dark Matter.
Summary
The Standard Model of Particle Physics describes the fundamental components of ordinary matter and their interactions. Despite its success in predicting many experimental results, the Standard Model fails to account for a number of interesting phenomena. One phenomenon of particular interest is the large excess of unobservable (Dark) matter in the Universe. This excess cannot be explained by Standard Model particles. A compelling hypothesis is that Dark Matter is comprised of particles that can be produced in the proton-proton collisions from the Large Hadron Collider (LHC) at CERN.
Within this project, I will build a team of researchers at Lund University dedicated to searches for signals of the presence of Dark Matter particles. The discovery strategies employed seek the decays of particles that either mediate the interactions between Dark and Standard Model particles or are produced in association with Dark Matter. These new particles manifest in detectors as two, three, or four collimated jets of particles (hadronic jets).
The LHC will resume delivery of proton-proton collisions to the ATLAS detector in 2015. Searches for new, rare, low mass particles such as Dark Matter mediators have so far been hindered by constraints on the rates of data that can be stored. These constraints will be overcome through the implementation of a novel real-time data analysis technique and a new search signature, both introduced to ATLAS by this project. The coincidence of this project with the upcoming LHC runs and the software and hardware improvements within the ATLAS detector is a unique opportunity to increase the sensitivity to hadronically decaying new particles by a large margin with respect to any previous searches. The results of these searches will be interpreted within a comprehensive and coherent set of theoretical benchmarks, highlighting the strengths of collider experiments in the global quest for Dark Matter.
Max ERC Funding
1 268 076 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym DEVOCEAN
Project Impact of diatom evolution on the oceans
Researcher (PI) Daniel CONLEY
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2018-ADG
Summary Motivated by a series of recent discoveries, DEVOCEAN will provide the first comprehensive evaluation of the emergence of diatoms and their impact on the global biogeochemical cycle of silica, carbon and other nutrients that regulate ocean productivity and ultimately climate. I propose that the proliferation of phytoplankton that occurred after the Permian-Triassic extinction, in particular the diatoms, fundamentally influenced oceanic environments through the enhancement of carbon export to depth as part of the biological pump. Although molecular clocks suggest that diatoms evolved over 200 Ma ago, this result has been largely ignored because of the lack of diatoms in the geologic fossil record with most studies therefore focused on diversification during the Cenozoic where abundant diatom fossils are found. Much of the older fossil evidence has likely been destroyed by dissolution during diagenesis, subducted or is concealed deep within the Earth under many layers of rock. DEVOCEAN will provide evidence on diatom evolution and speciation in the geological record by examining formations representing locations in which diatoms are likely to have accumulated in ocean sediments. We will generate robust estimates of the timing and magnitude of dissolved Si drawdown following the origin of diatoms using the isotopic silicon composition of fossil sponge spicules and radiolarians. The project will also provide fundamental new insights into the timing of dissolved Si drawdown and other key events, which reorganized the distribution of carbon and nutrients in seawater, changing energy flows and productivity in the biological communities of the ancient oceans.
Summary
Motivated by a series of recent discoveries, DEVOCEAN will provide the first comprehensive evaluation of the emergence of diatoms and their impact on the global biogeochemical cycle of silica, carbon and other nutrients that regulate ocean productivity and ultimately climate. I propose that the proliferation of phytoplankton that occurred after the Permian-Triassic extinction, in particular the diatoms, fundamentally influenced oceanic environments through the enhancement of carbon export to depth as part of the biological pump. Although molecular clocks suggest that diatoms evolved over 200 Ma ago, this result has been largely ignored because of the lack of diatoms in the geologic fossil record with most studies therefore focused on diversification during the Cenozoic where abundant diatom fossils are found. Much of the older fossil evidence has likely been destroyed by dissolution during diagenesis, subducted or is concealed deep within the Earth under many layers of rock. DEVOCEAN will provide evidence on diatom evolution and speciation in the geological record by examining formations representing locations in which diatoms are likely to have accumulated in ocean sediments. We will generate robust estimates of the timing and magnitude of dissolved Si drawdown following the origin of diatoms using the isotopic silicon composition of fossil sponge spicules and radiolarians. The project will also provide fundamental new insights into the timing of dissolved Si drawdown and other key events, which reorganized the distribution of carbon and nutrients in seawater, changing energy flows and productivity in the biological communities of the ancient oceans.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym DisDyn
Project Distributed and Dynamic Graph Algorithms and Complexity
Researcher (PI) Danupon NA NONGKAI
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2016-STG
Summary This project aims to (i) resolve challenging graph problems in distributed and dynamic settings, with a focus on connectivity problems (such as computing edge connectivity and distances), and (ii) on the way develop a systematic approach to attack problems in these settings, by thoroughly exploring relevant algorithmic and complexity-theoretic landscapes. Tasks include
- building a hierarchy of intermediate computational models so that designing algorithms and proving lower bounds can be done in several intermediate steps,
- explaining the limits of algorithms by proving conditional lower bounds based on old and new reasonable conjectures, and
- connecting techniques in the two settings to generate new insights that are unlikely to emerge from the isolated viewpoint of a single field.
The project will take advantage from and contribute to the developments in many young fields in theoretical computer science, such as fine-grained complexity and sublinear algorithms. Resolving one of the connectivity problems will already be a groundbreaking result. However, given the approach, it is likely that one breakthrough will lead to many others.
Summary
This project aims to (i) resolve challenging graph problems in distributed and dynamic settings, with a focus on connectivity problems (such as computing edge connectivity and distances), and (ii) on the way develop a systematic approach to attack problems in these settings, by thoroughly exploring relevant algorithmic and complexity-theoretic landscapes. Tasks include
- building a hierarchy of intermediate computational models so that designing algorithms and proving lower bounds can be done in several intermediate steps,
- explaining the limits of algorithms by proving conditional lower bounds based on old and new reasonable conjectures, and
- connecting techniques in the two settings to generate new insights that are unlikely to emerge from the isolated viewpoint of a single field.
The project will take advantage from and contribute to the developments in many young fields in theoretical computer science, such as fine-grained complexity and sublinear algorithms. Resolving one of the connectivity problems will already be a groundbreaking result. However, given the approach, it is likely that one breakthrough will lead to many others.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym DM
Project Dirac Materials
Researcher (PI) Alexander Balatsky
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE3, ERC-2012-ADG_20120216
Summary "The elegant Dirac equation, describing the linear dispersion (energy/momentum) relation of electrons at relativistic speeds, has profound consequences such as the prediction of antiparticles, reflection less tunneling (Klein paradox) and others. Recent discovery of graphene and topological insulators (TI) highlights the scientific importance and technological promise of materials with “relativistic Dirac dispersion"" of electrons for functional materials and device applications with novel functionalities. One might use term ‘Dirac materials’ to encompass a subset of (materials) systems in which the low energy phase space for fermion excitations is reduced compared to conventional band structure predictions (i.e. point or lines of nodes vs. full Fermi Surface).
Dirac materials are characterized by universal low energy properties due to presence of the nodal excitations. It is this reduction of phase space due to additional symmetries that can be turned on and off that opens a new door to functionality of Dirac materials.
We propose to use the sensitivity of nodes in the electron spectrum of Dirac materials to induce controlled modifications of the Dirac points/lines via band structure engineering in artificial structures and via inelastic scattering processes with controlled doping. Proposed research will expand our theoretical understanding and guide design of materials and engineered geometries that allow tunable energy profiles of Dirac carriers."
Summary
"The elegant Dirac equation, describing the linear dispersion (energy/momentum) relation of electrons at relativistic speeds, has profound consequences such as the prediction of antiparticles, reflection less tunneling (Klein paradox) and others. Recent discovery of graphene and topological insulators (TI) highlights the scientific importance and technological promise of materials with “relativistic Dirac dispersion"" of electrons for functional materials and device applications with novel functionalities. One might use term ‘Dirac materials’ to encompass a subset of (materials) systems in which the low energy phase space for fermion excitations is reduced compared to conventional band structure predictions (i.e. point or lines of nodes vs. full Fermi Surface).
Dirac materials are characterized by universal low energy properties due to presence of the nodal excitations. It is this reduction of phase space due to additional symmetries that can be turned on and off that opens a new door to functionality of Dirac materials.
We propose to use the sensitivity of nodes in the electron spectrum of Dirac materials to induce controlled modifications of the Dirac points/lines via band structure engineering in artificial structures and via inelastic scattering processes with controlled doping. Proposed research will expand our theoretical understanding and guide design of materials and engineered geometries that allow tunable energy profiles of Dirac carriers."
Max ERC Funding
1 700 000 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym e-NeuroPharma
Project Electronic Neuropharmacology
Researcher (PI) Rolf Magnus BERGGREN
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Advanced Grant (AdG), PE5, ERC-2018-ADG
Summary As the population ages, neurodegenerative diseases (ND) will have a devastating impact on individuals and society. Despite enormous research efforts there is still no cure for these diseases, only care! The origin of ND is hugely complex, spanning from the molecular level to systemic processes, causing malfunctioning of signalling in the central nervous system (CNS). This signalling includes the coupled processing of biochemical and electrical signals, however current approaches for symptomatic- and disease modifying treatments are all based on biochemical approaches, alone.
Organic bioelectronics has arisen as a promising technology providing signal translation, as sensors and modulators, across the biology-technology interface; especially, it has proven unique in neuronal applications. There is great opportunity with organic bioelectronics since it can complement biochemical pharmacology to enable a twinned electric-biochemical therapy for ND and neurological disorders. However, this technology is traditionally manufactured on stand-alone substrates. Even though organic bioelectronics has been manufactured on flexible and soft carriers in the past, current technology consume space and volume, that when applied to CNS, rule out close proximity and amalgamation between the bioelectronics technology and CNS components – features that are needed in order to reach high therapeutic efficacy.
e-NeuroPharma includes development of innovative organic bioelectronics, that can be in-vivo-manufactured within the brain. The overall aim is to evaluate and develop electrodes, delivery devices and sensors that enable a twinned biochemical-electric therapy approach to combat ND and other neurological disorders. e-NeuroPharma will focus on the development of materials that can cross the blood-brain-barrier, that self-organize and -polymerize along CNS components, and that record and regulate relevant electrical, electrochemical and physical parameters relevant to ND and disorders
Summary
As the population ages, neurodegenerative diseases (ND) will have a devastating impact on individuals and society. Despite enormous research efforts there is still no cure for these diseases, only care! The origin of ND is hugely complex, spanning from the molecular level to systemic processes, causing malfunctioning of signalling in the central nervous system (CNS). This signalling includes the coupled processing of biochemical and electrical signals, however current approaches for symptomatic- and disease modifying treatments are all based on biochemical approaches, alone.
Organic bioelectronics has arisen as a promising technology providing signal translation, as sensors and modulators, across the biology-technology interface; especially, it has proven unique in neuronal applications. There is great opportunity with organic bioelectronics since it can complement biochemical pharmacology to enable a twinned electric-biochemical therapy for ND and neurological disorders. However, this technology is traditionally manufactured on stand-alone substrates. Even though organic bioelectronics has been manufactured on flexible and soft carriers in the past, current technology consume space and volume, that when applied to CNS, rule out close proximity and amalgamation between the bioelectronics technology and CNS components – features that are needed in order to reach high therapeutic efficacy.
e-NeuroPharma includes development of innovative organic bioelectronics, that can be in-vivo-manufactured within the brain. The overall aim is to evaluate and develop electrodes, delivery devices and sensors that enable a twinned biochemical-electric therapy approach to combat ND and other neurological disorders. e-NeuroPharma will focus on the development of materials that can cross the blood-brain-barrier, that self-organize and -polymerize along CNS components, and that record and regulate relevant electrical, electrochemical and physical parameters relevant to ND and disorders
Max ERC Funding
3 237 335 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym ECOHERB
Project Drivers and impacts of invertebrate herbivores across forest ecosystems globally.
Researcher (PI) Daniel Metcalfe
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2015-CoG
Summary Forests slow global climate change by absorbing atmospheric carbon dioxide but this ecosystem service is limited by soil nutrients. Herbivores potentially alter soil nutrients in a range of ways, but these have mostly only been recorded for large mammals. By comparison, the impacts of the abundant invertebrates in forests have largely been ignored and are not included in current models used to generate the climate predictions so vital for designing governmental policies
The proposed project will use a pioneering new interdisciplinary approach to provide the most complete picture yet available of the rates, underlying drivers and ultimate impacts of key nutrient inputs from invertebrate herbivores across forest ecosystems worldwide. Specifically, we will:
(1) Establish a network of herbivory monitoring stations across all major forest types, and across key environmental gradients (temperature, rainfall, ecosystem development).
(2) Perform laboratory experiments to examine the effects of herbivore excreta on soil processes under different temperature and moisture conditions.
(3) Integrate this information into a cutting-edge ecosystem model, to generate more accurate predictions of forest carbon sequestration under future climate change.
The network established will form the foundation for a unique long-term global monitoring effort which we intend to continue long after the current funding time scale. This work represents a powerful blend of several disciplines harnessing an array of cutting edge tools to provide fundamentally novel insights into an area of direct and urgent importance for the society.
Summary
Forests slow global climate change by absorbing atmospheric carbon dioxide but this ecosystem service is limited by soil nutrients. Herbivores potentially alter soil nutrients in a range of ways, but these have mostly only been recorded for large mammals. By comparison, the impacts of the abundant invertebrates in forests have largely been ignored and are not included in current models used to generate the climate predictions so vital for designing governmental policies
The proposed project will use a pioneering new interdisciplinary approach to provide the most complete picture yet available of the rates, underlying drivers and ultimate impacts of key nutrient inputs from invertebrate herbivores across forest ecosystems worldwide. Specifically, we will:
(1) Establish a network of herbivory monitoring stations across all major forest types, and across key environmental gradients (temperature, rainfall, ecosystem development).
(2) Perform laboratory experiments to examine the effects of herbivore excreta on soil processes under different temperature and moisture conditions.
(3) Integrate this information into a cutting-edge ecosystem model, to generate more accurate predictions of forest carbon sequestration under future climate change.
The network established will form the foundation for a unique long-term global monitoring effort which we intend to continue long after the current funding time scale. This work represents a powerful blend of several disciplines harnessing an array of cutting edge tools to provide fundamentally novel insights into an area of direct and urgent importance for the society.
Max ERC Funding
1 750 000 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym ECOSOCPOL
Project Social and Political Economics: Theory and Evidence
Researcher (PI) Torsten Persson
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), SH1, ERC-2015-AdG
Summary In this project, I will study how individual and social motives interact to drive individual decisions, a question that has fallen between the cracks of different social-science approaches. I will use a common theoretical framework to approach an important, but badly understood, general question: do social motives reinforce or weaken the effect of changes in individual motives? By modifying this common framework to different applications, I will consider its predictions empirically in different large data sets with individual-level information. The planned applications include four subprojects in the social, political, and economic spheres: (i) decisions in China on the ethnicity of children in interethnic marriages and matching into such marriages, (ii) decisions on tax evasion in the U.K. and Sweden, (iii) decisions to give political campaign contributions in the U.S., and (iv) decisions about fertility in Sweden. I may also spell out the common lessons from the results on the interaction between individual and social motives in monograph format intended for a broader audience.
Summary
In this project, I will study how individual and social motives interact to drive individual decisions, a question that has fallen between the cracks of different social-science approaches. I will use a common theoretical framework to approach an important, but badly understood, general question: do social motives reinforce or weaken the effect of changes in individual motives? By modifying this common framework to different applications, I will consider its predictions empirically in different large data sets with individual-level information. The planned applications include four subprojects in the social, political, and economic spheres: (i) decisions in China on the ethnicity of children in interethnic marriages and matching into such marriages, (ii) decisions on tax evasion in the U.K. and Sweden, (iii) decisions to give political campaign contributions in the U.S., and (iv) decisions about fertility in Sweden. I may also spell out the common lessons from the results on the interaction between individual and social motives in monograph format intended for a broader audience.
Max ERC Funding
1 104 812 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym ELECTRONOPERA
Project Electron dynamics to the Attosecond time scale and Angstrom length scale on low dimensional structures in Operation
Researcher (PI) Anders Mikkelsen
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2010-StG_20091028
Summary We will develop and use imaging techniques for direct probing of electron dynamics in low dimensional structures with orders of
magnitude improvements in time and spatial resolution. We will perform our measurements not only on static structures, but on
complex structures under operating conditions. Finally as our equipment can also probe structural properties from microns to
single atom defects we can directly correlate our observations of electron dynamics with knowledge of geometrical structure. We
hope to directly answer central questions in nanophysics on how complex geometric structure on several length-scales induces
new and surprising electron dynamics and thus properties in nanoscale objects.
The low dimensional semiconductors and metal (nano) structures studied will be chosen to have unique novel properties that will
have potential applications in IT, life-science and renewable energy.
To radically increase our diagnostics capabilities we will combine PhotoEmission Electron Microscopy and attosecond XUV/IR
laser technology to directly image surface electron dynamics with attosecond time resolution and nanometer lateral resolution.
Exploring a completely new realm in terms of timescale with nm resolution we will start with rather simple structure such as Au
nanoparticles and arrays nanoholes in ultrathin metal films, and gradually increase complexity.
As the first group in the world we have shown that atomic resolved structural and electrical measurements by Scanning Tunneling
Microscopy is possible on complex 1D semiconductors heterostructures. Importantly, our new method allows for direct studies of
nanowires in devices.
We can now measure atomic scale surface chemistry and surface electronic/geometric structure directly on operational/operating
nanoscale devices. This is important both from a technology point of view, and is an excellent playground for understanding the
fundamental interplay between electronic and structural properties.
Summary
We will develop and use imaging techniques for direct probing of electron dynamics in low dimensional structures with orders of
magnitude improvements in time and spatial resolution. We will perform our measurements not only on static structures, but on
complex structures under operating conditions. Finally as our equipment can also probe structural properties from microns to
single atom defects we can directly correlate our observations of electron dynamics with knowledge of geometrical structure. We
hope to directly answer central questions in nanophysics on how complex geometric structure on several length-scales induces
new and surprising electron dynamics and thus properties in nanoscale objects.
The low dimensional semiconductors and metal (nano) structures studied will be chosen to have unique novel properties that will
have potential applications in IT, life-science and renewable energy.
To radically increase our diagnostics capabilities we will combine PhotoEmission Electron Microscopy and attosecond XUV/IR
laser technology to directly image surface electron dynamics with attosecond time resolution and nanometer lateral resolution.
Exploring a completely new realm in terms of timescale with nm resolution we will start with rather simple structure such as Au
nanoparticles and arrays nanoholes in ultrathin metal films, and gradually increase complexity.
As the first group in the world we have shown that atomic resolved structural and electrical measurements by Scanning Tunneling
Microscopy is possible on complex 1D semiconductors heterostructures. Importantly, our new method allows for direct studies of
nanowires in devices.
We can now measure atomic scale surface chemistry and surface electronic/geometric structure directly on operational/operating
nanoscale devices. This is important both from a technology point of view, and is an excellent playground for understanding the
fundamental interplay between electronic and structural properties.
Max ERC Funding
1 419 120 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym ERIKLINDAHLERC2007
Project Multiscale and Distributed Computing Algorithms for Biomolecular Simulation and Efficient Free Energy Calculations
Researcher (PI) Erik Lindahl
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Summary
The long-term goal of our research is to advance the state-of-the-art in molecular simulation algorithms by 4-5 orders of magnitude, particularly in the context of the GROMACS software we are developing. This is an immense challenge, but with huge potential rewards: it will be an amazing virtual microscope for basic chemistry, polymer and material science research; it could help us understand the molecular basis of diseases such as Creutzfeldt-Jacob, and it would enable rational design rather than random screening for future drugs. To realize it, we will focus on four critical topics: • ALGORITHMS FOR SIMULATION ON GRAPHICS AND OTHER STREAMING PROCESSORS: Graphics cards and the test Intel 80-core chip are not only the most powerful processors available, but this type of streaming architectures will power many supercomputers in 3-5 years, and it is thus critical that we design new “streamable” MD algorithms. • MULTISCALE MODELING: We will develop virtual-site-based methods to bridge atomic and mesoscopic dynamics, QM/MM, and mixed explicit/implicit solvent models with water layers around macromolecules. • MULTI-LEVEL PARALLEL & DISTRIBUTED SIMULATION: Distributed computing provides virtually infinite computer power, but has been limited to small systems. We will address this by combining SMP parallelization and Markov State Models that partition phase space into transition/local dynamics to enable distributed simulation of arbitrary systems. • EFFICIENT FREE ENERGY CALCULATIONS: We will design algorithms for multi-conformational parallel sampling, implement Bennett Acceptance Ratios in Gromacs, correction terms for PME lattice sums, and combine standard force fields with polarization/multipoles, e.g. Amoeba. We have a very strong track record of converting methodological advances into applications, and the results will have impact on a wide range of fields from biomolecules and polymer science through material simulations and nanotechnology.
Max ERC Funding
992 413 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym ESUX
Project Electron Spectroscopy using Ultra Brilliant X-rays - a program for the advancement of state-of-the-art instrumentation and science
Researcher (PI) Nils Ove Tor Mårtensson
Host Institution (HI) Uppsala University
Call Details Advanced Grant (AdG), PE4, ERC-2012-ADG_20120216
Summary The progress of materials science depends critically on the access to advanced characterization methods. During the last decades there has been an almost revolutionary development of modern X-ray based tools. We are today at a turning point in the development of synchrotron radiation based electron spectroscopy, where the program of the PI has maintained a leading position since 20 years. New techniques are evolving parallel to the development of a new generation of ultra-brilliant synchrotron radiation (SR) facilities. Electron spectroscopy is one of the most important techniques for the advancement of materials science and is one of the corner stones for the research at SR facilities. It is of highest priority to introduce new types of instruments to push the spectroscopy into new domains of time, spatial, energy and angular resolution.
We have recently accomplished a break-through in this field with a new type of electron analyzer, the ArTOF instrument, capable of increasing the energy resolution down to the micro-eV range with a simultaneous increase of the transmission of almost three orders of magnitude, compared to the earlier instruments. In addition the emission angles of all electrons are determined, with high precision and within a wide cone. This allows us to obtain three dimensional electronic structure information in real time. The present ERC proposal defines an ambitious program to fully exploit the new possibilities in urgent fields of research: In situ time resolved electronic band structure of organic crystals for electronic applications, time resolved studies of 3D band structure of solids and new 2D materials (graphene, topological insulators), electron structure and dynamics of materials for solar cell applications, and in other important research fields. The research program will also adapt the new technique to take advantage of new opportunities opened by emerging ultra-brilliant X-ray sources.
Summary
The progress of materials science depends critically on the access to advanced characterization methods. During the last decades there has been an almost revolutionary development of modern X-ray based tools. We are today at a turning point in the development of synchrotron radiation based electron spectroscopy, where the program of the PI has maintained a leading position since 20 years. New techniques are evolving parallel to the development of a new generation of ultra-brilliant synchrotron radiation (SR) facilities. Electron spectroscopy is one of the most important techniques for the advancement of materials science and is one of the corner stones for the research at SR facilities. It is of highest priority to introduce new types of instruments to push the spectroscopy into new domains of time, spatial, energy and angular resolution.
We have recently accomplished a break-through in this field with a new type of electron analyzer, the ArTOF instrument, capable of increasing the energy resolution down to the micro-eV range with a simultaneous increase of the transmission of almost three orders of magnitude, compared to the earlier instruments. In addition the emission angles of all electrons are determined, with high precision and within a wide cone. This allows us to obtain three dimensional electronic structure information in real time. The present ERC proposal defines an ambitious program to fully exploit the new possibilities in urgent fields of research: In situ time resolved electronic band structure of organic crystals for electronic applications, time resolved studies of 3D band structure of solids and new 2D materials (graphene, topological insulators), electron structure and dynamics of materials for solar cell applications, and in other important research fields. The research program will also adapt the new technique to take advantage of new opportunities opened by emerging ultra-brilliant X-ray sources.
Max ERC Funding
2 486 128 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym FLEXBOT
Project Flexible object manipulation based on statistical learning and topological representations
Researcher (PI) Danica Kragic Jensfelt
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2011-StG_20101014
Summary A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Summary
A vision for the future are autonomous and semi-autonomous systems that perform complex tasks safely and robustly in interaction with humans and the environment. The action of such a system needs to be carefully planned and executed, taking into account the available sensory feedback and knowledge about the environment. Many of the existing approaches view motion planning as a geometrical problem, not taking the uncertainty into account. Our goal is to study how different type of representations and algorithms from the area of machine learning and classical mathematics can be used to solve some of the open problems in the area of action recognition and action generation.
FLEXBOT will explore how how topological representations can be used for an integrated approach toward i) vision based understanding of complex human hand motion, ii) mapping and control of robotics hands and iii) integrating the topological representations with models for high-level task encoding and planning.
Our research opens for new and important areas scientifically and technologically. Scientifically, we push for new way of thinking in an area that has traditionally been born from mechanical modeling of bodies. Technologically, we will provide methods plausible for evaluation of new designs of robotic and prosthetic hands. Further development of machine learning and computer vision methods will allow for scene understanding that goes beyond the assumption of worlds of rigid bodies, including complex objects such as hands.
Max ERC Funding
1 398 720 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym FSA
Project Fluid Spectrum Acess
Researcher (PI) Alexandre Proutiere
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE7, ERC-2012-StG_20111012
Summary Spectrum is a key and scarce resource in wireless communication networks, and it remains tightly controlled by regulation authorities. Most of the frequency bands are exclusively allocated to a single system licensed to use it everywhere and for long periods of time. This rigid spectrum management model inevitably leads to significant inefficiencies in spectrum use. The explosion of demand for broadband wireless services also calls for more flexible models where much larger spectrum parts could be dynamically shared among users in a fluid manner. In such models, Dynamic Spectrum Access (DSA) techniques will play a major role. These techniques make it possible for radio devices to become frequency-agile, i.e. able to rapidly and dynamically access bands of a wide spectrum part.
The success and spread of dynamic spectrum access strongly rely on the ability for many frequency-agile devices (or systems) to coexist peacefully and efficiently. With multiple interacting devices, the research agenda shifts from spectrum access problems to spectrum sharing problems, which raises original and challenging questions. There may be limited or no communication between the different devices or systems sharing spectrum. We further expect systems to be heterogeneous in their transmission capabilities, but also in the type of service they support. In that context, the design of spectrum access strategies resulting in an efficient and fair spectrum resource use constitutes a challenging puzzle. The broad objective of the proposed research is to develop original analytical and simulation tools to tackle dynamic spectrum sharing issues. The project leverages and marries techniques from distributed optimization and machine learning to design decentralized, efficient, and fair spectrum sharing algorithms. We believe that such algorithms are critical for the birth and rapid expansion of DSA technologies and hence for the development of future wireless broadband systems.
Summary
Spectrum is a key and scarce resource in wireless communication networks, and it remains tightly controlled by regulation authorities. Most of the frequency bands are exclusively allocated to a single system licensed to use it everywhere and for long periods of time. This rigid spectrum management model inevitably leads to significant inefficiencies in spectrum use. The explosion of demand for broadband wireless services also calls for more flexible models where much larger spectrum parts could be dynamically shared among users in a fluid manner. In such models, Dynamic Spectrum Access (DSA) techniques will play a major role. These techniques make it possible for radio devices to become frequency-agile, i.e. able to rapidly and dynamically access bands of a wide spectrum part.
The success and spread of dynamic spectrum access strongly rely on the ability for many frequency-agile devices (or systems) to coexist peacefully and efficiently. With multiple interacting devices, the research agenda shifts from spectrum access problems to spectrum sharing problems, which raises original and challenging questions. There may be limited or no communication between the different devices or systems sharing spectrum. We further expect systems to be heterogeneous in their transmission capabilities, but also in the type of service they support. In that context, the design of spectrum access strategies resulting in an efficient and fair spectrum resource use constitutes a challenging puzzle. The broad objective of the proposed research is to develop original analytical and simulation tools to tackle dynamic spectrum sharing issues. The project leverages and marries techniques from distributed optimization and machine learning to design decentralized, efficient, and fair spectrum sharing algorithms. We believe that such algorithms are critical for the birth and rapid expansion of DSA technologies and hence for the development of future wireless broadband systems.
Max ERC Funding
1 197 040 €
Duration
Start date: 2012-11-01, End date: 2017-10-31
Project acronym FUN POLYSTORE
Project FUNctionalized POLYmer electrolytes for energy STORagE
Researcher (PI) Daniel BRANDELL
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2017-COG
Summary Besides the need for large-scale implementation of renewable energy sources, there is an equivalent need for new energy storage solutions. This is not least true for the transport sector, where electric vehicles are expanding rapidly. The rich flora of battery chemistries – today crowned by the Li-ion battery – is likewise expected to expand in upcoming years. Novel types of batteries, “post-lithium ion”, will challenge the Li-ion chemistries by advantages in cost, sustainability, elemental abundance or energy density. This requires significant improvements of the materials, not least regarding the electrolyte. The conventional liquid battery electrolytes pose a problem already for the mature Li-ion chemistries due to safety and cost, but are particularly destructive for future battery types such as Li-metal, organic electrodes, Li-S, Li-O2, Na- or Mg-batteries, where rapid degradation and loss of material are associated with incompatibilities with the electrolytes. In this context, solid state polymer electrolytes (SPEs) could provide a considerable improvement.
The field of solid polymer electrolytes (SPEs) is dominated by polyethers, particularly poly(ethylene oxide) (PEO). This application regards moving out of the established PEO-paradigm and exploring alternative polymer hosts for SPEs, primarily polycarbonates and polyesters. These ‘alternative’ polymers are comparatively easy to work with synthetically, and their possible functionalization is straightforward. The work aims at exploring functionalized alternative polymer host for mechanically robust block-copolymer systems, for alternative cation chemistries (Na, Mg, etc.), for extremely high and low electrochemical potentials, and for unstable and easily dissolved electrode materials (sulfur, organic). Moreover, since the ion transport processes in the host materials are fundamentally different from polyethers, there is a need for investigating the conduction mechanisms using simulations.
Summary
Besides the need for large-scale implementation of renewable energy sources, there is an equivalent need for new energy storage solutions. This is not least true for the transport sector, where electric vehicles are expanding rapidly. The rich flora of battery chemistries – today crowned by the Li-ion battery – is likewise expected to expand in upcoming years. Novel types of batteries, “post-lithium ion”, will challenge the Li-ion chemistries by advantages in cost, sustainability, elemental abundance or energy density. This requires significant improvements of the materials, not least regarding the electrolyte. The conventional liquid battery electrolytes pose a problem already for the mature Li-ion chemistries due to safety and cost, but are particularly destructive for future battery types such as Li-metal, organic electrodes, Li-S, Li-O2, Na- or Mg-batteries, where rapid degradation and loss of material are associated with incompatibilities with the electrolytes. In this context, solid state polymer electrolytes (SPEs) could provide a considerable improvement.
The field of solid polymer electrolytes (SPEs) is dominated by polyethers, particularly poly(ethylene oxide) (PEO). This application regards moving out of the established PEO-paradigm and exploring alternative polymer hosts for SPEs, primarily polycarbonates and polyesters. These ‘alternative’ polymers are comparatively easy to work with synthetically, and their possible functionalization is straightforward. The work aims at exploring functionalized alternative polymer host for mechanically robust block-copolymer systems, for alternative cation chemistries (Na, Mg, etc.), for extremely high and low electrochemical potentials, and for unstable and easily dissolved electrode materials (sulfur, organic). Moreover, since the ion transport processes in the host materials are fundamentally different from polyethers, there is a need for investigating the conduction mechanisms using simulations.
Max ERC Funding
1 950 732 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym FUNMAT
Project Self-Organized Nanostructuring in Functional Thin Film Materials
Researcher (PI) Lars Hultman
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Advanced Grant (AdG), PE5, ERC-2008-AdG
Summary I aim to achieve a fundamental understanding of the atomistic kinetic pathways responsible for nanostructure formation and to explore the concept of self-organization by thermodynamic segregation in functional ceramics. Model systems are advanced ceramic thin films, which will be studied under two defining cases: 1) deposition of supersaturated solid solutions or nanocomposites by magnetron sputtering (epitaxy) and arc evaporation. 2) post-deposition annealing (ageing) of as-synthesized material. Thin film ceramics are terra incognita for compositions in the miscibility gap. The field is exciting since both surface and in-depth decomposition can take place in the alloys. The methodology is based on combined growth experiments, characterization, and ab initio calculations to identify and describe systems with a large miscibility gap. A hot topic is to elucidate the bonding nature of the cubic-SiNx interfacial phase, discovered by us in TiN/Si3N4 with impact for superhard nanocomposites. I have also pioneered studies of self-organization by spinodal decomposition in TiAlN alloy films (age hardening). Here, the details of metastable c-AlN nm domain formation are unknown and the systems HfAlN and ZrAlN are predicted to be even more promising. Other model systems are III-nitrides (band gap engineering), semiconductor/insulator oxides (interface conductivity) and carbides (tribology). The proposed research is exploratory and has the potential of explaining outstanding phenomena (Gibbs-Thomson effect, strain, and spinodal decomposition) as well as discovering new phases, for which my group has a track-record, backed-up by state-of-the-art in situ techniques. One can envision a new class of super-hard all-crystalline ceramic nanocomposites with relevance for a large number of research areas where elevated temperature is of concern, significant in impact for areas as diverse as microelectronics and cutting tools as well as mechanical and optical components.
Summary
I aim to achieve a fundamental understanding of the atomistic kinetic pathways responsible for nanostructure formation and to explore the concept of self-organization by thermodynamic segregation in functional ceramics. Model systems are advanced ceramic thin films, which will be studied under two defining cases: 1) deposition of supersaturated solid solutions or nanocomposites by magnetron sputtering (epitaxy) and arc evaporation. 2) post-deposition annealing (ageing) of as-synthesized material. Thin film ceramics are terra incognita for compositions in the miscibility gap. The field is exciting since both surface and in-depth decomposition can take place in the alloys. The methodology is based on combined growth experiments, characterization, and ab initio calculations to identify and describe systems with a large miscibility gap. A hot topic is to elucidate the bonding nature of the cubic-SiNx interfacial phase, discovered by us in TiN/Si3N4 with impact for superhard nanocomposites. I have also pioneered studies of self-organization by spinodal decomposition in TiAlN alloy films (age hardening). Here, the details of metastable c-AlN nm domain formation are unknown and the systems HfAlN and ZrAlN are predicted to be even more promising. Other model systems are III-nitrides (band gap engineering), semiconductor/insulator oxides (interface conductivity) and carbides (tribology). The proposed research is exploratory and has the potential of explaining outstanding phenomena (Gibbs-Thomson effect, strain, and spinodal decomposition) as well as discovering new phases, for which my group has a track-record, backed-up by state-of-the-art in situ techniques. One can envision a new class of super-hard all-crystalline ceramic nanocomposites with relevance for a large number of research areas where elevated temperature is of concern, significant in impact for areas as diverse as microelectronics and cutting tools as well as mechanical and optical components.
Max ERC Funding
2 292 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym GAPWAVE ICS
Project Waveguide-type semiconductor integrated circuits (ICs) in gaps between conducting surfaces with texture – architecture, electromagnetic modeling and micromachining
Researcher (PI) Per-Simon Kildal
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE7, ERC-2012-ADG_20120216
Summary In order to explore and exploit the frequency range from 30 GHz up to THz, new types of transmission lines and semiconductor architectures are needed. Conventional microwave technologies that are commonly used below 30 GHz become either too lossy or are too expensive to manufacture, and technologies used in the optical regime are not usable either. The intermediate frequency band is therefore often referred to as the THz gap, indicating the lack of commercialize-able technologies there.
Professor Kildal has invented a fundamentally new regime of transmission line, referred to as gap waveguides. The basis is newly discovered local waves appearing in the gap between two conducting surfaces, controlled by a texture in one or both of the surfaces. The gap waveguide has been verified below 20 GHz, but it will be more advantageous in the THz gap. The texture will for THz applications be of submillimeter or micrometer scale, realizable by micromachining or etching. Also, there is no need for a dielectric substrate, and there is no need for conductive contact between the two surfaces. Therefore, such gap waveguides and circuits for the THz gap can be manufactured with low cost.
The vision is that the topology of this new regime of gap waveguides will facilitate integration of semiconductor devices, and may lay the foundation for new architectures of transistors and other integrated circuits, being located inside the gap encapsulated by the conductive surfaces themselves. In order to reach this vision new and efficient numerical electromagnetic methods and modeling tools need to be developed, taking advantage of the particular gap waveguide geometry, and being able to connect to or replace the charge transport models for the transistors in the doped semiconductors themselves.
The gap waveguide technology can get a tremendous impact on exploring higher frequencies in radio astronomy, communications, and imaging for medical as well as security applications.
Summary
In order to explore and exploit the frequency range from 30 GHz up to THz, new types of transmission lines and semiconductor architectures are needed. Conventional microwave technologies that are commonly used below 30 GHz become either too lossy or are too expensive to manufacture, and technologies used in the optical regime are not usable either. The intermediate frequency band is therefore often referred to as the THz gap, indicating the lack of commercialize-able technologies there.
Professor Kildal has invented a fundamentally new regime of transmission line, referred to as gap waveguides. The basis is newly discovered local waves appearing in the gap between two conducting surfaces, controlled by a texture in one or both of the surfaces. The gap waveguide has been verified below 20 GHz, but it will be more advantageous in the THz gap. The texture will for THz applications be of submillimeter or micrometer scale, realizable by micromachining or etching. Also, there is no need for a dielectric substrate, and there is no need for conductive contact between the two surfaces. Therefore, such gap waveguides and circuits for the THz gap can be manufactured with low cost.
The vision is that the topology of this new regime of gap waveguides will facilitate integration of semiconductor devices, and may lay the foundation for new architectures of transistors and other integrated circuits, being located inside the gap encapsulated by the conductive surfaces themselves. In order to reach this vision new and efficient numerical electromagnetic methods and modeling tools need to be developed, taking advantage of the particular gap waveguide geometry, and being able to connect to or replace the charge transport models for the transistors in the doped semiconductors themselves.
The gap waveguide technology can get a tremendous impact on exploring higher frequencies in radio astronomy, communications, and imaging for medical as well as security applications.
Max ERC Funding
1 659 302 €
Duration
Start date: 2013-05-01, End date: 2017-04-30
Project acronym GLOBALVISION
Project Global Optimization Methods in Computer Vision, Pattern Recognition and Medical Imaging
Researcher (PI) Fredrik Kahl
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Summary
Computer vision concerns itself with understanding the real world through the analysis of images. Typical problems are object recognition, medical image segmentation, geometric reconstruction problems and navigation of autonomous vehicles. Such problems often lead to complicated optimization problems with a mixture of discrete and continuous variables, or even infinite dimensional variables in terms of curves and surfaces. Today, state-of-the-art in solving these problems generally relies on heuristic methods that generate only local optima of various qualities. During the last few years, work by the applicant, co-workers, and others has opened new possibilities. This research project builds on this. We will in this project focus on developing new global optimization methods for computing high-quality solutions for a broad class of problems. A guiding principle will be to relax the original, complicated problem to an approximate, simpler one to which globally optimal solutions can more easily be computed. Technically, this relaxed problem often is convex. A crucial point in this approach is to estimate the quality of the exact solution of the approximate problem compared to the (unknown) global optimum of the original problem. Preliminary results have been well received by the research community and we now wish to extend this work to more difficult and more general problem settings, resulting in thorough re-examination of algorithms used widely in different and trans-disciplinary fields. This project is to be considered as a basic research project with relevance to industry. The expected outcome is new knowledge spread to a wide community through scientific papers published at international journals and conferences as well as publicly available software.
Max ERC Funding
1 440 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym GRINDOOR
Project Green Nanotechnology for the Indoor Environment
Researcher (PI) Claes-Göran Sture Granqvist
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Advanced Grant (AdG), PE5, ERC-2010-AdG_20100224
Summary The GRINDOOR project aims at developing and implementing new materials that enable huge energy savings in buildings and improve the quality of the indoor environment. About 40% of the primary energy, and 70% of the electricity, is used in buildings, and therefore the outcome of this project can have an impact on the long-term energy demand in the EU and the World. It is a highly focused study on new nanomaterials based on some transition metal oxides, which are used for four interrelated applications related to indoor lighting and indoor air: (i) electrochromic coatings are integrated in devices and used in “smart windows” to regulate the inflow of visible light and solar energy in order to minimize air condition and create indoor comfort, (ii) thermochromic nanoparticulate coatings are used on windows to provide large temperature-dependent control of the inflow of infrared solar radiation (in stand-alone cases as well as in conjunction with electrochromics), (iii) oxide-based gas sensors are used to measure indoor air quality especially with regard to formaldehyde, and (iv) photocatalytic coatings are used for indoor air cleaning. The investigated materials have many things in common and a joint and focused study, such as the one proposed here, will generate important new knowledge that can be transferred between the various sub-projects. The new oxide materials are prepared by advanced reactive gas deposition—using unique equipment—and high-pressure reactive dc magnetron sputtering. The materials are characterized and investigated by a wide range of state-of-the-art techniques.
Summary
The GRINDOOR project aims at developing and implementing new materials that enable huge energy savings in buildings and improve the quality of the indoor environment. About 40% of the primary energy, and 70% of the electricity, is used in buildings, and therefore the outcome of this project can have an impact on the long-term energy demand in the EU and the World. It is a highly focused study on new nanomaterials based on some transition metal oxides, which are used for four interrelated applications related to indoor lighting and indoor air: (i) electrochromic coatings are integrated in devices and used in “smart windows” to regulate the inflow of visible light and solar energy in order to minimize air condition and create indoor comfort, (ii) thermochromic nanoparticulate coatings are used on windows to provide large temperature-dependent control of the inflow of infrared solar radiation (in stand-alone cases as well as in conjunction with electrochromics), (iii) oxide-based gas sensors are used to measure indoor air quality especially with regard to formaldehyde, and (iv) photocatalytic coatings are used for indoor air cleaning. The investigated materials have many things in common and a joint and focused study, such as the one proposed here, will generate important new knowledge that can be transferred between the various sub-projects. The new oxide materials are prepared by advanced reactive gas deposition—using unique equipment—and high-pressure reactive dc magnetron sputtering. The materials are characterized and investigated by a wide range of state-of-the-art techniques.
Max ERC Funding
2 328 726 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym HALOGEN
Project Understanding Halogen Bonding in Solution: Investigation of Yet Unexplored Interactions with Applications in Medicinal Chemistry
Researcher (PI) Mate Erdelyi
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Starting Grant (StG), PE4, ERC-2010-StG_20091028
Summary Halogen bonding is an electron density donation-based weak interaction that has so far almost exclusively been investigated in computational and crystallographic studies. It shows high similarities to hydrogen bonding; however, its applicability for molecular recognition processes long remained unappreciated and has not been thoroughly explored.
The main goals of this project are (1) to take the major leap from solid state/computational to /solution/ investigations of halogen bonding by developing novel NMR methods, using these (2) perform the first ever systematic physicochemical study of halogen bonding in solutions, and (3) to apply the gained knowledge in structural biology through elucidation of the anaesthetic binding site of native proteins. This in turn is of direct clinical relevance by providing a long-sought understanding of the disease malignant hyperthermia.
Model compounds will be prepared using solution-phase and solid-supported organic synthesis; NMR methods will be developed for physicochemical studies of molecular recognition processes and applied in structural biology through the study of the interaction of anaesthetics with proteins involved in cellular calcium regulation.
Using a peptidomimetic model system and an outstandingly sensitive NMR technique I will systematically study the impact of halogen bond donor and acceptor sites, and of electronic and solvent effects on the strength of the interaction. The proposed method will quantify relative stability of a strategically-designed, cooperatively folding model system.
A second NMR technique will utilize paramagnetic effects and permit simultaneous characterization of bond strength and geometry of weak intermolecular complexes in solution. The technique will first be validated on small, organic model compounds and subsequently be transferred to weak, protein-ligand interactions. It will be exploited to gain an atomic level understanding of anaesthesia.
Summary
Halogen bonding is an electron density donation-based weak interaction that has so far almost exclusively been investigated in computational and crystallographic studies. It shows high similarities to hydrogen bonding; however, its applicability for molecular recognition processes long remained unappreciated and has not been thoroughly explored.
The main goals of this project are (1) to take the major leap from solid state/computational to /solution/ investigations of halogen bonding by developing novel NMR methods, using these (2) perform the first ever systematic physicochemical study of halogen bonding in solutions, and (3) to apply the gained knowledge in structural biology through elucidation of the anaesthetic binding site of native proteins. This in turn is of direct clinical relevance by providing a long-sought understanding of the disease malignant hyperthermia.
Model compounds will be prepared using solution-phase and solid-supported organic synthesis; NMR methods will be developed for physicochemical studies of molecular recognition processes and applied in structural biology through the study of the interaction of anaesthetics with proteins involved in cellular calcium regulation.
Using a peptidomimetic model system and an outstandingly sensitive NMR technique I will systematically study the impact of halogen bond donor and acceptor sites, and of electronic and solvent effects on the strength of the interaction. The proposed method will quantify relative stability of a strategically-designed, cooperatively folding model system.
A second NMR technique will utilize paramagnetic effects and permit simultaneous characterization of bond strength and geometry of weak intermolecular complexes in solution. The technique will first be validated on small, organic model compounds and subsequently be transferred to weak, protein-ligand interactions. It will be exploited to gain an atomic level understanding of anaesthesia.
Max ERC Funding
1 495 630 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym HIDDeN
Project HIDDeN - Exploring the Hidden Dusty Nuclei of Galaxies
Researcher (PI) Eva Susanne AALTO
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Luminous infrared galaxies (LIRGs) emit most of their bolometric luminosity in the far-infrared. They are mainly powered by extreme bursts of star formation and/or Active Galactic Nuclei (AGNs; accreting supermassive black holes (SMBHs)) in their centres. LIRGs are the closest examples of rapid evolution in galaxies and a detailed study of LIRGs is critical for our understanding of the cosmic evolution of galaxies and SMBHs. Centres of some LIRGs are deeply obscured and unreachable at optical, IR and even X-ray wavelengths. These hidden nuclei therefore represent a largely unexplored phase of the growth of central regions with their SMBHs. Large growth spurts are suspected to occur when the SMBHs are deeply embedded. Obscured AGNs thus can provide new constraints on the AGN duty cycle, give the full range of environments and astrophysical processes that drive the growth of SMBHs, and help to complete the picture of connections between the host galaxy and SMBH. Many dust embedded AGNs are still to be discovered as studies suggest that a significant fraction of SMBHs may be obscured in the local and more distant Universe.
In the HIDDeN project we use mm and submm observational methods to reach behind the curtain of dust in the most embedded centres of LIRGs, allowing us to undertake ground-breaking studies of heretofore hidden rapid evolutionary phases of nearby galaxy nuclei. HIDDeN takes advantage of emerging opportunities to address the nature of near-field, and redshift z=1-2, obscured AGNs/starbursts and their associated molecular inflows and outflows in the context of their evolution and the starburst-AGN connection. In particular we use the ALMA and NOEMA telescopes, supported by JVLA, LOFAR, HST and future JWST observations, to address four interconnected goals: A. Probing the Dusty Interiors of Compact Obscured Nuclei (CONs), B. The cold winds of change - Molecular Outflows from LIRGs and AGNs, C. The Co-Evolution of Starbursts and AGNs and D. Are there hidden CONs at z=1-2
Summary
Luminous infrared galaxies (LIRGs) emit most of their bolometric luminosity in the far-infrared. They are mainly powered by extreme bursts of star formation and/or Active Galactic Nuclei (AGNs; accreting supermassive black holes (SMBHs)) in their centres. LIRGs are the closest examples of rapid evolution in galaxies and a detailed study of LIRGs is critical for our understanding of the cosmic evolution of galaxies and SMBHs. Centres of some LIRGs are deeply obscured and unreachable at optical, IR and even X-ray wavelengths. These hidden nuclei therefore represent a largely unexplored phase of the growth of central regions with their SMBHs. Large growth spurts are suspected to occur when the SMBHs are deeply embedded. Obscured AGNs thus can provide new constraints on the AGN duty cycle, give the full range of environments and astrophysical processes that drive the growth of SMBHs, and help to complete the picture of connections between the host galaxy and SMBH. Many dust embedded AGNs are still to be discovered as studies suggest that a significant fraction of SMBHs may be obscured in the local and more distant Universe.
In the HIDDeN project we use mm and submm observational methods to reach behind the curtain of dust in the most embedded centres of LIRGs, allowing us to undertake ground-breaking studies of heretofore hidden rapid evolutionary phases of nearby galaxy nuclei. HIDDeN takes advantage of emerging opportunities to address the nature of near-field, and redshift z=1-2, obscured AGNs/starbursts and their associated molecular inflows and outflows in the context of their evolution and the starburst-AGN connection. In particular we use the ALMA and NOEMA telescopes, supported by JVLA, LOFAR, HST and future JWST observations, to address four interconnected goals: A. Probing the Dusty Interiors of Compact Obscured Nuclei (CONs), B. The cold winds of change - Molecular Outflows from LIRGs and AGNs, C. The Co-Evolution of Starbursts and AGNs and D. Are there hidden CONs at z=1-2
Max ERC Funding
2 496 319 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym HIGH-GEAR
Project High-valent protein-coordinated catalytic metal sites: Geometric and Electronic ARchitecture
Researcher (PI) Martin Ivar HÖGBOM
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Consolidator Grant (CoG), PE4, ERC-2016-COG
Summary It is estimated that almost half of all enzymes utilize metal cofactors for their function, for example the respiratory complexes and the oxygen-evolving photosystem II, the most fundamental requirements for aerobic life as we know it. If we could mimic nature’s use of metals for harvesting sunlight, energy conversion and chemical synthesis it would eliminate the need for fossil fuels and greatly increase the possibilities of chemical industry while reducing the environmental impact. Achieving this type of chemistry is an outstanding testament to evolution and understanding it is a glaring challenge to mankind.
These types of reactions are based on very challenging redox chemistry (involving one or several electrons). The key catalytic species are generally high-valent metal clusters with a varying ligand environment, provided by the protein and other bound molecules, that directly controls the reactivity of the inorganic core. To be able to understand and mimic this chemistry it is of central importance to know the geometric and electronic structures of the metal core as well as the entire ligand environment for these usually short-lived and very reactive intermediates. It has, for a number of reasons, proven extremely challenging to obtain these for protein-coordinated catalysts.
The central goal of this project is to determine true and accurate geometric and electronic structures of high-valent di-nuclear Fe/Fe and Mn/Fe metal sites coordinated in protein matrices known to direct these for varied and important chemistry. By combining new X-ray diffraction based techniques with advanced spectroscopy we aim to define how the protein controls the entatic state as well as reactivity and mechanism for some of the most potent catalysts in nature. The results will serve as a basis for design of oxygen-activating catalysts with novel properties.
Summary
It is estimated that almost half of all enzymes utilize metal cofactors for their function, for example the respiratory complexes and the oxygen-evolving photosystem II, the most fundamental requirements for aerobic life as we know it. If we could mimic nature’s use of metals for harvesting sunlight, energy conversion and chemical synthesis it would eliminate the need for fossil fuels and greatly increase the possibilities of chemical industry while reducing the environmental impact. Achieving this type of chemistry is an outstanding testament to evolution and understanding it is a glaring challenge to mankind.
These types of reactions are based on very challenging redox chemistry (involving one or several electrons). The key catalytic species are generally high-valent metal clusters with a varying ligand environment, provided by the protein and other bound molecules, that directly controls the reactivity of the inorganic core. To be able to understand and mimic this chemistry it is of central importance to know the geometric and electronic structures of the metal core as well as the entire ligand environment for these usually short-lived and very reactive intermediates. It has, for a number of reasons, proven extremely challenging to obtain these for protein-coordinated catalysts.
The central goal of this project is to determine true and accurate geometric and electronic structures of high-valent di-nuclear Fe/Fe and Mn/Fe metal sites coordinated in protein matrices known to direct these for varied and important chemistry. By combining new X-ray diffraction based techniques with advanced spectroscopy we aim to define how the protein controls the entatic state as well as reactivity and mechanism for some of the most potent catalysts in nature. The results will serve as a basis for design of oxygen-activating catalysts with novel properties.
Max ERC Funding
1 968 375 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym highECS
Project Reining in the upper bound on Earth’s Climate Sensitivities
Researcher (PI) Thorsten MAURITSEN
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2017-COG
Summary One of the greatest recent advances in climate science is that it is now beyond reasonable doubt that human activity is warming the Earth. The next natural question is by how much the Earth will warm for a given emission – a quantity that will be essential to regulating global warming. Yet, the likely range of 1.5-4.5 K for equilibrium climate sensitivity (ECS) for a doubling of the atmospheric CO2 concentration has not been reduced for decades. In particular the risk of ECS being high is concerning, but also represents a scientifically intriguing challenge.
In this project I will conduct unconventional and innovative research designed to limit the upper bound of ECS: I will confront leading hypotheses of extreme cloud feedbacks – the primary potential source of a high ECS – with observations from the full instrumental- and satellite records, and proxies from warm- and cold past climates. I will investigate how ocean- and atmospheric circulations impact cloud feedbacks, and seek the limits for how much past greenhouse warming could have been masked by aerosol cooling.
The highECS project builds on my developments of climate modeling, diagnostics and statistical methods, the strengths of the host institution and developments in national and international projects. The effort is timely in that the World Climate Research Programme (WCRP) has identified uncertainty in ECS as one of the grand challenges of climate science, while the capacity to observe ongoing climate change, key cloud processes, extracting new proxy evidence of past change and computing power is greater than ever before.
If successful in my objective of reining in the upper bound on climate sensitivity this will be a major breakthrough upon a nearly 40-year scientific deadlock and reduce the risk of catastrophic climate change – if not, it will indicate that extreme policy measures may be needed to curb future global warming. Either way, the economic value of knowing is tremendous.
Summary
One of the greatest recent advances in climate science is that it is now beyond reasonable doubt that human activity is warming the Earth. The next natural question is by how much the Earth will warm for a given emission – a quantity that will be essential to regulating global warming. Yet, the likely range of 1.5-4.5 K for equilibrium climate sensitivity (ECS) for a doubling of the atmospheric CO2 concentration has not been reduced for decades. In particular the risk of ECS being high is concerning, but also represents a scientifically intriguing challenge.
In this project I will conduct unconventional and innovative research designed to limit the upper bound of ECS: I will confront leading hypotheses of extreme cloud feedbacks – the primary potential source of a high ECS – with observations from the full instrumental- and satellite records, and proxies from warm- and cold past climates. I will investigate how ocean- and atmospheric circulations impact cloud feedbacks, and seek the limits for how much past greenhouse warming could have been masked by aerosol cooling.
The highECS project builds on my developments of climate modeling, diagnostics and statistical methods, the strengths of the host institution and developments in national and international projects. The effort is timely in that the World Climate Research Programme (WCRP) has identified uncertainty in ECS as one of the grand challenges of climate science, while the capacity to observe ongoing climate change, key cloud processes, extracting new proxy evidence of past change and computing power is greater than ever before.
If successful in my objective of reining in the upper bound on climate sensitivity this will be a major breakthrough upon a nearly 40-year scientific deadlock and reduce the risk of catastrophic climate change – if not, it will indicate that extreme policy measures may be needed to curb future global warming. Either way, the economic value of knowing is tremendous.
Max ERC Funding
1 998 654 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym HISTORICALDATABASE
Project The Swedish historical database project
Researcher (PI) Per Einar Pettersson Lidbom
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Consolidator Grant (CoG), SH1, ERC-2013-CoG
Summary The Swedish historical data base project will put together and make publicly available highly disaggregated data on roughly a yearly basis for about 2500 Swedish administrative districts over the period 1749-1952. The finished data set will consist of comprehensive and detailed information on economic activity, political characteristics, vital statistics, occupational structure, education, social and agriculture statistics and infrastructure investments (e.g., railway construction). The comprehensiveness and complete coverage of historical data at the local administrative level is what makes this project unique from an international perspective. Since Sweden has the longest continuous and reliable data series on population and vital statistics in the world,starting as early as 1749, makes it possible to construct a comprehensive panel data set over all 2,500 Swedish local administrative units covering a 200 year period. Consequently, the total number of observations for each variable can be as large as 0.5 million (N=2500×T=200). With this type of rich and disaggregated historical data it become possible to get a better understanding of economic growth, structural transformation and economic development. Also, within-country variation allows for more satisfying empirical identification strategies such as instrumental variables, regression discontinuities or difference-in-differences estimation. As a case in point, I have demonstrated the potential usefulness of the Swedish historical data by addressing the question of whether redistribution of resources towards the poor differs between types of democracy after democratization. The identification strategy is based on a regression-discontinuity design where the type of democracy partly is a function of population size. This paper is currently “revise and resubmit” 2nd round at Econometrica. After collecting the new data, we intend to studying a number of questions related to economic development and growth.
Summary
The Swedish historical data base project will put together and make publicly available highly disaggregated data on roughly a yearly basis for about 2500 Swedish administrative districts over the period 1749-1952. The finished data set will consist of comprehensive and detailed information on economic activity, political characteristics, vital statistics, occupational structure, education, social and agriculture statistics and infrastructure investments (e.g., railway construction). The comprehensiveness and complete coverage of historical data at the local administrative level is what makes this project unique from an international perspective. Since Sweden has the longest continuous and reliable data series on population and vital statistics in the world,starting as early as 1749, makes it possible to construct a comprehensive panel data set over all 2,500 Swedish local administrative units covering a 200 year period. Consequently, the total number of observations for each variable can be as large as 0.5 million (N=2500×T=200). With this type of rich and disaggregated historical data it become possible to get a better understanding of economic growth, structural transformation and economic development. Also, within-country variation allows for more satisfying empirical identification strategies such as instrumental variables, regression discontinuities or difference-in-differences estimation. As a case in point, I have demonstrated the potential usefulness of the Swedish historical data by addressing the question of whether redistribution of resources towards the poor differs between types of democracy after democratization. The identification strategy is based on a regression-discontinuity design where the type of democracy partly is a function of population size. This paper is currently “revise and resubmit” 2nd round at Econometrica. After collecting the new data, we intend to studying a number of questions related to economic development and growth.
Max ERC Funding
1 200 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym HYDROCARB
Project Towards a new understanding of carbon processing in freshwaters: methane emission hot spots and carbon burial
Researcher (PI) Sebastian Sobek
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE10, ERC-2013-StG
Summary In spite of their small areal extent, inland waters play a vital role in the carbon cycle of the continents, as they emit significant amounts of the greenhouse gases (GHG) carbon dioxide (CO2) and methane (CH4) to the atmosphere, and simultaneously bury more organic carbon (OC) in their sediments than the entire ocean. Particularly in tropical hydropower reservoirs, GHG emissions can be large, mainly owing to high CH4 emission. Moreover, the number of tropical hydropower reservoirs will continue to increase dramatically, due to an urgent need for economic growth and a vast unused hydropower potential in many tropical countries. However, the current understanding of the magnitude of GHG emission, and of the processes regulating it, is insufficient. Here I propose a research program on tropical reservoirs in Brazil that takes advantage of recent developments in both concepts and methodologies to provide unique evaluations of GHG emission and OC burial in tropical reservoirs. In particular, I will test the following hypotheses: 1) Current estimates of reservoir CH4 emission are at least one order of magnitude too low, since they have completely missed the recently discovered existence of gas bubble emission hot spots; 2) The burial of land-derived OC in reservoir sediments offsets a significant share of the GHG emissions; and 3) The sustained, long-term CH4 emission from reservoirs is to a large degree fuelled by primary production of new OC within the reservoir, and may therefore be reduced by management of nutrient supply. The new understanding and the cross-disciplinary methodological approach will constitute a major advance to aquatic science in general, and have strong impacts on the understanding of other aquatic systems at other latitudes as well. In addition, the results will be merged into an existing reservoir GHG risk assessment tool to improve planning, design, management and judgment of hydropower reservoirs.
Summary
In spite of their small areal extent, inland waters play a vital role in the carbon cycle of the continents, as they emit significant amounts of the greenhouse gases (GHG) carbon dioxide (CO2) and methane (CH4) to the atmosphere, and simultaneously bury more organic carbon (OC) in their sediments than the entire ocean. Particularly in tropical hydropower reservoirs, GHG emissions can be large, mainly owing to high CH4 emission. Moreover, the number of tropical hydropower reservoirs will continue to increase dramatically, due to an urgent need for economic growth and a vast unused hydropower potential in many tropical countries. However, the current understanding of the magnitude of GHG emission, and of the processes regulating it, is insufficient. Here I propose a research program on tropical reservoirs in Brazil that takes advantage of recent developments in both concepts and methodologies to provide unique evaluations of GHG emission and OC burial in tropical reservoirs. In particular, I will test the following hypotheses: 1) Current estimates of reservoir CH4 emission are at least one order of magnitude too low, since they have completely missed the recently discovered existence of gas bubble emission hot spots; 2) The burial of land-derived OC in reservoir sediments offsets a significant share of the GHG emissions; and 3) The sustained, long-term CH4 emission from reservoirs is to a large degree fuelled by primary production of new OC within the reservoir, and may therefore be reduced by management of nutrient supply. The new understanding and the cross-disciplinary methodological approach will constitute a major advance to aquatic science in general, and have strong impacts on the understanding of other aquatic systems at other latitudes as well. In addition, the results will be merged into an existing reservoir GHG risk assessment tool to improve planning, design, management and judgment of hydropower reservoirs.
Max ERC Funding
1 798 227 €
Duration
Start date: 2013-09-01, End date: 2019-08-31
Project acronym INSYSBIO
Project Industrial Systems Biology of Yeast and A. oryzae
Researcher (PI) Jens Nielsen
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE8, ERC-2009-AdG
Summary Metabolic engineering is the development of new cell factories or improving existing ones, and it is the enabling science that allows for sustainable production of fuels and chemicals through biotechnology. With the development in genomics and functional genomics, it has become interesting to evaluate how advanced high-throughput experimental techniques (transcriptome, proteome, metabolome and fluxome) can be applied for improving the process of metabolic engineering. These techniques have mainly found applications in life sciences and studies of human health, and it is necessary to develop novel bioinformatics techniques and modelling concepts before they can provide physiological information that can be used to guide metabolic engineering strategies. In particular it is challenging how these techniques can be used to advance the use of mathematical modelling for description of the operation of complex metabolic networks. The availability of robust mathematical models will allow a wider use of mathematical models to drive metabolic engineering, in analogy with other fields of engineering where mathematical modelling is central in the design phase. In this project the advancement of novel concepts, models and technologies for enhancing metabolic engineering will be done in connection with the development of novel cell factories for high-level production of different classes of products. The chemicals considered will involve both commodity type chemicals like 3-hydroxypropionic acid and malic acid, that can be used for sustainable production of polymers, an industrial enzyme and pharmaceutical proteins like human insulin.
Summary
Metabolic engineering is the development of new cell factories or improving existing ones, and it is the enabling science that allows for sustainable production of fuels and chemicals through biotechnology. With the development in genomics and functional genomics, it has become interesting to evaluate how advanced high-throughput experimental techniques (transcriptome, proteome, metabolome and fluxome) can be applied for improving the process of metabolic engineering. These techniques have mainly found applications in life sciences and studies of human health, and it is necessary to develop novel bioinformatics techniques and modelling concepts before they can provide physiological information that can be used to guide metabolic engineering strategies. In particular it is challenging how these techniques can be used to advance the use of mathematical modelling for description of the operation of complex metabolic networks. The availability of robust mathematical models will allow a wider use of mathematical models to drive metabolic engineering, in analogy with other fields of engineering where mathematical modelling is central in the design phase. In this project the advancement of novel concepts, models and technologies for enhancing metabolic engineering will be done in connection with the development of novel cell factories for high-level production of different classes of products. The chemicals considered will involve both commodity type chemicals like 3-hydroxypropionic acid and malic acid, that can be used for sustainable production of polymers, an industrial enzyme and pharmaceutical proteins like human insulin.
Max ERC Funding
2 499 590 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym INTEGRAL
Project Integrable Systems in Gauge and String Theory
Researcher (PI) Konstantin Zarembo
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2013-ADG
Summary The project is aimed at uncovering new links between integrable systems, string theory and quantum field theory. The goal is to study non-perturbative phenomena in strongly-coupled field theories, and to understand relationship between gauge fields and strings at a deeper level.
Summary
The project is aimed at uncovering new links between integrable systems, string theory and quantum field theory. The goal is to study non-perturbative phenomena in strongly-coupled field theories, and to understand relationship between gauge fields and strings at a deeper level.
Max ERC Funding
1 693 692 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym INTGEN
Project Intergenerational correlations of schooling, income and health: an investigation of the underlying mechanisms
Researcher (PI) Carl Mikael Lindahl
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), SH1, ERC-2009-StG
Summary The objective of this project is to use rich Swedish registry data to learn about mechanisms behind intergenerational correlations. Typically, considerably effort has been spent on estimating correlations between outcome variables, such as education and income, for parents and children. However, the estimated correlations are driven by the causal effect of the parental variable of interest as well as unobservable factors such as other family background related variables and a part that is due to genetic transmission between parent and child. Disentangling these parts is very difficult and only recently has researchers made serious attempts to disentangling these different parts. However, findings vary widely across methods and this literature is still in its infancy. Among questions we ask are: How much of the association between outcome variables for the child and a parent is due to a causal effect from the parental variable, and how much is transmitted through unobservable family factors and genetic transmission? What are the intergenerational transmission and channels for life expectancy and health? What is the importance of genes-environmental interaction? Has the importance of genes, environment and its interactions for the intergenerational associations changed during the growth of the Scandinavian welfare state? How many generations does it take for ancestors placement in the income distribution to not longer matter for life success? These questions are directly relevant for policy, and relate to classical social science issues such as inequality of opportunity and level-of-living in general. The innovativeness of this project is based on using the uniqueness of Swedish registry data (ideal to answer these questions), with which one can match biological and adoptive parents, children and siblings, and hence can identify whether children are reared by their biological or adoptive parents, for the population of Swedes.
Summary
The objective of this project is to use rich Swedish registry data to learn about mechanisms behind intergenerational correlations. Typically, considerably effort has been spent on estimating correlations between outcome variables, such as education and income, for parents and children. However, the estimated correlations are driven by the causal effect of the parental variable of interest as well as unobservable factors such as other family background related variables and a part that is due to genetic transmission between parent and child. Disentangling these parts is very difficult and only recently has researchers made serious attempts to disentangling these different parts. However, findings vary widely across methods and this literature is still in its infancy. Among questions we ask are: How much of the association between outcome variables for the child and a parent is due to a causal effect from the parental variable, and how much is transmitted through unobservable family factors and genetic transmission? What are the intergenerational transmission and channels for life expectancy and health? What is the importance of genes-environmental interaction? Has the importance of genes, environment and its interactions for the intergenerational associations changed during the growth of the Scandinavian welfare state? How many generations does it take for ancestors placement in the income distribution to not longer matter for life success? These questions are directly relevant for policy, and relate to classical social science issues such as inequality of opportunity and level-of-living in general. The innovativeness of this project is based on using the uniqueness of Swedish registry data (ideal to answer these questions), with which one can match biological and adoptive parents, children and siblings, and hence can identify whether children are reared by their biological or adoptive parents, for the population of Swedes.
Max ERC Funding
631 600 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym LEARN
Project Limitations, Estimation, Adaptivity, Reinforcement and Networks in System Identification
Researcher (PI) Lennart Ljung
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Advanced Grant (AdG), PE7, ERC-2010-AdG_20100224
Summary The objective with this proposal is to provide design tools and algorithms for model management in robust, adaptive and autonomous engineering systems. The increasing demands on reliable models for systems of ever greater complexity have pointed to several insufficiencies in today's techniques for model construction. The proposal addresses key areas where new ideas are required. Modeling a central issue in many scientific fields. System Identification is the term used in the Automatic Control Community for the area of building mathematical models of dynamical systems from observed input and output signals, but several other research communities work with the same problem under different names, such as (data-driven) learning.
We have identified five specific themes where progress is both acutely needed and feasible:
1. Encounters with Convex Programming Techniques: How to capitalize on the remarkable recent progress in convex and semidefinite programming to obtain efficient, robust and reliable algorithmic solutions.
2. Fundamental Limitations: To develop and elucidate what are the limits of model accuracy, regardless of the modeling method. This can be seen as a theory rooted in the Cramer-Rao inequality in the spirit of invariance results and lower bounds characterizing, e.g., Information Theory.
3. Experiment Design and Reinforcement Techniques: Study how well tailored and ``cheap'' experiments can extract essential information about isolated model properties. Also study how such methods may relate to general reinforcement techniques.
4. Potentials of Non-parametric Models: How to incorporate and adjust techniques from adjacent research communities, e.g. concerning manifold learning and Gaussian Processes in machine learning.
5. Managing Structural Constraints: To develop structure preserving identification methods for networked and decentralized systems.
We have ideas how to approach each of these themes, and initial attempts are promising.
Summary
The objective with this proposal is to provide design tools and algorithms for model management in robust, adaptive and autonomous engineering systems. The increasing demands on reliable models for systems of ever greater complexity have pointed to several insufficiencies in today's techniques for model construction. The proposal addresses key areas where new ideas are required. Modeling a central issue in many scientific fields. System Identification is the term used in the Automatic Control Community for the area of building mathematical models of dynamical systems from observed input and output signals, but several other research communities work with the same problem under different names, such as (data-driven) learning.
We have identified five specific themes where progress is both acutely needed and feasible:
1. Encounters with Convex Programming Techniques: How to capitalize on the remarkable recent progress in convex and semidefinite programming to obtain efficient, robust and reliable algorithmic solutions.
2. Fundamental Limitations: To develop and elucidate what are the limits of model accuracy, regardless of the modeling method. This can be seen as a theory rooted in the Cramer-Rao inequality in the spirit of invariance results and lower bounds characterizing, e.g., Information Theory.
3. Experiment Design and Reinforcement Techniques: Study how well tailored and ``cheap'' experiments can extract essential information about isolated model properties. Also study how such methods may relate to general reinforcement techniques.
4. Potentials of Non-parametric Models: How to incorporate and adjust techniques from adjacent research communities, e.g. concerning manifold learning and Gaussian Processes in machine learning.
5. Managing Structural Constraints: To develop structure preserving identification methods for networked and decentralized systems.
We have ideas how to approach each of these themes, and initial attempts are promising.
Max ERC Funding
2 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym M&M´S
Project New Paradigms for MEMS & NEMS Integration
Researcher (PI) Frank Niklaus
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary Micro- and nanoelectromechanical system (MEMS and NEMS) components are vital for many industrial and consumer products such as airbag systems in cars and motion controls in mobile phones, and many of these MEMS and NEMS enabled applications have a large impact on European industry and society. However, the potential of MEMS and NEMS is being critically hampered by their dependence on integrated circuit (IC) manufacturing technologies. Most micro- and nano-manufacturing methods have been developed by the IC industry and are characterized by highly standardized manufacturing processes that are adapted for extremely large production volumes of more than 10.000 wafers per month. In contrast, the vast majority of MEMS and NEMS applications only demands production volumes of less than 100 wafers per month in combination with different non-standardized manufacturing and integration processes for each product. If a much wider variety of diverse and even low-volume MEMS and NEMS products shall be exploited, the semiconductor manufacturing paradigm has to be broken. In this project, we therefore will focus on frontier research on new paradigms for flexible and cost-efficient manufacturing and integration of MEMS and NEMS within three related research areas:
(1) Wafer-Level Heterogeneous Integration for MEMS and NEMS, where we explore new and improved wafer-level heterogeneous integration technologies for MEMS and NEMS devices;
(2) Integration of Materials into MEMS Using High-Speed Wire Bonding Tools, where we explore new ways of integrating various types of wire materials into MEMS devices;
(3) Free-Form 3D Printing of Mono-Crystalline Silicon Micro- and Nanostructures, where we explore entirely novel ways of implementing mono-crystalline silicon MEMS and NEMS structures that can be arbitrarily shaped.
Summary
Micro- and nanoelectromechanical system (MEMS and NEMS) components are vital for many industrial and consumer products such as airbag systems in cars and motion controls in mobile phones, and many of these MEMS and NEMS enabled applications have a large impact on European industry and society. However, the potential of MEMS and NEMS is being critically hampered by their dependence on integrated circuit (IC) manufacturing technologies. Most micro- and nano-manufacturing methods have been developed by the IC industry and are characterized by highly standardized manufacturing processes that are adapted for extremely large production volumes of more than 10.000 wafers per month. In contrast, the vast majority of MEMS and NEMS applications only demands production volumes of less than 100 wafers per month in combination with different non-standardized manufacturing and integration processes for each product. If a much wider variety of diverse and even low-volume MEMS and NEMS products shall be exploited, the semiconductor manufacturing paradigm has to be broken. In this project, we therefore will focus on frontier research on new paradigms for flexible and cost-efficient manufacturing and integration of MEMS and NEMS within three related research areas:
(1) Wafer-Level Heterogeneous Integration for MEMS and NEMS, where we explore new and improved wafer-level heterogeneous integration technologies for MEMS and NEMS devices;
(2) Integration of Materials into MEMS Using High-Speed Wire Bonding Tools, where we explore new ways of integrating various types of wire materials into MEMS devices;
(3) Free-Form 3D Printing of Mono-Crystalline Silicon Micro- and Nanostructures, where we explore entirely novel ways of implementing mono-crystalline silicon MEMS and NEMS structures that can be arbitrarily shaped.
Max ERC Funding
1 495 982 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym MACROCLIMATE
Project Quantitative dynamic macroeconomic analysis of global climate change and inequality
Researcher (PI) Per Krusell
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), SH1, ERC-2008-AdG
Summary The proposal is to form a Research Center for Quantitative Macroeconomic Research (RCQMR) at the Institute for International Economic Studies at Stockholm University. The aim of the RCQMR is to conduct research within the general area of macroeconomics and inequality. However, most of the focus during the buildup period will be on a broad project on the world economy and climate change. The aim is to build a dynamic quantitative macroeconomic model of the world economy with a climate system as an integral part. The novelty, relative to existing economy-climate models, is the modeling methodology: it will use modern macroeconomic analysis---in particular the numerical tools developed to study economies with a cross-section of consumers/agents---in order to substantially enrich and generalize the description of the world economy.
Summary
The proposal is to form a Research Center for Quantitative Macroeconomic Research (RCQMR) at the Institute for International Economic Studies at Stockholm University. The aim of the RCQMR is to conduct research within the general area of macroeconomics and inequality. However, most of the focus during the buildup period will be on a broad project on the world economy and climate change. The aim is to build a dynamic quantitative macroeconomic model of the world economy with a climate system as an integral part. The novelty, relative to existing economy-climate models, is the modeling methodology: it will use modern macroeconomic analysis---in particular the numerical tools developed to study economies with a cross-section of consumers/agents---in order to substantially enrich and generalize the description of the world economy.
Max ERC Funding
2 100 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym MAGNETIC-SPEED-LIMIT
Project Understanding the speed limits of magnetism
Researcher (PI) Stefano BONETTI
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary While the origin of magnetic order in condensed matter is in the exchange and spin-orbit interactions, with time scales in the subpicosecond ranges, it has been long believed that magnetism could only be manipulated at nanosecond rates, exploiting dipolar interactions with external magnetic fields. However, in the past decade researchers have been able to observe ultrafast magnetic dynamics at its intrinsic time scales without the need for magnetic fields, thus revolutionising the view on the speed limits of magnetism. Despite many achievements in ultrafast magnetism, the understanding of the fundamental physics that allows for the ultrafast dissipation of angular momentum is still only partial, hampered by the lack of experimental techniques suited to fully explore these phenomena. However, the recent appearance of two new types of coherent radiation, single-cycle THz pulses and x-rays generated at free electron lasers (FELs), has provided researchers access to a whole new set of capabilities to tackle this challenge. This proposal suggests using these techniques to achieve an encompassing view of ultrafast magnetic dynamics in metallic ferromagnets, via the following three research objectives: (a) to reveal ultrafast dynamics driven by strong THz radiation in several magnetic systems using table-top femtosecond lasers; (b) to unravel the contribution of lattice dynamics to ultrafast demagnetization in different magnetic materials using the x-rays produced at FELs and (c) to directly image ultrafast spin currents by creating femtosecond movies with nanometre resolution. The proposed experiments are challenging and explore unchartered territories, but if successful, they will advance the understanding of the speed limits of magnetism, at the time scales of the exchange and spin-orbit interactions. They will also open up for future investigations of ultrafast magnetic phenomena in materials with large electronic correlations or spin-orbit coupling.
Summary
While the origin of magnetic order in condensed matter is in the exchange and spin-orbit interactions, with time scales in the subpicosecond ranges, it has been long believed that magnetism could only be manipulated at nanosecond rates, exploiting dipolar interactions with external magnetic fields. However, in the past decade researchers have been able to observe ultrafast magnetic dynamics at its intrinsic time scales without the need for magnetic fields, thus revolutionising the view on the speed limits of magnetism. Despite many achievements in ultrafast magnetism, the understanding of the fundamental physics that allows for the ultrafast dissipation of angular momentum is still only partial, hampered by the lack of experimental techniques suited to fully explore these phenomena. However, the recent appearance of two new types of coherent radiation, single-cycle THz pulses and x-rays generated at free electron lasers (FELs), has provided researchers access to a whole new set of capabilities to tackle this challenge. This proposal suggests using these techniques to achieve an encompassing view of ultrafast magnetic dynamics in metallic ferromagnets, via the following three research objectives: (a) to reveal ultrafast dynamics driven by strong THz radiation in several magnetic systems using table-top femtosecond lasers; (b) to unravel the contribution of lattice dynamics to ultrafast demagnetization in different magnetic materials using the x-rays produced at FELs and (c) to directly image ultrafast spin currents by creating femtosecond movies with nanometre resolution. The proposed experiments are challenging and explore unchartered territories, but if successful, they will advance the understanding of the speed limits of magnetism, at the time scales of the exchange and spin-orbit interactions. They will also open up for future investigations of ultrafast magnetic phenomena in materials with large electronic correlations or spin-orbit coupling.
Max ERC Funding
1 967 755 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym MAMBA
Project Molecular mechanism of amyloid β aggregation
Researcher (PI) Sara Elisabet Snogerup Linse
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Generation of toxic oligomers during aggregation of amyloid beta peptide (Abeta42) into amyloid fibrils is a central event in Alzheimer disease. Understanding the aggregation process is therefore one important step towards therapy and diagnosis of the disease. We propose a physical chemistry approach with the goal of finding the molecular mechanisms behind the process in terms of the underlying microscopic steps and the molecular driving forces governing each step. We will use methodology developed recently in our laboratory yielding unprecedented reproducibility in the kinetic data. The methodology relies on optimization of every step from production and purification to isolation of highly pure monomeric peptide, and inertness and minimized area of all surfaces. We will use cell viability studies to detect toxic oligomeric species, and selective radio-labeling experiments to pinpoint the origin of those species. In order to obtain insight into the molecular determinants and the relative role of different kinds of intermolecular interactions for each microscopic step, we will study the concentration dependent aggregation kinetics as a function of extrinsic and intrinsic parameters. Extrinsic parameters include temperature, salt, pH, biological membranes, other proteins, and low and high Mw inhibitors. Intrinsic parameters include point mutations and sequence extension/truncation. We will perform detailed kinetic studies for each inhibitor to learn which step in the process is inhibited coupled to cell toxicity assays to learn whether the generation of toxic oligomers is limited. We will use spectroscopic techniques, dynamic light scattering, cryogenic transmission electron microscopy and mass spectrometry coupled to HD exchange to learn about structural transitions as a function of process progression under different conditions to favor different microscopic steps. The results may lead to improved diagnostics and therapeutics of Alzheimer disease.
Summary
Generation of toxic oligomers during aggregation of amyloid beta peptide (Abeta42) into amyloid fibrils is a central event in Alzheimer disease. Understanding the aggregation process is therefore one important step towards therapy and diagnosis of the disease. We propose a physical chemistry approach with the goal of finding the molecular mechanisms behind the process in terms of the underlying microscopic steps and the molecular driving forces governing each step. We will use methodology developed recently in our laboratory yielding unprecedented reproducibility in the kinetic data. The methodology relies on optimization of every step from production and purification to isolation of highly pure monomeric peptide, and inertness and minimized area of all surfaces. We will use cell viability studies to detect toxic oligomeric species, and selective radio-labeling experiments to pinpoint the origin of those species. In order to obtain insight into the molecular determinants and the relative role of different kinds of intermolecular interactions for each microscopic step, we will study the concentration dependent aggregation kinetics as a function of extrinsic and intrinsic parameters. Extrinsic parameters include temperature, salt, pH, biological membranes, other proteins, and low and high Mw inhibitors. Intrinsic parameters include point mutations and sequence extension/truncation. We will perform detailed kinetic studies for each inhibitor to learn which step in the process is inhibited coupled to cell toxicity assays to learn whether the generation of toxic oligomers is limited. We will use spectroscopic techniques, dynamic light scattering, cryogenic transmission electron microscopy and mass spectrometry coupled to HD exchange to learn about structural transitions as a function of process progression under different conditions to favor different microscopic steps. The results may lead to improved diagnostics and therapeutics of Alzheimer disease.
Max ERC Funding
2 499 920 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym MATHFOR
Project Formalization of Constructive Mathematics
Researcher (PI) Thierry Coquand
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Advanced Grant (AdG), PE6, ERC-2009-AdG
Summary The general theme is to explore the connections between reasoning and computations in mathematics. There are two main research directions. The first research direction is a refomulation of Hilbert's program, using ideas from formal, or pointfree topology. We have shown, with multiple examples, that this allows a partial realization of this program in commutative algebra, and a new way to formulate constructive mathematics. The second research direction explores the computational content using type theory and the Curry-Howard correspondence between proofs and programs. Type theory allows us to represent constructive mathematics in a formal way, and provides key insight for the design of proof systems helping in the analysis of the logical structure of mathematical proofs. The interest of this program is well illustrated by the recent work of G. Gonthier on the formalization of the 4 color theorem.
Summary
The general theme is to explore the connections between reasoning and computations in mathematics. There are two main research directions. The first research direction is a refomulation of Hilbert's program, using ideas from formal, or pointfree topology. We have shown, with multiple examples, that this allows a partial realization of this program in commutative algebra, and a new way to formulate constructive mathematics. The second research direction explores the computational content using type theory and the Curry-Howard correspondence between proofs and programs. Type theory allows us to represent constructive mathematics in a formal way, and provides key insight for the design of proof systems helping in the analysis of the logical structure of mathematical proofs. The interest of this program is well illustrated by the recent work of G. Gonthier on the formalization of the 4 color theorem.
Max ERC Funding
1 912 288 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym MECCA
Project Meeting Challenges in Computer Architecture
Researcher (PI) Per Orvar Stenström
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Computer technology has doubled computational performance every 24 months, over the past several decades. This performance growth rate has been an enabler for the dramatic innovation in information technology that now embraces our society. Before 2004, application developers could exploit this performance growth rate with no effort. However, since 2004 power consumption of computer chips exceeded the allowable limits and from that point and onwards, parallel computer architectures became the norm. Currently, parallelism is completely exposed to application developers and managing it is difficult and time-consuming. This has a serious impact on software productivity that may stall progress in information technology.
Technology forecasts predict that by 2020 there will be hundreds of processors on a computer chip. Apart from managing parallelism, keeping power consumption within allowable limits will remain a key roadblock for maintaining historical performance growth rates. Power efficiency must increase by an order of magnitude in the next ten years to not limit the growth rate. Finally, computer chips are also key components in embedded controllers, where stringent timing responses are mandatory. Delivering predictable and tight response times using parallel architectures is a challenging and unsolved problem.
MECCA takes a novel, interdisciplinary and unconventional approach to address three important challenges facing computer architecture – the three Ps: Parallelism, Power, and Predictability in a unified framework. Unlike earlier, predominantly disciplinary approaches, MECCA bridges layers in computing systems from the programming language/model, to the compiler, to the run-time/OS, down to the architecture layer. This opens up for exchanging information across layers to manage parallelism and architectural resources in a
transparent way to application developers to meet challenging performance, power, and predictability requirements for future computers."
Summary
"Computer technology has doubled computational performance every 24 months, over the past several decades. This performance growth rate has been an enabler for the dramatic innovation in information technology that now embraces our society. Before 2004, application developers could exploit this performance growth rate with no effort. However, since 2004 power consumption of computer chips exceeded the allowable limits and from that point and onwards, parallel computer architectures became the norm. Currently, parallelism is completely exposed to application developers and managing it is difficult and time-consuming. This has a serious impact on software productivity that may stall progress in information technology.
Technology forecasts predict that by 2020 there will be hundreds of processors on a computer chip. Apart from managing parallelism, keeping power consumption within allowable limits will remain a key roadblock for maintaining historical performance growth rates. Power efficiency must increase by an order of magnitude in the next ten years to not limit the growth rate. Finally, computer chips are also key components in embedded controllers, where stringent timing responses are mandatory. Delivering predictable and tight response times using parallel architectures is a challenging and unsolved problem.
MECCA takes a novel, interdisciplinary and unconventional approach to address three important challenges facing computer architecture – the three Ps: Parallelism, Power, and Predictability in a unified framework. Unlike earlier, predominantly disciplinary approaches, MECCA bridges layers in computing systems from the programming language/model, to the compiler, to the run-time/OS, down to the architecture layer. This opens up for exchanging information across layers to manage parallelism and architectural resources in a
transparent way to application developers to meet challenging performance, power, and predictability requirements for future computers."
Max ERC Funding
2 379 822 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym MEDIA AND POLICY
Project The impact of mass media on public policy
Researcher (PI) David Strömberg
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary This project will study political economics issues, that is, how public policies are influenced by political considerations. The emphasis is on the mass media's role in shaping government policies. A smaller part will also analyze how different political institutions and economic outcomes influence policy and the impact of extreme weather events. The project will mainly be empirical, using statistical methods with a focus on identifying causal effects, rather than correlations. The study of media effects will analyze the political impact of having a press actively covering politics. This is an important issue, largely unanswered because the presence of an active press is endogenous to things like corruption and voter information. We will address this question in the special case of media coverage of US Congressional elections. To identify the effect of news, we will use the fact that the amount of coverage is driven to a large extent by the coincidental match between media markets and congressional districts. We intend to analyze the effect of active press coverage on, (i) voter information, (ii) politicians actions, and (iii) federal funds per capita. The project will also investigate how political institutions and economic outcomes influences the health impacts (such as mortality among old and infants) of weather extremes. Historical weather data at a very detailed geographical level will be combined with socio-economic data in a panel (longitudinal) form. This is joint work with meteorologists who will construct historical weather data at fine grids across the globe. The part dealing with structural political economics aims to develop a framework for investigating the effects of institutions on economic policy. In existing work, there is a disconnect between the theoretical modelling and empirical applications. The aim is to close this gap.
Summary
This project will study political economics issues, that is, how public policies are influenced by political considerations. The emphasis is on the mass media's role in shaping government policies. A smaller part will also analyze how different political institutions and economic outcomes influence policy and the impact of extreme weather events. The project will mainly be empirical, using statistical methods with a focus on identifying causal effects, rather than correlations. The study of media effects will analyze the political impact of having a press actively covering politics. This is an important issue, largely unanswered because the presence of an active press is endogenous to things like corruption and voter information. We will address this question in the special case of media coverage of US Congressional elections. To identify the effect of news, we will use the fact that the amount of coverage is driven to a large extent by the coincidental match between media markets and congressional districts. We intend to analyze the effect of active press coverage on, (i) voter information, (ii) politicians actions, and (iii) federal funds per capita. The project will also investigate how political institutions and economic outcomes influences the health impacts (such as mortality among old and infants) of weather extremes. Historical weather data at a very detailed geographical level will be combined with socio-economic data in a panel (longitudinal) form. This is joint work with meteorologists who will construct historical weather data at fine grids across the globe. The part dealing with structural political economics aims to develop a framework for investigating the effects of institutions on economic policy. In existing work, there is a disconnect between the theoretical modelling and empirical applications. The aim is to close this gap.
Max ERC Funding
799 945 €
Duration
Start date: 2008-09-01, End date: 2014-08-31
Project acronym MEDIACHINA
Project Social Media and Traditional Media in China: Political and Economic Effects
Researcher (PI) Carl David STRÖMBERG
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), SH1, ERC-2016-ADG
Summary How is political accountability and firm performance in an autocracy affected by media? This project will analyse how economic and political outcomes in China are affected by social and traditional media. It will also use media content to measure factors that are otherwise difficult to observe, such as political networks and the trade-off between political and economic goals in Chinese firms. An explosion of social media use in China has produced an information shock to society and its leaders, also supplying a data shock to researchers, which is magnified by the digitization of traditional media content, and coupled with new methods for analysing this type of data, originating from the in big data and machine-learning literatures. As a result, a large set of previously unanswerable questions are now open for research.
In Qin, Strömberg and Wu (2016) we document this information shock, using a data set of over 13 billion social media posts from Sina Weibo (the Chinese equivalent of Twitter). We show that millions of posts concern sensitive topics such as organized protests and explicit accusations of top leaders of corruption. Traditional media is silent on these issues. We argue that the likely reason for the lighter censoring of social media is that the central government finds the information useful for monitoring officials, firms, and citizen unrest.
In this project, I will analyze the effect of this information shock on protests and strikes, the sales of counterfeit and substandard medicines, the promotion of local leaders, and coverage of censored events in traditional media. Together with a set of collaborator, I will study the effects of social media using the staggered introduction of Sina Weibo across geographic regions. I will also study the content, entry and exit of general-interest newspapers that are all controlled by different politicians. This is to investigate the trade-off between political and economic goals and political connections.
Summary
How is political accountability and firm performance in an autocracy affected by media? This project will analyse how economic and political outcomes in China are affected by social and traditional media. It will also use media content to measure factors that are otherwise difficult to observe, such as political networks and the trade-off between political and economic goals in Chinese firms. An explosion of social media use in China has produced an information shock to society and its leaders, also supplying a data shock to researchers, which is magnified by the digitization of traditional media content, and coupled with new methods for analysing this type of data, originating from the in big data and machine-learning literatures. As a result, a large set of previously unanswerable questions are now open for research.
In Qin, Strömberg and Wu (2016) we document this information shock, using a data set of over 13 billion social media posts from Sina Weibo (the Chinese equivalent of Twitter). We show that millions of posts concern sensitive topics such as organized protests and explicit accusations of top leaders of corruption. Traditional media is silent on these issues. We argue that the likely reason for the lighter censoring of social media is that the central government finds the information useful for monitoring officials, firms, and citizen unrest.
In this project, I will analyze the effect of this information shock on protests and strikes, the sales of counterfeit and substandard medicines, the promotion of local leaders, and coverage of censored events in traditional media. Together with a set of collaborator, I will study the effects of social media using the staggered introduction of Sina Weibo across geographic regions. I will also study the content, entry and exit of general-interest newspapers that are all controlled by different politicians. This is to investigate the trade-off between political and economic goals and political connections.
Max ERC Funding
1 716 970 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym MEGASIM
Project Million-core Molecular Simulation
Researcher (PI) Berk Hess
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE4, ERC-2010-StG_20091028
Summary Molecular simulation has become a standard tool for studying the function of biomolecules, such as proteins, nucleic acids and lipids. Due to increasing computer power and decreasing length scales in engineering, molecular simulation is also increasingly used in microfluidics and the study of, for instance, small water droplets. All these applications would benefit strongly from simulations that are several orders of magnitude longer than the current state of art. Although currently Moore's law still holds, the performance of processor cores no longer doubles every 18 months, but rather the number of cores increases. Therefore to improve the performance and to scale to a million cores, each core should do less work. With the classical single-program multiple-data parallelism the communication time will quickly become a bottleneck. To advance the molecular simulation field and efficiently use upcoming million core computers, a switch to multiple-program multiple-data parallelism (MPMD) is required. Domain decomposition should be applied over the nodes, whereas within a node MPMD parallelism should be used. This requires workloads being divided and dispatched efficiently to different threads. To hide the communication times, calculation should be overlapped with communication. Because simulation time steps will soon take in the order of 100 microseconds, global communication will become a bottleneck. However,global communication is required for, among other things, full electrostatics algorithms. Thus new algorithms need to be derived to ensure parallel scaling. Only with such efforts we will be able to fully utilize the potential of upcoming hardware to solve current and future scientific problems.
Summary
Molecular simulation has become a standard tool for studying the function of biomolecules, such as proteins, nucleic acids and lipids. Due to increasing computer power and decreasing length scales in engineering, molecular simulation is also increasingly used in microfluidics and the study of, for instance, small water droplets. All these applications would benefit strongly from simulations that are several orders of magnitude longer than the current state of art. Although currently Moore's law still holds, the performance of processor cores no longer doubles every 18 months, but rather the number of cores increases. Therefore to improve the performance and to scale to a million cores, each core should do less work. With the classical single-program multiple-data parallelism the communication time will quickly become a bottleneck. To advance the molecular simulation field and efficiently use upcoming million core computers, a switch to multiple-program multiple-data parallelism (MPMD) is required. Domain decomposition should be applied over the nodes, whereas within a node MPMD parallelism should be used. This requires workloads being divided and dispatched efficiently to different threads. To hide the communication times, calculation should be overlapped with communication. Because simulation time steps will soon take in the order of 100 microseconds, global communication will become a bottleneck. However,global communication is required for, among other things, full electrostatics algorithms. Thus new algorithms need to be derived to ensure parallel scaling. Only with such efforts we will be able to fully utilize the potential of upcoming hardware to solve current and future scientific problems.
Max ERC Funding
899 448 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym METLAKE
Project Predicting future methane fluxes from Northern lakes
Researcher (PI) DAVID TORBJORN EMANUEL BASTVIKEN
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Consolidator Grant (CoG), PE10, ERC-2016-COG
Summary The new global temperature goal calls for reliable quantification of present and future greenhouse gas (GHG) emissions, including climate feedbacks. Non-CO2 GHGs, with methane (CH4) being the most important, represent a large but highly uncertain component in global GHG budget. Lakes are among the largest natural sources of CH4 but our understanding of lake CH4 fluxes is rudimentary. Lake emissions are not yet routinely monitored, and coherent, spatially representative, long-term datasets are rare which hamper accurate flux estimates and predictions.
METLAKE aims to improve our ability to quantify and predict lake CH4 emissions. Major goals include: (1) the development of robust validated predictive models suitable for use at the lake rich northern latitudes where large climate changes are anticipated in the near future, (2) the testing of the idea that appropriate consideration of spatiotemporal scaling can greatly facilitate generation of accurate yet simple predictive models, (3) to reveal and quantify detailed flux regulation patterns including spatiotemporal interactions and response times to environmental change, and (4) to pioneer novel use of sensor networks and near ground remote sensing with a new hyperspectral CH4 camera suitable for large-scale high resolution CH4 measurements.
Extensive field work based on optimized state-of-the-art approaches will generate multi-scale and multi-system data, supplemented by experiments, and evaluated by data analyses and modelling approaches targeting effects of scaling on model performance.
Altogether, METLAKE will advance our understanding of one of the largest natural CH4 sources, and provide us with systematic tools to predict future lake emissions. Such quantification of feedbacks on natural GHG emissions is required to move beyond state-of-the-art regarding global GHG budgets and to estimate the mitigation efforts needed to reach global climate goals.
Summary
The new global temperature goal calls for reliable quantification of present and future greenhouse gas (GHG) emissions, including climate feedbacks. Non-CO2 GHGs, with methane (CH4) being the most important, represent a large but highly uncertain component in global GHG budget. Lakes are among the largest natural sources of CH4 but our understanding of lake CH4 fluxes is rudimentary. Lake emissions are not yet routinely monitored, and coherent, spatially representative, long-term datasets are rare which hamper accurate flux estimates and predictions.
METLAKE aims to improve our ability to quantify and predict lake CH4 emissions. Major goals include: (1) the development of robust validated predictive models suitable for use at the lake rich northern latitudes where large climate changes are anticipated in the near future, (2) the testing of the idea that appropriate consideration of spatiotemporal scaling can greatly facilitate generation of accurate yet simple predictive models, (3) to reveal and quantify detailed flux regulation patterns including spatiotemporal interactions and response times to environmental change, and (4) to pioneer novel use of sensor networks and near ground remote sensing with a new hyperspectral CH4 camera suitable for large-scale high resolution CH4 measurements.
Extensive field work based on optimized state-of-the-art approaches will generate multi-scale and multi-system data, supplemented by experiments, and evaluated by data analyses and modelling approaches targeting effects of scaling on model performance.
Altogether, METLAKE will advance our understanding of one of the largest natural CH4 sources, and provide us with systematic tools to predict future lake emissions. Such quantification of feedbacks on natural GHG emissions is required to move beyond state-of-the-art regarding global GHG budgets and to estimate the mitigation efforts needed to reach global climate goals.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym MICROTOMACROANDBACK
Project Micro Heterogeneity and Macroeconomic Policy
Researcher (PI) Kurt Elliott MITMAN
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary This project will develop macroeconomic models with household heterogeneity and partially demand-determined output to study how the economy is affected by monetary and fiscal policy, and to investigate the importance of housing in macroeconomic fluctuations and the transmission and efficacy of policy. The objective is to provide a modelling framework that simultaneously is consistent with both empirical micro evidence on household consumption and savings behaviour, and macro evidence on the response of the aggregate economy to a variety of economic shocks. The ultimate goal is to help make these models the new standard for the study of fluctuations and policy evaluation in macro.
Inequality and incomplete markets will be a central theme. The first objective of the project is to establish the importance of incomplete financial markets for the response of the economy to changes in monetary policy. The framework will then be used to evaluate the size of the fiscal multiplier and quantitatively evaluate the stimulative effect of extensions to unemployment benefits.
The second theme of the project will focus on housing and mortgage debt as an amplification and propagation mechanism. The recent Great Recession–preceded by an unparalleled boom and bust in house prices – has brought to light the importance of housing for the economy. The project will develop a rich benchmark model of housing and the aggregate economy for policy evaluation. First, the project will investigate how heterogeneity in housing and debt affects the transmission of monetary policy. Next, a cross-country analysis will be performed to quantify the importance of different arrangements in the mortgage market for the response of the economy to shocks. Finally, I will introduce imperfect information above the driving forces of the economy to study booms and busts in the housing market and real economic activity, with the goal of evaluating macroprudential policies geared at the housing market.
Summary
This project will develop macroeconomic models with household heterogeneity and partially demand-determined output to study how the economy is affected by monetary and fiscal policy, and to investigate the importance of housing in macroeconomic fluctuations and the transmission and efficacy of policy. The objective is to provide a modelling framework that simultaneously is consistent with both empirical micro evidence on household consumption and savings behaviour, and macro evidence on the response of the aggregate economy to a variety of economic shocks. The ultimate goal is to help make these models the new standard for the study of fluctuations and policy evaluation in macro.
Inequality and incomplete markets will be a central theme. The first objective of the project is to establish the importance of incomplete financial markets for the response of the economy to changes in monetary policy. The framework will then be used to evaluate the size of the fiscal multiplier and quantitatively evaluate the stimulative effect of extensions to unemployment benefits.
The second theme of the project will focus on housing and mortgage debt as an amplification and propagation mechanism. The recent Great Recession–preceded by an unparalleled boom and bust in house prices – has brought to light the importance of housing for the economy. The project will develop a rich benchmark model of housing and the aggregate economy for policy evaluation. First, the project will investigate how heterogeneity in housing and debt affects the transmission of monetary policy. Next, a cross-country analysis will be performed to quantify the importance of different arrangements in the mortgage market for the response of the economy to shocks. Finally, I will introduce imperfect information above the driving forces of the economy to study booms and busts in the housing market and real economic activity, with the goal of evaluating macroprudential policies geared at the housing market.
Max ERC Funding
1 299 165 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym MMFCS
Project Multiscale Models for Catalytic-Reaction-Coupled Transport Phenomena in Fuel Cells
Researcher (PI) Bengt Sundén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE8, ERC-2008-AdG
Summary In proton exchange membrane fuel cells (PEMFCs) and solid oxide fuel cells (SOFCs) there are various transport processes strongly affected by catalytic chemical/electrochemical reactions in nano- or/and micro-structured and multi-functional porous electrodes. Due to the complexity of fuel cells, fundamental understanding of physical phenomena continues to be required for the coupled chemical and transport processes with two-phase flow/water management in PEMFCs, and internal reforming reactions/thermal management in SOFCs. The project deals with the coupling of micro scale reactions (such as the electrochemical reactions and catalytic reactions) with various transport phenomena to provide a comprehensive understanding of fuel cell dynamics. The methodology for the project is a combination of model development and integration, simulation/analysis and validation. For microscopically complex porous layers and active sites, submodels will be developed by considering the detailed elementary kinetic rates based on the intermediate chemical species and their reactions occurring on the surface of the involved materials. As the inputs, the obtained data from the microscopic submodels will be implemented by the macroscopic CFD codes, previously developed for various applications, to examine local parameters in the porous electrodes and components. Both macro- and microscopic models will be validated by the experimental and/or literature data during the course of the project. The project will make progress beyond the state-of-the-art in modelling and analysis of advanced fuel cells, such as ultra low Pt loading (<0.1mgPt/cm2) and high temperature (120-200oC) PEMFCs, and intermediate temperature (600-800oC) planar SOFCs.
Summary
In proton exchange membrane fuel cells (PEMFCs) and solid oxide fuel cells (SOFCs) there are various transport processes strongly affected by catalytic chemical/electrochemical reactions in nano- or/and micro-structured and multi-functional porous electrodes. Due to the complexity of fuel cells, fundamental understanding of physical phenomena continues to be required for the coupled chemical and transport processes with two-phase flow/water management in PEMFCs, and internal reforming reactions/thermal management in SOFCs. The project deals with the coupling of micro scale reactions (such as the electrochemical reactions and catalytic reactions) with various transport phenomena to provide a comprehensive understanding of fuel cell dynamics. The methodology for the project is a combination of model development and integration, simulation/analysis and validation. For microscopically complex porous layers and active sites, submodels will be developed by considering the detailed elementary kinetic rates based on the intermediate chemical species and their reactions occurring on the surface of the involved materials. As the inputs, the obtained data from the microscopic submodels will be implemented by the macroscopic CFD codes, previously developed for various applications, to examine local parameters in the porous electrodes and components. Both macro- and microscopic models will be validated by the experimental and/or literature data during the course of the project. The project will make progress beyond the state-of-the-art in modelling and analysis of advanced fuel cells, such as ultra low Pt loading (<0.1mgPt/cm2) and high temperature (120-200oC) PEMFCs, and intermediate temperature (600-800oC) planar SOFCs.
Max ERC Funding
1 320 000 €
Duration
Start date: 2009-06-01, End date: 2014-05-31
Project acronym MODULISPACES
Project Topology of moduli spaces of Riemann surfaces
Researcher (PI) Dan PETERSEN
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2017-STG
Summary The proposal describes two main projects. Both of them concern cohomology of moduli spaces of Riemann surfaces, but the aims are rather different.
The first is a natural continuation of my work on tautological rings, which I intend to work on with Qizheng Yin and Mehdi Tavakol. In this project, we will introduce a new perspective on tautological rings, which is that the tautological cohomology of moduli spaces of pointed Riemann surfaces can be described in terms of tautological cohomology of the moduli space M_g, but with twisted coefficients. In the cases we have been able to compute so far, the tautological cohomology with twisted coefficients is always much simpler to understand, even though it “contains the same information”. In particular we hope to be able to find a systematic way of analyzing the consequences of the recent conjecture that Pixton’s relations are all relations between tautological classes; until now, most concrete consequences of Pixton’s conjecture have been found via extensive computer calculations, which are feasible only when the genus and number of markings is small.
The second project has a somewhat different flavor, involving operads and periods of moduli spaces, and builds upon recent work of myself with Johan Alm, who I will continue to collaborate with. This work is strongly informed by Brown’s breakthrough results relating mixed motives over Spec(Z) and multiple zeta values to the periods of moduli spaces of genus zero Riemann surfaces. In brief, Brown introduced a partial compactification of the moduli space M_{0,n} of n-pointed genus zero Riemann surfaces; we have shown that the spaces M_{0,n} and these partial compactifications are connected by a form of dihedral Koszul duality. It seems likely that this Koszul duality should have further ramifications in the study of multiple zeta values and periods of these spaces; optimistically, this could lead to new irrationality results for multiple zeta values.
Summary
The proposal describes two main projects. Both of them concern cohomology of moduli spaces of Riemann surfaces, but the aims are rather different.
The first is a natural continuation of my work on tautological rings, which I intend to work on with Qizheng Yin and Mehdi Tavakol. In this project, we will introduce a new perspective on tautological rings, which is that the tautological cohomology of moduli spaces of pointed Riemann surfaces can be described in terms of tautological cohomology of the moduli space M_g, but with twisted coefficients. In the cases we have been able to compute so far, the tautological cohomology with twisted coefficients is always much simpler to understand, even though it “contains the same information”. In particular we hope to be able to find a systematic way of analyzing the consequences of the recent conjecture that Pixton’s relations are all relations between tautological classes; until now, most concrete consequences of Pixton’s conjecture have been found via extensive computer calculations, which are feasible only when the genus and number of markings is small.
The second project has a somewhat different flavor, involving operads and periods of moduli spaces, and builds upon recent work of myself with Johan Alm, who I will continue to collaborate with. This work is strongly informed by Brown’s breakthrough results relating mixed motives over Spec(Z) and multiple zeta values to the periods of moduli spaces of genus zero Riemann surfaces. In brief, Brown introduced a partial compactification of the moduli space M_{0,n} of n-pointed genus zero Riemann surfaces; we have shown that the spaces M_{0,n} and these partial compactifications are connected by a form of dihedral Koszul duality. It seems likely that this Koszul duality should have further ramifications in the study of multiple zeta values and periods of these spaces; optimistically, this could lead to new irrationality results for multiple zeta values.
Max ERC Funding
1 091 249 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym MOFcat
Project Fundamental and Applied Science on Molecular Redox-Catalysts of Energy Relevance in Metal-Organic Frameworks
Researcher (PI) Sascha Ott
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary Organometallic redox-catalysts of energy relevance, i.e. water and hydrogen oxidation, and proton and carbon dioxide reduction catalysts, will be incorporated into metal-organic frameworks (MOFs). Immobilization and spatial organization of the molecular catalysts will stabilize their molecular integrity and ensure longevity and recyclability of the resulting MOFcats. The organized environment provided by the MOF will enable the control of conformational flexibility, diffusion, charge transport, and higher coordination sphere effects that play crucial roles in enzymes, but cannot be addressed in homogenous solution and are thus largely unexplored. The effect that the MOF environment has on catalysis will be directly probed electrochemically in MOFcats that are immobilized or grown on electrode surfaces. In combination with spectroscopic techniques in spectroelectrochemical cells, intermediates in the catalytic cycles will be detected and characterized. Kinetic information of the individual steps in the catalytic cycles will be obtained in MOFs that contain both a molecular photosensitizer (PS) and a molecular catalyst (PS-MOFcats). The envisaged systems will allow light-induced electron transfer processes to generate reduced or oxidized catalyst states the reactivity of which will be studied with high time resolution by transient UV/Vis and IR spectroscopy. The acquired fundamental mechanistic knowledge is far beyond the current state-of-the-art in MOF chemistry and catalysis, and will be used to prepare MOFcat-based electrodes that function at highest possible rates and lowest overpotentials. PS-MOFcats will be grown on flat semiconductor surfaces, and explored as a novel concept to photoanode and -cathode designs for dye-sensitized solar fuel devices (DSSFDs). The design is particularly appealing as it accommodates high PS concentrations for efficient light-harvesting, while providing potent catalysts close to the solvent interface.
Summary
Organometallic redox-catalysts of energy relevance, i.e. water and hydrogen oxidation, and proton and carbon dioxide reduction catalysts, will be incorporated into metal-organic frameworks (MOFs). Immobilization and spatial organization of the molecular catalysts will stabilize their molecular integrity and ensure longevity and recyclability of the resulting MOFcats. The organized environment provided by the MOF will enable the control of conformational flexibility, diffusion, charge transport, and higher coordination sphere effects that play crucial roles in enzymes, but cannot be addressed in homogenous solution and are thus largely unexplored. The effect that the MOF environment has on catalysis will be directly probed electrochemically in MOFcats that are immobilized or grown on electrode surfaces. In combination with spectroscopic techniques in spectroelectrochemical cells, intermediates in the catalytic cycles will be detected and characterized. Kinetic information of the individual steps in the catalytic cycles will be obtained in MOFs that contain both a molecular photosensitizer (PS) and a molecular catalyst (PS-MOFcats). The envisaged systems will allow light-induced electron transfer processes to generate reduced or oxidized catalyst states the reactivity of which will be studied with high time resolution by transient UV/Vis and IR spectroscopy. The acquired fundamental mechanistic knowledge is far beyond the current state-of-the-art in MOF chemistry and catalysis, and will be used to prepare MOFcat-based electrodes that function at highest possible rates and lowest overpotentials. PS-MOFcats will be grown on flat semiconductor surfaces, and explored as a novel concept to photoanode and -cathode designs for dye-sensitized solar fuel devices (DSSFDs). The design is particularly appealing as it accommodates high PS concentrations for efficient light-harvesting, while providing potent catalysts close to the solvent interface.
Max ERC Funding
1 968 750 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym MolStrucDyn
Project Ultrafast Molecular Structural Dynamics
Researcher (PI) Sebastian Westenhoff
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Consolidator Grant (CoG), PE4, ERC-2016-COG
Summary Chemical reactions in solution are strongly influenced by femtosecond solvent-solute dynamics. Likewise, proteins provide specific environments to control the outcome of substrate reactions. The molecular understanding of these effect are currently poorly developed.
I propose to fill this knowledge gap by ‘filming’ elementary chemical reactions in solution and in proteins. I will pioneer new time-resolved scattering and diffraction experiments using X-ray Free Electron Lasers (XFELs).
Using femtosecond time-resolved X-ray scattering, I plan to decipher the structural dynamics of bond breaking and bond formation in iodine containing compounds in solution. I will pioneer time-resolved fluctuation correlation X-ray scattering to recover full electron density maps of the reaction trajectories at an atomic resolution. I will visualize the as yet unknown structures of reaction intermediates and the solvent response.
Furthermore, I propose to investigate the molecular photoresponse of phytochrome photoconversion with femtosecond time-resolved serial microcrystallography. Phytochromes are ubiquitous photosensory proteins in plants and are essential to all vegetation on earth. I will resolve how the chromophore and the protein react collectively to photoexcitation and how this leads to conformational changes.
Combined, this interdisciplinary project will yield microscopic understanding on how the surrounding of reactants guides the outcome of elementary (bio)chemical reactions.
This program builds on my strengths in structural biology of phytochromes (Takala et al., Nature, 2014), time-resolved X-ray scattering (Westenhoff et al., Nature Methods 2010), and femtosecond spectroscopy (21 papers in PRL, JACS, Nature Methods 2006-2012 & 2016).
The new XFEL-based methods will have wide-ranging applications in chemistry and biology. My work will open new horizons in physical chemistry and structural biology.
Summary
Chemical reactions in solution are strongly influenced by femtosecond solvent-solute dynamics. Likewise, proteins provide specific environments to control the outcome of substrate reactions. The molecular understanding of these effect are currently poorly developed.
I propose to fill this knowledge gap by ‘filming’ elementary chemical reactions in solution and in proteins. I will pioneer new time-resolved scattering and diffraction experiments using X-ray Free Electron Lasers (XFELs).
Using femtosecond time-resolved X-ray scattering, I plan to decipher the structural dynamics of bond breaking and bond formation in iodine containing compounds in solution. I will pioneer time-resolved fluctuation correlation X-ray scattering to recover full electron density maps of the reaction trajectories at an atomic resolution. I will visualize the as yet unknown structures of reaction intermediates and the solvent response.
Furthermore, I propose to investigate the molecular photoresponse of phytochrome photoconversion with femtosecond time-resolved serial microcrystallography. Phytochromes are ubiquitous photosensory proteins in plants and are essential to all vegetation on earth. I will resolve how the chromophore and the protein react collectively to photoexcitation and how this leads to conformational changes.
Combined, this interdisciplinary project will yield microscopic understanding on how the surrounding of reactants guides the outcome of elementary (bio)chemical reactions.
This program builds on my strengths in structural biology of phytochromes (Takala et al., Nature, 2014), time-resolved X-ray scattering (Westenhoff et al., Nature Methods 2010), and femtosecond spectroscopy (21 papers in PRL, JACS, Nature Methods 2006-2012 & 2016).
The new XFEL-based methods will have wide-ranging applications in chemistry and biology. My work will open new horizons in physical chemistry and structural biology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym MorePheno
Project Collider Phenomenology and Event Generators
Researcher (PI) Håkan Torbjörn Sjöstrand
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2014-ADG
Summary Collider physics is about exploring the smallest constituents of matter, and unravelling the basic laws of the Universe. Unfortunately there can be a huge gap between a one-line formula of a fundamental theory and the experimental reality it implies. Phenomenology is intended to fill that gap, e.g. to explore the consequences of a theory such that it can be directly compared with data.
Nowhere is the gap more striking than for QCD, the theory of strong interactions, which dominates in most high-energy collisions, like at the LHC (Large Hadron Collider) at CERN. And yet, when such collisions produce hundreds of outgoing particles, calculational complexity is insurmountable. Instead ingenious but approximate QCD-inspired models have to be invented.
Such models are especially powerful if they can be cast in the form of computer code, and combined to provide a complete description of the collision process. An event generator is such a code, where random numbers are used to emulate the quantum mechanical uncertainty that leads to no two collision events being quite identical.
The Principal Investigator is the main author of PYTHIA, the most widely used event generator of the last 30 years and vital for physics studies at the LHC. It is in a state of continuous extension: new concepts are invented, new models developed, new code written, to provide an increasingly accurate understanding of collider physics. But precise LHC data has put a demand on far more precise descriptions, and have also shown that some models need to be rethought from the ground up.
This project, at its core, is about conducting more frontline research with direct implications for event generators, embedded in a broader phenomenology context. In addition to the PI, the members of the theoretical high energy physics group in Lund and of the PYTHIA collaboration will participate in this project, as well as graduate students and postdocs.
Summary
Collider physics is about exploring the smallest constituents of matter, and unravelling the basic laws of the Universe. Unfortunately there can be a huge gap between a one-line formula of a fundamental theory and the experimental reality it implies. Phenomenology is intended to fill that gap, e.g. to explore the consequences of a theory such that it can be directly compared with data.
Nowhere is the gap more striking than for QCD, the theory of strong interactions, which dominates in most high-energy collisions, like at the LHC (Large Hadron Collider) at CERN. And yet, when such collisions produce hundreds of outgoing particles, calculational complexity is insurmountable. Instead ingenious but approximate QCD-inspired models have to be invented.
Such models are especially powerful if they can be cast in the form of computer code, and combined to provide a complete description of the collision process. An event generator is such a code, where random numbers are used to emulate the quantum mechanical uncertainty that leads to no two collision events being quite identical.
The Principal Investigator is the main author of PYTHIA, the most widely used event generator of the last 30 years and vital for physics studies at the LHC. It is in a state of continuous extension: new concepts are invented, new models developed, new code written, to provide an increasingly accurate understanding of collider physics. But precise LHC data has put a demand on far more precise descriptions, and have also shown that some models need to be rethought from the ground up.
This project, at its core, is about conducting more frontline research with direct implications for event generators, embedded in a broader phenomenology context. In addition to the PI, the members of the theoretical high energy physics group in Lund and of the PYTHIA collaboration will participate in this project, as well as graduate students and postdocs.
Max ERC Funding
1 990 895 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym MSTAR
Project Massive Star Formation through the Universe
Researcher (PI) Jonathan TAN
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE9, ERC-2017-ADG
Summary Massive stars are important throughout astrophysics, yet there remain many open questions about how they form. These include: What is the accretion mechanism of massive star formation? What sets the initial mass function of stars, especially at the highest masses? What is the relation of massive star formation to star cluster formation? How do massive star and star cluster formation vary with galactic environment? What was the nature of the first stars to form in the universe and could these have been the seeds for supermassive black holes? With recent advances in both theoretical/computational techniques and observational facilities, the time is now ripe for progress on answering these questions.
Here we propose an ambitious research program that combines latest theoretical studies of massive star and star cluster formation, including analytic, semi-analytic and full numerical simulations, with state-of-the-art observational programs, including several large surveys. We will: 1) Develop new theoretical models for how individual massive stars form from gas cores, focusing on diagnostics and including study of how the process depends on galactic environment; 2) Test these protostar models against observations, especially with ALMA, SOFIA, JVLA, HST and in the near future with JWST and eventually TMT & E-ELT; 3) Develop theoretical models for star cluster formation, including both magneto-hydrodynamics of the gas and N-body modeling of the young stellar population, with the focus on how massive stars form and evolve in these systems; 4) Test these protocluster models against observational data of young and still-forming star clusters, especially with ALMA, HST, Chandra, JWST and ground-based near-IR facilities; 5) Explore new theoretical models of how the first stars formed, with potential implications for the origins of supermassive black holes - one of the key unsolved problems in astrophysics.
Summary
Massive stars are important throughout astrophysics, yet there remain many open questions about how they form. These include: What is the accretion mechanism of massive star formation? What sets the initial mass function of stars, especially at the highest masses? What is the relation of massive star formation to star cluster formation? How do massive star and star cluster formation vary with galactic environment? What was the nature of the first stars to form in the universe and could these have been the seeds for supermassive black holes? With recent advances in both theoretical/computational techniques and observational facilities, the time is now ripe for progress on answering these questions.
Here we propose an ambitious research program that combines latest theoretical studies of massive star and star cluster formation, including analytic, semi-analytic and full numerical simulations, with state-of-the-art observational programs, including several large surveys. We will: 1) Develop new theoretical models for how individual massive stars form from gas cores, focusing on diagnostics and including study of how the process depends on galactic environment; 2) Test these protostar models against observations, especially with ALMA, SOFIA, JVLA, HST and in the near future with JWST and eventually TMT & E-ELT; 3) Develop theoretical models for star cluster formation, including both magneto-hydrodynamics of the gas and N-body modeling of the young stellar population, with the focus on how massive stars form and evolve in these systems; 4) Test these protocluster models against observational data of young and still-forming star clusters, especially with ALMA, HST, Chandra, JWST and ground-based near-IR facilities; 5) Explore new theoretical models of how the first stars formed, with potential implications for the origins of supermassive black holes - one of the key unsolved problems in astrophysics.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym MULTIMATE
Project A Research Platform Addressing Outstanding Research Challenges for Nanoscale Design and Engineering of Multifunctional Material
Researcher (PI) Johanna Rosen
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary "Nanoscale engineering is a fascinating research field spawning extraordinary materials which revolutionize microelectronics, medicine,energy production, etc. Still, there is a need for new materials and synthesis methods to offer unprecedented properties for use in future applications.
In this research project, I will conduct fundamental science investigations focused towards the development of novel materials with tailor-made properties, achieved by precise control of the materials structure and compostition. The objectives are to: 1) Perform novel synthesis of graphene. 2) Explore nanoscale engineering of ""graphene-based"" materials, based on more than one atomic element. 3) Tailor uniquely combined metallic/ceramic/magnetic materials properties in so called MAX phases. 4) Provide proof of concept for thin film architectures in advanced applications that require specific mechanical, tribological, electronic, and magnetic properties.
This initative involves advanced materials design by a new and unique synthesis method based on cathodic arc. Research breakthroughs are envisioned: Functionalized graphene-based and fullerene-like compounds are expected to have a major impact on tribology and electronic applications. The MAX phases are expected to be a new candidate for applications within low friction contacts, electronics, as well as spintronics. In particular, single crystal devices are predicted through tuning of tunnel magnetoresistance (TMR) and anisotropic conductivity (from insulating to n-and p-type).
I can lead this innovative and interdisciplinary project, with a unique background combining relevant research areas: arc process development, plasma processing, materials synthesis and engineering, characterization, along with theory and modelling."
Summary
"Nanoscale engineering is a fascinating research field spawning extraordinary materials which revolutionize microelectronics, medicine,energy production, etc. Still, there is a need for new materials and synthesis methods to offer unprecedented properties for use in future applications.
In this research project, I will conduct fundamental science investigations focused towards the development of novel materials with tailor-made properties, achieved by precise control of the materials structure and compostition. The objectives are to: 1) Perform novel synthesis of graphene. 2) Explore nanoscale engineering of ""graphene-based"" materials, based on more than one atomic element. 3) Tailor uniquely combined metallic/ceramic/magnetic materials properties in so called MAX phases. 4) Provide proof of concept for thin film architectures in advanced applications that require specific mechanical, tribological, electronic, and magnetic properties.
This initative involves advanced materials design by a new and unique synthesis method based on cathodic arc. Research breakthroughs are envisioned: Functionalized graphene-based and fullerene-like compounds are expected to have a major impact on tribology and electronic applications. The MAX phases are expected to be a new candidate for applications within low friction contacts, electronics, as well as spintronics. In particular, single crystal devices are predicted through tuning of tunnel magnetoresistance (TMR) and anisotropic conductivity (from insulating to n-and p-type).
I can lead this innovative and interdisciplinary project, with a unique background combining relevant research areas: arc process development, plasma processing, materials synthesis and engineering, characterization, along with theory and modelling."
Max ERC Funding
1 484 700 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym MUSTANG
Project Magnonics Using Spin Torque, spin caloritronics, And Nanoplasmonic engineerinG
Researcher (PI) Johan Åkerman
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary My overall aim is to develop a Magnonic technology platform where Spintronic, Spin-Caloritronic and Nano-plasmonic devices and structures combine to create ground-breaking functionality from novel interactions between charge, spin, heat and light. With traditional Magnonic studies typically geared towards the low GHz range, and nanoplasmonic phenomena primarily focusing on visible light, my proposed platform will also attempt to bridge the so-called “THz gap” and create ultra-broadband and rapidly tuneable spin wave (SW) based signal generators, manipulators, detectors, and even spectrometers, in the 10–200 GHz frequency range. I will reach this goal by transferring my documented nano-contact spin torque oscillator (NC-STO) expertise into the magnonics world of both metal and insulator based SW propagation, add recently discovered spin hall (SHE) and inverse spin hall effect (ISHE) SW manipulation/detection, and combine it with my recently acquired know-how in nanoplasmonics.
My specific aims are:
1. SW generation and manipulation using metal and YIG based NC-STOs
2. SW-light/heat interaction using nanoplasmonic structures and Spin-Caloritronics
3. ISHE/SHE detection and control of propagating SWs in metals and YIG
Summary
My overall aim is to develop a Magnonic technology platform where Spintronic, Spin-Caloritronic and Nano-plasmonic devices and structures combine to create ground-breaking functionality from novel interactions between charge, spin, heat and light. With traditional Magnonic studies typically geared towards the low GHz range, and nanoplasmonic phenomena primarily focusing on visible light, my proposed platform will also attempt to bridge the so-called “THz gap” and create ultra-broadband and rapidly tuneable spin wave (SW) based signal generators, manipulators, detectors, and even spectrometers, in the 10–200 GHz frequency range. I will reach this goal by transferring my documented nano-contact spin torque oscillator (NC-STO) expertise into the magnonics world of both metal and insulator based SW propagation, add recently discovered spin hall (SHE) and inverse spin hall effect (ISHE) SW manipulation/detection, and combine it with my recently acquired know-how in nanoplasmonics.
My specific aims are:
1. SW generation and manipulation using metal and YIG based NC-STOs
2. SW-light/heat interaction using nanoplasmonic structures and Spin-Caloritronics
3. ISHE/SHE detection and control of propagating SWs in metals and YIG
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym NanoBioNext
Project Nanoscale Biomeasurements of Nerve Cells and Vesicles: Molecular Substructure and the Nature of Exocytosis
Researcher (PI) Andrew EWING
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Advanced Grant (AdG), PE4, ERC-2017-ADG
Summary I propose to develop and apply state of the art analytical methods to investigate cell membrane and vesicle substructure to elucidate the chemistry of the closing regulatory phase of individual exocytosis events. The general goal of this proposal is to develop a new brand of analytical nanoelectrochemistry (nanogap and nanopore electrochemical cytometry), combined with chemical nanoscopy imaging methods with STED and nanoscale mass spectrometry imaging. I propose to apply this to the questions of the nature of exocytosis and the chemistry that initiates the process of a short-term memory. We have recently discovered that most neurotransmitter release is partial via an open and closed vesicle release process and this allows new mechanisms of plasticity and synaptic strength to be hypothesized. I propose to (i) test if partial release is ubiquitous phenomenon, (ii) develop new nanoscale analytical methods to measure exocytotic release from pancreatic beta cells and a neuron in Drosophila, and to elucidate the substructure of nanometer vesicles, (iii) use these analytical methods in model cells and neurons to test the hypothesis that lipid membrane changes are involved in the initiation of the chemical events leading to short-term memory, and (iv) test the effects of drugs and zinc on plasticity of vesicles and exocytosis. This work combines new method development with a revolutionary application of chemical analysis to test the hypothesis that lipids play a previously unanticipated role in synaptic plasticity and the chemical structures involved in the initiation of short-term memory. As long-term impact, this will provide sensitive analytical tools to understand how changes in these chemical species might be affected in relation to diseases involving short-term memory loss.
Summary
I propose to develop and apply state of the art analytical methods to investigate cell membrane and vesicle substructure to elucidate the chemistry of the closing regulatory phase of individual exocytosis events. The general goal of this proposal is to develop a new brand of analytical nanoelectrochemistry (nanogap and nanopore electrochemical cytometry), combined with chemical nanoscopy imaging methods with STED and nanoscale mass spectrometry imaging. I propose to apply this to the questions of the nature of exocytosis and the chemistry that initiates the process of a short-term memory. We have recently discovered that most neurotransmitter release is partial via an open and closed vesicle release process and this allows new mechanisms of plasticity and synaptic strength to be hypothesized. I propose to (i) test if partial release is ubiquitous phenomenon, (ii) develop new nanoscale analytical methods to measure exocytotic release from pancreatic beta cells and a neuron in Drosophila, and to elucidate the substructure of nanometer vesicles, (iii) use these analytical methods in model cells and neurons to test the hypothesis that lipid membrane changes are involved in the initiation of the chemical events leading to short-term memory, and (iv) test the effects of drugs and zinc on plasticity of vesicles and exocytosis. This work combines new method development with a revolutionary application of chemical analysis to test the hypothesis that lipids play a previously unanticipated role in synaptic plasticity and the chemical structures involved in the initiation of short-term memory. As long-term impact, this will provide sensitive analytical tools to understand how changes in these chemical species might be affected in relation to diseases involving short-term memory loss.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-08-01, End date: 2023-07-31
Project acronym NANOCELLIMAGE
Project Ultrasmall Chemical Imaging of Cells and Vesicular Release
Researcher (PI) Andrew Ewing
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary The long-term goal of this research is to establish the chain of molecular events associated with (1) neurotransmitter release at the single cell and subcellular level and (2) with cell differentiation and reprogramming. These are incredibly important goals for which there are few analytical chemistry methods that are available and useful. The immediate goal therefore includes development of three chemical methodologies at the cutting edge of analytical chemistry: 1) the development of arrays of nanometer electrodes that can be used to spatially measure the release of easily oxidized substances across the cell surface; 2) to improve the combination of MALDI and cluster SIMS ion sources on an orthogonal QStar instrument to enable protein and glycoprotein analysis at the single whole cell level, lipid domain analysis at the subcellular level, and importantly, depth profiling; and 3) the application of information discovered at single cells and of the methods developed in goals 1 and 2 to an in vitro model of cell-to-cell communication and regeneration. I intend to build on my expertise in both electrochemistry and SIMS imaging to develop these approaches. The work described here constitutes two new directions of research in my group as well as new analytical chemistry, and, if successful, will lead to researchers being able to gather incredibly important new data about cell-to-cell communication and cell differentiation and reprogramming as well as to a better understanding the role of lipids in exocytosis and endocytosis.
Summary
The long-term goal of this research is to establish the chain of molecular events associated with (1) neurotransmitter release at the single cell and subcellular level and (2) with cell differentiation and reprogramming. These are incredibly important goals for which there are few analytical chemistry methods that are available and useful. The immediate goal therefore includes development of three chemical methodologies at the cutting edge of analytical chemistry: 1) the development of arrays of nanometer electrodes that can be used to spatially measure the release of easily oxidized substances across the cell surface; 2) to improve the combination of MALDI and cluster SIMS ion sources on an orthogonal QStar instrument to enable protein and glycoprotein analysis at the single whole cell level, lipid domain analysis at the subcellular level, and importantly, depth profiling; and 3) the application of information discovered at single cells and of the methods developed in goals 1 and 2 to an in vitro model of cell-to-cell communication and regeneration. I intend to build on my expertise in both electrochemistry and SIMS imaging to develop these approaches. The work described here constitutes two new directions of research in my group as well as new analytical chemistry, and, if successful, will lead to researchers being able to gather incredibly important new data about cell-to-cell communication and cell differentiation and reprogramming as well as to a better understanding the role of lipids in exocytosis and endocytosis.
Max ERC Funding
2 491 881 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym NanoPokers
Project Deciphering cell heterogeneity in tumors using arrays of nanowires to controllably poke single cells in longitudinal studies
Researcher (PI) Christelle Nathalie Prinz
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE5, ERC-2015-CoG
Summary Cancer is responsible for 20% of all deaths in Europe. Current cancer research is based on cell ensemble measurements or on snapshot studies of individual cells. However, cancer is a systemic disease, involving many cells that interact and evolve over time in a complex manner, which cell ensemble studies and snapshot studies cannot grasp. It is therefore crucial to investigate cancer at the single cell level and in longitudinal studies (over time). Despite the recent developments in micro- and nanotechnologies, combined with live cell imaging, today, there is no method available that meets the crucial need for global monitoring of individual cell responses to stimuli/perturbation in real-time.
This project addresses this crucial need by combining super resolution live-cell imaging and the development of sensors, as well as injection devices based on vertical nanowire arrays. The devices will penetrate multiple single cells in a fully controlled manner, with minimal invasiveness.
The objectives of the project are:
1) To develop nanowire based-tools in order to gain controlled and reliable access to the cell interior with minimal invasiveness.
2) Developing mRNA sensing and biomolecule injection capabilities based on nanowires.
3) Performing longitudinal single cell studies in tumours, including monitoring gene expression in real time, under controlled cell perturbation.
By enabling global, long term monitoring of individual tumour cells submitted to controlled stimuli, the project will open up new horizons in Biology and in Medical Research. It will enable ground-breaking discoveries in understanding the complexity of molecular events underlying the disease. This cross-disciplinary project will lead to paradigm-shifting research, which will enable the development of optimal treatment strategies. This will be applicable, not only for cancer, but also for a broad range of diseases, such as diabetes and neurodegenerative diseases.
Summary
Cancer is responsible for 20% of all deaths in Europe. Current cancer research is based on cell ensemble measurements or on snapshot studies of individual cells. However, cancer is a systemic disease, involving many cells that interact and evolve over time in a complex manner, which cell ensemble studies and snapshot studies cannot grasp. It is therefore crucial to investigate cancer at the single cell level and in longitudinal studies (over time). Despite the recent developments in micro- and nanotechnologies, combined with live cell imaging, today, there is no method available that meets the crucial need for global monitoring of individual cell responses to stimuli/perturbation in real-time.
This project addresses this crucial need by combining super resolution live-cell imaging and the development of sensors, as well as injection devices based on vertical nanowire arrays. The devices will penetrate multiple single cells in a fully controlled manner, with minimal invasiveness.
The objectives of the project are:
1) To develop nanowire based-tools in order to gain controlled and reliable access to the cell interior with minimal invasiveness.
2) Developing mRNA sensing and biomolecule injection capabilities based on nanowires.
3) Performing longitudinal single cell studies in tumours, including monitoring gene expression in real time, under controlled cell perturbation.
By enabling global, long term monitoring of individual tumour cells submitted to controlled stimuli, the project will open up new horizons in Biology and in Medical Research. It will enable ground-breaking discoveries in understanding the complexity of molecular events underlying the disease. This cross-disciplinary project will lead to paradigm-shifting research, which will enable the development of optimal treatment strategies. This will be applicable, not only for cancer, but also for a broad range of diseases, such as diabetes and neurodegenerative diseases.
Max ERC Funding
2 621 251 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym NAPOLI
Project Nanoporous Asymmetric Poly(Ionic Liquid) Membrane
Researcher (PI) Jiayin Yuan
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2014-STG
Summary Nanoporous polymer membranes (NPMs) play a crucial, irreplaceable role in fundamental research and industrial usage, including separation, filtration, water treatment and sustainable environment. The vast majority of advances concentrate on neutral or weakly charged polymers, such as the ongoing interest on self-assembled block copolymer NPMs. There is an urgent need to process polyelectrolytes into NPMs that critically combine a high charge density with nanoporous morphology. Additionally, engineering structural asymmetry/gradient simultaneously in the membrane is equally beneficial, as it would improve membrane performance by building up compartmentalized functionalities. For example, a gradient in pore size forms high pressure resistance coupled with improved selectivity. Nevertheless, developing such highly charged, nanoporous and gradient membranes has remained a challenge, owing to the water solubility and ionic nature of conventional polyelectrolytes, poorly processable into nanoporous state via common routes.
Recently, my group first reported an easy-to-perform production of nanoporous polyelectrolyte membranes. Building on this important but rather preliminary advance, I propose to develop the next generation of NPMs, nanoporous asymmetric poly(ionic liquid) membranes (NAPOLIs). The aim is to produce NAPOLIs bearing diverse gradients, understand the unique transport behavior, improve the membrane stability/sustainability/applicability, and finally apply them in the active fields of energy and environment. Both the currently established route and the newly proposed ones will be employed for the membrane fabrication.
This proposal is inherently interdisciplinary, as it must combine polymer chemistry/engineering, physical chemistry, membrane/materials science, and nanoscience for its success. This research will fundamentally advance nanoporous membrane design for a wide scope of applications and reveal unique physical processes in an asymmetric context.
Summary
Nanoporous polymer membranes (NPMs) play a crucial, irreplaceable role in fundamental research and industrial usage, including separation, filtration, water treatment and sustainable environment. The vast majority of advances concentrate on neutral or weakly charged polymers, such as the ongoing interest on self-assembled block copolymer NPMs. There is an urgent need to process polyelectrolytes into NPMs that critically combine a high charge density with nanoporous morphology. Additionally, engineering structural asymmetry/gradient simultaneously in the membrane is equally beneficial, as it would improve membrane performance by building up compartmentalized functionalities. For example, a gradient in pore size forms high pressure resistance coupled with improved selectivity. Nevertheless, developing such highly charged, nanoporous and gradient membranes has remained a challenge, owing to the water solubility and ionic nature of conventional polyelectrolytes, poorly processable into nanoporous state via common routes.
Recently, my group first reported an easy-to-perform production of nanoporous polyelectrolyte membranes. Building on this important but rather preliminary advance, I propose to develop the next generation of NPMs, nanoporous asymmetric poly(ionic liquid) membranes (NAPOLIs). The aim is to produce NAPOLIs bearing diverse gradients, understand the unique transport behavior, improve the membrane stability/sustainability/applicability, and finally apply them in the active fields of energy and environment. Both the currently established route and the newly proposed ones will be employed for the membrane fabrication.
This proposal is inherently interdisciplinary, as it must combine polymer chemistry/engineering, physical chemistry, membrane/materials science, and nanoscience for its success. This research will fundamentally advance nanoporous membrane design for a wide scope of applications and reveal unique physical processes in an asymmetric context.
Max ERC Funding
1 500 000 €
Duration
Start date: 2015-03-01, End date: 2021-01-31
Project acronym NAQUOP
Project Nanodevices for Quantum Optics
Researcher (PI) Valery Zwiller
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary We propose developing a nanodevice toolbox for single photon quantum optics. A scalable scheme to generate indistinguishable single photons, an interface to couple single photon polarization to a single electron spin and high efficiency single photon detectors represent the core of the scientific problems to be addressed in this project.
We set the following research objectives: 1- Understand to what extent quantum dots can be made indistinguishable. 2- Interface coherently single photons to single electron spins via strain engineering in quantum dots. 3- Gain a better understanding of the limits to time resolution and detection efficiency of ultrafast superconducting single photon detectors.
The proposed research effort will yield novel experiments: the realization of scalable indistinguishable quantum dot sources by frequency locking single quantum dots to atomic transitions, the demonstration of new selection rules in semiconductor nanostructures to couple photon polarization to the electron spin only, the development of ultrafast and high efficiency single photon and single plasmon detectors and their implementation in two photon interference and quantum plasmonics experiments.
To carry out the work, multidisciplinary efforts where nanofabrication, quantum optics, semiconductor and superconductor physics will be merged to demonstrate the scalability of quantum dots for quantum information processing, providing crucial new knowledge in single photon optics at the nanoscale. The impact of the project will be important and far reaching as it will address fundamental questions related to the scalability of quantum indistinguishability of remote nanostructures.
Summary
We propose developing a nanodevice toolbox for single photon quantum optics. A scalable scheme to generate indistinguishable single photons, an interface to couple single photon polarization to a single electron spin and high efficiency single photon detectors represent the core of the scientific problems to be addressed in this project.
We set the following research objectives: 1- Understand to what extent quantum dots can be made indistinguishable. 2- Interface coherently single photons to single electron spins via strain engineering in quantum dots. 3- Gain a better understanding of the limits to time resolution and detection efficiency of ultrafast superconducting single photon detectors.
The proposed research effort will yield novel experiments: the realization of scalable indistinguishable quantum dot sources by frequency locking single quantum dots to atomic transitions, the demonstration of new selection rules in semiconductor nanostructures to couple photon polarization to the electron spin only, the development of ultrafast and high efficiency single photon and single plasmon detectors and their implementation in two photon interference and quantum plasmonics experiments.
To carry out the work, multidisciplinary efforts where nanofabrication, quantum optics, semiconductor and superconductor physics will be merged to demonstrate the scalability of quantum dots for quantum information processing, providing crucial new knowledge in single photon optics at the nanoscale. The impact of the project will be important and far reaching as it will address fundamental questions related to the scalability of quantum indistinguishability of remote nanostructures.
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym NEWIRES
Project Next Generation Semiconductor Nanowires
Researcher (PI) Kimberly Thelander
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary Semiconductor nanowires composed of III-V materials have enormous potential to add new functionality to electronics and optical applications. However, integration of these promising structures into applications is severely limited by the current near-universal reliance on gold nanoparticles as seeds for nanowire fabrication. Although highly controlled fabrication is achieved, this metal is entirely incompatible with the Si-based electronics industry. It also presents limitations for the extension of nanowire research towards novel materials not existing in bulk. To date, exploration of alternatives has been limited to selective-area and self-seeded processes, both of which have major limitations in terms of size and morphology control, potential to combine materials, and crystal structure tuning. There is also very little understanding of precisely why gold has proven so successful for nanowire growth, and which alternatives may yield comparable or better results. The aim of this project will be to explore alternative nanoparticle seed materials to go beyond the use of gold in III-V nanowire fabrication. This will be achieved using a unique and recently developed capability for aerosol-phase fabrication of highly controlled nanoparticles directly integrated with conventional nanowire fabrication equipment. The primary goal will be to deepen the understanding of the nanowire fabrication process, and the specific advantages (and limitations) of gold as a seed material, in order to develop and optimize alternatives. The use of a wide variety of seed particle materials in nanowire fabrication will greatly broaden the variety of novel structures that can be fabricated. The results will also transform the nanowire fabrication research field, in order to develop important connections between nanowire research and the semiconductor industry, and to greatly improve the viability of nanowire integration into future devices.
Summary
Semiconductor nanowires composed of III-V materials have enormous potential to add new functionality to electronics and optical applications. However, integration of these promising structures into applications is severely limited by the current near-universal reliance on gold nanoparticles as seeds for nanowire fabrication. Although highly controlled fabrication is achieved, this metal is entirely incompatible with the Si-based electronics industry. It also presents limitations for the extension of nanowire research towards novel materials not existing in bulk. To date, exploration of alternatives has been limited to selective-area and self-seeded processes, both of which have major limitations in terms of size and morphology control, potential to combine materials, and crystal structure tuning. There is also very little understanding of precisely why gold has proven so successful for nanowire growth, and which alternatives may yield comparable or better results. The aim of this project will be to explore alternative nanoparticle seed materials to go beyond the use of gold in III-V nanowire fabrication. This will be achieved using a unique and recently developed capability for aerosol-phase fabrication of highly controlled nanoparticles directly integrated with conventional nanowire fabrication equipment. The primary goal will be to deepen the understanding of the nanowire fabrication process, and the specific advantages (and limitations) of gold as a seed material, in order to develop and optimize alternatives. The use of a wide variety of seed particle materials in nanowire fabrication will greatly broaden the variety of novel structures that can be fabricated. The results will also transform the nanowire fabrication research field, in order to develop important connections between nanowire research and the semiconductor industry, and to greatly improve the viability of nanowire integration into future devices.
Max ERC Funding
1 496 246 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym NINA
Project Nitride-based nanostructured novel thermoelectric thin-film materials
Researcher (PI) Per Daniel Eklund
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary My recent discovery of the anomalously high thermoelectric power factor of ScN thin films demonstrates that unexpected thermoelectric materials can be found among the early transition-metal and rare-earth nitrides. Corroborated by first-principles calculations, we have well-founded hypotheses that these properties stem from nitrogen vacancies, dopants, and alloying, which introduce controllable sharp features with a large slope at the Fermi level, causing a drastically increased Seebeck coefficient. In-depth fundamental studies are needed to enable property tuning and materials design in these systems, to timely exploit my discovery and break new ground.
The project concerns fundamental, primarily experimental, studies on scandium nitride-based and related single-phase and nanostructured films. The overall goal is to understand the complex correlations between electronic, thermal and thermoelectric properties and structural features such as layering, orientation, epitaxy, dopants and lattice defects. Ab initio calculations of band structures, mixing thermodynamics, and properties are integrated with the experimental activities. Novel mechanisms are proposed for drastic reduction of the thermal conductivity with retained high power factor. This will be realized by intentionally introduced secondary phases and artificial nanolaminates; the layering causing discontinuities in the phonon distribution and thus reducing thermal conductivity.
My expertise in thin-film processing and advanced materials characterization places me in a unique position to pursue this novel high-gain approach to thermoelectrics, and an ERC starting grant will be essential in achieving critical mass and consolidating an internationally leading research platform. The scientific impact and vision is in pioneering an understanding of a novel class of thermoelectric materials with potential for thermoelectric devices for widespread use in environmentally friendly energy applications.
Summary
My recent discovery of the anomalously high thermoelectric power factor of ScN thin films demonstrates that unexpected thermoelectric materials can be found among the early transition-metal and rare-earth nitrides. Corroborated by first-principles calculations, we have well-founded hypotheses that these properties stem from nitrogen vacancies, dopants, and alloying, which introduce controllable sharp features with a large slope at the Fermi level, causing a drastically increased Seebeck coefficient. In-depth fundamental studies are needed to enable property tuning and materials design in these systems, to timely exploit my discovery and break new ground.
The project concerns fundamental, primarily experimental, studies on scandium nitride-based and related single-phase and nanostructured films. The overall goal is to understand the complex correlations between electronic, thermal and thermoelectric properties and structural features such as layering, orientation, epitaxy, dopants and lattice defects. Ab initio calculations of band structures, mixing thermodynamics, and properties are integrated with the experimental activities. Novel mechanisms are proposed for drastic reduction of the thermal conductivity with retained high power factor. This will be realized by intentionally introduced secondary phases and artificial nanolaminates; the layering causing discontinuities in the phonon distribution and thus reducing thermal conductivity.
My expertise in thin-film processing and advanced materials characterization places me in a unique position to pursue this novel high-gain approach to thermoelectrics, and an ERC starting grant will be essential in achieving critical mass and consolidating an internationally leading research platform. The scientific impact and vision is in pioneering an understanding of a novel class of thermoelectric materials with potential for thermoelectric devices for widespread use in environmentally friendly energy applications.
Max ERC Funding
1 499 976 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym NOCO2
Project Novel combustion principle with inherent capture of CO2
using combined manganese oxides that release oxygen
Researcher (PI) Jan Anders Lyngfelt
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary Conventional CO2 capture processes have significant cost and energy penalties associated with gas separation. Chemical-looping combustion (CLC), an entirely new combustion principle avoids this difficulty by inherent CO2 capture, using metal oxides for oxygen transfer from air to fuel. The process has been demonstrated in small scale with gaseous fuels. However, with solid fuels it would be difficult to reach high fuel conversion, with the oxygen-carrier materials used so far. But a new type of combined oxides based on manganese has the ability not only to react with gaseous fuel, but also to release gaseous oxygen, which would fundamentally change the concept.
The programme would provide 1) new oxygen-carrier materials with unique properties that would make this low-cost/high-efficiency option of CO2 capture possible, 2) cold-flow model investigation of suitable reactor system configurations and components, 3) a demonstration of this new combustion technology at the pilot plant level, 4) a model of the process comprising a full understanding, including kinetics, equilibria, hydrodynamics of fluidized reactors, mass and heat balances.
The basis of this programme is the discovery of a number of oxygen-releasing combined manganese oxides, having properties that can make a CLC with solid fuels a break-through process for CO2 capture. The purpose of the programme is to perform a comprehensive study of these materials, to demonstrate that they work in real systems, to achieve a full understanding of how they work in interaction with solid fuels in fluidized beds and to assess how this process would work in the full scale.
Climate negotiations and agreements could be significantly facilitated by this low cost option for CO2 capture which, in principle, should be applicable to 25% of the global CO2 emissions, i.e. coal fired power plants. It would also provide a future means of removing CO2 from the atmosphere at low cost by burning biofuel and capture CO2.
.
Summary
Conventional CO2 capture processes have significant cost and energy penalties associated with gas separation. Chemical-looping combustion (CLC), an entirely new combustion principle avoids this difficulty by inherent CO2 capture, using metal oxides for oxygen transfer from air to fuel. The process has been demonstrated in small scale with gaseous fuels. However, with solid fuels it would be difficult to reach high fuel conversion, with the oxygen-carrier materials used so far. But a new type of combined oxides based on manganese has the ability not only to react with gaseous fuel, but also to release gaseous oxygen, which would fundamentally change the concept.
The programme would provide 1) new oxygen-carrier materials with unique properties that would make this low-cost/high-efficiency option of CO2 capture possible, 2) cold-flow model investigation of suitable reactor system configurations and components, 3) a demonstration of this new combustion technology at the pilot plant level, 4) a model of the process comprising a full understanding, including kinetics, equilibria, hydrodynamics of fluidized reactors, mass and heat balances.
The basis of this programme is the discovery of a number of oxygen-releasing combined manganese oxides, having properties that can make a CLC with solid fuels a break-through process for CO2 capture. The purpose of the programme is to perform a comprehensive study of these materials, to demonstrate that they work in real systems, to achieve a full understanding of how they work in interaction with solid fuels in fluidized beds and to assess how this process would work in the full scale.
Climate negotiations and agreements could be significantly facilitated by this low cost option for CO2 capture which, in principle, should be applicable to 25% of the global CO2 emissions, i.e. coal fired power plants. It would also provide a future means of removing CO2 from the atmosphere at low cost by burning biofuel and capture CO2.
.
Max ERC Funding
2 500 000 €
Duration
Start date: 2012-03-01, End date: 2017-02-28
Project acronym ODDSUPER
Project New mechanisms and materials for odd-frequency superconductivity
Researcher (PI) Annica BLACK-SCHAFFER
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Odd-frequency superconductivity is a very unique superconducting state that is odd in time or, equivalently, frequency, which is opposite to the ordinary behavior of superconductivity. It has been realized to be the absolute key to understand the surprising physics of superconductor-ferromagnet (SF) structures and has also enabled the whole emerging field of superconducting spintronics. This project will discover and explore entirely new mechanisms and materials for odd-frequency superconductivity, to both generate a much deeper understanding of superconductivity and open for entirely new functionalities. Importantly, it will generalize and apply my initial discoveries of two new odd-frequency mechanisms, present in bulk multiband superconductors and in hybrid structures between topological insulators and conventional superconductors, respectively. In both cases odd-frequency superconductivity is generated without any need for ferromagnets or interfaces, completely different from the situation in SF structures. The result will be a significant expansion of the concept and importance of odd-frequency superconductivity to a very wide class of materials, ranging from multiband, bilayer, and nanoscale superconductors to topological superconductors. The project will also establish the connection between topology and odd-frequency pairing, which needs to be addressed in order to understand topological superconductors, as well as incorporate new materials and functionality into traditional SF structures. To achieve these goals the project will develop a novel methodological framework for large-scale and fully quantum mechanical studies with atomic level resolution, solving self-consistently for the superconducting state and incorporating quantum transport calculations.
Summary
Odd-frequency superconductivity is a very unique superconducting state that is odd in time or, equivalently, frequency, which is opposite to the ordinary behavior of superconductivity. It has been realized to be the absolute key to understand the surprising physics of superconductor-ferromagnet (SF) structures and has also enabled the whole emerging field of superconducting spintronics. This project will discover and explore entirely new mechanisms and materials for odd-frequency superconductivity, to both generate a much deeper understanding of superconductivity and open for entirely new functionalities. Importantly, it will generalize and apply my initial discoveries of two new odd-frequency mechanisms, present in bulk multiband superconductors and in hybrid structures between topological insulators and conventional superconductors, respectively. In both cases odd-frequency superconductivity is generated without any need for ferromagnets or interfaces, completely different from the situation in SF structures. The result will be a significant expansion of the concept and importance of odd-frequency superconductivity to a very wide class of materials, ranging from multiband, bilayer, and nanoscale superconductors to topological superconductors. The project will also establish the connection between topology and odd-frequency pairing, which needs to be addressed in order to understand topological superconductors, as well as incorporate new materials and functionality into traditional SF structures. To achieve these goals the project will develop a novel methodological framework for large-scale and fully quantum mechanical studies with atomic level resolution, solving self-consistently for the superconducting state and incorporating quantum transport calculations.
Max ERC Funding
1 121 660 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym OPUS
Project Optical Ultra-Sensor
Researcher (PI) Markus Pollnau
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary This project aims at pushing the limits of optical sensing on a microchip by orders of magnitude, thereby allowing for ultra-high sensitivity in optical detection and enabling first-time-ever demonstrations of several optical sensing principles on a microchip. My idea is based upon our distributed-feedback lasers in rare-earth-ion-doped aluminum oxide waveguides on a silicon chip with ultra-narrow linewidths of 1 kHz, corresponding to Q-factors exceeding 10^11, intra-cavity laser intensities of several watts over a waveguide cross-section of 2 micrometer, and light interaction lengths reaching 20 km. Optical read-out of the laser frequency and linewidth is achieved by frequency down-conversion via detection of the GHz beat signal of two such lasers positioned in the same waveguide or in parallel waveguides on the same microchip.
The sensitivity of optical detection is related to the laser linewidth, interaction length, and transverse mode overlap with the measurand; its potential of optically exciting ions or molecules and its optical trapping force are related to the laser intensity. By applying novel concepts, we will decrease the laser linewidth to 1 Hz (Q-factor > 10^14), thereby also significantly increasing the intra-cavity intensity and light interaction length, simplify the read-out by reducing the line-width separation between two lasers to the MHz regime, and increase the mode interaction with the environment by either increasing its evanescent field or perpendicularly intersecting a nanofluidic channel with the optical waveguide, thereby allowing for unprecedented sensitivity of optical detection on a microchip. We will exploit this dual-wavelength distributed-feedback laser sensor for the first-ever demonstrations of intra-laser-cavity (ILC) optical trapping and detection of nano-sized biological objects in an optofluidic chip, ILC trace-gas detection on a microchip, ILC Raman spectrometry on a microchip, and ILC spectroscopy of single rare-earth ions.
Summary
This project aims at pushing the limits of optical sensing on a microchip by orders of magnitude, thereby allowing for ultra-high sensitivity in optical detection and enabling first-time-ever demonstrations of several optical sensing principles on a microchip. My idea is based upon our distributed-feedback lasers in rare-earth-ion-doped aluminum oxide waveguides on a silicon chip with ultra-narrow linewidths of 1 kHz, corresponding to Q-factors exceeding 10^11, intra-cavity laser intensities of several watts over a waveguide cross-section of 2 micrometer, and light interaction lengths reaching 20 km. Optical read-out of the laser frequency and linewidth is achieved by frequency down-conversion via detection of the GHz beat signal of two such lasers positioned in the same waveguide or in parallel waveguides on the same microchip.
The sensitivity of optical detection is related to the laser linewidth, interaction length, and transverse mode overlap with the measurand; its potential of optically exciting ions or molecules and its optical trapping force are related to the laser intensity. By applying novel concepts, we will decrease the laser linewidth to 1 Hz (Q-factor > 10^14), thereby also significantly increasing the intra-cavity intensity and light interaction length, simplify the read-out by reducing the line-width separation between two lasers to the MHz regime, and increase the mode interaction with the environment by either increasing its evanescent field or perpendicularly intersecting a nanofluidic channel with the optical waveguide, thereby allowing for unprecedented sensitivity of optical detection on a microchip. We will exploit this dual-wavelength distributed-feedback laser sensor for the first-ever demonstrations of intra-laser-cavity (ILC) optical trapping and detection of nano-sized biological objects in an optofluidic chip, ILC trace-gas detection on a microchip, ILC Raman spectrometry on a microchip, and ILC spectroscopy of single rare-earth ions.
Max ERC Funding
2 499 958 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym OSIRIS
Project Open silicon based research platform for emerging devices
Researcher (PI) Lars Mikael Östling
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE7, ERC-2008-AdG
Summary The OSIRIS proposal will address the crucial and ultimately strategic area for the future emerging nanoelectronics, i.e. how structures and devices actually will be fabricated as physical dimensions approaches a few nanometer minimum feature size. The project title is Open silicon based research platform for emerging devices and indicates that many of the future emerging devices will be based on a silicon fabrication base platform but may not be fully based on silicon as the active semiconductor material. Over the past 10 years this research team has established a versatile fabrication technology platform in excellent condition to open up a variety of new technologies to explore nanometer minimum feature size in realizable electrical repeatable devices structures.
The proposed project has five different focus areas outlined. It covers a broad range of critical research issues that can be foreseen as groundbreaking topics for the period beyond 2015. the different topics addressed are;
1) Three dimensional FET nanostructures based on SiNW and GeNW with advanced configuration.
2) New applications of SiNW with build-in strain for fast silicon-base optoelectronic devices.
3) Low frequency noise in advanced nanoelectronic structures
4) THz devices for IR-detection
5) Bio-sensor nanoelectronics for extreme bio-molecule sensitivity and real time detection of DNA.
These areas are carefully chosen to assemble the right mix with predictable research success and with a few areas that can be called high gain/high risk. In particular we want to mention that focus area 2 and 4 have a great potential impact when successful but also at a certain higher risk for a more difficult implementation in future devices. There is in no cases any risk that the research will not generate high quality scientific results.
Summary
The OSIRIS proposal will address the crucial and ultimately strategic area for the future emerging nanoelectronics, i.e. how structures and devices actually will be fabricated as physical dimensions approaches a few nanometer minimum feature size. The project title is Open silicon based research platform for emerging devices and indicates that many of the future emerging devices will be based on a silicon fabrication base platform but may not be fully based on silicon as the active semiconductor material. Over the past 10 years this research team has established a versatile fabrication technology platform in excellent condition to open up a variety of new technologies to explore nanometer minimum feature size in realizable electrical repeatable devices structures.
The proposed project has five different focus areas outlined. It covers a broad range of critical research issues that can be foreseen as groundbreaking topics for the period beyond 2015. the different topics addressed are;
1) Three dimensional FET nanostructures based on SiNW and GeNW with advanced configuration.
2) New applications of SiNW with build-in strain for fast silicon-base optoelectronic devices.
3) Low frequency noise in advanced nanoelectronic structures
4) THz devices for IR-detection
5) Bio-sensor nanoelectronics for extreme bio-molecule sensitivity and real time detection of DNA.
These areas are carefully chosen to assemble the right mix with predictable research success and with a few areas that can be called high gain/high risk. In particular we want to mention that focus area 2 and 4 have a great potential impact when successful but also at a certain higher risk for a more difficult implementation in future devices. There is in no cases any risk that the research will not generate high quality scientific results.
Max ERC Funding
1 999 500 €
Duration
Start date: 2009-06-01, End date: 2014-05-31
Project acronym OTEGS
Project Organic Thermoelectric Generators
Researcher (PI) Xavier Dominique Etienne Crispin
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2012-StG_20111012
Summary At the moment, there is no viable technology to produce electricity from natural heat sources (T<200°C) and from 50% of the waste heat (electricity production, industries, buildings and transports) stored in large volume of warm fluids (T<200°C). To extract heat from large volumes of fluids, the thermoelectric generators would need to cover large areas in new designed heat exchangers. To develop into a viable technology platform, thermoelectric devices must be fabricated on large areas via low-cost processes. But no thermoelectric material exists for this purpose.
Recently, the applicant has discovered that the low-cost conducting polymer poly(ethylene dioxythiophene) possesses a figure-of-merit ZT=0.25 at room temperature. Conducting polymers can be processed from solution, they are flexible and possess an intrinsic low thermal conductivity. This combination of unique properties motivate further investigations to reveal the true potential of organic materials for thermoelectric applications: this is the essence of this project.
My goal is to organize an interdisciplinary team of researchers focused on the characterization, understanding, design and fabrication of p- and n-doped organic-based thermoelectric materials; and the demonstration of those materials in organic thermoelectric generators (OTEGs). Firstly, we will create the first generation of efficient organic thermoelectric materials with ZT> 0.8 at room temperature: (i) by optimizing not only the power factor but also the thermal conductivity; (ii) by demonstrating that a large power factor is obtained in inorganic-organic nanocomposites. Secondly, we will optimize thermoelectrochemical cells by considering various types of electrolytes.
The research activities proposed are at the cutting edge in material sciences and involve chemical synthesis, interface studies, thermal physics, electrical, electrochemical and structural characterization, device physics. The project is held at Linköping University holding a world leading research in polymer electronics.
Summary
At the moment, there is no viable technology to produce electricity from natural heat sources (T<200°C) and from 50% of the waste heat (electricity production, industries, buildings and transports) stored in large volume of warm fluids (T<200°C). To extract heat from large volumes of fluids, the thermoelectric generators would need to cover large areas in new designed heat exchangers. To develop into a viable technology platform, thermoelectric devices must be fabricated on large areas via low-cost processes. But no thermoelectric material exists for this purpose.
Recently, the applicant has discovered that the low-cost conducting polymer poly(ethylene dioxythiophene) possesses a figure-of-merit ZT=0.25 at room temperature. Conducting polymers can be processed from solution, they are flexible and possess an intrinsic low thermal conductivity. This combination of unique properties motivate further investigations to reveal the true potential of organic materials for thermoelectric applications: this is the essence of this project.
My goal is to organize an interdisciplinary team of researchers focused on the characterization, understanding, design and fabrication of p- and n-doped organic-based thermoelectric materials; and the demonstration of those materials in organic thermoelectric generators (OTEGs). Firstly, we will create the first generation of efficient organic thermoelectric materials with ZT> 0.8 at room temperature: (i) by optimizing not only the power factor but also the thermal conductivity; (ii) by demonstrating that a large power factor is obtained in inorganic-organic nanocomposites. Secondly, we will optimize thermoelectrochemical cells by considering various types of electrolytes.
The research activities proposed are at the cutting edge in material sciences and involve chemical synthesis, interface studies, thermal physics, electrical, electrochemical and structural characterization, device physics. The project is held at Linköping University holding a world leading research in polymer electronics.
Max ERC Funding
1 453 690 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym OutflowMagn
Project Magnetic fields and the outflows during the formation and evolution of stars
Researcher (PI) Wouter Henricus Theodorus Vlemmings
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary The outflows of young and old stars play a crucial role in the cycle of matter in galaxies. Stars and planetary systems are formed through complex physical processes during the collapse of gas clouds with outflows a required ingredient. At the end of a stars life, stellar outflows are the main source of heavy elements that are essential for the formation of stars, planets and life. Magnetic fields are one of the key factors governing the in particular the often observed collimated outflow. They might also be a key ingredient in driving stellar mass loss and are potentially essential for stabilizing accretion disks of, in particular, massive proto-stars. Only polarization observations at different spatial scales are able to measure the strength and structure of magnetic fields during the launching of outflows from young and old stars. Because stars in these evolutionary phases are highly obscured by dusty envelopes, their magnetic fields are best probed through observations of molecules and dust at submillimeter and radio wavelengths. In addition to its role, the origin of the magnetic field in these stellar phases is also still unknown and to determine it multi-wavelength observations are essential. The proposed research group will use state of the art submillimeter and radio instruments, integrated with self-consistent radiative transfer and magneto-hydrodynamic models, to examine the role and origin of magnetic fields during star formation and in the outflows from evolved stars. The group will search for planets around evolved stars to answer the elusive question on the origin of their magnetic field and determine the connection between the galactic magnetic field and that responsible for the formation of jets and potentially disks around young proto-stars. This fundamental new work, for which a dedicated research group is essential, will reveal the importance of magnetism during star formation as well as in driving and shaping the mass loss of evolved stars.
Summary
The outflows of young and old stars play a crucial role in the cycle of matter in galaxies. Stars and planetary systems are formed through complex physical processes during the collapse of gas clouds with outflows a required ingredient. At the end of a stars life, stellar outflows are the main source of heavy elements that are essential for the formation of stars, planets and life. Magnetic fields are one of the key factors governing the in particular the often observed collimated outflow. They might also be a key ingredient in driving stellar mass loss and are potentially essential for stabilizing accretion disks of, in particular, massive proto-stars. Only polarization observations at different spatial scales are able to measure the strength and structure of magnetic fields during the launching of outflows from young and old stars. Because stars in these evolutionary phases are highly obscured by dusty envelopes, their magnetic fields are best probed through observations of molecules and dust at submillimeter and radio wavelengths. In addition to its role, the origin of the magnetic field in these stellar phases is also still unknown and to determine it multi-wavelength observations are essential. The proposed research group will use state of the art submillimeter and radio instruments, integrated with self-consistent radiative transfer and magneto-hydrodynamic models, to examine the role and origin of magnetic fields during star formation and in the outflows from evolved stars. The group will search for planets around evolved stars to answer the elusive question on the origin of their magnetic field and determine the connection between the galactic magnetic field and that responsible for the formation of jets and potentially disks around young proto-stars. This fundamental new work, for which a dedicated research group is essential, will reveal the importance of magnetism during star formation as well as in driving and shaping the mass loss of evolved stars.
Max ERC Funding
2 000 000 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym OXLEET
Project Oxidation via low-energy electron transfer. Development of green oxidation methodology via a biomimetic approach
Researcher (PI) Jan Erling Bäckvall
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE5, ERC-2009-AdG
Summary Oxidation reactions are of fundamental importance in Nature and are key transformation in organic synthesis. There is currently a need from society to replace waste-producing expensive oxidants by environmentally benign oxidants in industrial oxidation reactions. The aim with the proposed research is to develop novel green oxidation methodology that also involves hydrogen transfer reactions. In the oxidation reactions the goal is to use molecular oxygen (air) or hydrogen peroxide as the oxidants. In the present project new catalytic oxidations via low-energy electron transfer will be developed. The catalytic reactions obtained can be used for racemization of alcohols and amines and for oxygen- and hydrogen peroxide-driven oxidations of various substrates. Examples of some reactions that will be studied are oxidative palladium-catalyzed C-C bond formation and metal-catalyzed C-H oxidation including dehydrogenation reactions with iron and ruthenium. Coupled catalytic systems where electron transfer mediators (ETMs) facilitate electron transfer from the reduced catalyst to molecular oxygen (hydrogen peroxide) will be studied. Highly efficient reoxidation systems will be designed by covalently linking two electron transfer mediators (ETMs). The intramolecular electron transfer in these hybrid ETM catalysts will significantly increase the rate of oxidation reactions. The research will lead to development of more efficient reoxidation systems based on molecular oxygen and hydrogen peroxide, as well as more versatile racemization catalysts for alcohols and amines.
Summary
Oxidation reactions are of fundamental importance in Nature and are key transformation in organic synthesis. There is currently a need from society to replace waste-producing expensive oxidants by environmentally benign oxidants in industrial oxidation reactions. The aim with the proposed research is to develop novel green oxidation methodology that also involves hydrogen transfer reactions. In the oxidation reactions the goal is to use molecular oxygen (air) or hydrogen peroxide as the oxidants. In the present project new catalytic oxidations via low-energy electron transfer will be developed. The catalytic reactions obtained can be used for racemization of alcohols and amines and for oxygen- and hydrogen peroxide-driven oxidations of various substrates. Examples of some reactions that will be studied are oxidative palladium-catalyzed C-C bond formation and metal-catalyzed C-H oxidation including dehydrogenation reactions with iron and ruthenium. Coupled catalytic systems where electron transfer mediators (ETMs) facilitate electron transfer from the reduced catalyst to molecular oxygen (hydrogen peroxide) will be studied. Highly efficient reoxidation systems will be designed by covalently linking two electron transfer mediators (ETMs). The intramolecular electron transfer in these hybrid ETM catalysts will significantly increase the rate of oxidation reactions. The research will lead to development of more efficient reoxidation systems based on molecular oxygen and hydrogen peroxide, as well as more versatile racemization catalysts for alcohols and amines.
Max ERC Funding
1 722 000 €
Duration
Start date: 2010-01-01, End date: 2015-12-31
Project acronym PALP
Project Physics of Atoms with Attosecond Light Pulses
Researcher (PI) Anne L'huillier Wahlström
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2013-ADG
Summary "The field of attosecond science is now entering the second decade of its existence, with good prospects for breakthroughs in a number of areas. We want to take the next step in this development: from mastering the generation and control of attosecond pulses to breaking new marks starting with the simplest systems, atoms. The aim of the present application is to advance the emerging new research field “Ultrafast Atomic Physics”, where one- or two-electron wave packets are created by absorption of attosecond pulse(s) and analyzed or controlled by another short pulse. Our project can be divided into three parts:
1. Interferometric measurements using tunable attosecond pulses
How long time does it take for an electron to escape its potential?
We will measure photoemission time delays for several atomic systems, using a tunable attosecond pulse source. This type of measurements will be extended to multiple ionization and excitation processes, using coincidence measurements to disentangle the different channels and infrared ionization for analysis.
2. XUV pump/XUV probe experiments using intense attosecond pulses
How long does it take for an atom to become an ion once a hole has been created?
Using intense attosecond pulses and the possibility to do XUV pump/ XUV probe experiments, we will study the transition between nonsequential double ionization, where the photons are absorbed simultaneously and all electrons emitted at the same time and sequential ionization where electrons are emitted one at a time.
3. ""Complete"" attosecond experiments using high-repetition rate attosecond pulses
We foresee a paradigm shift in attosecond science with the new high repetition rate systems based on optical parametric chirped pulse amplification which are coming to age. We want to combine coincidence measurement with angular detection, allowing us to characterize (two-particle) electronic wave packets both in time and in momentum and to study their quantum-mechanical properties."
Summary
"The field of attosecond science is now entering the second decade of its existence, with good prospects for breakthroughs in a number of areas. We want to take the next step in this development: from mastering the generation and control of attosecond pulses to breaking new marks starting with the simplest systems, atoms. The aim of the present application is to advance the emerging new research field “Ultrafast Atomic Physics”, where one- or two-electron wave packets are created by absorption of attosecond pulse(s) and analyzed or controlled by another short pulse. Our project can be divided into three parts:
1. Interferometric measurements using tunable attosecond pulses
How long time does it take for an electron to escape its potential?
We will measure photoemission time delays for several atomic systems, using a tunable attosecond pulse source. This type of measurements will be extended to multiple ionization and excitation processes, using coincidence measurements to disentangle the different channels and infrared ionization for analysis.
2. XUV pump/XUV probe experiments using intense attosecond pulses
How long does it take for an atom to become an ion once a hole has been created?
Using intense attosecond pulses and the possibility to do XUV pump/ XUV probe experiments, we will study the transition between nonsequential double ionization, where the photons are absorbed simultaneously and all electrons emitted at the same time and sequential ionization where electrons are emitted one at a time.
3. ""Complete"" attosecond experiments using high-repetition rate attosecond pulses
We foresee a paradigm shift in attosecond science with the new high repetition rate systems based on optical parametric chirped pulse amplification which are coming to age. We want to combine coincidence measurement with angular detection, allowing us to characterize (two-particle) electronic wave packets both in time and in momentum and to study their quantum-mechanical properties."
Max ERC Funding
2 047 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym PARADIGM
Project New Paradigm in the Design of Degradable Polymeric Materials - Macroscopic Performance Translated to all Levels of Order
Researcher (PI) Ann-Christine Albertsson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE8, ERC-2009-AdG
Summary A new generation of polymeric materials is needed promptly that does not behave like traditional commodity plastics in terms of environmental interaction, degradation pattern, fragmentation tendency, and biological persistency. I herein propose a new paradigm in the design of polymeric materials; the design of polymeric materials through a retro-structural approach where the macroscopic performance is translated to every scale level of structural order so that appropriate molecular recognitions are identified and subsequently synthetically generated in a bottom-up procedure. Inspiration on how to design such materials is best drawn from Nature which is unsurpassed in its ability to combine molecular building blocks into perfectly designed versatile super- and supramolecular structures with well-defined properties, disassembly patterns, and biological functions. A closer look into the structural build-up of biological materials gives important clues on how to design synthetic functional materials with desirable environmental interaction. In addition to advanced synthesis, surface modification and processing, the materials and their degradation behavior will be thoroughly characterized by using traditional characterization techniques in combination with latest spectroscopic and imaging techniques. I have chosen to focus on two areas that stand out as highly prioritized in maintaining or even raising our quality of life; sustainable materials for commodity applications and tissue engineering systems in biomaterials science. This is a bold high risk proposal which if successful will have a ground-breaking influence on how we design polymeric materials.
Summary
A new generation of polymeric materials is needed promptly that does not behave like traditional commodity plastics in terms of environmental interaction, degradation pattern, fragmentation tendency, and biological persistency. I herein propose a new paradigm in the design of polymeric materials; the design of polymeric materials through a retro-structural approach where the macroscopic performance is translated to every scale level of structural order so that appropriate molecular recognitions are identified and subsequently synthetically generated in a bottom-up procedure. Inspiration on how to design such materials is best drawn from Nature which is unsurpassed in its ability to combine molecular building blocks into perfectly designed versatile super- and supramolecular structures with well-defined properties, disassembly patterns, and biological functions. A closer look into the structural build-up of biological materials gives important clues on how to design synthetic functional materials with desirable environmental interaction. In addition to advanced synthesis, surface modification and processing, the materials and their degradation behavior will be thoroughly characterized by using traditional characterization techniques in combination with latest spectroscopic and imaging techniques. I have chosen to focus on two areas that stand out as highly prioritized in maintaining or even raising our quality of life; sustainable materials for commodity applications and tissue engineering systems in biomaterials science. This is a bold high risk proposal which if successful will have a ground-breaking influence on how we design polymeric materials.
Max ERC Funding
2 500 000 €
Duration
Start date: 2010-03-01, End date: 2016-02-29
Project acronym PEBBLE2PLANET
Project From pebbles to planets: towards new horizons in the formation of planets
Researcher (PI) Anders Johansen
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2011-StG_20101014
Summary "The goal of this ERC Starting Grant proposal is to make significant advances in our understanding of how planetesimals and gas giant planets form. I propose an ambitious research programme dedicated to answering three key questions at the frontier of planet formation theory: - How do mm-sized particles grow past the bouncing barrier? - What is the Initial Mass Function of planetesimals? - How do the cores of gas giants form and evolve?} I will address these questions using a combination of novel ideas and computer simulations to model three critical stages of planet formation: 1) the growth of pebbles into rocks and boulders by coagulation and vapour condensation, 2) the gravitational collapse of clumps of rocks and boulders into planetesimals with an array of sizes, and 3) the long term growth of planetesimals as they grow to become cores of gas giants by accreting pebbles embedded in the gas. These investigations will form an important theoretical foundation for understanding the next generation of observations of protoplanetary disc pebbles, planetesimal belts, and planetary systems. The self-consistent models for the formation of planets resulting from this proposal will shed light on the spatial distribution of pebbles in gas discs around young stars (observable with the ALMA telescopes), on the initial state of planetesimal belts (crucial for understanding the evolution of debris discs observable with JWST and the asteroid and Kuiper belts), and on the formation and evolution of the wealth of exoplanetary systems detected in the near future (by astrometry with the Gaia satellite, by ground-based radial velocity surveys, and by direct imaging with E-ELT)."
Summary
"The goal of this ERC Starting Grant proposal is to make significant advances in our understanding of how planetesimals and gas giant planets form. I propose an ambitious research programme dedicated to answering three key questions at the frontier of planet formation theory: - How do mm-sized particles grow past the bouncing barrier? - What is the Initial Mass Function of planetesimals? - How do the cores of gas giants form and evolve?} I will address these questions using a combination of novel ideas and computer simulations to model three critical stages of planet formation: 1) the growth of pebbles into rocks and boulders by coagulation and vapour condensation, 2) the gravitational collapse of clumps of rocks and boulders into planetesimals with an array of sizes, and 3) the long term growth of planetesimals as they grow to become cores of gas giants by accreting pebbles embedded in the gas. These investigations will form an important theoretical foundation for understanding the next generation of observations of protoplanetary disc pebbles, planetesimal belts, and planetary systems. The self-consistent models for the formation of planets resulting from this proposal will shed light on the spatial distribution of pebbles in gas discs around young stars (observable with the ALMA telescopes), on the initial state of planetesimal belts (crucial for understanding the evolution of debris discs observable with JWST and the asteroid and Kuiper belts), and on the formation and evolution of the wealth of exoplanetary systems detected in the near future (by astrometry with the Gaia satellite, by ground-based radial velocity surveys, and by direct imaging with E-ELT)."
Max ERC Funding
1 332 467 €
Duration
Start date: 2012-01-01, End date: 2016-12-31
Project acronym PHOTOCHROMES
Project Photochromic Systems for Solid State Molecular Electronic Devices and Light-Activated Cancer Drugs
Researcher (PI) Joakim Andréasson
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE4, ERC-2007-StG
Summary Photochromic molecules, or photochromes, can be reversibly isomerized between two thermally stable forms by exposure to light of different wavelengths. Upon isomerization, properties such as excitation energies, redox properties, charge distribution, and structure experience significant changes. These changes can be harnessed to switch “on” or “off” the action of a variety of photophysical processes in the photochromic constructs, e.g., energy and electron transfer. Until now, the focus of my research has been to show proof of principle for a large selection of molecule-based photonically controlled logic devices (solution based) with the functional basis in the switching of the transfer processes mentioned above. Now, I wish to extend the study to include experiments in the solid state, e.g., polymer matrices. Taking the step into doing solid state chemistry is not only a prerequisite for any real-world application. It will also allow for experiments that cannot be performed in fluid solution, such as aligning molecules in a stretched film for chemistry with polarized light, and immobilization of molecules for selective addressing in a three-dimensional array of volume elements. Furthermore, I intend to investigate the possibility to photonically control the membrane penetrating and the DNA-binding abilities of photochromes, aiming at, in a long-term perspective, light-activated cancer drugs. Due to the fact that both the structure and the charge distribution of a photochrome may change drastically upon isomerization, one of the two isomeric forms is often suitable for penetrating a membrane. Inside the membrane, e.g., in a cell, the photochrome can be photo-isomerized to a structure with high affinity for strong binding to DNA. Upon binding, transcription is inhibited and the cell dies. If desired, pH-sensitivity and two-photon processes could be used to further increase the selectivity in addressing very specific regions of the body, such as a tumor.
Summary
Photochromic molecules, or photochromes, can be reversibly isomerized between two thermally stable forms by exposure to light of different wavelengths. Upon isomerization, properties such as excitation energies, redox properties, charge distribution, and structure experience significant changes. These changes can be harnessed to switch “on” or “off” the action of a variety of photophysical processes in the photochromic constructs, e.g., energy and electron transfer. Until now, the focus of my research has been to show proof of principle for a large selection of molecule-based photonically controlled logic devices (solution based) with the functional basis in the switching of the transfer processes mentioned above. Now, I wish to extend the study to include experiments in the solid state, e.g., polymer matrices. Taking the step into doing solid state chemistry is not only a prerequisite for any real-world application. It will also allow for experiments that cannot be performed in fluid solution, such as aligning molecules in a stretched film for chemistry with polarized light, and immobilization of molecules for selective addressing in a three-dimensional array of volume elements. Furthermore, I intend to investigate the possibility to photonically control the membrane penetrating and the DNA-binding abilities of photochromes, aiming at, in a long-term perspective, light-activated cancer drugs. Due to the fact that both the structure and the charge distribution of a photochrome may change drastically upon isomerization, one of the two isomeric forms is often suitable for penetrating a membrane. Inside the membrane, e.g., in a cell, the photochrome can be photo-isomerized to a structure with high affinity for strong binding to DNA. Upon binding, transcription is inhibited and the cell dies. If desired, pH-sensitivity and two-photon processes could be used to further increase the selectivity in addressing very specific regions of the body, such as a tumor.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym PLANETESYS
Project The next-generation planet formation model
Researcher (PI) Anders JOHANSEN
Host Institution (HI) LUNDS UNIVERSITET
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary The goal of this ERC Consolidator Grant proposal is to make significant contributions to our understanding of the formation of planetary systems and the chemical composition of planets. I will achieve this by developing a planet formation model that integrates the most relevant physical processes and combines the newly discovered pebble accretion mechanism with gravitational interaction between a high number of growing embryos. Exploiting the results of the computer simulations will allow me to address three major, outstanding research questions in the study of planets and their formation:
* What are the dominant physical processes that shape planetary systems?
* How are solids flash-heated in protoplanetary discs?
* What are the conditions for forming habitable planets?
I will follow the chemical composition of solid bodies in a protoplanetary disc as they grow from dust grains to fully fledged planets. This will shed light on the formation pathways of all major planetary classes – from terrestrial planets, over super-Earths to ice giants and gas giants – in orbital configurations acquired under the combined effects of planetary growth, migration and gravitational interaction between the developing planets. I will examine the role of the CO iceline as a nursery for planetary embryos that grow and migrate to form cold gas giants akin to Jupiter and Saturn in our Solar System. I will also explore the formation of the mysterious chondrules – widespread in primitive meteorites – by lightning discharge during planetesimal formation and address the role of chondrules for planet formation. Finally, I will simulate the delivery of life-essential volatiles to terrestrial planets and super-Earths in the habitable zone, considering the simultaneous growth of rocky and icy planetary embryos and gravitational stirring by migrating giant planets, for a wide range of planetary system architectures.
Summary
The goal of this ERC Consolidator Grant proposal is to make significant contributions to our understanding of the formation of planetary systems and the chemical composition of planets. I will achieve this by developing a planet formation model that integrates the most relevant physical processes and combines the newly discovered pebble accretion mechanism with gravitational interaction between a high number of growing embryos. Exploiting the results of the computer simulations will allow me to address three major, outstanding research questions in the study of planets and their formation:
* What are the dominant physical processes that shape planetary systems?
* How are solids flash-heated in protoplanetary discs?
* What are the conditions for forming habitable planets?
I will follow the chemical composition of solid bodies in a protoplanetary disc as they grow from dust grains to fully fledged planets. This will shed light on the formation pathways of all major planetary classes – from terrestrial planets, over super-Earths to ice giants and gas giants – in orbital configurations acquired under the combined effects of planetary growth, migration and gravitational interaction between the developing planets. I will examine the role of the CO iceline as a nursery for planetary embryos that grow and migrate to form cold gas giants akin to Jupiter and Saturn in our Solar System. I will also explore the formation of the mysterious chondrules – widespread in primitive meteorites – by lightning discharge during planetesimal formation and address the role of chondrules for planet formation. Finally, I will simulate the delivery of life-essential volatiles to terrestrial planets and super-Earths in the habitable zone, considering the simultaneous growth of rocky and icy planetary embryos and gravitational stirring by migrating giant planets, for a wide range of planetary system architectures.
Max ERC Funding
1 985 818 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym PLASMA
Project Running away and radiating
Researcher (PI) Tünde-Maria Fülöp
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary Particle acceleration and radiation in plasmas has a wide variety of applications, ranging from cancer therapy and lightning initiation, to the improved design of fusion devices for large scale energy production. The goal of this project is to build a flexible ensemble of theoretical and numerical models that describes the acceleration processes and the resulting fast particle dynamics in two focus areas: magnetic fusion plasmas and laser-produced plasmas. This interdisciplinary approach is a new way of studying charged particle acceleration. It will lead to a deeper understanding of the complex interactions that characterise fast particle behaviour in plasmas. Plasmas are complex systems, with many kinds of interacting electromagnetic (EM) waves and charged particles. For such a system it is infeasible to build one model which captures both the small scale physics and the large scale phenomena. Therefore we aim to develop several complementary models, in one common framework, and make sure they agree in overlapping regions. The common framework will be built layer-by-layer, using models derived from first principles in a systematic way, with theory closely linked to numerics and validated by experimental observations. The key object of study is the evolution of the velocity-space particle distribution in time and space. The main challenge is the strong coupling between the distribution and the EM-field, which requires models with self-consistent coupling of Maxwell’s equations and kinetic equations. For the latter we will use Vlasov-Fokker-Planck solvers extended with advanced collision operators. Interesting aspects include non-Maxwellian distributions, instabilities, shock-wave formation and avalanches. The resulting theoretical framework and the corresponding code-suite will be a novel instrument for advanced studies of charged particle acceleration. Due to the generality of our approach, the applicability will reach far beyond the two focus areas.
Summary
Particle acceleration and radiation in plasmas has a wide variety of applications, ranging from cancer therapy and lightning initiation, to the improved design of fusion devices for large scale energy production. The goal of this project is to build a flexible ensemble of theoretical and numerical models that describes the acceleration processes and the resulting fast particle dynamics in two focus areas: magnetic fusion plasmas and laser-produced plasmas. This interdisciplinary approach is a new way of studying charged particle acceleration. It will lead to a deeper understanding of the complex interactions that characterise fast particle behaviour in plasmas. Plasmas are complex systems, with many kinds of interacting electromagnetic (EM) waves and charged particles. For such a system it is infeasible to build one model which captures both the small scale physics and the large scale phenomena. Therefore we aim to develop several complementary models, in one common framework, and make sure they agree in overlapping regions. The common framework will be built layer-by-layer, using models derived from first principles in a systematic way, with theory closely linked to numerics and validated by experimental observations. The key object of study is the evolution of the velocity-space particle distribution in time and space. The main challenge is the strong coupling between the distribution and the EM-field, which requires models with self-consistent coupling of Maxwell’s equations and kinetic equations. For the latter we will use Vlasov-Fokker-Planck solvers extended with advanced collision operators. Interesting aspects include non-Maxwellian distributions, instabilities, shock-wave formation and avalanches. The resulting theoretical framework and the corresponding code-suite will be a novel instrument for advanced studies of charged particle acceleration. Due to the generality of our approach, the applicability will reach far beyond the two focus areas.
Max ERC Funding
1 948 750 €
Duration
Start date: 2015-10-01, End date: 2020-09-30
Project acronym PrecisionNuclei
Project Strong interactions for precision nuclear physics
Researcher (PI) Andreas EKSTRÖM
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary Nuclear physics is a cornerstone in our scientific endeavour to understand the universe. Indeed, atomic nuclei bring us closer to study both the stellar explosions in the macrocosmos, where the elements are formed, and the fundamental symmetries of the microcosmos. Having access to a a precise description of the interactions between protons and neutrons would provide a key to new knowledge across 20 orders of magnitude; from neutrinos to neutron stars. Despite a century of the finest efforts, a systematic description of strongly interacting matter at low energies is still lacking. Successful theoretical approaches, such as mean-field and shell models, rely on uncontrolled approximations that severely limit their predictive power in regions where the model has not been adjusted.
In this project I will develop a novel methodology to use experimental information from heavy atomic nuclei in the construction of nuclear interactions from chiral effective field theory. I expect this approach to enable me and my team to make precise ab initio predictions of various nuclear observables in a wide mass-range from hydrogen to lead as well as infinite nuclear matter. I will apply Bayesian regression and methods from machine learning to quantify the statistical and systematic uncertainties of the theoretical predictions. The novelty and challenge in this project lies in synthesising (i) the design of nuclear interactions, (ii) ab initio calculations of nuclei, and (iii) statistical inference in the confrontation between theory and experimental data. This alignment of methods, harboured within the same project, will create a clear scientific advantage and allow me to tackle the following big research question: how can atomic nuclei be described in chiral effective field theories of quantum chromo dynamics?
Summary
Nuclear physics is a cornerstone in our scientific endeavour to understand the universe. Indeed, atomic nuclei bring us closer to study both the stellar explosions in the macrocosmos, where the elements are formed, and the fundamental symmetries of the microcosmos. Having access to a a precise description of the interactions between protons and neutrons would provide a key to new knowledge across 20 orders of magnitude; from neutrinos to neutron stars. Despite a century of the finest efforts, a systematic description of strongly interacting matter at low energies is still lacking. Successful theoretical approaches, such as mean-field and shell models, rely on uncontrolled approximations that severely limit their predictive power in regions where the model has not been adjusted.
In this project I will develop a novel methodology to use experimental information from heavy atomic nuclei in the construction of nuclear interactions from chiral effective field theory. I expect this approach to enable me and my team to make precise ab initio predictions of various nuclear observables in a wide mass-range from hydrogen to lead as well as infinite nuclear matter. I will apply Bayesian regression and methods from machine learning to quantify the statistical and systematic uncertainties of the theoretical predictions. The novelty and challenge in this project lies in synthesising (i) the design of nuclear interactions, (ii) ab initio calculations of nuclei, and (iii) statistical inference in the confrontation between theory and experimental data. This alignment of methods, harboured within the same project, will create a clear scientific advantage and allow me to tackle the following big research question: how can atomic nuclei be described in chiral effective field theories of quantum chromo dynamics?
Max ERC Funding
1 499 085 €
Duration
Start date: 2018-02-01, End date: 2023-01-31
Project acronym PROMETHEUS
Project Flame nanoengineering for antibacterial medical devices
Researcher (PI) Georgios SOTIRIOU
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Starting Grant (StG), PE8, ERC-2017-STG
Summary Engineers in nanotechnology research labs have been quite innovative the last decade in designing nanoscale materials for medicine. However, very few of these exciting discoveries are translated to commercial medical products today. The main reasons for this are two inherent limitations of most nanomanufacture processes: scalability and reproducibility. There is too little knowledge on how well the unique properties associated with nanoparticles are maintained during their large-scale production while often poor reproducibility hinders their successful use. A key goal here is to utilize a nanomanufacture process famous for its scalability and reproducibility, flame aerosol reactors that produce at tons/hr commodity powders, and advance the knowledge for synthesis of complex nanoparticles and their direct integration in medical devices. Our aim is to develop the next generation of antibacterial medical devices to fight antimicrobial resistance, a highly understudied field. Antimicrobial resistance constitutes the most serious public health threat today with estimations to become the leading cause of human deaths in 30 years.
We focus on flame direct nanoparticle deposition on substrates combining nanoparticle production and functional layer deposition in a single-step with close attention to product nanoparticle properties and device assembly, extending beyond the simple commodity powders of the past. Specific targets here are two devices; a) hybrid drug microneedle patch with photothermal nanoparticles to fight life-threatening skin infections from drug-resistant bacteria and b) smart nanocoatings on implants providing both osteogenic and self-triggered antibacterial properties. The engineering approach for the development of antibacterial devices will provide insight into the basic physicochemical principles to assist in commercialization while the outcome of this research will help the fight against antibiotic resistance improving the public health worldwide.
Summary
Engineers in nanotechnology research labs have been quite innovative the last decade in designing nanoscale materials for medicine. However, very few of these exciting discoveries are translated to commercial medical products today. The main reasons for this are two inherent limitations of most nanomanufacture processes: scalability and reproducibility. There is too little knowledge on how well the unique properties associated with nanoparticles are maintained during their large-scale production while often poor reproducibility hinders their successful use. A key goal here is to utilize a nanomanufacture process famous for its scalability and reproducibility, flame aerosol reactors that produce at tons/hr commodity powders, and advance the knowledge for synthesis of complex nanoparticles and their direct integration in medical devices. Our aim is to develop the next generation of antibacterial medical devices to fight antimicrobial resistance, a highly understudied field. Antimicrobial resistance constitutes the most serious public health threat today with estimations to become the leading cause of human deaths in 30 years.
We focus on flame direct nanoparticle deposition on substrates combining nanoparticle production and functional layer deposition in a single-step with close attention to product nanoparticle properties and device assembly, extending beyond the simple commodity powders of the past. Specific targets here are two devices; a) hybrid drug microneedle patch with photothermal nanoparticles to fight life-threatening skin infections from drug-resistant bacteria and b) smart nanocoatings on implants providing both osteogenic and self-triggered antibacterial properties. The engineering approach for the development of antibacterial devices will provide insight into the basic physicochemical principles to assist in commercialization while the outcome of this research will help the fight against antibiotic resistance improving the public health worldwide.
Max ERC Funding
1 812 500 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym PROMISE
Project Origins of the Molecular Cloud Structure
Researcher (PI) Jouni Tapani Kainulainen
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary Understanding the physical processes that control the life-cycle of the interstellar medium (ISM) is one of the key themes in the astrophysics of galaxies today. This importance originates from the role of the ISM as the birthplace of new stars, and therefore, as an indivisible component of galaxy evolution. Exactly how the conversion of the ISM to stars takes place is intricately linked to how the internal structure of the cold, molecular clouds in the ISM forms and evolves. Despite this pivotal role, our picture of the molecular cloud structure has a fundamental lacking: it is based largely on observations of low-mass molecular clouds. Yet, it is the massive, giant molecular clouds (GMCs) in which most stars form and which impact the ISM of galaxies most. I present a program that will fill this gap and make profound progress in the field. We have developed a new observational technique that provides an unparalleled view of the structure of young GMCs. I also have developed a powerful tool to study the most important structural characteristics of molecular clouds, e.g., the probability distribution of volume densities, which have not been accessible before. With this program, the full potential of these tools will be put into use. We will produce a unique, high-fidelity column density data set for a statistically interesting volume in the Galaxy, including thousands of molecular clouds. The data set will be unmatched in its quality and extent, providing an unprecedented basis for statistical studies. We will then connect this outstanding observational view with state-of-the-art numerical simulations. This approach allows us to address the key question in the field: Which processes drive the structure formation in massive molecular clouds, and how do they do it? Most crucially, we will create a new, observationally constrained framework for the evolution of the molecular cloud structure over the entire mass range of molecular clouds and star formation in the ISM.
Summary
Understanding the physical processes that control the life-cycle of the interstellar medium (ISM) is one of the key themes in the astrophysics of galaxies today. This importance originates from the role of the ISM as the birthplace of new stars, and therefore, as an indivisible component of galaxy evolution. Exactly how the conversion of the ISM to stars takes place is intricately linked to how the internal structure of the cold, molecular clouds in the ISM forms and evolves. Despite this pivotal role, our picture of the molecular cloud structure has a fundamental lacking: it is based largely on observations of low-mass molecular clouds. Yet, it is the massive, giant molecular clouds (GMCs) in which most stars form and which impact the ISM of galaxies most. I present a program that will fill this gap and make profound progress in the field. We have developed a new observational technique that provides an unparalleled view of the structure of young GMCs. I also have developed a powerful tool to study the most important structural characteristics of molecular clouds, e.g., the probability distribution of volume densities, which have not been accessible before. With this program, the full potential of these tools will be put into use. We will produce a unique, high-fidelity column density data set for a statistically interesting volume in the Galaxy, including thousands of molecular clouds. The data set will be unmatched in its quality and extent, providing an unprecedented basis for statistical studies. We will then connect this outstanding observational view with state-of-the-art numerical simulations. This approach allows us to address the key question in the field: Which processes drive the structure formation in massive molecular clouds, and how do they do it? Most crucially, we will create a new, observationally constrained framework for the evolution of the molecular cloud structure over the entire mass range of molecular clouds and star formation in the ISM.
Max ERC Funding
1 266 750 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym PROPHET
Project Simplifying Development and Deployment of High-Performance, Reliable Distributed Systems
Researcher (PI) Dejan Kostic
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Starting Grant (StG), PE6, ERC-2010-StG_20091028
Summary Distributed systems form the foundation of our society's infrastructure. Unfortunately, they suffer from a number of problems. First, they are time-consuming to develop because it is difficult for the programmer to envision all possible deployment environments and design adaptation mechanisms that will achieve high performance in all scenarios. Second, the code is complex due to the numerous outcomes that have to be accounted for at development time and the need to reimplement state and network models. Third, the distributed systems are unreliable because of the difficulties of programming a system that runs over an asynchronous network and handles all possible failure scenarios. If left unchecked, these problems will keep plaguing existing systems and hinder development of a new generation of distributed services.
We propose a radically new approach to simplifying development and deployment of high-performance, reliable distributed systems. The key insight is in creating a new programming model and architecture that leverages the increases in per-node computational power, bandwidth and storage to achieve this goal. Instead of resolving difficult deployment choices at coding time, the programmer merely specifies the choices and the objectives that should be satisfied. The new runtime then resolves the choices during live execution so as to maximize the objectives. To accomplish this task, the runtime uses a groundbreaking combination of state-space exploration, simulation, behavior prediction, performance modeling, and program steering. In addition, our approach reuses the effort spent in distributed system testing by transmitting a behavior summary to the runtime to further speed up choice resolution.
Summary
Distributed systems form the foundation of our society's infrastructure. Unfortunately, they suffer from a number of problems. First, they are time-consuming to develop because it is difficult for the programmer to envision all possible deployment environments and design adaptation mechanisms that will achieve high performance in all scenarios. Second, the code is complex due to the numerous outcomes that have to be accounted for at development time and the need to reimplement state and network models. Third, the distributed systems are unreliable because of the difficulties of programming a system that runs over an asynchronous network and handles all possible failure scenarios. If left unchecked, these problems will keep plaguing existing systems and hinder development of a new generation of distributed services.
We propose a radically new approach to simplifying development and deployment of high-performance, reliable distributed systems. The key insight is in creating a new programming model and architecture that leverages the increases in per-node computational power, bandwidth and storage to achieve this goal. Instead of resolving difficult deployment choices at coding time, the programmer merely specifies the choices and the objectives that should be satisfied. The new runtime then resolves the choices during live execution so as to maximize the objectives. To accomplish this task, the runtime uses a groundbreaking combination of state-space exploration, simulation, behavior prediction, performance modeling, and program steering. In addition, our approach reuses the effort spent in distributed system testing by transmitting a behavior summary to the runtime to further speed up choice resolution.
Max ERC Funding
1 450 000 €
Duration
Start date: 2011-02-01, End date: 2016-12-31
Project acronym PROSECUTOR
Project Programming Language-Based Security To Rescue
Researcher (PI) Andreas Sabelfeld
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE6, ERC-2012-StG_20111012
Summary It is alarming that the society's critical infrastructures are not
fully prepared to meet the challenge of information security. Modern
computing systems are increasingly extensible, inter-connected, and
mobile. However, exactly these trends make systems more vulnerable to
attacks. A particularly exposed infrastructure is the world-wide web
infrastructure, where allowing the mere possibility of fetching a web
page opens up opportunities for delivering potentially malicious
executable content past current security mechanisms such as
firewalls. A critical challenge is to secure the computing
infrastructures without losing the benefits of the trends.
It is our firm belief that attacks will continue succeeding unless a
fundamental security solution, one that focuses on the security of the
actual applications (code), is devised. To this end, we are convinced
that application-level security can be best enforced, *by
construction*, at the level of programming languages.
ProSecuToR will develop the technology of *programming language-based
security* in order to secure computing infrastructures.
Language-based security is an innovative approach for enforcing
security by construction. The project will deliver policies and
enforcement mechanisms for protecting who can see and who can modify
sensitive data. Security policies will be expressible by the
programmer at the construction phase. We will devise a policy
framework capable of expressing fine-grained application-level
security policies. We will build practical enforcement mechanisms to
enforce the policies for expressive languages. Enforcement mechanisms
will be fully automatic, preventing dangerous programs from executing
whenever there is a possibility of compromising desired security
properties. The practicality will be demonstrated by building robust
web applications. ProSecuToR is expected to lead to breakthroughs in
*securing web mashups* and *end-to-end web application security*.
Summary
It is alarming that the society's critical infrastructures are not
fully prepared to meet the challenge of information security. Modern
computing systems are increasingly extensible, inter-connected, and
mobile. However, exactly these trends make systems more vulnerable to
attacks. A particularly exposed infrastructure is the world-wide web
infrastructure, where allowing the mere possibility of fetching a web
page opens up opportunities for delivering potentially malicious
executable content past current security mechanisms such as
firewalls. A critical challenge is to secure the computing
infrastructures without losing the benefits of the trends.
It is our firm belief that attacks will continue succeeding unless a
fundamental security solution, one that focuses on the security of the
actual applications (code), is devised. To this end, we are convinced
that application-level security can be best enforced, *by
construction*, at the level of programming languages.
ProSecuToR will develop the technology of *programming language-based
security* in order to secure computing infrastructures.
Language-based security is an innovative approach for enforcing
security by construction. The project will deliver policies and
enforcement mechanisms for protecting who can see and who can modify
sensitive data. Security policies will be expressible by the
programmer at the construction phase. We will devise a policy
framework capable of expressing fine-grained application-level
security policies. We will build practical enforcement mechanisms to
enforce the policies for expressive languages. Enforcement mechanisms
will be fully automatic, preventing dangerous programs from executing
whenever there is a possibility of compromising desired security
properties. The practicality will be demonstrated by building robust
web applications. ProSecuToR is expected to lead to breakthroughs in
*securing web mashups* and *end-to-end web application security*.
Max ERC Funding
1 500 000 €
Duration
Start date: 2013-01-01, End date: 2017-12-31
Project acronym ProtonPump
Project Structural mechanism coupling the reduction of oxygen to proton pumping in living cells
Researcher (PI) Richard Neutze
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Advanced Grant (AdG), PE4, ERC-2017-ADG
Summary Every breath you take delivers oxygen to mitochondria within the cells of your body. Mitochondria are energy transducing organelles that accept electrons liberated from the food that you eat in order to generate a transmembrane proton concentration gradient. Cytochrome c oxidase is an integral membrane protein complex in the mitochondria that accepts four electrons and reduces molecular oxygen to two water molecules while simultaneously pumping protons against a transmembrane potential. Cytochrome c oxidase homologues are found in almost all living organisms. Because oxygen is the final destination of the transferred electrons, this enzyme family is referred to as the terminal oxidases. Crystal structures of terminal oxidases have been known for more than two decades and these enzymes have been studied with virtually all biophysical and biochemical methods. Despite this scrutiny, it is unknown how redox reactions at the enzyme’s active site are coupled to proton pumping. Here I aim to create a three dimensional movie that reveals how proton exchange between key amino acid residues is controlled by the movements of electrons within the enzyme. This work will utilize state-of-the-art methods of time-resolved serial crystallography, time-resolved wide angle X-ray scattering and time-resolved X-ray emission spectroscopy at European X-ray free electron lasers (XFELs) and synchrotron radiation facilities to observe structural changes in terminal oxidases with time. I will develop new approaches for rapidly delivering oxygen or electrons into the protein’s active site in order to initiate the catalytic cycle in microcrystals and in solution. This project will yield completely new insight into one of the most important chemical reactions in biology while opening up the field of time-resolved structural studies of proteins beyond a handful of naturally occurring light-driven systems.
Summary
Every breath you take delivers oxygen to mitochondria within the cells of your body. Mitochondria are energy transducing organelles that accept electrons liberated from the food that you eat in order to generate a transmembrane proton concentration gradient. Cytochrome c oxidase is an integral membrane protein complex in the mitochondria that accepts four electrons and reduces molecular oxygen to two water molecules while simultaneously pumping protons against a transmembrane potential. Cytochrome c oxidase homologues are found in almost all living organisms. Because oxygen is the final destination of the transferred electrons, this enzyme family is referred to as the terminal oxidases. Crystal structures of terminal oxidases have been known for more than two decades and these enzymes have been studied with virtually all biophysical and biochemical methods. Despite this scrutiny, it is unknown how redox reactions at the enzyme’s active site are coupled to proton pumping. Here I aim to create a three dimensional movie that reveals how proton exchange between key amino acid residues is controlled by the movements of electrons within the enzyme. This work will utilize state-of-the-art methods of time-resolved serial crystallography, time-resolved wide angle X-ray scattering and time-resolved X-ray emission spectroscopy at European X-ray free electron lasers (XFELs) and synchrotron radiation facilities to observe structural changes in terminal oxidases with time. I will develop new approaches for rapidly delivering oxygen or electrons into the protein’s active site in order to initiate the catalytic cycle in microcrystals and in solution. This project will yield completely new insight into one of the most important chemical reactions in biology while opening up the field of time-resolved structural studies of proteins beyond a handful of naturally occurring light-driven systems.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym PSOPA
Project Phase-sensitive optical parametric amplifiers
Researcher (PI) Peter Avo Andrekson
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Optical amplifiers are essential in optical communication systems as they compensate loss induced by the transmission fiber ensuring signal integrity of the information being transmitted, as well as in other applications such as spectroscopy.
This research proposal deals with phase-sensitive optical parametric amplifiers (PSA) that have unique and superior properties compared with all other optical amplifiers, most notably the potential of noiseless amplification, very broad optical bandwidth, and being an enabler of a range of ultrafast all-optical functionalities. In communication, there is an urgent need to develop new technologies that can break the ‘nonlinear Shannon capacity limit’, which is considered a serious barrier for continued capacity increase needed to meet the exponentially growing demand for bandwidth. The use of PSAs is expected to be an essential part of this development.
The objective is to unleash the unexplored potential of PSAs by generating knowledge and implementing experimental demonstrations that go substantially beyond current state-of-the-art. This involves a mix of engineering and scientific challenges with telecom and non-telecom applications in mind. We will leverage advances in other areas e.g. low loss photonic crystal fibers and highly nonlinear materials to realize compact PSAs with unprecedented performance. Specifically, we will demonstrate:
• Significant merits (reach, spectral efficiency, capacity) of PSAs in optical transmission systems
• High coherence, low noise lasers by utilizing ultralow noise amplifier as gain element
• Very broad gain bandwidth, low noise PSAs using specially tailored nonlinear gain medium
• Compact (hybrid integration compatible) PSA using new nonlinear materials
• Novel ultrafast all-optical operations/signal processing using PSAs
• Capability of PSAs for detection of very weak optical signals for e.g. and quantum optics
Summary
Optical amplifiers are essential in optical communication systems as they compensate loss induced by the transmission fiber ensuring signal integrity of the information being transmitted, as well as in other applications such as spectroscopy.
This research proposal deals with phase-sensitive optical parametric amplifiers (PSA) that have unique and superior properties compared with all other optical amplifiers, most notably the potential of noiseless amplification, very broad optical bandwidth, and being an enabler of a range of ultrafast all-optical functionalities. In communication, there is an urgent need to develop new technologies that can break the ‘nonlinear Shannon capacity limit’, which is considered a serious barrier for continued capacity increase needed to meet the exponentially growing demand for bandwidth. The use of PSAs is expected to be an essential part of this development.
The objective is to unleash the unexplored potential of PSAs by generating knowledge and implementing experimental demonstrations that go substantially beyond current state-of-the-art. This involves a mix of engineering and scientific challenges with telecom and non-telecom applications in mind. We will leverage advances in other areas e.g. low loss photonic crystal fibers and highly nonlinear materials to realize compact PSAs with unprecedented performance. Specifically, we will demonstrate:
• Significant merits (reach, spectral efficiency, capacity) of PSAs in optical transmission systems
• High coherence, low noise lasers by utilizing ultralow noise amplifier as gain element
• Very broad gain bandwidth, low noise PSAs using specially tailored nonlinear gain medium
• Compact (hybrid integration compatible) PSA using new nonlinear materials
• Novel ultrafast all-optical operations/signal processing using PSAs
• Capability of PSAs for detection of very weak optical signals for e.g. and quantum optics
Max ERC Funding
2 499 264 €
Duration
Start date: 2012-03-01, End date: 2017-02-28