Project acronym 1D-Engine
Project 1D-electrons coupled to dissipation: a novel approach for understanding and engineering superconducting materials and devices
Researcher (PI) Adrian KANTIAN
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE3, ERC-2017-STG
Summary Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Summary
Correlated electrons are at the forefront of condensed matter theory. Interacting quasi-1D electrons have seen vast progress in analytical and numerical theory, and thus in fundamental understanding and quantitative prediction. Yet, in the 1D limit fluctuations preclude important technological use, particularly of superconductors. In contrast, high-Tc superconductors in 2D/3D are not precluded by fluctuations, but lack a fundamental theory, making prediction and engineering of their properties, a major goal in physics, very difficult. This project aims to combine the advantages of both areas by making major progress in the theory of quasi-1D electrons coupled to an electron bath, in part building on recent breakthroughs (with the PIs extensive involvement) in simulating 1D and 2D electrons with parallelized density matrix renormalization group (pDMRG) numerics. Such theory will fundamentally advance the study of open electron systems, and show how to use 1D materials as elements of new superconducting (SC) devices and materials: 1) It will enable a new state of matter, 1D electrons with true SC order. Fluctuations from the electronic liquid, such as graphene, could also enable nanoscale wires to appear SC at high temperatures. 2) A new approach for the deliberate engineering of a high-Tc superconductor. In 1D, how electrons pair by repulsive interactions is understood and can be predicted. Stabilization by reservoir - formed by a parallel array of many such 1D systems - offers a superconductor for which all factors setting Tc are known and can be optimized. 3) Many existing superconductors with repulsive electron pairing, all presently not understood, can be cast as 1D electrons coupled to a bath. Developing chain-DMFT theory based on pDMRG will allow these materials SC properties to be simulated and understood for the first time. 4) The insights gained will be translated to 2D superconductors to study how they could be enhanced by contact with electronic liquids.
Max ERC Funding
1 491 013 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 3DWATERWAVES
Project Mathematical aspects of three-dimensional water waves with vorticity
Researcher (PI) Erik Torsten Wahlén
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE1, ERC-2015-STG
Summary The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Summary
The goal of this project is to develop a mathematical theory for steady three-dimensional water waves with vorticity. The mathematical model consists of the incompressible Euler equations with a free surface, and vorticity is important for modelling the interaction of surface waves with non-uniform currents. In the two-dimensional case, there has been a lot of progress on water waves with vorticity in the last decade. This progress has mainly been based on the stream function formulation, in which the problem is reformulated as a nonlinear elliptic free boundary problem. An analogue of this formulation is not available in three dimensions, and the theory has therefore so far been restricted to irrotational flow. In this project we seek to go beyond this restriction using two different approaches. In the first approach we will adapt methods which have been used to construct three-dimensional ideal flows with vorticity in domains with a fixed boundary to the free boundary context (for example Beltrami flows). In the second approach we will develop methods which are new even in the case of a fixed boundary, by performing a detailed study of the structure of the equations close to a given shear flow using ideas from infinite-dimensional bifurcation theory. This involves handling infinitely many resonances.
Max ERC Funding
1 203 627 €
Duration
Start date: 2016-03-01, End date: 2021-02-28
Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ABACUS
Project Advancing Behavioral and Cognitive Understanding of Speech
Researcher (PI) Bart De Boer
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Summary
I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Max ERC Funding
1 276 620 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym ADDICTIONCIRCUITS
Project Drug addiction: molecular changes in reward and aversion circuits
Researcher (PI) Nils David Engblom
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Starting Grant (StG), LS5, ERC-2010-StG_20091118
Summary Our affective and motivational state is important for our decisions, actions and quality of life. Many pathological conditions affect this state. For example, addictive drugs are hyperactivating the reward system and trigger a strong motivation for continued drug intake, whereas many somatic and psychiatric diseases lead to an aversive state, characterized by loss of motivation. I will study specific neural circuits and mechanisms underlying reward and aversion, and how pathological signaling in these systems can trigger relapse in drug addiction.
Given the important role of the dopaminergic neurons in the midbrain for many aspects of reward signaling, I will study how synaptic plasticity in these cells, and in their target neurons in the striatum, contribute to relapse in drug seeking. I will also study the circuits underlying aversion. Little is known about these circuits, but my hypothesis is that an important component of aversion is signaled by a specific neuronal population in the brainstem parabrachial nucleus, projecting to the central amygdala. We will test this hypothesis and also determine how this aversion circuit contributes to the persistence of addiction and to relapse.
To dissect this complicated system, I am developing new genetic methods for manipulating and visualizing specific functional circuits in the mouse brain. My unique combination of state-of-the-art competence in transgenics and cutting edge knowledge in the anatomy and functional organization of the circuits behind reward and aversion should allow me to decode these systems, linking discrete circuits to behavior.
Collectively, the results will indicate how signals encoding aversion and reward are integrated to control addictive behavior and they may identify novel avenues for treatment of drug addiction as well as aversion-related symptoms affecting patients with chronic inflammatory conditions and cancer.
Summary
Our affective and motivational state is important for our decisions, actions and quality of life. Many pathological conditions affect this state. For example, addictive drugs are hyperactivating the reward system and trigger a strong motivation for continued drug intake, whereas many somatic and psychiatric diseases lead to an aversive state, characterized by loss of motivation. I will study specific neural circuits and mechanisms underlying reward and aversion, and how pathological signaling in these systems can trigger relapse in drug addiction.
Given the important role of the dopaminergic neurons in the midbrain for many aspects of reward signaling, I will study how synaptic plasticity in these cells, and in their target neurons in the striatum, contribute to relapse in drug seeking. I will also study the circuits underlying aversion. Little is known about these circuits, but my hypothesis is that an important component of aversion is signaled by a specific neuronal population in the brainstem parabrachial nucleus, projecting to the central amygdala. We will test this hypothesis and also determine how this aversion circuit contributes to the persistence of addiction and to relapse.
To dissect this complicated system, I am developing new genetic methods for manipulating and visualizing specific functional circuits in the mouse brain. My unique combination of state-of-the-art competence in transgenics and cutting edge knowledge in the anatomy and functional organization of the circuits behind reward and aversion should allow me to decode these systems, linking discrete circuits to behavior.
Collectively, the results will indicate how signals encoding aversion and reward are integrated to control addictive behavior and they may identify novel avenues for treatment of drug addiction as well as aversion-related symptoms affecting patients with chronic inflammatory conditions and cancer.
Max ERC Funding
1 500 000 €
Duration
Start date: 2010-10-01, End date: 2015-09-30
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AFRIVAL
Project African river basins: catchment-scale carbon fluxes and transformations
Researcher (PI) Steven Bouillon
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Summary
This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Max ERC Funding
1 745 262 €
Duration
Start date: 2009-10-01, End date: 2014-09-30