Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ABACUS
Project Advancing Behavioral and Cognitive Understanding of Speech
Researcher (PI) Bart De Boer
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Starting Grant (StG), SH4, ERC-2011-StG_20101124
Summary I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Summary
I intend to investigate what cognitive mechanisms give us combinatorial speech. Combinatorial speech is the ability to make new words using pre-existing speech sounds. Humans are the only apes that can do this, yet we do not know how our brains do it, nor how exactly we differ from other apes. Using new experimental techniques to study human behavior and new computational techniques to model human cognition, I will find out how we deal with combinatorial speech.
The experimental part will study individual and cultural learning. Experimental cultural learning is a new technique that simulates cultural evolution in the laboratory. Two types of cultural learning will be used: iterated learning, which simulates language transfer across generations, and social coordination, which simulates emergence of norms in a language community. Using the two types of cultural learning together with individual learning experiments will help to zero in, from three angles, on how humans deal with combinatorial speech. In addition it will make a methodological contribution by comparing the strengths and weaknesses of the three methods.
The computer modeling part will formalize hypotheses about how our brains deal with combinatorial speech. Two models will be built: a high-level model that will establish the basic algorithms with which combinatorial speech is learned and reproduced, and a neural model that will establish in more detail how the algorithms are implemented in the brain. In addition, the models, through increasing understanding of how humans deal with speech, will help bridge the performance gap between human and computer speech recognition.
The project will advance science in four ways: it will provide insight into how our unique ability for using combinatorial speech works, it will tell us how this is implemented in the brain, it will extend the novel methodology of experimental cultural learning and it will create new computer models for dealing with human speech.
Max ERC Funding
1 276 620 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym AcTafactors
Project AcTafactors: Tumor Necrosis Factor-based immuno-cytokines with superior therapeutic indexes
Researcher (PI) Jan Honoré L Tavernier
Host Institution (HI) VIB
Call Details Proof of Concept (PoC), ERC-2015-PoC, ERC-2015-PoC
Summary Tumor Necrosis Factor (TNF) is a homotrimeric pro-inflammatory cytokine that was originally discovered based on its extraordinary antitumor activity. However, its shock-inducing properties, causing hypotension, leukopenia and multiple organ failure, prevented its systemic use in cancer treatment. With this proof-of-concept study we want to evaluate a novel class of cell-targeted TNFs with strongly reduced systemic toxicities (AcTafactors). In these engineered immuno-cytokines, single-chain TNFs that harbor mutations to reduce the affinity for its receptor(s) are fused to a cell- specific targeting domain. Whilst almost no biological activity is observed on non-targeted cells, thus preventing systemic toxicity, avidity effects at the targeted cell membrane lead to recovery of over 90% of the TNF signaling activity. In this project we propose a lead optimization program to further improve the lead AcTafactors identified in the context of the ERC Advanced Grant project and to evaluate the resulting molecules for their ability to target the tumor (neo)vasculature in clinically relevant murine tumor models. The pre-clinical proof-of-concept we aim for represents a first step towards clinical development and ultimately potential market approval of an effective AcTafactor anti-cancer therapy.
Summary
Tumor Necrosis Factor (TNF) is a homotrimeric pro-inflammatory cytokine that was originally discovered based on its extraordinary antitumor activity. However, its shock-inducing properties, causing hypotension, leukopenia and multiple organ failure, prevented its systemic use in cancer treatment. With this proof-of-concept study we want to evaluate a novel class of cell-targeted TNFs with strongly reduced systemic toxicities (AcTafactors). In these engineered immuno-cytokines, single-chain TNFs that harbor mutations to reduce the affinity for its receptor(s) are fused to a cell- specific targeting domain. Whilst almost no biological activity is observed on non-targeted cells, thus preventing systemic toxicity, avidity effects at the targeted cell membrane lead to recovery of over 90% of the TNF signaling activity. In this project we propose a lead optimization program to further improve the lead AcTafactors identified in the context of the ERC Advanced Grant project and to evaluate the resulting molecules for their ability to target the tumor (neo)vasculature in clinically relevant murine tumor models. The pre-clinical proof-of-concept we aim for represents a first step towards clinical development and ultimately potential market approval of an effective AcTafactor anti-cancer therapy.
Max ERC Funding
149 320 €
Duration
Start date: 2015-11-01, End date: 2017-04-30
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AD-VIP
Project Alzheimer’s disease and AAV9: Use of a virus-based delivery system for vectored immunoprophylaxis in dementia.
Researcher (PI) MATTHEW GUY HOLT
Host Institution (HI) VIB
Call Details Proof of Concept (PoC), PC1, ERC-2015-PoC
Summary Alzheimer’s disease (AD) is the most common form of dementia in the Western World, representing an economic and social cost of billions of euros a year. Given the changing demographics of society, these costs will only increase over the coming decades.
Amyloid plaques, composed of amyloid beta peptide (Abeta), are a defining characteristic of AD. Evidence now suggests that Abeta is central to disease pathogenesis due to its toxicity, which leads to cell loss and eventual cognitive decline. Abeta is generated by proteolytic cleavage of amyloid precursor protein, a process that involves the protein BACE1.
Knock-down of BACE1 is sufficient to prevent amyloid pathology and cognitive deficits in transgenic mouse models of AD, so BACE1 is an attractive target for therapeutic intervention. Although many small molecule inhibitors of BACE1 have been developed, many have problems with imperfect selectivity, posing a substantial risk for off-target toxicity in vivo. In contrast, antibody-based therapeutics provide an attractive alternative given their excellent molecular selectivity. However, the success of antibody therapies in AD is limited by the blood brain barrier, which limits antibody entry into the brain from the systemic circulation.
Recent studies have shown that adeno-associated virus serotype 9 (AAV9) effectively crosses the blood brain barrier. Here, we propose evaluating the use of AAV9 as a delivery system for a highly specific and potent inhibitory nanobody targeted against BACE1 as a treatment for AD.
Summary
Alzheimer’s disease (AD) is the most common form of dementia in the Western World, representing an economic and social cost of billions of euros a year. Given the changing demographics of society, these costs will only increase over the coming decades.
Amyloid plaques, composed of amyloid beta peptide (Abeta), are a defining characteristic of AD. Evidence now suggests that Abeta is central to disease pathogenesis due to its toxicity, which leads to cell loss and eventual cognitive decline. Abeta is generated by proteolytic cleavage of amyloid precursor protein, a process that involves the protein BACE1.
Knock-down of BACE1 is sufficient to prevent amyloid pathology and cognitive deficits in transgenic mouse models of AD, so BACE1 is an attractive target for therapeutic intervention. Although many small molecule inhibitors of BACE1 have been developed, many have problems with imperfect selectivity, posing a substantial risk for off-target toxicity in vivo. In contrast, antibody-based therapeutics provide an attractive alternative given their excellent molecular selectivity. However, the success of antibody therapies in AD is limited by the blood brain barrier, which limits antibody entry into the brain from the systemic circulation.
Recent studies have shown that adeno-associated virus serotype 9 (AAV9) effectively crosses the blood brain barrier. Here, we propose evaluating the use of AAV9 as a delivery system for a highly specific and potent inhibitory nanobody targeted against BACE1 as a treatment for AD.
Max ERC Funding
150 000 €
Duration
Start date: 2016-12-01, End date: 2018-05-31
Project acronym ADAPTEM
Project Adaptive transmission electron microscopy: development of a programmable phase plate
Researcher (PI) Johan VERBEECK
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Proof of Concept (PoC), ERC-2017-PoC
Summary Adaptive optics, the technology to dynamically program the phase of optical waves has sparked an avalanche of scientific discoveries and innovations in light optics applications. Nowadays, the phase of optical waves can be dynamically programmed providing research on exotic optical beams and unprecedented control over the performance of optical instruments. Although electron waves carry many similarities in comparison to their optical counterparts, a generic programmable phase plate for electrons is still missing. This project aims at developing a prototype of a programmable electrostatic phase plate that allows the user to freely change the phase of electron waves and demonstrate it to potential licensees for further upscaling and introduction to the market. The target of this POC project is the realization of a tunable easy-to-use 5x5-pixel prototype that will demonstrate the potential of adaptive optics in electron microscopy. Its realization will be based on lithographic technology to allow for future upscaling. It is expected that such a phase plate can dramatically increase the information obtained at a given electron dose, limiting the detrimental effects of beam damage that currently hinders the use of electron microscopy in e.g. life sciences. As such, it has the potential to disrupt the electron microscopy market with novel applications while at the same time reducing cost and complexity and increasing the potential for fully automated instruments.
Summary
Adaptive optics, the technology to dynamically program the phase of optical waves has sparked an avalanche of scientific discoveries and innovations in light optics applications. Nowadays, the phase of optical waves can be dynamically programmed providing research on exotic optical beams and unprecedented control over the performance of optical instruments. Although electron waves carry many similarities in comparison to their optical counterparts, a generic programmable phase plate for electrons is still missing. This project aims at developing a prototype of a programmable electrostatic phase plate that allows the user to freely change the phase of electron waves and demonstrate it to potential licensees for further upscaling and introduction to the market. The target of this POC project is the realization of a tunable easy-to-use 5x5-pixel prototype that will demonstrate the potential of adaptive optics in electron microscopy. Its realization will be based on lithographic technology to allow for future upscaling. It is expected that such a phase plate can dramatically increase the information obtained at a given electron dose, limiting the detrimental effects of beam damage that currently hinders the use of electron microscopy in e.g. life sciences. As such, it has the potential to disrupt the electron microscopy market with novel applications while at the same time reducing cost and complexity and increasing the potential for fully automated instruments.
Max ERC Funding
148 500 €
Duration
Start date: 2018-03-01, End date: 2019-08-31
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AfricanWomen
Project Women in Africa
Researcher (PI) catherine GUIRKINGER
Host Institution (HI) UNIVERSITE DE NAMUR ASBL
Call Details Starting Grant (StG), SH1, ERC-2017-STG
Summary Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Summary
Rates of domestic violence and the relative risk of premature death for women are higher in sub-Saharan Africa than in any other region. Yet we know remarkably little about the economic forces, incentives and constraints that drive discrimination against women in this region, making it hard to identify policy levers to address the problem. This project will help fill this gap.
I will investigate gender discrimination from two complementary perspectives. First, through the lens of economic history, I will investigate the forces driving trends in women’s relative well-being since slavery. To quantify the evolution of well-being of sub-Saharan women relative to men, I will use three types of historical data: anthropometric indicators (relative height), vital statistics (to compute numbers of missing women), and outcomes of formal and informal family law disputes. I will then investigate how major economic developments and changes in family laws differentially affected women’s welfare across ethnic groups with different norms on women’s roles and rights.
Second, using intra-household economic models, I will provide new insights into domestic violence and gender bias in access to crucial resources in present-day Africa. I will develop a new household model that incorporates gender identity and endogenous outside options to explore the relationship between women’s empowerment and the use of violence. Using the notion of strategic delegation, I will propose a new rationale for the separation of budgets often observed in African households and generate predictions of how improvements in women’s outside options affect welfare. Finally, with first hand data, I will investigate intra-household differences in nutrition and work effort in times of food shortage from the points of view of efficiency and equity. I will use activity trackers as an innovative means of collecting high quality data on work effort and thus overcome data limitations restricting the existing literature
Max ERC Funding
1 499 313 €
Duration
Start date: 2018-08-01, End date: 2023-07-31