Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym BRIDGE
Project Biomimetic process design for tissue regeneration:
from bench to bedside via in silico modelling
Researcher (PI) Liesbet Geris
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary "Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Summary
"Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Max ERC Funding
1 191 440 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym CAPS
Project Capillary suspensions: a novel route for versatile, cost efficient and environmentally friendly material design
Researcher (PI) Erin Crystal Koos
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2013-StG
Summary A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Summary
A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Max ERC Funding
1 489 618 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym CO2LIFE
Project BIOMIMETIC FIXATION OF CO2 AS SOURCE OF SALTS AND GLUCOSE
Researcher (PI) Patricia LUIS ALCONERO
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2017-STG
Summary The continued increase in the atmospheric concentration of CO2 due to anthropogenic emissions is leading to significant changes in climate, with the industry accounting for one-third of all the energy used globally and for almost 40% of worldwide CO2 emissions. Fast actions are required to decrease the concentration of this greenhouse gas in the atmosphere, value that has currently reaching 400 ppm. Among the technological possibilities that are on the table to reduce CO2 emissions, carbon capture and storage into geological deposits is one of the main strategies that is being applied. However, the final objective of this strategy is to remove CO2 without considering the enormous potential of this molecule as a source of carbon for the production of valuable compounds. Nature has developed an effective and equilibrated mechanism to concentrate CO2 and fixate the inorganic carbon into organic material (e.g., glucose) by means of enzymatic action. Mimicking Nature and take advantage of millions of years of evolution should be considered as a basic starting point in the development of smart and highly effective processes. In addition, the use of amino-acid salts for CO2 capture is envisaged as a potential approach to recover CO2 in the form of (bi)carbonates.
The project CO2LIFE presents the overall objective of developing a chemical process that converts carbon dioxide into valuable molecules using membrane technology. The strategy followed in this project is two-fold: i) CO2 membrane-based absorption-crystallization process on basis of using amino-acid salts, and ii) CO2 conversion into glucose or salts by using enzymes as catalysts supported on or retained by membranes. The final product, i.e. (bi)carbonates or glucose, has a large interest in the (bio)chemical industry, thus, new CO2 emissions are avoided and the carbon cycle is closed. This project will provide a technological solution at industrial scale for the removal and reutilization of CO2.
Summary
The continued increase in the atmospheric concentration of CO2 due to anthropogenic emissions is leading to significant changes in climate, with the industry accounting for one-third of all the energy used globally and for almost 40% of worldwide CO2 emissions. Fast actions are required to decrease the concentration of this greenhouse gas in the atmosphere, value that has currently reaching 400 ppm. Among the technological possibilities that are on the table to reduce CO2 emissions, carbon capture and storage into geological deposits is one of the main strategies that is being applied. However, the final objective of this strategy is to remove CO2 without considering the enormous potential of this molecule as a source of carbon for the production of valuable compounds. Nature has developed an effective and equilibrated mechanism to concentrate CO2 and fixate the inorganic carbon into organic material (e.g., glucose) by means of enzymatic action. Mimicking Nature and take advantage of millions of years of evolution should be considered as a basic starting point in the development of smart and highly effective processes. In addition, the use of amino-acid salts for CO2 capture is envisaged as a potential approach to recover CO2 in the form of (bi)carbonates.
The project CO2LIFE presents the overall objective of developing a chemical process that converts carbon dioxide into valuable molecules using membrane technology. The strategy followed in this project is two-fold: i) CO2 membrane-based absorption-crystallization process on basis of using amino-acid salts, and ii) CO2 conversion into glucose or salts by using enzymes as catalysts supported on or retained by membranes. The final product, i.e. (bi)carbonates or glucose, has a large interest in the (bio)chemical industry, thus, new CO2 emissions are avoided and the carbon cycle is closed. This project will provide a technological solution at industrial scale for the removal and reutilization of CO2.
Max ERC Funding
1 302 710 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym COCOON
Project Conformal coating of nanoporous materials
Researcher (PI) Christophe Detavernier
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE8, ERC-2009-StG
Summary CONTEXT - Nanoporous structures are used for application in catalysis, molecular separation, fuel cells, dye sensitized solar cells etc. Given the near molecular size of the porous network, it is extremely challenging to modify the interior surface of the pores after the nanoporous material has been synthesized.
THIS PROPOSAL - Atomic Layer Deposition (ALD) is envisioned as a novel technique for creating catalytically active sites and for controlling the pore size distribution in nanoporous materials. ALD is a self-limited growth method that is characterized by alternating exposure of the growing film to precursor vapours, resulting in the sequential deposition of (sub)monolayers. It provides atomic level control of thickness and composition, and is currently used in micro-electronics to grow films into structures with aspect ratios of up to 100 / 1. We aim to make the fundamental breakthroughs necessary to enable atomic layer deposition to engineer the composition, size and shape of the interior surface of nanoporous materials with aspect ratios in excess of 10,000 / 1.
POTENTIAL IMPACT Achieving these objectives will enable atomic level engineering of the interior surface of any porous material. We plan to focus on three specific applications where our results will have both medium and long term impacts:
- Engineering the composition of pore walls using ALD, e.g. to create catalytic sites (e.g. Al for acid sites, Ti for redox sites, or Pt, Pd or Ni)
- chemical functionalization of the pore walls with atomic level control can result in breakthrough applications in the fields of catalysis and sensors.
- Atomic level control of the size of nanopores through ALD controlling the pore size distribution of molecular sieves can potentially lead to breakthrough applications in molecular separation and filtration.
- Nanocasting replication of a mesoporous template by means of ALD can result in the mass-scale production of nanotubes.
Summary
CONTEXT - Nanoporous structures are used for application in catalysis, molecular separation, fuel cells, dye sensitized solar cells etc. Given the near molecular size of the porous network, it is extremely challenging to modify the interior surface of the pores after the nanoporous material has been synthesized.
THIS PROPOSAL - Atomic Layer Deposition (ALD) is envisioned as a novel technique for creating catalytically active sites and for controlling the pore size distribution in nanoporous materials. ALD is a self-limited growth method that is characterized by alternating exposure of the growing film to precursor vapours, resulting in the sequential deposition of (sub)monolayers. It provides atomic level control of thickness and composition, and is currently used in micro-electronics to grow films into structures with aspect ratios of up to 100 / 1. We aim to make the fundamental breakthroughs necessary to enable atomic layer deposition to engineer the composition, size and shape of the interior surface of nanoporous materials with aspect ratios in excess of 10,000 / 1.
POTENTIAL IMPACT Achieving these objectives will enable atomic level engineering of the interior surface of any porous material. We plan to focus on three specific applications where our results will have both medium and long term impacts:
- Engineering the composition of pore walls using ALD, e.g. to create catalytic sites (e.g. Al for acid sites, Ti for redox sites, or Pt, Pd or Ni)
- chemical functionalization of the pore walls with atomic level control can result in breakthrough applications in the fields of catalysis and sensors.
- Atomic level control of the size of nanopores through ALD controlling the pore size distribution of molecular sieves can potentially lead to breakthrough applications in molecular separation and filtration.
- Nanocasting replication of a mesoporous template by means of ALD can result in the mass-scale production of nanotubes.
Max ERC Funding
1 432 800 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym COSMOS
Project Semiparametric Inference for Complex and Structural Models in Survival Analysis
Researcher (PI) Ingrid VAN KEILEGOM
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Summary
In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Max ERC Funding
2 318 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CRAMIS
Project Critical phenomena in random matrix theory and integrable systems
Researcher (PI) Tom Claeys
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Summary
The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Max ERC Funding
1 130 400 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym FHiCuNCAG
Project Foundations for Higher and Curved Noncommutative Algebraic Geometry
Researcher (PI) Wendy Joy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Summary
With this research programme, inspired by open problems within noncommutative algebraic geometry (NCAG) as well as by actual developments in algebraic topology, it is our aim to lay out new foundations for NCAG. On the one hand, the categorical approach to geometry put forth in NCAG has seen a wide range of applications both in mathematics and in theoretical physics. On the other hand, algebraic topology has received a vast impetus from the development of higher topos theory by Lurie and others. The current project is aimed at cross-fertilisation between the two subjects, in particular through the development of “higher linear topos theory”. We will approach the higher structure on Hochschild type complexes from two angles. Firstly, focusing on intrinsic incarnations of spaces as large categories, we will use the tensor products developed jointly with Ramos González and Shoikhet to obtain a “large version” of the Deligne conjecture. Secondly, focusing on concrete representations, we will develop new operadic techniques in order to endow complexes like the Gerstenhaber-Schack complex for prestacks (due to Dinh Van-Lowen) and the deformation complexes for monoidal categories and pasting diagrams (due to Shrestha and Yetter) with new combinatorial structure. In another direction, we will move from Hochschild cohomology of abelian categories (in the sense of Lowen-Van den Bergh) to Mac Lane cohomology for exact categories (in the sense of Kaledin-Lowen), extending the scope of NCAG to “non-linear deformations”. One of the mysteries in algebraic deformation theory is the curvature problem: in the process of deformation we are brought to the boundaries of NCAG territory through the introduction of a curvature component which disables the standard approaches to cohomology. Eventually, it is our goal to set up a new framework for NCAG which incorporates curved objects, drawing inspiration from the realm of higher categories.
Max ERC Funding
1 171 360 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym HEXTREME
Project Hexahedral mesh generation in real time
Researcher (PI) Jean-François REMACLE
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE8, ERC-2015-AdG
Summary Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Summary
Over one million finite element analyses are preformed in engineering offices every day and finite elements come with the price of mesh generation. This proposal aims at creating two breakthroughs in the art of mesh generation that will be directly beneficial to the finite element community at large. The first challenge of HEXTREME is to take advantage of the massively multi-threaded nature of modern computers and to parallelize all the aspects of the mesh generation process at a fine grain level. Reducing the meshing time by more than one order of magnitude is an ambitious objective: if minutes can become seconds, then success in this research would definitively radically change the way in which engineers deal with mesh generation. This project then proposes an innovative approach to overcoming the major difficulty associated with mesh generation: it aims at providing a fast and reliable solution to the problem of conforming hexahedral mesh generation. Quadrilateral meshes in 2D and hexahedral meshes in 3D are usually considered to be superior to triangular/tetrahedral meshes. Even though direct tetrahedral meshing techniques have reached a level of robustness that allow us to treat general 3D domains, there may never exist a direct algorithm for building unstructured hex-meshes in general 3D domains. In HEXTREME, an indirect approach is envisaged that relies on recent developments in various domains of applied mathematics and computer science such as graph theory, combinatorial optimization or computational geometry. The methodology that is proposed for hex meshing is finally extended to the difficult problem of boundary layer meshing. Mesh generation is one important step of the engineering analysis process. Yet, a mesh is a tool and not an aim. A specific task of the project is dedicated to the interaction with research partners that are committed to beta-test the results of HEXTREME. All the results of HEXTREME will be provided as an open source in Gmsh.
Max ERC Funding
2 244 238 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym HHNCDMIR
Project Hochschild cohomology, non-commutative deformations and mirror symmetry
Researcher (PI) Wendy Lowen
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE1, ERC-2010-StG_20091028
Summary "Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Summary
"Our research programme addresses several interesting current issues in non-commutative algebraic geometry, and important links with symplectic geometry and algebraic topology. Non-commutative algebraic geometry is concerned with the study of algebraic objects in geometric ways. One of the basic philosophies is that, in analogy with (derived) categories of (quasi-)coherent sheaves over schemes and (derived) module categories, non-commutative spaces can be represented by suitable abelian or triangulated categories. This point of view has proven extremely useful in non-commutative algebra, algebraic geometry and more recently in string theory thanks to the Homological Mirror Symmetry conjecture. One of our main aims is to set up a deformation framework for non-commutative spaces represented by ""enhanced"" triangulated categories, encompassing both the non-commutative schemes represented by derived abelian categories and the derived-affine spaces, represented by dg algebras. This framework should clarify and resolve some of the important problems known to exist in the deformation theory of derived-affine spaces. It should moreover be applicable to Fukaya-type categories, and yield a new way of proving and interpreting instances of ""deformed mirror symmetry"". This theory will be developed in interaction with concrete applications of the abelian deformation theory developed in our earlier work, and with the development of new decomposition and comparison techniques for Hochschild cohomology. By understanding the links between the different theories and fields of application, we aim to achieve an interdisciplinary understanding of non-commutative spaces using abelian and triangulated structures."
Max ERC Funding
703 080 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym i-CaD
Project Innovative Catalyst Design for Large-Scale, Sustainable Processes
Researcher (PI) Joris Wilfried Maria Cornelius Thybaut
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary A systematic and novel, multi-scale model based catalyst design methodology will be developed. The fundamental nature of the models used is unprecedented and will represent a breakthrough compared to the more commonly applied statistical, correlative relationships. The methodology will focus on the intrinsic kinetics of (potentially) large-scale processes for the conversion of renewable feeds into fuels and chemicals. Non-ideal behaviour, caused by mass and heat transfer limitations or particular reactor hydrodynamics, will be explicitly accounted for when simulating or optimizing industrial-scale applications. The selected model reactions are situated in the area of biomass upgrading to fuels and chemicals: fast pyrolysis oil stabilization, glycerol hydrogenolysis and selective oxidation of (bio)ethanol to acetaldehyde.
For the first time, a systematic microkinetic modelling methodology will be developed for oxygenates conversion. In particular, stereochemistry in catalysis will be assessed. Two types of descriptors will be quantified: kinetic descriptors that are catalyst independent and catalyst descriptors that specifically account for the effect of the catalyst properties on the reaction kinetics. The latter will be optimized in terms of reactant conversion, product yield or selectivity. Fundamental relationships will be established between the catalyst descriptors as determined by microkinetic modelling and independently measured catalyst properties or synthesis parameters. These innovative relationships allow providing the desired, rational feedback in from optimal descriptor values towards synthesis parameters for a new catalyst generation. Their fundamental character will guarantee adequate extrapolative properties that can be exploited for the identification of a groundbreaking next catalyst generation.
Summary
A systematic and novel, multi-scale model based catalyst design methodology will be developed. The fundamental nature of the models used is unprecedented and will represent a breakthrough compared to the more commonly applied statistical, correlative relationships. The methodology will focus on the intrinsic kinetics of (potentially) large-scale processes for the conversion of renewable feeds into fuels and chemicals. Non-ideal behaviour, caused by mass and heat transfer limitations or particular reactor hydrodynamics, will be explicitly accounted for when simulating or optimizing industrial-scale applications. The selected model reactions are situated in the area of biomass upgrading to fuels and chemicals: fast pyrolysis oil stabilization, glycerol hydrogenolysis and selective oxidation of (bio)ethanol to acetaldehyde.
For the first time, a systematic microkinetic modelling methodology will be developed for oxygenates conversion. In particular, stereochemistry in catalysis will be assessed. Two types of descriptors will be quantified: kinetic descriptors that are catalyst independent and catalyst descriptors that specifically account for the effect of the catalyst properties on the reaction kinetics. The latter will be optimized in terms of reactant conversion, product yield or selectivity. Fundamental relationships will be established between the catalyst descriptors as determined by microkinetic modelling and independently measured catalyst properties or synthesis parameters. These innovative relationships allow providing the desired, rational feedback in from optimal descriptor values towards synthesis parameters for a new catalyst generation. Their fundamental character will guarantee adequate extrapolative properties that can be exploited for the identification of a groundbreaking next catalyst generation.
Max ERC Funding
1 999 877 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym INSITE
Project Development and use of an integrated in silico-in vitro mesofluidics system for tissue engineering
Researcher (PI) Liesbet Laura J GERIS
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Consolidator Grant (CoG), PE8, ERC-2017-COG
Summary Tissue Engineering (TE) refers to the branch of medicine that aims to replace or regenerate functional tissue or organs using man-made living implants. As the field is moving towards more complex TE constructs with sophisticated functionalities, there is a lack of dedicated in vitro devices that allow testing the response of the complex construct as a whole, prior to implantation. Additionally, the knowledge accumulated from mechanistic and empirical in vitro and in vivo studies is often underused in the development of novel constructs due to a lack of integration of all the data in a single, in silico, platform.
The INSITE project aims to address both challenges by developing a new mesofluidics set-up for in vitro testing of TE constructs and by developing dedicated multiscale and multiphysics models that aggregate the available data and use these to design complex constructs and proper mesofluidics settings for in vitro testing. The combination of these in silico and in vitro approaches will lead to an integrated knowledge-rich mesofluidics system that provides an in vivo-like time-varying in vitro environment. The system will emulate the in vivo environment present at the (early) stages of bone regeneration including the vascularization process and the innate immune response. A proof of concept will be delivered for complex TE constructs for large bone defects and infected fractures.
To realize this project, the applicant can draw on her well-published track record and extensive network in the fields of in silico medicine and skeletal TE. If successful, INSITE will generate a shift from in vivo to in vitro work and hence a transformation of the classical R&D pipeline. Using this system will allow for a maximum of relevant in vitro research prior to the in vivo phase, which is highly needed in academia and industry with the increasing ethical (3R), financial and regulatory constraints.
Summary
Tissue Engineering (TE) refers to the branch of medicine that aims to replace or regenerate functional tissue or organs using man-made living implants. As the field is moving towards more complex TE constructs with sophisticated functionalities, there is a lack of dedicated in vitro devices that allow testing the response of the complex construct as a whole, prior to implantation. Additionally, the knowledge accumulated from mechanistic and empirical in vitro and in vivo studies is often underused in the development of novel constructs due to a lack of integration of all the data in a single, in silico, platform.
The INSITE project aims to address both challenges by developing a new mesofluidics set-up for in vitro testing of TE constructs and by developing dedicated multiscale and multiphysics models that aggregate the available data and use these to design complex constructs and proper mesofluidics settings for in vitro testing. The combination of these in silico and in vitro approaches will lead to an integrated knowledge-rich mesofluidics system that provides an in vivo-like time-varying in vitro environment. The system will emulate the in vivo environment present at the (early) stages of bone regeneration including the vascularization process and the innate immune response. A proof of concept will be delivered for complex TE constructs for large bone defects and infected fractures.
To realize this project, the applicant can draw on her well-published track record and extensive network in the fields of in silico medicine and skeletal TE. If successful, INSITE will generate a shift from in vivo to in vitro work and hence a transformation of the classical R&D pipeline. Using this system will allow for a maximum of relevant in vitro research prior to the in vivo phase, which is highly needed in academia and industry with the increasing ethical (3R), financial and regulatory constraints.
Max ERC Funding
2 161 750 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym INTERDIFFUSION
Project Unraveling Interdiffusion Effects at Material Interfaces -- Learning from Tensors of Microstructure Evolution Simulations
Researcher (PI) Nele Marie Moelans
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary Multi-materials, combining various materials with different functionalities, are increasingly desired in engineering applications. Reliable material assembly is a great challenge in the development of innovative technologies. The interdiffusion microstructures formed at material interfaces are critical for the performance of the product. However, as more and more elements are involved, their complexity increases and their variety becomes immense. Furthermore, interdiffusion microstructures evolve during processing and in use of the device. Experimental testing of the long-term evolution in assembled devices is extremely time-consuming. The current level of materials models and simulation techniques does not allow in silico (or computer aided) design of multi-component material assemblies, since the parameter space is much too large.
With this project, I aim a break-through in computational materials science, using tensor decomposition techniques emerging in data-analysis to guide efficiently high-throughput interdiffusion microstructure simulation studies. The measurable outcomes aimed at, are
1) a high-performance computing software that allows to compute the effect of a huge number of material and process parameters, sufficiently large for reliable in-silico design of multi-materials, on the interdiffusion microstructure evolution, based on a tractable number of simulations, and
2) decomposed tensor descriptions for important multi-material systems enabling reliable computation of interdiffusion microstructure characteristics using a single computer.
If successful, the outcomes of this project will allow to significantly accelerate the design of innovative multi-materials. My expertise in microstructure simulations and multi-component materials, and access to collaborations with the top experts in tensor decomposition techniques and materials characterization are crucial to reach this ambitious aim.
Summary
Multi-materials, combining various materials with different functionalities, are increasingly desired in engineering applications. Reliable material assembly is a great challenge in the development of innovative technologies. The interdiffusion microstructures formed at material interfaces are critical for the performance of the product. However, as more and more elements are involved, their complexity increases and their variety becomes immense. Furthermore, interdiffusion microstructures evolve during processing and in use of the device. Experimental testing of the long-term evolution in assembled devices is extremely time-consuming. The current level of materials models and simulation techniques does not allow in silico (or computer aided) design of multi-component material assemblies, since the parameter space is much too large.
With this project, I aim a break-through in computational materials science, using tensor decomposition techniques emerging in data-analysis to guide efficiently high-throughput interdiffusion microstructure simulation studies. The measurable outcomes aimed at, are
1) a high-performance computing software that allows to compute the effect of a huge number of material and process parameters, sufficiently large for reliable in-silico design of multi-materials, on the interdiffusion microstructure evolution, based on a tractable number of simulations, and
2) decomposed tensor descriptions for important multi-material systems enabling reliable computation of interdiffusion microstructure characteristics using a single computer.
If successful, the outcomes of this project will allow to significantly accelerate the design of innovative multi-materials. My expertise in microstructure simulations and multi-component materials, and access to collaborations with the top experts in tensor decomposition techniques and materials characterization are crucial to reach this ambitious aim.
Max ERC Funding
1 496 875 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym INVPROB
Project Inverse Problems
Researcher (PI) Lassi Juhani Päivärinta
Host Institution (HI) TALLINNA TEHNIKAULIKOOL
Call Details Advanced Grant (AdG), PE1, ERC-2010-AdG_20100224
Summary Inverse problems constitute an interdisciplinary field of science concentrating on the mathematical theory and practical interpretation of indirect measurements. Their applications include medical imaging, atmospheric remote sensing, industrial process monitoring, and astronomical imaging. The common feature is extreme sensitivity to measurement noise. Computerized tomography, MRI, and exploration of the interior of earth by using earthquake data are typical inverse problems where mathematics has played an important role. By using the methods of inverse problems it is possible to bring modern mathematics to a vast number of applied fields. Genuine scientific innovations that are found in mathematical research, say in geometry, stochastics, or analysis, can be brought to real life applications through modelling. The solutions are often found by combining recent theoretical and computational advances. The study of inverse problems is one of the most active and fastest growing areas of modern applied mathematics, and the most interdisciplinary field of mathematics or even science in general.
The exciting but high risk problems in the research plan of the PI include mathematics of invisibility cloaking, invisible patterns, practical algorithms for imaging, and random quantum systems. Progress in these problems could have a considerable impact in applications such as construction of metamaterials for invisible optic fibre cables, scopes for MRI devices, and early screening for breast cancer. The progress here necessitates international collaboration. This will be realized in upcoming programs on inverse problems. The PI is involved in organizing semester programs in inverse problems at MSRI in 2010, Isaac Newton Institute in 2011, and Mittag-Leffler -institute in 2012.
Summary
Inverse problems constitute an interdisciplinary field of science concentrating on the mathematical theory and practical interpretation of indirect measurements. Their applications include medical imaging, atmospheric remote sensing, industrial process monitoring, and astronomical imaging. The common feature is extreme sensitivity to measurement noise. Computerized tomography, MRI, and exploration of the interior of earth by using earthquake data are typical inverse problems where mathematics has played an important role. By using the methods of inverse problems it is possible to bring modern mathematics to a vast number of applied fields. Genuine scientific innovations that are found in mathematical research, say in geometry, stochastics, or analysis, can be brought to real life applications through modelling. The solutions are often found by combining recent theoretical and computational advances. The study of inverse problems is one of the most active and fastest growing areas of modern applied mathematics, and the most interdisciplinary field of mathematics or even science in general.
The exciting but high risk problems in the research plan of the PI include mathematics of invisibility cloaking, invisible patterns, practical algorithms for imaging, and random quantum systems. Progress in these problems could have a considerable impact in applications such as construction of metamaterials for invisible optic fibre cables, scopes for MRI devices, and early screening for breast cancer. The progress here necessitates international collaboration. This will be realized in upcoming programs on inverse problems. The PI is involved in organizing semester programs in inverse problems at MSRI in 2010, Isaac Newton Institute in 2011, and Mittag-Leffler -institute in 2012.
Max ERC Funding
1 800 000 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym MADPII
Project Multiscale Analysis and Design for Process Intensification and Innovation
Researcher (PI) Guy B.M.M. Marin
Host Institution (HI) UNIVERSITEIT GENT
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary The current pressures on the major industrial players have necessitated a more urgent push for increased productivity, process efficiency, and waste reduction; i.e. process intensification. Future sizable improvements in these entrenched industrial processes will require either completely novel production technologies, fundamental analysis/modeling methods, or a combination of both. This proposal aims to approach this challenge by using multiscale modeling and experimentation on three fronts: (1) detailed analysis of industrial processes to generate new fundamental chemical understanding, (2) multiscale modeling and evaluation of high-volume chemical processes using a multiscale approach and fundamental chemical understanding, and (3) show the practical applicability of the multiscale approach and use it to critically examine novel technologies in the context of industrial processes. The novel technology portion of this proposal will be focused around a class known as rotating bed reactors in a static geometry (RBR-SG). We will investigate three processes that could benefit from RBR-SG technology: (1) fast pyrolysis of biomass, (2) gasification of biomass, and (3) short contact time catalytic partial oxidation of light hydrocarbons. Experimental reactor and kinetic work and validated computational fluid dynamics (CFD) modeling of the process mentioned above will be used. We will construct two RBR-SG units; heat transfer, adsorption, and pyrolysis gas/solid experiments will be performed in one, while non-reacting flow tests will be performed in the other with other phase combinations. Detailed kinetic models will provide novel insights into the reaction dynamics and impact other research and technologies. The combination of kinetic and CFD models will clearly demonstrate the benefits of a multiscale approach, will definitively identify the process(es) benefitting most from RBR-SG technology, and will enable a first design of the RBR-SG based on our results.
Summary
The current pressures on the major industrial players have necessitated a more urgent push for increased productivity, process efficiency, and waste reduction; i.e. process intensification. Future sizable improvements in these entrenched industrial processes will require either completely novel production technologies, fundamental analysis/modeling methods, or a combination of both. This proposal aims to approach this challenge by using multiscale modeling and experimentation on three fronts: (1) detailed analysis of industrial processes to generate new fundamental chemical understanding, (2) multiscale modeling and evaluation of high-volume chemical processes using a multiscale approach and fundamental chemical understanding, and (3) show the practical applicability of the multiscale approach and use it to critically examine novel technologies in the context of industrial processes. The novel technology portion of this proposal will be focused around a class known as rotating bed reactors in a static geometry (RBR-SG). We will investigate three processes that could benefit from RBR-SG technology: (1) fast pyrolysis of biomass, (2) gasification of biomass, and (3) short contact time catalytic partial oxidation of light hydrocarbons. Experimental reactor and kinetic work and validated computational fluid dynamics (CFD) modeling of the process mentioned above will be used. We will construct two RBR-SG units; heat transfer, adsorption, and pyrolysis gas/solid experiments will be performed in one, while non-reacting flow tests will be performed in the other with other phase combinations. Detailed kinetic models will provide novel insights into the reaction dynamics and impact other research and technologies. The combination of kinetic and CFD models will clearly demonstrate the benefits of a multiscale approach, will definitively identify the process(es) benefitting most from RBR-SG technology, and will enable a first design of the RBR-SG based on our results.
Max ERC Funding
2 494 700 €
Duration
Start date: 2012-05-01, End date: 2017-04-30
Project acronym MAtrix
Project In silico and in vitro Models of Angiogenesis: unravelling the role of the extracellular matrix
Researcher (PI) Hans Pol S Van Oosterwyck
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary Angiogenesis, the formation of new blood vessels from the existing vasculature, is a process that is fundamental to normal tissue growth, wound repair and disease. The control of angiogenesis is of utmost importance for tissue regenerative therapies as well as cancer treatment, however this remains a challenge. The extracellular matrix (ECM) is a one of the key controlling factors of angiogenesis. The mechanisms through which the ECM exerts its influence are poorly understood. MAtrix will create unprecedented opportunities for unraveling the role of the ECM in angiogenesis. It will do so by creating a highly innovative, multiscale in silico model that provides quantitative, subcellular resolution on cell-matrix interaction, which is key to the understanding of cell migration. In this way, MAtrix goes substantially beyond the state of the art in terms of computational models of angiogenesis. It will integrate mechanisms of ECM-mediated cell migration and relate them to intracellular regulatory mechanisms of angiogenesis.
Apart from its innovation in terms of computational modelling, MAtrix’ impact is related to its interdisciplinarity, involving computer simulations and in vitro experiments. This will enable to investigate research hypotheses on the role of the ECM in angiogenesis that are generated by the in silico model. State of the art technologies (fluorescence microscopy, cell and ECM mechanics, biomaterials design) will be applied –in conjunction with the in silico model- to quantity cell-ECM mechanical interaction at a subcellular level and the dynamics of cell migration. In vitro experiments will be performed for a broad range of biomaterials and their characteristics. In this way, MAtrix will deliver a proof-of-concept that an in silico model can help in identifying and prioritising biomaterials characteristics, relevant for angiogenesis. MAtrix’ findings can have a major impact on the development of therapies that want to control the angiogenic response.
Summary
Angiogenesis, the formation of new blood vessels from the existing vasculature, is a process that is fundamental to normal tissue growth, wound repair and disease. The control of angiogenesis is of utmost importance for tissue regenerative therapies as well as cancer treatment, however this remains a challenge. The extracellular matrix (ECM) is a one of the key controlling factors of angiogenesis. The mechanisms through which the ECM exerts its influence are poorly understood. MAtrix will create unprecedented opportunities for unraveling the role of the ECM in angiogenesis. It will do so by creating a highly innovative, multiscale in silico model that provides quantitative, subcellular resolution on cell-matrix interaction, which is key to the understanding of cell migration. In this way, MAtrix goes substantially beyond the state of the art in terms of computational models of angiogenesis. It will integrate mechanisms of ECM-mediated cell migration and relate them to intracellular regulatory mechanisms of angiogenesis.
Apart from its innovation in terms of computational modelling, MAtrix’ impact is related to its interdisciplinarity, involving computer simulations and in vitro experiments. This will enable to investigate research hypotheses on the role of the ECM in angiogenesis that are generated by the in silico model. State of the art technologies (fluorescence microscopy, cell and ECM mechanics, biomaterials design) will be applied –in conjunction with the in silico model- to quantity cell-ECM mechanical interaction at a subcellular level and the dynamics of cell migration. In vitro experiments will be performed for a broad range of biomaterials and their characteristics. In this way, MAtrix will deliver a proof-of-concept that an in silico model can help in identifying and prioritising biomaterials characteristics, relevant for angiogenesis. MAtrix’ findings can have a major impact on the development of therapies that want to control the angiogenic response.
Max ERC Funding
1 497 400 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym MAZEST
Project M- and Z-estimation in semiparametric statistics : applications in various fields
Researcher (PI) Ingrid Van Keilegom
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary The area of semiparametric statistics is, in comparison to the areas of fully parametric or nonparametric statistics, relatively unexplored and still in full development. Semiparametric models offer a valid alternative for purely parametric ones, that are known to be sensitive to incorrect model specification, and completely nonparametric models, which often suffer from lack of precision and power. A drawback of semiparametric models so far is, however, that the development of mathematical properties under these models is often a lot harder than under the other two types of models. The present project tries to solve this difficulty partially, by presenting and applying a general method to prove the asymptotic properties of estimators for a wide spectrum of semiparametric models. The objectives of this project are twofold. On one hand we will apply a general theory developed by Chen, Linton and Van Keilegom (2003) for a class of semiparametric Z-estimation problems, to a number of novel research ideas, coming from a broad range of areas in statistics. On the other hand we will show that some estimation problems are not covered by this theory, we consider a more general class of semiparametric estimators (M-estimators called) and develop a general theory for this class of estimators. This theory will open new horizons for a wide variety of problems in semiparametric statistics. The project requires highly complex mathematical skills and cutting edge results from modern empirical process theory.
Summary
The area of semiparametric statistics is, in comparison to the areas of fully parametric or nonparametric statistics, relatively unexplored and still in full development. Semiparametric models offer a valid alternative for purely parametric ones, that are known to be sensitive to incorrect model specification, and completely nonparametric models, which often suffer from lack of precision and power. A drawback of semiparametric models so far is, however, that the development of mathematical properties under these models is often a lot harder than under the other two types of models. The present project tries to solve this difficulty partially, by presenting and applying a general method to prove the asymptotic properties of estimators for a wide spectrum of semiparametric models. The objectives of this project are twofold. On one hand we will apply a general theory developed by Chen, Linton and Van Keilegom (2003) for a class of semiparametric Z-estimation problems, to a number of novel research ideas, coming from a broad range of areas in statistics. On the other hand we will show that some estimation problems are not covered by this theory, we consider a more general class of semiparametric estimators (M-estimators called) and develop a general theory for this class of estimators. This theory will open new horizons for a wide variety of problems in semiparametric statistics. The project requires highly complex mathematical skills and cutting edge results from modern empirical process theory.
Max ERC Funding
750 000 €
Duration
Start date: 2008-07-01, End date: 2014-06-30