Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ActiveWindFarms
Project Active Wind Farms: Optimization and Control of Atmospheric Energy Extraction in Gigawatt Wind Farms
Researcher (PI) Johan Meyers
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2012-StG_20111012
Summary With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Summary
With the recognition that wind energy will become an important contributor to the world’s energy portfolio, several wind farms with a capacity of over 1 gigawatt are in planning phase. In the past, engineering of wind farms focused on a bottom-up approach, in which atmospheric wind availability was considered to be fixed by climate and weather. However, farms of gigawatt size slow down the Atmospheric Boundary Layer (ABL) as a whole, reducing the availability of wind at turbine hub height. In Denmark’s large off-shore farms, this leads to underperformance of turbines which can reach levels of 40%–50% compared to the same turbine in a lone-standing case. For large wind farms, the vertical structure and turbulence physics of the flow in the ABL become crucial ingredients in their design and operation. This introduces a new set of scientific challenges related to the design and control of large wind farms. The major ambition of the present research proposal is to employ optimal control techniques to control the interaction between large wind farms and the ABL, and optimize overall farm-power extraction. Individual turbines are used as flow actuators by dynamically pitching their blades using time scales ranging between 10 to 500 seconds. The application of such control efforts on the atmospheric boundary layer has never been attempted before, and introduces flow control on a physical scale which is currently unprecedented. The PI possesses a unique combination of expertise and tools enabling these developments: efficient parallel large-eddy simulations of wind farms, multi-scale turbine modeling, and gradient-based optimization in large optimization-parameter spaces using adjoint formulations. To ensure a maximum impact on the wind-engineering field, the project aims at optimal control, experimental wind-tunnel validation, and at including multi-disciplinary aspects, related to structural mechanics, power quality, and controller design.
Max ERC Funding
1 499 241 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym AEROSOL
Project Astrochemistry of old stars:direct probing of unique chemical laboratories
Researcher (PI) Leen Katrien Els Decin
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2014-CoG
Summary The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Summary
The gas and dust in the interstellar medium (ISM) drive the chemical evolution of galaxies, the formation of stars and planets, and the synthesis of complex prebiotic molecules. The prime birth places for this interstellar material are the winds of evolved (super)giant stars. These winds are unique chemical laboratories, in which a large variety of gas and dust species radially expand away from the star.
Recent progress on the observations of these winds has been impressive thanks to Herschel and ALMA. The next challenge is to unravel the wealth of chemical information contained in these data. This is an ambitious task since (1) a plethora of physical and chemical processes interact in a complex way, (2) laboratory data to interpret these interactions are lacking, and (3) theoretical tools to analyse the data do not meet current needs.
To boost the knowledge of the physics and chemistry characterizing these winds, I propose a world-leading multi-disciplinary project combining (1) high-quality data, (2) novel theoretical wind models, and (3) targeted laboratory experiments. The aim is to pinpoint the dominant chemical pathways, unravel the transition from gas-phase to dust species, elucidate the role of clumps on the overall wind structure, and study the reciprocal effect between various dynamical and chemical phenomena.
Now is the right time for this ambitious project thanks to the availability of (1) high-quality multi-wavelength data, including ALMA and Herschel data of the PI, (2) supercomputers enabling a homogeneous analysis of the data using sophisticated theoretical wind models, and (3) novel laboratory equipment to measure the gas-phase reaction rates of key species.
This project will have far-reaching impact on (1) the field of evolved stars, (2) the understanding of the chemical lifecycle of the ISM, (3) chemical studies of dynamically more complex systems, such as exoplanets, protostars, supernovae etc., and (4) it will guide new instrument development.
Max ERC Funding
2 605 897 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AEROSPACEPHYS
Project Multiphysics models and simulations for reacting and plasma flows applied to the space exploration program
Researcher (PI) Thierry Edouard Bertrand Magin
Host Institution (HI) INSTITUT VON KARMAN DE DYNAMIQUE DES FLUIDES
Call Details Starting Grant (StG), PE8, ERC-2010-StG_20091028
Summary Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Summary
Space exploration is one of boldest and most exciting endeavors that humanity has undertaken, and it holds enormous promise for the future. Our next challenges for the spatial conquest include bringing back samples to Earth by means of robotic missions and continuing the manned exploration program, which aims at sending human beings to Mars and bring them home safely. Inaccurate prediction of the heat-flux to the surface of the spacecraft heat shield can be fatal for the crew or the success of a robotic mission. This quantity is estimated during the design phase. An accurate prediction is a particularly complex task, regarding modelling of the following phenomena that are potential “mission killers:” 1) Radiation of the plasma in the shock layer, 2) Complex surface chemistry on the thermal protection material, 3) Flow transition from laminar to turbulent. Our poor understanding of the coupled mechanisms of radiation, ablation, and transition leads to the difficulties in flux prediction. To avoid failure and ensure safety of the astronauts and payload, engineers resort to “safety factors” to determine the thickness of the heat shield, at the expense of the mass of embarked payload. Thinking out of the box and basic research are thus necessary for advancements of the models that will better define the environment and requirements for the design and safe operation of tomorrow’s space vehicles and planetary probes for the manned space exploration. The three basic ingredients for predictive science are: 1) Physico-chemical models, 2) Computational methods, 3) Experimental data. We propose to follow a complementary approach for prediction. The proposed research aims at: “Integrating new advanced physico-chemical models and computational methods, based on a multidisciplinary approach developed together with physicists, chemists, and applied mathematicians, to create a top-notch multiphysics and multiscale numerical platform for simulations of planetary atmosphere entries, crucial to the new challenges of the manned space exploration program. Experimental data will also be used for validation, following state-of-the-art uncertainty quantification methods.”
Max ERC Funding
1 494 892 €
Duration
Start date: 2010-09-01, End date: 2015-08-31
Project acronym AFRIVAL
Project African river basins: catchment-scale carbon fluxes and transformations
Researcher (PI) Steven Bouillon
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE10, ERC-2009-StG
Summary This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Summary
This proposal wishes to fundamentally improve our understanding of the role of tropical freshwater ecosystems in carbon (C) cycling on the catchment scale. It uses an unprecedented combination of state-of-the-art proxies such as stable isotope, 14C and biomarker signatures to characterize organic matter, radiogenic isotope signatures to determine particle residence times, as well as field measurements of relevant biogeochemical processes. We focus on tropical systems since there is a striking lack of data on such systems, even though riverine C transport is thought to be disproportionately high in tropical areas. Furthermore, the presence of landscape-scale contrasts in vegetation (in particular, C3 vs. C4 plants) are an important asset in the use of stable isotopes as natural tracers of C cycling processes on this scale. Freshwater ecosystems are an important component in the global C cycle, and the primary link between terrestrial and marine ecosystems. Recent estimates indicate that ~2 Pg C y-1 (Pg=Petagram) enter freshwater systems, i.e., about twice the estimated global terrestrial C sink. More than half of this is thought to be remineralized before it reaches the coastal zone, and for the Amazon basin this has even been suggested to be ~90% of the lateral C inputs. The question how general these patterns are is a matter of debate, and assessing the mechanisms determining the degree of processing versus transport of organic carbon in lakes and river systems is critical to further constrain their role in the global C cycle. This proposal provides an interdisciplinary approach to describe and quantify catchment-scale C transport and cycling in tropical river basins. Besides conceptual and methodological advances, and a significant expansion of our dataset on C processes in such systems, new data gathered in this project are likely to provide exciting and novel hypotheses on the functioning of freshwater systems and their linkage to the terrestrial C budget.
Max ERC Funding
1 745 262 €
Duration
Start date: 2009-10-01, End date: 2014-09-30
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ATTO
Project A new concept for ultra-high capacity wireless networks
Researcher (PI) Piet DEMEESTER
Host Institution (HI) UNIVERSITEIT GENT
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary The project will address the following key question:
How can we provide fibre-like connectivity to moving objects (robots, humans) with the following characteristics: very high dedicated bitrate of 100 Gb/s per object, very low latency of <10 μs, very high reliability of 99.999%, very high density of more than one object per m2 and this at low power consumption?
Achieving this would be groundbreaking and it requires a completely new and high-risk approach: applying close proximity wireless communications using low interference ultra-small cells (called “ATTO-cells”) integrated in floors and connected to antennas on the (parallel) floor-facing surface of ground moving objects. This makes it possible to obtain very high densities with very good channel conditions. The technological challenges involved are groundbreaking in mobile networking (overall architecture, handover with extremely low latencies), wireless subsystems (60 GHz substrate integrated waveguide-based distributed antenna systems connected to RF transceivers integrated in floors, low crosstalk between ATTO-cells) and optical interconnect subsystems (simple non-blocking optical coherent remote selection of ATTO-cells, transparent low power 100 Gb/s coherent optical / RF transceiver interconnection using analogue equalization and symbol interleaving to support 4x4 MIMO). By providing this unique communication infrastructure in high density settings, the ATTO concept will not only support the highly demanding future 5G services (UHD streaming, cloud computing and storage, augmented and virtual reality, a range of IoT services, etc.), but also even more demanding services, that are challenging our imagination such as mobile robot swarms or brain computer interfaces with PFlops computing capabilities.
This new concept for ultra-high capacity wireless networks will open up many more opportunities in reconfigurable robot factories, intelligent hospitals, flexible offices, dense public spaces, etc.
Summary
The project will address the following key question:
How can we provide fibre-like connectivity to moving objects (robots, humans) with the following characteristics: very high dedicated bitrate of 100 Gb/s per object, very low latency of <10 μs, very high reliability of 99.999%, very high density of more than one object per m2 and this at low power consumption?
Achieving this would be groundbreaking and it requires a completely new and high-risk approach: applying close proximity wireless communications using low interference ultra-small cells (called “ATTO-cells”) integrated in floors and connected to antennas on the (parallel) floor-facing surface of ground moving objects. This makes it possible to obtain very high densities with very good channel conditions. The technological challenges involved are groundbreaking in mobile networking (overall architecture, handover with extremely low latencies), wireless subsystems (60 GHz substrate integrated waveguide-based distributed antenna systems connected to RF transceivers integrated in floors, low crosstalk between ATTO-cells) and optical interconnect subsystems (simple non-blocking optical coherent remote selection of ATTO-cells, transparent low power 100 Gb/s coherent optical / RF transceiver interconnection using analogue equalization and symbol interleaving to support 4x4 MIMO). By providing this unique communication infrastructure in high density settings, the ATTO concept will not only support the highly demanding future 5G services (UHD streaming, cloud computing and storage, augmented and virtual reality, a range of IoT services, etc.), but also even more demanding services, that are challenging our imagination such as mobile robot swarms or brain computer interfaces with PFlops computing capabilities.
This new concept for ultra-high capacity wireless networks will open up many more opportunities in reconfigurable robot factories, intelligent hospitals, flexible offices, dense public spaces, etc.
Max ERC Funding
2 496 250 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym BIOTENSORS
Project Biomedical Data Fusion using Tensor based Blind Source Separation
Researcher (PI) Sabine Jeanne A Van Huffel
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Summary
"Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym BOSS-WAVES
Project Back-reaction Of Solar plaSma to WAVES
Researcher (PI) Tom VAN DOORSSELAERE
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary "The solar coronal heating problem is a long-standing astrophysical problem. The slow DC (reconnection) heating models are well developed in detailed 3D numerical simulations. The fast AC (wave) heating mechanisms have traditionally been neglected since there were no wave observations.
Since 2007, we know that the solar atmosphere is filled with transverse waves, but still we have no adequate models (except for my own 1D analytical models) for their dissipation and plasma heating by these waves. We urgently need to know the contribution of these waves to the coronal heating problem.
In BOSS-WAVES, I will innovate the AC wave heating models by utilising novel 3D numerical simulations of propagating transverse waves. From previous results in my team, I know that the inclusion of the back-reaction of the solar plasma is crucial in understanding the energy dissipation: the wave heating leads to chromospheric evaporation and plasma mixing (by the Kelvin-Helmholtz instability).
BOSS-WAVES will bring the AC heating models to the same level of state-of-the-art DC heating models.
The high-risk, high-gain goals are (1) to create a coronal loop heated by waves, starting from an "empty" corona, by evaporating chromospheric material, and (2) to pioneer models for whole active regions heated by transverse waves."
Summary
"The solar coronal heating problem is a long-standing astrophysical problem. The slow DC (reconnection) heating models are well developed in detailed 3D numerical simulations. The fast AC (wave) heating mechanisms have traditionally been neglected since there were no wave observations.
Since 2007, we know that the solar atmosphere is filled with transverse waves, but still we have no adequate models (except for my own 1D analytical models) for their dissipation and plasma heating by these waves. We urgently need to know the contribution of these waves to the coronal heating problem.
In BOSS-WAVES, I will innovate the AC wave heating models by utilising novel 3D numerical simulations of propagating transverse waves. From previous results in my team, I know that the inclusion of the back-reaction of the solar plasma is crucial in understanding the energy dissipation: the wave heating leads to chromospheric evaporation and plasma mixing (by the Kelvin-Helmholtz instability).
BOSS-WAVES will bring the AC heating models to the same level of state-of-the-art DC heating models.
The high-risk, high-gain goals are (1) to create a coronal loop heated by waves, starting from an "empty" corona, by evaporating chromospheric material, and (2) to pioneer models for whole active regions heated by transverse waves."
Max ERC Funding
1 991 960 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym BRIDGE
Project Biomimetic process design for tissue regeneration:
from bench to bedside via in silico modelling
Researcher (PI) Liesbet Geris
Host Institution (HI) UNIVERSITE DE LIEGE
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary "Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Summary
"Tissue engineering (TE), the interdisciplinary field combining biomedical and engineering sciences in the search for functional man-made organ replacements, has key issues with the quantity and quality of the generated products. Protocols followed in the lab are mainly trial and error based, requiring a huge amount of manual interventions and lacking clear early time-point quality criteria to guide the process. As a result, these processes are very hard to scale up to industrial production levels. BRIDGE aims to fortify the engineering aspects of the TE field by adding a higher level of understanding and control to the manufacturing process (MP) through the use of in silico models. BRIDGE will focus on the bone TE field to provide proof of concept for its in silico approach.
The combination of the applicant's well-received published and ongoing work on a wide range of modelling tools in the bone field combined with the state-of-the-art experimental techniques present in the TE lab of the additional participant allows envisaging following innovation and impact:
1. proof-of-concept of the use of an in silico blue-print for the design and control of a robust modular TE MP;
2. model-derived optimised culture conditions for patient derived cell populations increasing modular robustness of in vitro chondrogenesis/endochondral ossification;
3. in silico identification of a limited set of in vitro biomarkers that is predictive of the in vivo outcome;
4. model-derived optimised culture conditions increasing quantity and quality of the in vivo outcome of the TE MP;
5. incorporation of congenital defects in the in silico MP design, constituting a further validation of BRIDGE’s in silico approach and a necessary step towards personalised medical care.
We believe that the systematic – and unprecedented – integration of (bone) TE and mathematical modelling, as proposed in BRIDGE, is required to come to a rationalized, engineering approach to design and control bone TE MPs."
Max ERC Funding
1 191 440 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym CAPS
Project Capillary suspensions: a novel route for versatile, cost efficient and environmentally friendly material design
Researcher (PI) Erin Crystal Koos
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2013-StG
Summary A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Summary
A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Max ERC Funding
1 489 618 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym Cathedral
Project Post-Snowden Circuits and Design Methods for Security
Researcher (PI) Ingrid VERBAUWHEDE
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary Summary: Comprehensive set of circuits and design methods to create next generation electronic circuits with strong built-in trust and security.
Electronics are integrating/invading into the human environment at an amazing speed, called the Internet-of-Things and next the Internet-of-Everything. This creates huge security problems. Distributed (e.g. body) sensors, pick up often very private data, which is sent digitally into the cloud, over wireless and wired links. Protection of this data relies on high-quality cryptographic algorithms and protocols. The nodes need to be cheap and lightweight, making them very vulnerable to eavesdropping and abuse. Moreover, post-Snowden, society realizes that the attack capabilities of intelligence agencies, and probably following soon of organized crime and other hackers, are orders of magnitude stronger than imagined. Thus there is a strong demand to re-establish trust in ICT systems.
In this proposal we focus on the root of trust: the digital hardware. The overall objective is to provide fundamental enabling technologies for secure trustworthy digital circuits which can be applied in a wide range of applications. To master complexity, digital hardware design is traditionally split into different abstraction layers. We revisit these abstraction layers from a security viewpoint: we look at process variations to the benefit of security, standard cell compatible digital design flow with security as design objective, hardware IP blocks for next generation cryptographic algorithms and protocols (e.g. authenticated encryption schemes, post-quantum public key schemes), integration into embedded HW/SW platforms, and methods to provide trust evidence to higher levels of abstraction. To strengthen the security we investigate the links between the layers. Finally an embedded application is selected as design driver, the security evaluation of which will be fed back to the individual layers.
Summary
Summary: Comprehensive set of circuits and design methods to create next generation electronic circuits with strong built-in trust and security.
Electronics are integrating/invading into the human environment at an amazing speed, called the Internet-of-Things and next the Internet-of-Everything. This creates huge security problems. Distributed (e.g. body) sensors, pick up often very private data, which is sent digitally into the cloud, over wireless and wired links. Protection of this data relies on high-quality cryptographic algorithms and protocols. The nodes need to be cheap and lightweight, making them very vulnerable to eavesdropping and abuse. Moreover, post-Snowden, society realizes that the attack capabilities of intelligence agencies, and probably following soon of organized crime and other hackers, are orders of magnitude stronger than imagined. Thus there is a strong demand to re-establish trust in ICT systems.
In this proposal we focus on the root of trust: the digital hardware. The overall objective is to provide fundamental enabling technologies for secure trustworthy digital circuits which can be applied in a wide range of applications. To master complexity, digital hardware design is traditionally split into different abstraction layers. We revisit these abstraction layers from a security viewpoint: we look at process variations to the benefit of security, standard cell compatible digital design flow with security as design objective, hardware IP blocks for next generation cryptographic algorithms and protocols (e.g. authenticated encryption schemes, post-quantum public key schemes), integration into embedded HW/SW platforms, and methods to provide trust evidence to higher levels of abstraction. To strengthen the security we investigate the links between the layers. Finally an embedded application is selected as design driver, the security evaluation of which will be fed back to the individual layers.
Max ERC Funding
2 369 250 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym CHINA
Project Trade, Productivity, and Firm Capabilities in China's Manufacturing Sector
Researcher (PI) Johannes Van Biesebroeck
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH1, ERC-2009-StG
Summary China s economy has expanded at breakneck speed to become the 3rd largest trading country in the world and the largest recipient of foreign direct investment (FDI). Entry into the WTO in 2001 was a landmark event in this ongoing process and I propose to study several channels through which it spurred China s industrial development. Crucially, I will take an integrated view of the different ways in which Chinese and Western firms interact: through trade flows, as suppliers or competitors, FDI, or knowledge transfers. First, I investigate the existence and magnitude of a causal link from the trade reforms to productivity growth. Second, I look for evidence of capability upgrading, such as increased production efficiency, an ability to produce higher quality products, or introduce new products by innovating. Third, I study the mechanisms for the impact of trade and FDI on local firms, in particular assessing the relative importance of increased market competition and the transfer of know-how from foreign firms. For this analysis, I draw heavily on a unique data set. Information on the universe of Chinese manufacturing firms is being linked to the universe of Chinese trade transactions. These are unique research tools on their own, but as a linked data set, the only comparable one in the world is for the U.S. economy. The Chinese data has the advantage to contain detailed information on FDI, distinguishes between ordinary and processing trade, and contains information on innovation, such as R&D and sales of new goods. Answering the above questions is important for other developing countries wanting to learn from China s experience and for Western firms assessing how quickly Chinese firms will become viable suppliers of sophisticated inputs or direct competitors. By estimating models that are explicitly derived from new theories, I advance the literature at the interaction of international and development economics, industrial organization, economic geography.
Summary
China s economy has expanded at breakneck speed to become the 3rd largest trading country in the world and the largest recipient of foreign direct investment (FDI). Entry into the WTO in 2001 was a landmark event in this ongoing process and I propose to study several channels through which it spurred China s industrial development. Crucially, I will take an integrated view of the different ways in which Chinese and Western firms interact: through trade flows, as suppliers or competitors, FDI, or knowledge transfers. First, I investigate the existence and magnitude of a causal link from the trade reforms to productivity growth. Second, I look for evidence of capability upgrading, such as increased production efficiency, an ability to produce higher quality products, or introduce new products by innovating. Third, I study the mechanisms for the impact of trade and FDI on local firms, in particular assessing the relative importance of increased market competition and the transfer of know-how from foreign firms. For this analysis, I draw heavily on a unique data set. Information on the universe of Chinese manufacturing firms is being linked to the universe of Chinese trade transactions. These are unique research tools on their own, but as a linked data set, the only comparable one in the world is for the U.S. economy. The Chinese data has the advantage to contain detailed information on FDI, distinguishes between ordinary and processing trade, and contains information on innovation, such as R&D and sales of new goods. Answering the above questions is important for other developing countries wanting to learn from China s experience and for Western firms assessing how quickly Chinese firms will become viable suppliers of sophisticated inputs or direct competitors. By estimating models that are explicitly derived from new theories, I advance the literature at the interaction of international and development economics, industrial organization, economic geography.
Max ERC Funding
944 940 €
Duration
Start date: 2010-02-01, End date: 2016-01-31
Project acronym COCOON
Project Conformal coating of nanoporous materials
Researcher (PI) Christophe Detavernier
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), PE8, ERC-2009-StG
Summary CONTEXT - Nanoporous structures are used for application in catalysis, molecular separation, fuel cells, dye sensitized solar cells etc. Given the near molecular size of the porous network, it is extremely challenging to modify the interior surface of the pores after the nanoporous material has been synthesized.
THIS PROPOSAL - Atomic Layer Deposition (ALD) is envisioned as a novel technique for creating catalytically active sites and for controlling the pore size distribution in nanoporous materials. ALD is a self-limited growth method that is characterized by alternating exposure of the growing film to precursor vapours, resulting in the sequential deposition of (sub)monolayers. It provides atomic level control of thickness and composition, and is currently used in micro-electronics to grow films into structures with aspect ratios of up to 100 / 1. We aim to make the fundamental breakthroughs necessary to enable atomic layer deposition to engineer the composition, size and shape of the interior surface of nanoporous materials with aspect ratios in excess of 10,000 / 1.
POTENTIAL IMPACT Achieving these objectives will enable atomic level engineering of the interior surface of any porous material. We plan to focus on three specific applications where our results will have both medium and long term impacts:
- Engineering the composition of pore walls using ALD, e.g. to create catalytic sites (e.g. Al for acid sites, Ti for redox sites, or Pt, Pd or Ni)
- chemical functionalization of the pore walls with atomic level control can result in breakthrough applications in the fields of catalysis and sensors.
- Atomic level control of the size of nanopores through ALD controlling the pore size distribution of molecular sieves can potentially lead to breakthrough applications in molecular separation and filtration.
- Nanocasting replication of a mesoporous template by means of ALD can result in the mass-scale production of nanotubes.
Summary
CONTEXT - Nanoporous structures are used for application in catalysis, molecular separation, fuel cells, dye sensitized solar cells etc. Given the near molecular size of the porous network, it is extremely challenging to modify the interior surface of the pores after the nanoporous material has been synthesized.
THIS PROPOSAL - Atomic Layer Deposition (ALD) is envisioned as a novel technique for creating catalytically active sites and for controlling the pore size distribution in nanoporous materials. ALD is a self-limited growth method that is characterized by alternating exposure of the growing film to precursor vapours, resulting in the sequential deposition of (sub)monolayers. It provides atomic level control of thickness and composition, and is currently used in micro-electronics to grow films into structures with aspect ratios of up to 100 / 1. We aim to make the fundamental breakthroughs necessary to enable atomic layer deposition to engineer the composition, size and shape of the interior surface of nanoporous materials with aspect ratios in excess of 10,000 / 1.
POTENTIAL IMPACT Achieving these objectives will enable atomic level engineering of the interior surface of any porous material. We plan to focus on three specific applications where our results will have both medium and long term impacts:
- Engineering the composition of pore walls using ALD, e.g. to create catalytic sites (e.g. Al for acid sites, Ti for redox sites, or Pt, Pd or Ni)
- chemical functionalization of the pore walls with atomic level control can result in breakthrough applications in the fields of catalysis and sensors.
- Atomic level control of the size of nanopores through ALD controlling the pore size distribution of molecular sieves can potentially lead to breakthrough applications in molecular separation and filtration.
- Nanocasting replication of a mesoporous template by means of ALD can result in the mass-scale production of nanotubes.
Max ERC Funding
1 432 800 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym COGNIMUND
Project Cognitive Image Understanding: Image representations and Multimodal learning
Researcher (PI) Tinne Tuytelaars
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE6, ERC-2009-StG
Summary One of the primary and most appealing goals of computer vision is to automatically understand the content of images on a cognitive level. Ultimately we want to have computers interpret images as we humans do, recognizing all the objects, scenes, and people as well as their relations as they appear in natural images or video. With this project, I want to advance the state of the art in this field in two directions, which I believe to be crucial to build the next generation of image understanding tools. First, novel more robust yet descriptive image representations will be designed, that incorporate the intrinsic structure of images. These should already go a long way towards removing irrelevant sources of variability while capturing the essence of the image content. I believe the importance of further research into image representations is currently underestimated within the research community, yet I claim this is a crucial step with lots of opportunities good learning cannot easily make up for bad features. Second, weakly supervised methods to learn from multimodal input (especially the combination of images and text) will be investigated, making it possible to leverage the large amount of weak annotations available via the internet. This is essential if we want to scale the methods to a larger number of object categories (several hundreds instead of a few tens). As more data can be used for training, such weakly supervised methods might in the end even come on par with or outperform supervised schemes. Here we will call upon the latest results in semi-supervised learning, datamining, and computational linguistics.
Summary
One of the primary and most appealing goals of computer vision is to automatically understand the content of images on a cognitive level. Ultimately we want to have computers interpret images as we humans do, recognizing all the objects, scenes, and people as well as their relations as they appear in natural images or video. With this project, I want to advance the state of the art in this field in two directions, which I believe to be crucial to build the next generation of image understanding tools. First, novel more robust yet descriptive image representations will be designed, that incorporate the intrinsic structure of images. These should already go a long way towards removing irrelevant sources of variability while capturing the essence of the image content. I believe the importance of further research into image representations is currently underestimated within the research community, yet I claim this is a crucial step with lots of opportunities good learning cannot easily make up for bad features. Second, weakly supervised methods to learn from multimodal input (especially the combination of images and text) will be investigated, making it possible to leverage the large amount of weak annotations available via the internet. This is essential if we want to scale the methods to a larger number of object categories (several hundreds instead of a few tens). As more data can be used for training, such weakly supervised methods might in the end even come on par with or outperform supervised schemes. Here we will call upon the latest results in semi-supervised learning, datamining, and computational linguistics.
Max ERC Funding
1 538 380 €
Duration
Start date: 2010-02-01, End date: 2015-01-31
Project acronym COLORAMAP
Project Constrained Low-Rank Matrix Approximations: Theoretical and Algorithmic Developments for Practitioners
Researcher (PI) Nicolas Benoit P Gillis
Host Institution (HI) UNIVERSITE DE MONS
Call Details Starting Grant (StG), PE6, ERC-2015-STG
Summary Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Summary
Low-rank matrix approximation (LRA) techniques such as principal component analysis (PCA) are powerful tools for the representation and analysis of high dimensional data, and are used in a wide variety of areas such as machine learning, signal and image processing, data mining, and optimization. Without any constraints and using the least squares error, LRA can be solved via the singular value decomposition. However, in practice, this model is often not suitable mainly because (i) the data might be contaminated with outliers, missing data and non-Gaussian noise, and (ii) the low-rank factors of the decomposition might have to satisfy some specific constraints. Hence, in recent years, many variants of LRA have been introduced, using different constraints on the factors and using different objective functions to assess the quality of the approximation; e.g., sparse PCA, PCA with missing data, independent component analysis and nonnegative matrix factorization. Although these new constrained LRA models have become very popular and standard in some fields, there is still a significant gap between theory and practice. In this project, our goal is to reduce this gap by attacking the problem in an integrated way making connections between LRA variants, and by using four very different but complementary perspectives: (1) computational complexity issues, (2) provably correct algorithms, (3) heuristics for difficult instances, and (4) application-oriented aspects. This unified and multi-disciplinary approach will enable us to understand these problems better, to develop and analyze new and existing algorithms and to then use them for applications. Our ultimate goal is to provide practitioners with new tools and to allow them to decide which method to use in which situation and to know what to expect from it.
Max ERC Funding
1 291 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COLOURATOM
Project Colouring Atoms in 3 Dimensions
Researcher (PI) Sara Bals
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE4, ERC-2013-StG
Summary "Matter is a three dimensional (3D) agglomeration of atoms. The properties of materials are determined by the positions of the atoms, their chemical nature and the bonding between them. If we are able to determine these parameters in 3D, we will be able to provide the necessary input for predicting the properties and we can guide the synthesis and development of new nanomaterials.
The aim of this project is therefore to provide a complete 3D characterisation of complex hetero-nanosystems down to the atomic scale. The combination of advanced aberration corrected electron microscopy and novel 3D reconstruction algorithms is envisioned as a groundbreaking new approach to quantify the position AND the colour (chemical nature and bonding) of each individual atom in 3D for any given nanomaterial.
So far, only 3D imaging at the atomic scale was carried out for model-like systems. Measuring the position and the colour of the atoms in a complex nanomaterial can therefore be considered as an extremely challenging goal that will lead to a wealth of new information. Our objectives will enable 3D strain measurements at the atomic scale, localisation of atomic vacancies and interface characterisation in hetero-nanocrystals or hybrid soft-hard matter nanocompounds. Quantification of the oxidation states of surface atoms and of 3D surface relaxation will yield new insights concerning preferential functionalities.
Although these goals already go beyond the state-of-the-art, we plan to break fundamental limits and completely eliminate the need to tilt the sample for electron tomography. Especially for beam sensitive materials, this technique, so-called ""multi-detector stereoscopy"", can be considered as a groundbreaking approach to obtain 3D information at the atomic scale. As an ultimate ambition, we will investigate the dynamic behaviour of ultra-small binary clusters."
Summary
"Matter is a three dimensional (3D) agglomeration of atoms. The properties of materials are determined by the positions of the atoms, their chemical nature and the bonding between them. If we are able to determine these parameters in 3D, we will be able to provide the necessary input for predicting the properties and we can guide the synthesis and development of new nanomaterials.
The aim of this project is therefore to provide a complete 3D characterisation of complex hetero-nanosystems down to the atomic scale. The combination of advanced aberration corrected electron microscopy and novel 3D reconstruction algorithms is envisioned as a groundbreaking new approach to quantify the position AND the colour (chemical nature and bonding) of each individual atom in 3D for any given nanomaterial.
So far, only 3D imaging at the atomic scale was carried out for model-like systems. Measuring the position and the colour of the atoms in a complex nanomaterial can therefore be considered as an extremely challenging goal that will lead to a wealth of new information. Our objectives will enable 3D strain measurements at the atomic scale, localisation of atomic vacancies and interface characterisation in hetero-nanocrystals or hybrid soft-hard matter nanocompounds. Quantification of the oxidation states of surface atoms and of 3D surface relaxation will yield new insights concerning preferential functionalities.
Although these goals already go beyond the state-of-the-art, we plan to break fundamental limits and completely eliminate the need to tilt the sample for electron tomography. Especially for beam sensitive materials, this technique, so-called ""multi-detector stereoscopy"", can be considered as a groundbreaking approach to obtain 3D information at the atomic scale. As an ultimate ambition, we will investigate the dynamic behaviour of ultra-small binary clusters."
Max ERC Funding
1 461 466 €
Duration
Start date: 2013-12-01, End date: 2018-11-30
Project acronym COSMOS
Project Semiparametric Inference for Complex and Structural Models in Survival Analysis
Researcher (PI) Ingrid VAN KEILEGOM
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE1, ERC-2015-AdG
Summary In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Summary
In survival analysis investigators are interested in modeling and analysing the time until an event happens. It often happens that the available data are right censored, which means that only a lower bound of the time of interest is observed. This feature complicates substantially the statistical analysis of this kind of data. The aim of this project is to solve a number of open problems related to time-to-event data, that would represent a major step forward in the area of survival analysis.
The project has three objectives:
[1] Cure models take into account that a certain fraction of the subjects under study will never experience the event of interest. Because of the complex nature of these models, many problems are still open and rigorous theory is rather scarce in this area. Our goal is to fill this gap, which will be a challenging but important task.
[2] Copulas are nowadays widespread in many areas in statistics. However, they can contribute more substantially to resolving a number of the outstanding issues in survival analysis, such as in quantile regression and dependent censoring. Finding answers to these open questions, would open up new horizons for a wide variety of problems.
[3] We wish to develop new methods for doing correct inference in some of the common models in survival analysis in the presence of endogeneity or measurement errors. The present methodology has serious shortcomings, and we would like to propose, develop and validate new methods, that would be a major breakthrough if successful.
The above objectives will be achieved by using mostly semiparametric models. The development of mathematical properties under these models is often a challenging task, as complex tools from the theory on empirical processes and semiparametric efficiency are required. The project will therefore require an innovative combination of highly complex mathematical skills and cutting edge results from modern theory for semiparametric models.
Max ERC Funding
2 318 750 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym COUNTATOMS
Project Counting Atoms in nanomaterials
Researcher (PI) Gustaaf Van Tendeloo
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Advanced Grant (AdG), PE5, ERC-2009-AdG
Summary COUNTING ATOMS IN NANOMATERIALS Advanced electron microscopy for solid state materials has evolved from a qualitative imaging setup to a quantitative scientific technique. This will allow us not only to probe and better understand the fundamental behaviour of (nano) materials at an atomic level but also to guide technology towards new horizons. The installation in 2009 of a new and unique electron microscope with a real space resolution of 50 pm and an energy resolution of 100 meV will make it possible to perform unique experiments. We believe that the position of atoms at an interface or at a surface can be determined with a precision of 1 pm; this precision is essential as input for modelling the materials properties. It will be first applied to explain the fascinating behaviour of multilayer ceramic materials. The new experimental limits will also allow us to literally count the number of atoms within an atomic columns; particularly counting the number of foreign atoms. This will not only require experimental skills, but also theoretical support. A real challenge is probing the magnetic and electronic information of a single atom column. According to theory this would be possible using ultra high resolution. This new probing technique will be of extreme importance for e.g. spintronics. Modern (nano) technology more and more requires information in 3 dimensions (3D), rather than in 2D. This is possible through electron tomography; this technique will be optimised in order to obtain sub nanometer precision. A final challenge is the study of the interface between soft matter (bio- or organic materials) and hard matter. This was hitherto impossible because of the radiation damage of the electron beam. With the possibility to lower the voltage to 80 kV and possibly 50 kV, maintaining more or less the resolution, we will hopefully be able to probe the active sites for catalysis.
Summary
COUNTING ATOMS IN NANOMATERIALS Advanced electron microscopy for solid state materials has evolved from a qualitative imaging setup to a quantitative scientific technique. This will allow us not only to probe and better understand the fundamental behaviour of (nano) materials at an atomic level but also to guide technology towards new horizons. The installation in 2009 of a new and unique electron microscope with a real space resolution of 50 pm and an energy resolution of 100 meV will make it possible to perform unique experiments. We believe that the position of atoms at an interface or at a surface can be determined with a precision of 1 pm; this precision is essential as input for modelling the materials properties. It will be first applied to explain the fascinating behaviour of multilayer ceramic materials. The new experimental limits will also allow us to literally count the number of atoms within an atomic columns; particularly counting the number of foreign atoms. This will not only require experimental skills, but also theoretical support. A real challenge is probing the magnetic and electronic information of a single atom column. According to theory this would be possible using ultra high resolution. This new probing technique will be of extreme importance for e.g. spintronics. Modern (nano) technology more and more requires information in 3 dimensions (3D), rather than in 2D. This is possible through electron tomography; this technique will be optimised in order to obtain sub nanometer precision. A final challenge is the study of the interface between soft matter (bio- or organic materials) and hard matter. This was hitherto impossible because of the radiation damage of the electron beam. With the possibility to lower the voltage to 80 kV and possibly 50 kV, maintaining more or less the resolution, we will hopefully be able to probe the active sites for catalysis.
Max ERC Funding
2 000 160 €
Duration
Start date: 2010-01-01, End date: 2014-12-31
Project acronym CRAMIS
Project Critical phenomena in random matrix theory and integrable systems
Researcher (PI) Tom Claeys
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE1, ERC-2012-StG_20111012
Summary The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Summary
The main goal of the project is to create a research group on critical phenomena in random matrix theory and integrable systems at the Université Catholique de Louvain, where the PI was recently appointed.
Random matrix ensembles, integrable partial differential equations and Toeplitz determinants will be the main research topics in the project. Those three models show intimate connections and they all share certain properties that are, to a large extent, universal. In the recent past it has been showed that Painlevé equations play an important and universal role in the description of critical behaviour in each of these areas. In random matrix theory, they describe the local correlations between eigenvalues in appropriate double scaling limits; for integrable partial differential equations such as the Korteweg-de Vries equation and the nonlinear Schrödinger equation, they arise near points of gradient catastrophe in the small dispersion limit; for Toeplitz determinants they describe phase transitions for underlying models in statistical physics.
The aim of the project is to study new types of critical behaviour and to obtain a better understanding of the remarkable similarities between random matrices on one hand and integrable partial differential equations on the other hand. The focus will be on asymptotic questions, and one of the tools we plan to use is the Deift/Zhou steepest descent method to obtain asymptotics for Riemann-Hilbert problems. Although many of the problems in this project have their origin or motivation in mathematical physics, the proposed techniques are mostly based on complex and classical analysis.
Max ERC Funding
1 130 400 €
Duration
Start date: 2012-08-01, End date: 2017-07-31