Project acronym BIOTENSORS
Project Biomedical Data Fusion using Tensor based Blind Source Separation
Researcher (PI) Sabine Jeanne A Van Huffel
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Summary
"Summary: the quest for a general functional tensor framework for blind source separation
Our overall objective is the development of a general functional framework for solving tensor based blind source separation (BSS) problems in biomedical data fusion, using tensor decompositions (TDs) as basic core. We claim that TDs will allow the extraction of fairly complicated sources of biomedical activity from fairly complicated sets of uni- and multimodal data. The power of the new techniques will be demonstrated for three well-chosen representative biomedical applications for which extensive expertise and fully validated datasets are available in the PI’s team, namely:
• Metabolite quantification and brain tumour tissue typing using Magnetic Resonance Spectroscopic Imaging,
• Functional monitoring including seizure detection and polysomnography,
• Cognitive brain functioning and seizure zone localization using simultaneous Electroencephalography-functional MR Imaging integration.
Solving these challenging problems requires that algorithmic progress is made in several directions:
• Algorithms need to be based on multilinear extensions of numerical linear algebra.
• New grounds for separation, such as representability in a given function class, need to be explored.
• Prior knowledge needs to be exploited via appropriate health relevant constraints.
• Biomedical data fusion requires the combination of TDs, coupled via relevant constraints.
• Algorithms for TD updating are important for continuous long-term patient monitoring.
The algorithms are eventually integrated in an easy-to-use open source software platform that is general enough for use in other BSS applications.
Having been involved in biomedical signal processing over a period of 20 years, the PI has a good overview of the field and the opportunities. By working directly at the forefront in close collaboration with the clinical scientists who actually use our software, we can have a huge impact."
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym CAPS
Project Capillary suspensions: a novel route for versatile, cost efficient and environmentally friendly material design
Researcher (PI) Erin Crystal Koos
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), PE8, ERC-2013-StG
Summary A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Summary
A wide variety of materials including coatings and adhesives, emerging materials for nanotechnology products, as well as everyday food products are processed or delivered as suspensions. The flow properties of such suspensions must be finely adjusted according to the demands of the respective processing techniques, even for the feel of cosmetics and the perception of food products is highly influenced by their rheological properties. The recently developed capillary suspensions concept has the potential to revolutionize product formulations and material design. When a small amount (less than 1%) of a second immiscible liquid is added to the continuous phase of a suspension, the rheological properties of the mixture are dramatically altered from a fluid-like to a gel-like state or from a weak to a strong gel and the strength can be tuned in a wide range covering orders of magnitude. Capillary suspensions can be used to create smart, tunable fluids, stabilize mixtures that would otherwise phase separate, significantly reduce the amount organic or polymeric additives, and the strong particle network can be used as a precursor for the manufacturing of cost-efficient porous ceramics and foams with unprecedented properties.
This project will investigate the influence of factors determining capillary suspension formation, the strength of these admixtures as a function of these aspects, and how capillary suspensions depend on external forces. Only such a fundamental understanding of the network formation in capillary suspensions on both the micro- and macroscopic scale will allow for the design of sophisticated new materials. The main objectives of this proposal are to quantify and predict the strength of these admixtures and then use this information to design a variety of new materials in very different application areas including, e.g., porous materials, water-based coatings, ultra low fat foods, and conductive films.
Max ERC Funding
1 489 618 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym COLOURATOM
Project Colouring Atoms in 3 Dimensions
Researcher (PI) Sara Bals
Host Institution (HI) UNIVERSITEIT ANTWERPEN
Call Details Starting Grant (StG), PE4, ERC-2013-StG
Summary "Matter is a three dimensional (3D) agglomeration of atoms. The properties of materials are determined by the positions of the atoms, their chemical nature and the bonding between them. If we are able to determine these parameters in 3D, we will be able to provide the necessary input for predicting the properties and we can guide the synthesis and development of new nanomaterials.
The aim of this project is therefore to provide a complete 3D characterisation of complex hetero-nanosystems down to the atomic scale. The combination of advanced aberration corrected electron microscopy and novel 3D reconstruction algorithms is envisioned as a groundbreaking new approach to quantify the position AND the colour (chemical nature and bonding) of each individual atom in 3D for any given nanomaterial.
So far, only 3D imaging at the atomic scale was carried out for model-like systems. Measuring the position and the colour of the atoms in a complex nanomaterial can therefore be considered as an extremely challenging goal that will lead to a wealth of new information. Our objectives will enable 3D strain measurements at the atomic scale, localisation of atomic vacancies and interface characterisation in hetero-nanocrystals or hybrid soft-hard matter nanocompounds. Quantification of the oxidation states of surface atoms and of 3D surface relaxation will yield new insights concerning preferential functionalities.
Although these goals already go beyond the state-of-the-art, we plan to break fundamental limits and completely eliminate the need to tilt the sample for electron tomography. Especially for beam sensitive materials, this technique, so-called ""multi-detector stereoscopy"", can be considered as a groundbreaking approach to obtain 3D information at the atomic scale. As an ultimate ambition, we will investigate the dynamic behaviour of ultra-small binary clusters."
Summary
"Matter is a three dimensional (3D) agglomeration of atoms. The properties of materials are determined by the positions of the atoms, their chemical nature and the bonding between them. If we are able to determine these parameters in 3D, we will be able to provide the necessary input for predicting the properties and we can guide the synthesis and development of new nanomaterials.
The aim of this project is therefore to provide a complete 3D characterisation of complex hetero-nanosystems down to the atomic scale. The combination of advanced aberration corrected electron microscopy and novel 3D reconstruction algorithms is envisioned as a groundbreaking new approach to quantify the position AND the colour (chemical nature and bonding) of each individual atom in 3D for any given nanomaterial.
So far, only 3D imaging at the atomic scale was carried out for model-like systems. Measuring the position and the colour of the atoms in a complex nanomaterial can therefore be considered as an extremely challenging goal that will lead to a wealth of new information. Our objectives will enable 3D strain measurements at the atomic scale, localisation of atomic vacancies and interface characterisation in hetero-nanocrystals or hybrid soft-hard matter nanocompounds. Quantification of the oxidation states of surface atoms and of 3D surface relaxation will yield new insights concerning preferential functionalities.
Although these goals already go beyond the state-of-the-art, we plan to break fundamental limits and completely eliminate the need to tilt the sample for electron tomography. Especially for beam sensitive materials, this technique, so-called ""multi-detector stereoscopy"", can be considered as a groundbreaking approach to obtain 3D information at the atomic scale. As an ultimate ambition, we will investigate the dynamic behaviour of ultra-small binary clusters."
Max ERC Funding
1 461 466 €
Duration
Start date: 2013-12-01, End date: 2018-11-30
Project acronym FEEC-A
Project Finite Element Exterior Calculus and Applications
Researcher (PI) Ragnar Winther
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "The finite element method is one of the most successful techniques for designing numerical methods for systems of partial differential equations (PDEs). It is not only a methodology for developing numerical algorithms, but also a mathematical framework in which to explore their behavior. The finite element exterior calculus (FEEC) provides a new structure that produces a deeper understanding of the finite element method and its connections to the partial differential equation being approximated. The goal is to develop discretizations which are compatible with the geometric, topological, and algebraic structures which underlie well-posedness of the partial differential equation. The phrase FEEC was first used in a paper the PI wrote for Acta Numerica in 2006, together with his coworkers, D.N. Arnold and R.S. Falk. The general philosophy of FEEC has led to the design of new algorithms and software developments, also in areas beyond the direct application of the theory. The present project will be devoted to further development of the foundations of FEEC, and to direct or indirect use of FEEC in specific applications. The ambition is to set the scene for a nubmer of new research directions based on FEEC by giving ground-braking contributions to its foundation. The aim is also to use FEEC as a tool, or a guideline, to extend the foundation of numerical PDEs to a variety of problems for which this foundation does not exist. The more application oriented parts of the project includes topics like numerical methods for elasticity, its generalizations to more general models in materials science such as viscoelasticity, poroelasticity, and liquid crystals, and the applications of these models to CO2 storage and deformations of the spinal cord."
Summary
"The finite element method is one of the most successful techniques for designing numerical methods for systems of partial differential equations (PDEs). It is not only a methodology for developing numerical algorithms, but also a mathematical framework in which to explore their behavior. The finite element exterior calculus (FEEC) provides a new structure that produces a deeper understanding of the finite element method and its connections to the partial differential equation being approximated. The goal is to develop discretizations which are compatible with the geometric, topological, and algebraic structures which underlie well-posedness of the partial differential equation. The phrase FEEC was first used in a paper the PI wrote for Acta Numerica in 2006, together with his coworkers, D.N. Arnold and R.S. Falk. The general philosophy of FEEC has led to the design of new algorithms and software developments, also in areas beyond the direct application of the theory. The present project will be devoted to further development of the foundations of FEEC, and to direct or indirect use of FEEC in specific applications. The ambition is to set the scene for a nubmer of new research directions based on FEEC by giving ground-braking contributions to its foundation. The aim is also to use FEEC as a tool, or a guideline, to extend the foundation of numerical PDEs to a variety of problems for which this foundation does not exist. The more application oriented parts of the project includes topics like numerical methods for elasticity, its generalizations to more general models in materials science such as viscoelasticity, poroelasticity, and liquid crystals, and the applications of these models to CO2 storage and deformations of the spinal cord."
Max ERC Funding
2 059 687 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym FOREFRONT
Project Frontiers of Extended Formulations
Researcher (PI) Samuel Fiorini
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Summary
"Linear programming has proved to be an invaluable tool both in theory and practice. Semidefinite programming surpasses linear programming in terms of expressivity while remaining tractable. This project proposal investigates the modeling power of linear and semidefinite programming, in the context of combinatorial optimization. Within the emerging framework of extended formulations (EFs), I seek a decisive answer to the following question: Which problems can be modeled by a linear or semidefinite program, when the number of constraints and variables are limited? EFs are based on the idea that one should choose the ""right"" variables to model a problem. By extending the set of variables of a problem by a few carefully chosen variables, the number of constraints can in some cases dramatically decrease, making the problem easier to solve. Despite previous high-quality research, the theory of EFs is still on square one. This project proposal aims at (i) transforming our current zero-dimensional state of knowledge to a truly three-dimensional state of knowledge by pushing the boundaries of EFs in three directions (models, types and problems); (ii) using EFs as a lens on complexity by proving strong consequences of important conjectures such as P != NP, and leveraging strong connections to geometry to make progress on the log-rank conjecture. The proposed methodology is: (i) experiment-aided; (ii) interdisciplinary; (iii) constructive."
Max ERC Funding
1 455 479 €
Duration
Start date: 2014-09-01, End date: 2019-08-31
Project acronym FORSIED
Project Formalizing Subjective Interestingness in Exploratory Data Mining
Researcher (PI) Tijl De Bie
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE6, ERC-2013-CoG
Summary "The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Summary
"The rate at which research labs, enterprises and governments accumulate data is high and fast increasing. Often, these data are collected for no specific purpose, or they turn out to be useful for unanticipated purposes: Companies constantly look for new ways to monetize their customer databases; Governments mine various databases to detect tax fraud; Security agencies mine and cross-associate numerous heterogeneous information streams from publicly accessible and classified databases to understand and detect security threats. The objective in such Exploratory Data Mining (EDM) tasks is typically ill-defined, i.e. it is unclear how to formalize how interesting a pattern extracted from the data is. As a result, EDM is often a slow process of trial and error.
During this fellowship we aim to develop the mathematical principles of what makes a pattern interesting in a very subjective sense. Crucial in this endeavour will be research into automatic mechanisms to model and duly consider the prior beliefs and expectations of the user for whom the EDM patterns are intended, thus relieving the users of the complex task to attempt to formalize themselves what makes a pattern interesting to them.
This project will represent a radical change in how EDM research is done. Currently, researchers typically imagine a specific purpose for the patterns, try to formalize interestingness of such patterns given that purpose, and design an algorithm to mine them. However, given the variety of users, this strategy has led to a multitude of algorithms. As a result, users need to be data mining experts to understand which algorithm applies to their situation. To resolve this, we will develop a theoretically solid framework for the design of EDM systems that model the user's beliefs and expectations as much as the data itself, so as to maximize the amount of useful information transmitted to the user. This will ultimately bring the power of EDM within reach of the non-expert."
Max ERC Funding
1 549 315 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym HOLOBHC
Project Holography for realistic black holes and cosmologies
Researcher (PI) Geoffrey Gaston Joseph Jean-Vincent Compère
Host Institution (HI) UNIVERSITE LIBRE DE BRUXELLES
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary String theory provides with a consistent framework which combines quantum mechanics and gravity. Two grand challenges of fundamental physics - building realistic models of black holes and cosmologies - can be addressed in this framework thanks to novel holographic methods.
Recent astrophysical evidence indicates that some black holes rotate extremely fast, as close as 98% to the extremality bound. No quantum gravity model for such black holes has been formulated so far. My first objective is building the first model in string theory of an extremal black hole. Taking on this challenge is made possible thanks to recent advances in a remarkable duality known as the gauge/gravity correspondence. If successful, this program will pave the way to a description of quantum gravity effects that have been conjectured to occur close to the horizon of very fast rotating black holes.
Supernovae detection has established that our universe is starting a phase of accelerated expansion. This brings a pressing need to better understand still enigmatic features of de Sitter spacetime that models our universe at late times. My second objective is to derive new universal properties of the cosmological horizon of de Sitter spacetime using tools inspired from the gauge/gravity correspondence. These results will contribute to understand its remarkable entropy, which, according to the standard model of cosmology, bounds the entropy of our observable universe.
Summary
String theory provides with a consistent framework which combines quantum mechanics and gravity. Two grand challenges of fundamental physics - building realistic models of black holes and cosmologies - can be addressed in this framework thanks to novel holographic methods.
Recent astrophysical evidence indicates that some black holes rotate extremely fast, as close as 98% to the extremality bound. No quantum gravity model for such black holes has been formulated so far. My first objective is building the first model in string theory of an extremal black hole. Taking on this challenge is made possible thanks to recent advances in a remarkable duality known as the gauge/gravity correspondence. If successful, this program will pave the way to a description of quantum gravity effects that have been conjectured to occur close to the horizon of very fast rotating black holes.
Supernovae detection has established that our universe is starting a phase of accelerated expansion. This brings a pressing need to better understand still enigmatic features of de Sitter spacetime that models our universe at late times. My second objective is to derive new universal properties of the cosmological horizon of de Sitter spacetime using tools inspired from the gauge/gravity correspondence. These results will contribute to understand its remarkable entropy, which, according to the standard model of cosmology, bounds the entropy of our observable universe.
Max ERC Funding
1 020 084 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym HoloQosmos
Project Holographic Quantum Cosmology
Researcher (PI) Thomas Hertog
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary The current theory of cosmic inflation is largely based on classical physics. This undermines its predictivity in a world that is fundamentally quantum mechanical. With this project we will develop a novel approach towards a quantum theory of inflation. We will do this by introducing holographic techniques in cosmology. The notion of holography is the most profound conceptual breakthrough that has emerged form fundamental high-energy physics in recent years. It postulates that (quantum) gravitational systems such as the universe as a whole have a precise `holographic’ description in terms of quantum field theories defined on their boundary. Our aim is to develop a holographic framework for quantum cosmology. We will then apply this to three areas of theoretical cosmology where a quantum approach is of critical importance. First, we will put forward a holographic description of inflation that clarifies its microphysical origin and is rigorously predictive. Using this we will derive the distinct observational signatures of novel, truly holographic models of the early universe where inflation has no description in terms of classical cosmic evolution. Second, we will apply holographic cosmology to improve our understanding of eternal inflation. This is a phase deep into inflation where quantum effects dominate the evolution and affect the universe’s global structure. Finally we will work towards generalizing our holographic models of the primordial universe to include the radiation, matter and vacuum eras. The resulting unification of cosmic history in terms of a single holographic boundary theory may lead to intriguing predictions of correlations between early and late time observables, tying together the universe’s origin with its ultimate fate. Our project has the potential to revolutionize our perspective on cosmology and to further deepen the fruitful interaction between cosmology and high-energy physics.
Summary
The current theory of cosmic inflation is largely based on classical physics. This undermines its predictivity in a world that is fundamentally quantum mechanical. With this project we will develop a novel approach towards a quantum theory of inflation. We will do this by introducing holographic techniques in cosmology. The notion of holography is the most profound conceptual breakthrough that has emerged form fundamental high-energy physics in recent years. It postulates that (quantum) gravitational systems such as the universe as a whole have a precise `holographic’ description in terms of quantum field theories defined on their boundary. Our aim is to develop a holographic framework for quantum cosmology. We will then apply this to three areas of theoretical cosmology where a quantum approach is of critical importance. First, we will put forward a holographic description of inflation that clarifies its microphysical origin and is rigorously predictive. Using this we will derive the distinct observational signatures of novel, truly holographic models of the early universe where inflation has no description in terms of classical cosmic evolution. Second, we will apply holographic cosmology to improve our understanding of eternal inflation. This is a phase deep into inflation where quantum effects dominate the evolution and affect the universe’s global structure. Finally we will work towards generalizing our holographic models of the primordial universe to include the radiation, matter and vacuum eras. The resulting unification of cosmic history in terms of a single holographic boundary theory may lead to intriguing predictions of correlations between early and late time observables, tying together the universe’s origin with its ultimate fate. Our project has the potential to revolutionize our perspective on cosmology and to further deepen the fruitful interaction between cosmology and high-energy physics.
Max ERC Funding
1 995 900 €
Duration
Start date: 2014-08-01, End date: 2019-07-31
Project acronym i-CaD
Project Innovative Catalyst Design for Large-Scale, Sustainable Processes
Researcher (PI) Joris Wilfried Maria Cornelius Thybaut
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary A systematic and novel, multi-scale model based catalyst design methodology will be developed. The fundamental nature of the models used is unprecedented and will represent a breakthrough compared to the more commonly applied statistical, correlative relationships. The methodology will focus on the intrinsic kinetics of (potentially) large-scale processes for the conversion of renewable feeds into fuels and chemicals. Non-ideal behaviour, caused by mass and heat transfer limitations or particular reactor hydrodynamics, will be explicitly accounted for when simulating or optimizing industrial-scale applications. The selected model reactions are situated in the area of biomass upgrading to fuels and chemicals: fast pyrolysis oil stabilization, glycerol hydrogenolysis and selective oxidation of (bio)ethanol to acetaldehyde.
For the first time, a systematic microkinetic modelling methodology will be developed for oxygenates conversion. In particular, stereochemistry in catalysis will be assessed. Two types of descriptors will be quantified: kinetic descriptors that are catalyst independent and catalyst descriptors that specifically account for the effect of the catalyst properties on the reaction kinetics. The latter will be optimized in terms of reactant conversion, product yield or selectivity. Fundamental relationships will be established between the catalyst descriptors as determined by microkinetic modelling and independently measured catalyst properties or synthesis parameters. These innovative relationships allow providing the desired, rational feedback in from optimal descriptor values towards synthesis parameters for a new catalyst generation. Their fundamental character will guarantee adequate extrapolative properties that can be exploited for the identification of a groundbreaking next catalyst generation.
Summary
A systematic and novel, multi-scale model based catalyst design methodology will be developed. The fundamental nature of the models used is unprecedented and will represent a breakthrough compared to the more commonly applied statistical, correlative relationships. The methodology will focus on the intrinsic kinetics of (potentially) large-scale processes for the conversion of renewable feeds into fuels and chemicals. Non-ideal behaviour, caused by mass and heat transfer limitations or particular reactor hydrodynamics, will be explicitly accounted for when simulating or optimizing industrial-scale applications. The selected model reactions are situated in the area of biomass upgrading to fuels and chemicals: fast pyrolysis oil stabilization, glycerol hydrogenolysis and selective oxidation of (bio)ethanol to acetaldehyde.
For the first time, a systematic microkinetic modelling methodology will be developed for oxygenates conversion. In particular, stereochemistry in catalysis will be assessed. Two types of descriptors will be quantified: kinetic descriptors that are catalyst independent and catalyst descriptors that specifically account for the effect of the catalyst properties on the reaction kinetics. The latter will be optimized in terms of reactant conversion, product yield or selectivity. Fundamental relationships will be established between the catalyst descriptors as determined by microkinetic modelling and independently measured catalyst properties or synthesis parameters. These innovative relationships allow providing the desired, rational feedback in from optimal descriptor values towards synthesis parameters for a new catalyst generation. Their fundamental character will guarantee adequate extrapolative properties that can be exploited for the identification of a groundbreaking next catalyst generation.
Max ERC Funding
1 999 877 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym INTERFERE
Project Sparse Signal Coding for Interference-based Imaging Modalities
Researcher (PI) Peter Schelkens
Host Institution (HI) VRIJE UNIVERSITEIT BRUSSEL
Call Details Consolidator Grant (CoG), PE7, ERC-2013-CoG
Summary Since its invention in 1948 by Dennis Gabor holography has held the promise to empower full parallax 3D visualisation. Though the trajectory has been significantly longer than expected, recent developments in photonics, microelectronics and computer engineering have led to the prospective to realize within a decade dynamic full parallax holography with acceptable rendering quality and viewing angle. Unfortunately projections – based on the current state-of-the-art and expected evolution in the underlying “hardware” technologies – still predict exascale computing power and terabytes-per-second data rates.
Since dynamic digital holography requires huge amounts of pixels to be sensed, transmitted and represented, sparse signal representations hold a great promise reducing the computational complexity and bandwidth usage. INTERFERE will design a generic source coding methodology and architecture to facilitate the exploitation of sparse signal representations for dynamic, full parallax, large viewing angle digital holography and more generic, interference-based modalities, with the ambition to reduce the signal processing tailbacks while exploiting simultaneously human visual system characteristics.
Realizing these research objectives – with a strong focus on advanced signal representations, associated source coding methodologies and visual quality modelling – will provide a breakthrough with respect to the complexity reduction and thus realisation of full-parallax, wide viewing angle dynamic digital holography and benefit the earlier mentioned adjacent scientific fields. Intermediate results or components will have serendipic effects on other scientific disciplines and open new horizons for markets such as – but not limited to – medical imaging, biophotonics, life sciences, public safety, digital holographic microscopy, holographic biomedical sensors, data storage and metrology, illustrating the high-gain potential of INTERFERE.
Summary
Since its invention in 1948 by Dennis Gabor holography has held the promise to empower full parallax 3D visualisation. Though the trajectory has been significantly longer than expected, recent developments in photonics, microelectronics and computer engineering have led to the prospective to realize within a decade dynamic full parallax holography with acceptable rendering quality and viewing angle. Unfortunately projections – based on the current state-of-the-art and expected evolution in the underlying “hardware” technologies – still predict exascale computing power and terabytes-per-second data rates.
Since dynamic digital holography requires huge amounts of pixels to be sensed, transmitted and represented, sparse signal representations hold a great promise reducing the computational complexity and bandwidth usage. INTERFERE will design a generic source coding methodology and architecture to facilitate the exploitation of sparse signal representations for dynamic, full parallax, large viewing angle digital holography and more generic, interference-based modalities, with the ambition to reduce the signal processing tailbacks while exploiting simultaneously human visual system characteristics.
Realizing these research objectives – with a strong focus on advanced signal representations, associated source coding methodologies and visual quality modelling – will provide a breakthrough with respect to the complexity reduction and thus realisation of full-parallax, wide viewing angle dynamic digital holography and benefit the earlier mentioned adjacent scientific fields. Intermediate results or components will have serendipic effects on other scientific disciplines and open new horizons for markets such as – but not limited to – medical imaging, biophotonics, life sciences, public safety, digital holographic microscopy, holographic biomedical sensors, data storage and metrology, illustrating the high-gain potential of INTERFERE.
Max ERC Funding
1 992 615 €
Duration
Start date: 2014-06-01, End date: 2019-05-31