Project acronym 2D-4-CO2
Project DESIGNING 2D NANOSHEETS FOR CO2 REDUCTION AND INTEGRATION INTO vdW HETEROSTRUCTURES FOR ARTIFICIAL PHOTOSYNTHESIS
Researcher (PI) Damien VOIRY
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Summary
CO2 reduction reaction (CO2RR) holds great promise for conversion of the green-house gas carbon dioxide into chemical fuels. The absence of catalytic materials demonstrating high performance and high selectivity currently hampers practical demonstration. CO2RR is also limited by the low solubility of CO2 in the electrolyte solution and therefore electrocatalytic reactions in gas phase using gas diffusion electrodes would be preferred. 2D materials have recently emerged as a novel class of electrocatalytic materials thanks to their rich structures and electronic properties. The synthesis of novel 2D catalysts and their implementation into photocatalytic systems would be a major step towards the development of devices for storing solar energy in the form of chemical fuels. With 2D-4-CO2, I propose to: 1) develop novel class of CO2RR catalysts based on conducting 2D nanosheets and 2) demonstrate photocatalytic conversion of CO2 into chemical fuels using structure engineered gas diffusion electrodes made of 2D conducting catalysts. To reach this goal, the first objective of 2D-4-CO2 is to provide guidelines for the development of novel cutting-edge 2D catalysts towards CO2 conversion into chemical fuel. This will be possible by using a multidisciplinary approach based on 2D materials engineering, advanced methods of characterization and novel designs of gas diffusion electrodes for the reduction of CO2 in gas phase. The second objective is to develop practical photocatalytic systems using van der Waals (vdW) heterostructures for the efficient conversion of CO2 into chemical fuels. vdW heterostructures will consist in rational designs of 2D materials and 2D-like materials deposited by atomic layer deposition in order to achieve highly efficient light conversion and prolonged stability. This project will not only enable a deeper understanding of the CO2RR but it will also provide practical strategies for large-scale application of CO2RR for solar fuel production.
Max ERC Funding
1 499 931 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym 2DNANOCAPS
Project Next Generation of 2D-Nanomaterials: Enabling Supercapacitor Development
Researcher (PI) Valeria Nicolosi
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Starting Grant (StG), PE8, ERC-2011-StG_20101014
Summary Climate change and the decreasing availability of fossil fuels require society to move towards sustainable and renewable resources. 2DNanoCaps will focus on electrochemical energy storage, specifically supercapacitors. In terms of performance supercapacitors fill up the gap between batteries and the classical capacitors. Whereas batteries possess a high energy density but low power density, supercapacitors possess high power density but low energy density. Efforts are currently dedicated to move supercapacitors towards high energy density and high power density performance. Improvements have been achieved in the last few years due to the use of new electrode nanomaterials and the design of new hybrid faradic/capacitive systems. We recognize, however, that we are reaching a newer limit beyond which we will only see small incremental improvements. The main reason for this being the intrinsic difficulty in handling and processing materials at the nano-scale and the lack of communication across different scientific disciplines. I plan to use a multidisciplinary approach, where novel nanomaterials, existing knowledge on nano-scale processing and established expertise in device fabrication and testing will be brought together to focus on creating more efficient supercapacitor technologies. 2DNanoCaps will exploit liquid phase exfoliated two-dimensional nanomaterials such as transition metal oxides, layered metal chalcogenides and graphene as electrode materials. Electrodes will be ultra-thin (capacitance and thickness of the electrodes are inversely proportional), conductive, with high dielectric constants. Intercalation of ions between the assembled 2D flakes will be also achievable, providing pseudo-capacitance. The research here proposed will be initially based on fundamental laboratory studies, recognising that this holds the key to achieving step-change in supercapacitors, but also includes scaling-up and hybridisation as final objectives.
Summary
Climate change and the decreasing availability of fossil fuels require society to move towards sustainable and renewable resources. 2DNanoCaps will focus on electrochemical energy storage, specifically supercapacitors. In terms of performance supercapacitors fill up the gap between batteries and the classical capacitors. Whereas batteries possess a high energy density but low power density, supercapacitors possess high power density but low energy density. Efforts are currently dedicated to move supercapacitors towards high energy density and high power density performance. Improvements have been achieved in the last few years due to the use of new electrode nanomaterials and the design of new hybrid faradic/capacitive systems. We recognize, however, that we are reaching a newer limit beyond which we will only see small incremental improvements. The main reason for this being the intrinsic difficulty in handling and processing materials at the nano-scale and the lack of communication across different scientific disciplines. I plan to use a multidisciplinary approach, where novel nanomaterials, existing knowledge on nano-scale processing and established expertise in device fabrication and testing will be brought together to focus on creating more efficient supercapacitor technologies. 2DNanoCaps will exploit liquid phase exfoliated two-dimensional nanomaterials such as transition metal oxides, layered metal chalcogenides and graphene as electrode materials. Electrodes will be ultra-thin (capacitance and thickness of the electrodes are inversely proportional), conductive, with high dielectric constants. Intercalation of ions between the assembled 2D flakes will be also achievable, providing pseudo-capacitance. The research here proposed will be initially based on fundamental laboratory studies, recognising that this holds the key to achieving step-change in supercapacitors, but also includes scaling-up and hybridisation as final objectives.
Max ERC Funding
1 501 296 €
Duration
Start date: 2011-10-01, End date: 2016-09-30
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3D Reloaded
Project 3D Reloaded: Novel Algorithms for 3D Shape Inference and Analysis
Researcher (PI) Daniel Cremers
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Call Details Consolidator Grant (CoG), PE6, ERC-2014-CoG
Summary Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.
Summary
Despite their amazing success, we believe that computer vision algorithms have only scratched the surface of what can be done in terms of modeling and understanding our world from images. We believe that novel image analysis techniques will be a major enabler and driving force behind next-generation technologies, enhancing everyday life and opening up radically new possibilities. And we believe that the key to achieving this is to develop algorithms for reconstructing and analyzing the 3D structure of our world.
In this project, we will focus on three lines of research:
A) We will develop algorithms for 3D reconstruction from standard color cameras and from RGB-D cameras. In particular, we will promote real-time-capable direct and dense methods. In contrast to the classical two-stage approach of sparse feature-point based motion estimation and subsequent dense reconstruction, these methods optimally exploit all color information to jointly estimate dense geometry and camera motion.
B) We will develop algorithms for 3D shape analysis, including rigid and non-rigid matching, decomposition and interpretation of 3D shapes. We will focus on algorithms which are optimal or near-optimal. One of the major computational challenges lies in generalizing existing 2D shape analysis techniques to shapes in 3D and 4D (temporal evolutions of 3D shape).
C) We will develop shape priors for 3D reconstruction. These can be learned from sample shapes or acquired during the reconstruction process. For example, when reconstructing a larger office algorithms may exploit the geometric self-similarity of the scene, storing a model of a chair and its multiple instances only once rather than multiple times.
Advancing the state of the art in geometric reconstruction and geometric analysis will have a profound impact well beyond computer vision. We strongly believe that we have the necessary competence to pursue this project. Preliminary results have been well received by the community.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym 3D-E
Project 3D Engineered Environments for Regenerative Medicine
Researcher (PI) Ruth Elizabeth Cameron
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE8, ERC-2012-ADG_20120216
Summary "This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Summary
"This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Max ERC Funding
2 486 267 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym 3D-FABRIC
Project 3D Flow Analysis in Bijels Reconfigured for Interfacial Catalysis
Researcher (PI) Martin F. HAASE
Host Institution (HI) UNIVERSITEIT UTRECHT
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary The objective of this proposal is to determine the unknown criteria for convective cross-flow in bicontinuous interfacially jammed emulsion gels (bijels). Based on this, we will answer the question: Can continuously operated interfacial catalysis be realized in bijel cross-flow reactors? Demonstrating this potential will introduce a broadly applicable chemical technology, replacing wasteful chemical processes that require organic solvents. We will achieve our objective in three steps:
(a) Control over bijel structure and properties. Bijels will be formed with a selection of functional inorganic colloidal particles. Nanoparticle surface modifications will be developed and extensively characterized. General principles for the parameters determining bijel structures and properties will be established based on confocal and electron microscopy characterization. These principles will enable unprecedented control over bijel formation and will allow for designing desired properties.
(b) Convective flow in bijels. The mechanical strength of bijels will be tailored and measured. With mechanically robust bijels, the influence of size and organization of oil/water channels on convective mass transfer in bijels will be investigated. To this end, a bijel mass transfer apparatus fabricated by 3d-printing of bijel fibers and soft photolithography will be introduced. In conjunction with the following objective, the analysis of convective flows in bijels will facilitate a thorough description of their structure/function relationships.
(c) Biphasic chemical reactions in STrIPS bijel cross-flow reactors. First, continuous extraction in bijels will be realized. Next, conditions to carry out continuously-operated, phase transfer catalysis of well-known model reactions in bijels will be determined. Both processes will be characterized in-situ and in 3-dimensions by confocal microscopy of fluorescent phase transfer reactions in transparent bijels.
Summary
The objective of this proposal is to determine the unknown criteria for convective cross-flow in bicontinuous interfacially jammed emulsion gels (bijels). Based on this, we will answer the question: Can continuously operated interfacial catalysis be realized in bijel cross-flow reactors? Demonstrating this potential will introduce a broadly applicable chemical technology, replacing wasteful chemical processes that require organic solvents. We will achieve our objective in three steps:
(a) Control over bijel structure and properties. Bijels will be formed with a selection of functional inorganic colloidal particles. Nanoparticle surface modifications will be developed and extensively characterized. General principles for the parameters determining bijel structures and properties will be established based on confocal and electron microscopy characterization. These principles will enable unprecedented control over bijel formation and will allow for designing desired properties.
(b) Convective flow in bijels. The mechanical strength of bijels will be tailored and measured. With mechanically robust bijels, the influence of size and organization of oil/water channels on convective mass transfer in bijels will be investigated. To this end, a bijel mass transfer apparatus fabricated by 3d-printing of bijel fibers and soft photolithography will be introduced. In conjunction with the following objective, the analysis of convective flows in bijels will facilitate a thorough description of their structure/function relationships.
(c) Biphasic chemical reactions in STrIPS bijel cross-flow reactors. First, continuous extraction in bijels will be realized. Next, conditions to carry out continuously-operated, phase transfer catalysis of well-known model reactions in bijels will be determined. Both processes will be characterized in-situ and in 3-dimensions by confocal microscopy of fluorescent phase transfer reactions in transparent bijels.
Max ERC Funding
1 905 000 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym 3D2DPrint
Project 3D Printing of Novel 2D Nanomaterials: Adding Advanced 2D Functionalities to Revolutionary Tailored 3D Manufacturing
Researcher (PI) Valeria Nicolosi
Host Institution (HI) THE PROVOST, FELLOWS, FOUNDATION SCHOLARS & THE OTHER MEMBERS OF BOARD OF THE COLLEGE OF THE HOLY & UNDIVIDED TRINITY OF QUEEN ELIZABETH NEAR DUBLIN
Call Details Consolidator Grant (CoG), PE8, ERC-2015-CoG
Summary My vision is to establish, within the framework of an ERC CoG, a multidisciplinary group which will work in concert towards pioneering the integration of novel 2-Dimensional nanomaterials with novel additive fabrication techniques to develop a unique class of energy storage devices.
Batteries and supercapacitors are two very complementary types of energy storage devices. Batteries store much higher energy densities; supercapacitors, on the other hand, hold one tenth of the electricity per unit of volume or weight as compared to batteries but can achieve much higher power densities. Technology is currently striving to improve the power density of batteries and the energy density of supercapacitors. To do so it is imperative to develop new materials, chemistries and manufacturing strategies.
3D2DPrint aims to develop micro-energy devices (both supercapacitors and batteries), technologies particularly relevant in the context of the emergent industry of micro-electro-mechanical systems and constantly downsized electronics. We plan to use novel two-dimensional (2D) nanomaterials obtained by liquid-phase exfoliation. This method offers a new, economic and easy way to prepare ink of a variety of 2D systems, allowing to produce wide device performance window through elegant and simple constituent control at the point of fabrication. 3D2DPrint will use our expertise and know-how to allow development of advanced AM methods to integrate dissimilar nanomaterial blends and/or “hybrids” into fully embedded 3D printed energy storage devices, with the ultimate objective to realise a range of products that contain the above described nanomaterials subcomponent devices, electrical connections and traditional micro-fabricated subcomponents (if needed) ideally using a single tool.
Summary
My vision is to establish, within the framework of an ERC CoG, a multidisciplinary group which will work in concert towards pioneering the integration of novel 2-Dimensional nanomaterials with novel additive fabrication techniques to develop a unique class of energy storage devices.
Batteries and supercapacitors are two very complementary types of energy storage devices. Batteries store much higher energy densities; supercapacitors, on the other hand, hold one tenth of the electricity per unit of volume or weight as compared to batteries but can achieve much higher power densities. Technology is currently striving to improve the power density of batteries and the energy density of supercapacitors. To do so it is imperative to develop new materials, chemistries and manufacturing strategies.
3D2DPrint aims to develop micro-energy devices (both supercapacitors and batteries), technologies particularly relevant in the context of the emergent industry of micro-electro-mechanical systems and constantly downsized electronics. We plan to use novel two-dimensional (2D) nanomaterials obtained by liquid-phase exfoliation. This method offers a new, economic and easy way to prepare ink of a variety of 2D systems, allowing to produce wide device performance window through elegant and simple constituent control at the point of fabrication. 3D2DPrint will use our expertise and know-how to allow development of advanced AM methods to integrate dissimilar nanomaterial blends and/or “hybrids” into fully embedded 3D printed energy storage devices, with the ultimate objective to realise a range of products that contain the above described nanomaterials subcomponent devices, electrical connections and traditional micro-fabricated subcomponents (if needed) ideally using a single tool.
Max ERC Funding
2 499 942 €
Duration
Start date: 2016-10-01, End date: 2021-09-30
Project acronym 3DAddChip
Project Additive manufacturing of 2D nanomaterials for on-chip technologies
Researcher (PI) Cecilia Mattevi
Host Institution (HI) IMPERIAL COLLEGE OF SCIENCE TECHNOLOGY AND MEDICINE
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary The realization of “the internet of things” is inevitably constrained at the level of miniaturization that can be achieved in the electronic devices. A variety of technologies are now going through a process of miniaturization from micro-electromechanical systems (MEMS) to biomedical sensors, and actuators. The ultimate goal is to combine several components in an individual multifunctional platform, realizing on-chip technology. Devices have to be constrained to small footprints and exhibit high performance. Thus, the miniaturization process requires the introduction of new manufacturing processes to fabricate devices in the 3D space over small areas. 3D printing via robocasting is emerging as a new manufacturing technique, which allows shaping virtually any materials from polymers to ceramic and metals into complex architectures.
The goal of this research is to establish a 3D printing paradigm to produce miniaturized complex shape devices with diversified functions for on-chip technologies adaptable to “smart environment” such as flexible substrates, smart textiles and biomedical sensors. The elementary building blocks of the devices will be two-dimensional nanomaterials, which present unique optical, electrical, chemical and mechanical properties. The synergistic combination of the intrinsic characteristics of the 2D nanomaterials and the specific 3D architecture will enable advanced performance of the 3D printed objects. This research programme will demonstrate 3D miniaturized energy storage and energy conversion units fabricated with inks produced using a pilot plant. These units are essential components of any on-chip platform as they ensure energy autonomy via self-powering. Ultimately, this research will initiate new technologies based on miniaturized 3D devices.
Summary
The realization of “the internet of things” is inevitably constrained at the level of miniaturization that can be achieved in the electronic devices. A variety of technologies are now going through a process of miniaturization from micro-electromechanical systems (MEMS) to biomedical sensors, and actuators. The ultimate goal is to combine several components in an individual multifunctional platform, realizing on-chip technology. Devices have to be constrained to small footprints and exhibit high performance. Thus, the miniaturization process requires the introduction of new manufacturing processes to fabricate devices in the 3D space over small areas. 3D printing via robocasting is emerging as a new manufacturing technique, which allows shaping virtually any materials from polymers to ceramic and metals into complex architectures.
The goal of this research is to establish a 3D printing paradigm to produce miniaturized complex shape devices with diversified functions for on-chip technologies adaptable to “smart environment” such as flexible substrates, smart textiles and biomedical sensors. The elementary building blocks of the devices will be two-dimensional nanomaterials, which present unique optical, electrical, chemical and mechanical properties. The synergistic combination of the intrinsic characteristics of the 2D nanomaterials and the specific 3D architecture will enable advanced performance of the 3D printed objects. This research programme will demonstrate 3D miniaturized energy storage and energy conversion units fabricated with inks produced using a pilot plant. These units are essential components of any on-chip platform as they ensure energy autonomy via self-powering. Ultimately, this research will initiate new technologies based on miniaturized 3D devices.
Max ERC Funding
1 999 968 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym 4D-EEG
Project 4D-EEG: A new tool to investigate the spatial and temporal activity patterns in the brain
Researcher (PI) Franciscus C.T. Van Der Helm
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Summary
Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Max ERC Funding
3 477 202 €
Duration
Start date: 2012-06-01, End date: 2017-05-31
Project acronym 5D-NanoTrack
Project Five-Dimensional Localization Microscopy for Sub-Cellular Dynamics
Researcher (PI) Yoav SHECHTMAN
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary The sub-cellular processes that control the most critical aspects of life occur in three-dimensions (3D), and are intrinsically dynamic. While super-resolution microscopy has revolutionized cellular imaging in recent years, our current capability to observe the dynamics of life on the nanoscale is still extremely limited, due to inherent trade-offs between spatial, temporal and spectral resolution using existing approaches.
We propose to develop and demonstrate an optical microscopy methodology that would enable live sub-cellular observation in unprecedented detail. Making use of multicolor 3D point-spread-function (PSF) engineering, a technique I have recently developed, we will be able to simultaneously track multiple markers inside live cells, at high speed and in five-dimensions (3D, time, and color).
Multicolor 3D PSF engineering holds the potential of being a uniquely powerful method for 5D tracking. However, it is not yet applicable to live-cell imaging, due to significant bottlenecks in optical engineering and signal processing, which we plan to overcome in this project. Importantly, we will also demonstrate the efficacy of our method using a challenging biological application: real-time visualization of chromatin dynamics - the spatiotemporal organization of DNA. This is a highly suitable problem due to its fundamental importance, its role in a variety of cellular processes, and the lack of appropriate tools for studying it.
The project is divided into 3 aims:
1. Technology development: diffractive-element design for multicolor 3D PSFs.
2. System design: volumetric tracking of dense emitters.
3. Live-cell measurements: chromatin dynamics.
Looking ahead, here we create the imaging tools that pave the way towards the holy grail of chromatin visualization: dynamic observation of the 3D positions of the ~3 billion DNA base-pairs in a live human cell. Beyond that, our results will be applicable to numerous 3D micro/nanoscale tracking applications.
Summary
The sub-cellular processes that control the most critical aspects of life occur in three-dimensions (3D), and are intrinsically dynamic. While super-resolution microscopy has revolutionized cellular imaging in recent years, our current capability to observe the dynamics of life on the nanoscale is still extremely limited, due to inherent trade-offs between spatial, temporal and spectral resolution using existing approaches.
We propose to develop and demonstrate an optical microscopy methodology that would enable live sub-cellular observation in unprecedented detail. Making use of multicolor 3D point-spread-function (PSF) engineering, a technique I have recently developed, we will be able to simultaneously track multiple markers inside live cells, at high speed and in five-dimensions (3D, time, and color).
Multicolor 3D PSF engineering holds the potential of being a uniquely powerful method for 5D tracking. However, it is not yet applicable to live-cell imaging, due to significant bottlenecks in optical engineering and signal processing, which we plan to overcome in this project. Importantly, we will also demonstrate the efficacy of our method using a challenging biological application: real-time visualization of chromatin dynamics - the spatiotemporal organization of DNA. This is a highly suitable problem due to its fundamental importance, its role in a variety of cellular processes, and the lack of appropriate tools for studying it.
The project is divided into 3 aims:
1. Technology development: diffractive-element design for multicolor 3D PSFs.
2. System design: volumetric tracking of dense emitters.
3. Live-cell measurements: chromatin dynamics.
Looking ahead, here we create the imaging tools that pave the way towards the holy grail of chromatin visualization: dynamic observation of the 3D positions of the ~3 billion DNA base-pairs in a live human cell. Beyond that, our results will be applicable to numerous 3D micro/nanoscale tracking applications.
Max ERC Funding
1 802 500 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym AAATSI
Project Advanced Antenna Architecture for THZ Sensing Instruments
Researcher (PI) Andrea Neto
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE7, ERC-2011-StG_20101014
Summary The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Summary
The Tera-Hertz portion of the spectrum presents unique potentials for advanced applications. Currently the THz spectrum is revealing the mechanisms at the origin of our universe and provides the means to monitor the health of our planet via satellite based sensing of critical gases. Potentially time domain sensing of the THz spectrum will be the ideal tool for a vast variety of medical and security applications.
Presently, systems in the THz regime are extremely expensive and consequently the THz spectrum is still the domain of only niche (expensive) scientific applications. The main problems are the lack of power and sensitivity. The wide unused THz spectral bandwidth is, herself, the only widely available resource that in the future can compensate for these problems. But, so far, when scientists try to really use the bandwidth, they run into an insurmountable physical limit: antenna dispersion. Antenna dispersion modifies the signal’s spectrum in a wavelength dependent manner in all types of radiation, but is particularly deleterious to THz signals because the spectrum is too wide and with foreseeable technology it cannot be digitized.
The goal of this proposal is to introduce break-through antenna technology that will eliminate the dispersion bottle neck and revolutionize Time Domain sensing and Spectroscopic Space Science. Achieving these goals the project will pole vault THz imaging technology into the 21-th century and develop critically important enabling technologies which will satisfy the electrical engineering needs of the next 30 years and in the long run will enable multi Tera-bit wireless communications.
In order to achieve these goals, I will first build upon two major breakthrough radiation mechanisms that I pioneered: Leaky Lenses and Connected Arrays. Eventually, ultra wide band imaging arrays constituted by thousands of components will be designed on the bases of the new theoretical findings and demonstrated.
Max ERC Funding
1 499 487 €
Duration
Start date: 2011-11-01, End date: 2017-10-31
Project acronym AArteMIS
Project Aneurysmal Arterial Mechanics: Into the Structure
Researcher (PI) Pierre Joseph Badel
Host Institution (HI) ASSOCIATION POUR LA RECHERCHE ET LE DEVELOPPEMENT DES METHODES ET PROCESSUS INDUSTRIELS
Call Details Starting Grant (StG), PE8, ERC-2014-STG
Summary The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Summary
The rupture of an Aortic Aneurysm (AA), which is often lethal, is a mechanical phenomenon that occurs when the wall stress state exceeds the local strength of the tissue. Our current understanding of arterial rupture mechanisms is poor, and the physics taking place at the microscopic scale in these collagenous structures remains an open area of research. Understanding, modelling, and quantifying the micro-mechanisms which drive the mechanical response of such tissue and locally trigger rupture represents the most challenging and promising pathway towards predictive diagnosis and personalized care of AA.
The PI's group was recently able to detect, in advance, at the macro-scale, rupture-prone areas in bulging arterial tissues. The next step is to get into the details of the arterial microstructure to elucidate the underlying mechanisms.
Through the achievements of AArteMIS, the local mechanical state of the fibrous microstructure of the tissue, especially close to its rupture state, will be quantitatively analyzed from multi-photon confocal microscopy and numerically reconstructed to establish quantitative micro-scale rupture criteria. AArteMIS will also address developing micro-macro models which are based on the collected quantitative data.
The entire project will be completed through collaboration with medical doctors and engineers, experts in all required fields for the success of AArteMIS.
AArteMIS is expected to open longed-for pathways for research in soft tissue mechanobiology which focuses on cell environment and to enable essential clinical applications for the quantitative assessment of AA rupture risk. It will significantly contribute to understanding fatal vascular events and improving cardiovascular treatments. It will provide a tremendous source of data and inspiration for subsequent applications and research by answering the most fundamental questions on AA rupture behaviour enabling ground-breaking clinical changes to take place.
Max ERC Funding
1 499 783 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym ABEP
Project Asset Bubbles and Economic Policy
Researcher (PI) Jaume Ventura Fontanet
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Summary
Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Max ERC Funding
1 000 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym ABRSEIST
Project Antibiotic Resistance: Socio-Economic Determinants and the Role of Information and Salience in Treatment Choice
Researcher (PI) Hannes ULLRICH
Host Institution (HI) DEUTSCHES INSTITUT FUR WIRTSCHAFTSFORSCHUNG DIW (INSTITUT FUR KONJUNKTURFORSCHUNG) EV
Call Details Starting Grant (StG), SH1, ERC-2018-STG
Summary Antibiotics have contributed to a tremendous increase in human well-being, saving many millions of lives. However, antibiotics become obsolete the more they are used as selection pressure promotes the development of resistant bacteria. The World Health Organization has proclaimed antibiotic resistance as a major global threat to public health. Today, 700,000 deaths per year are due to untreatable infections. To win the battle against antibiotic resistance, new policies affecting the supply and demand of existing and new drugs must be designed. I propose new research to identify and evaluate feasible and effective demand-side policy interventions targeting the relevant decision makers: physicians and patients. ABRSEIST will make use of a broad econometric toolset to identify mechanisms linking antibiotic resistance and consumption exploiting a unique combination of physician-patient-level antibiotic resistance, treatment, and socio-economic data. Using machine learning methods adapted for causal inference, theory-driven structural econometric analysis, and randomization in the field it will provide rigorous evidence on effective intervention designs. This research will improve our understanding of how prescribing, resistance, and the effect of antibiotic use on resistance, are distributed in the general population which has important implications for the design of targeted interventions. It will then estimate a structural model of general practitioners’ acquisition and use of information under uncertainty about resistance in prescription choice, allowing counterfactual analysis of information-improving policies such as mandatory diagnostic testing. The large-scale and structural econometric analyses allow flexible identification of physician heterogeneity, which ABRSEIST will exploit to design and evaluate targeted, randomized information nudges in the field. The result will be improved rational use and a toolset applicable in contexts of antibiotic prescribing.
Summary
Antibiotics have contributed to a tremendous increase in human well-being, saving many millions of lives. However, antibiotics become obsolete the more they are used as selection pressure promotes the development of resistant bacteria. The World Health Organization has proclaimed antibiotic resistance as a major global threat to public health. Today, 700,000 deaths per year are due to untreatable infections. To win the battle against antibiotic resistance, new policies affecting the supply and demand of existing and new drugs must be designed. I propose new research to identify and evaluate feasible and effective demand-side policy interventions targeting the relevant decision makers: physicians and patients. ABRSEIST will make use of a broad econometric toolset to identify mechanisms linking antibiotic resistance and consumption exploiting a unique combination of physician-patient-level antibiotic resistance, treatment, and socio-economic data. Using machine learning methods adapted for causal inference, theory-driven structural econometric analysis, and randomization in the field it will provide rigorous evidence on effective intervention designs. This research will improve our understanding of how prescribing, resistance, and the effect of antibiotic use on resistance, are distributed in the general population which has important implications for the design of targeted interventions. It will then estimate a structural model of general practitioners’ acquisition and use of information under uncertainty about resistance in prescription choice, allowing counterfactual analysis of information-improving policies such as mandatory diagnostic testing. The large-scale and structural econometric analyses allow flexible identification of physician heterogeneity, which ABRSEIST will exploit to design and evaluate targeted, randomized information nudges in the field. The result will be improved rational use and a toolset applicable in contexts of antibiotic prescribing.
Max ERC Funding
1 498 920 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym ACAP
Project Acency Costs and Asset Pricing
Researcher (PI) Thomas Mariotti
Host Institution (HI) FONDATION JEAN-JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Summary
The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-11-01, End date: 2014-10-31
Project acronym ACCORD
Project Algorithms for Complex Collective Decisions on Structured Domains
Researcher (PI) Edith Elkind
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE6, ERC-2014-STG
Summary Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Summary
Algorithms for Complex Collective Decisions on Structured Domains.
The aim of this proposal is to substantially advance the field of Computational Social Choice, by developing new tools and methodologies that can be used for making complex group decisions in rich and structured environments. We consider settings where each member of a decision-making body has preferences over a finite set of alternatives, and the goal is to synthesise a collective preference over these alternatives, which may take the form of a partial order over the set of alternatives with a predefined structure: examples include selecting a fixed-size set of alternatives, a ranking of the alternatives, a winner and up to two runner-ups, etc. We will formulate desiderata that apply to such preference aggregation procedures, design specific procedures that satisfy as many of these desiderata as possible, and develop efficient algorithms for computing them. As the latter step may be infeasible on general preference domains, we will focus on identifying the least restrictive domains that enable efficient computation, and use real-life preference data to verify whether the associated restrictions are likely to be satisfied in realistic preference aggregation scenarios. Also, we will determine whether our preference aggregation procedures are computationally resistant to malicious behavior. To lower the cognitive burden on the decision-makers, we will extend our procedures to accept partial rankings as inputs. Finally, to further contribute towards bridging the gap between theory and practice of collective decision making, we will provide open-source software implementations of our procedures, and reach out to the potential users to obtain feedback on their practical applicability.
Max ERC Funding
1 395 933 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ACDC
Project Algorithms and Complexity of Highly Decentralized Computations
Researcher (PI) Fabian Daniel Kuhn
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), PE6, ERC-2013-StG
Summary "Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Summary
"Many of today's and tomorrow's computer systems are built on top of large-scale networks such as, e.g., the Internet, the world wide web, wireless ad hoc and sensor networks, or peer-to-peer networks. Driven by technological advances, new kinds of networks and applications have become possible and we can safely assume that this trend is going to continue. Often modern systems are envisioned to consist of a potentially large number of individual components that are organized in a completely decentralized way. There is no central authority that controls the topology of the network, how nodes join or leave the system, or in which way nodes communicate with each other. Also, many future distributed applications will be built using wireless devices that communicate via radio.
The general objective of the proposed project is to improve our understanding of the algorithmic and theoretical foundations of decentralized distributed systems. From an algorithmic point of view, decentralized networks and computations pose a number of fascinating and unique challenges that are not present in sequential or more standard distributed systems. As communication is limited and mostly between nearby nodes, each node of a large network can only maintain a very restricted view of the global state of the system. This is particularly true if the network can change dynamically, either by nodes joining or leaving the system or if the topology changes over time, e.g., because of the mobility of the devices in case of a wireless network. Nevertheless, the nodes of a network need to coordinate in order to achieve some global goal.
In particular, we plan to study algorithms and lower bounds for basic computation and information dissemination tasks in such systems. In addition, we are particularly interested in the complexity of distributed computations in dynamic and wireless networks."
Max ERC Funding
1 148 000 €
Duration
Start date: 2013-11-01, End date: 2018-10-31