Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BIC
Project Cavitation across scales: following Bubbles from Inception to Collapse
Researcher (PI) Carlo Massimo Casciola
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Summary
Cavitation is the formation of vapor cavities inside a liquid due to low pressure. Cavitation is an ubiquitous and destructive phenomenon common to most engineering applications that deal with flowing water. At the same time, the extreme conditions realized in cavitation are increasingly exploited in medicine, chemistry, and biology. What makes cavitation unpredictable is its multiscale nature: nucleation of vapor bubbles heavily depends on micro- and nanoscale details; mesoscale phenomena, as bubble collapse, determine relevant macroscopic effects, e.g., cavitation damage. In addition, macroscopic flow conditions, such as turbulence, have a major impact on it.
The objective of the BIC project is to develop the lacking multiscale description of cavitation, by proposing new integrated numerical methods capable to perform quantitative predictions. The detailed and physically sound understanding of the multifaceted phenomena involved in cavitation (nucleation, bubble growth, transport, and collapse in turbulent flows) fostered by BIC project will result in new methods for designing fluid machinery, but also therapies in ultrasound medicine and chemical reactors. The BIC project builds upon the exceptionally broad experience of the PI and of his research group in numerical simulations of flows at different scales that include advanced atomistic simulations of nanoscale wetting phenomena, mesoscale models for multiphase flows, and particle-laden turbulent flows. The envisaged numerical methodologies (free-energy atomistic simulations, phase-field models, and Direct Numerical Simulation of bubble-laden flows) will be supported by targeted experimental activities, designed to validate models and characterize realistic conditions.
Max ERC Funding
2 491 200 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym BrightEyes
Project Multi-Parameter Live-Cell Observation of Biomolecular Processes with Single-Photon Detector Array
Researcher (PI) Giuseppe Vicidomini
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE7, ERC-2018-COG
Summary Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Summary
Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Max ERC Funding
1 861 250 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym CHRONOS
Project A geochemical clock to measure timescales of volcanic eruptions
Researcher (PI) Diego Perugini
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PERUGIA
Call Details Consolidator Grant (CoG), PE10, ERC-2013-CoG
Summary "The eruption of volcanoes appears one of the most unpredictable phenomena on Earth. Yet the situation is rapidly changing. Quantification of the eruptive record constrains what is possible in a given volcanic system. Timing is the hardest part to quantify.
The main process triggering an eruption is the refilling of a sub-volcanic magma chamber by a new magma coming from depth. This process results in magma mixing and provokes a time-dependent diffusion of chemical elements. Understanding the time elapsed from mixing to eruption is fundamental to discerning pre-eruptive behaviour of volcanoes to mitigate the huge impact of volcanic eruptions on society and the environment.
The CHRONOS project proposes a new method that will cut the Gordian knot of the presently intractable problem of volcanic eruption timing using a surgical approach integrating textural, geochemical and experimental data on magma mixing. I will use the compositional heterogeneity frozen in time in the rocks the same way a broken clock at a crime scene is used to determine the time of the incident. CHRONOS will aim to:
1) be the first study to reproduce magma mixing, by performing unique experiments constrained by natural data and using natural melts, under controlled rheological and fluid-dynamics conditions;
2) obtain unprecedented high-quality data on the time dependence of chemical exchanges during magma mixing;
3) derive empirical relationships linking the extent of chemical exchanges and the mixing timescales;
4) determine timescales of volcanic eruptions combining natural and experimental data.
CHRONOS will open a new window on the physico-chemical processes occurring in the days preceding volcanic eruptions providing unprecedented information to build the first inventory of eruption timescales for planet Earth. If these timescales can be linked with geophysical signals occurring prior to eruptions, this inventory will have an immense value, enabling precise prediction of volcanic eruptions."
Summary
"The eruption of volcanoes appears one of the most unpredictable phenomena on Earth. Yet the situation is rapidly changing. Quantification of the eruptive record constrains what is possible in a given volcanic system. Timing is the hardest part to quantify.
The main process triggering an eruption is the refilling of a sub-volcanic magma chamber by a new magma coming from depth. This process results in magma mixing and provokes a time-dependent diffusion of chemical elements. Understanding the time elapsed from mixing to eruption is fundamental to discerning pre-eruptive behaviour of volcanoes to mitigate the huge impact of volcanic eruptions on society and the environment.
The CHRONOS project proposes a new method that will cut the Gordian knot of the presently intractable problem of volcanic eruption timing using a surgical approach integrating textural, geochemical and experimental data on magma mixing. I will use the compositional heterogeneity frozen in time in the rocks the same way a broken clock at a crime scene is used to determine the time of the incident. CHRONOS will aim to:
1) be the first study to reproduce magma mixing, by performing unique experiments constrained by natural data and using natural melts, under controlled rheological and fluid-dynamics conditions;
2) obtain unprecedented high-quality data on the time dependence of chemical exchanges during magma mixing;
3) derive empirical relationships linking the extent of chemical exchanges and the mixing timescales;
4) determine timescales of volcanic eruptions combining natural and experimental data.
CHRONOS will open a new window on the physico-chemical processes occurring in the days preceding volcanic eruptions providing unprecedented information to build the first inventory of eruption timescales for planet Earth. If these timescales can be linked with geophysical signals occurring prior to eruptions, this inventory will have an immense value, enabling precise prediction of volcanic eruptions."
Max ERC Funding
1 993 813 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym COMANCHE
Project Coherent manipulation and control of heat in solid-state nanostructures: the era of coherent caloritronics
Researcher (PI) Francesco Giazotto
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Consolidator Grant (CoG), PE3, ERC-2013-CoG
Summary "Electronic nanodevices have demonstrated to be versatile and effective tools for the investigation of exotic quantum phenomena under controlled and adjustable conditions. Yet, these have enabled to give access to the manipulation of charge flow with unprecedented precision. On the other hand, the wisdom dealing with control, measurements, storage, and conversion of heat in nanoscale devices, the so-called “caloritronics” (from the Latin word “calor”, i.e., heat), despite a number of recent advances is still at its infancy. Although coherence often plays a crucial role in determining the functionalities of nanoelectronic devices very little is known of its role in caloritronics. In such a context, coherent control of heat seems at present still very far from reach, and devising methods to phase-coherently manipulate the thermal current would represent a crucial breakthrough which could open the door to unprecedented possibilities in several fields of science.
Here we propose an original approach to set the experimental ground for the investigation and implementation of a new branch of science, the “coherent caloritronics”, which will take advantage of quantum circuits to phase-coherently manipulate and control the heat current in solid-state nanostructures. To tackle this challenging task our approach will follow three main separate approaches, i.e., the coherent control of heat transported by electrons in Josephson nanocircuits, the coherent manipulation of heat carried by electrons and exchanged between electrons and lattice phonons in superconducting proximity systems,
and finally, the control of the heat exchanged between electrons and photons by coherently tuning the coupling with the electromagnetic environment. We will integrate superconductors with normal-metal or semiconductor electrodes thus exploring new device concepts such as heat transistors, heat diodes, heat splitters, where thermal flux control is achieved thanks to the use of the quantum phase."
Summary
"Electronic nanodevices have demonstrated to be versatile and effective tools for the investigation of exotic quantum phenomena under controlled and adjustable conditions. Yet, these have enabled to give access to the manipulation of charge flow with unprecedented precision. On the other hand, the wisdom dealing with control, measurements, storage, and conversion of heat in nanoscale devices, the so-called “caloritronics” (from the Latin word “calor”, i.e., heat), despite a number of recent advances is still at its infancy. Although coherence often plays a crucial role in determining the functionalities of nanoelectronic devices very little is known of its role in caloritronics. In such a context, coherent control of heat seems at present still very far from reach, and devising methods to phase-coherently manipulate the thermal current would represent a crucial breakthrough which could open the door to unprecedented possibilities in several fields of science.
Here we propose an original approach to set the experimental ground for the investigation and implementation of a new branch of science, the “coherent caloritronics”, which will take advantage of quantum circuits to phase-coherently manipulate and control the heat current in solid-state nanostructures. To tackle this challenging task our approach will follow three main separate approaches, i.e., the coherent control of heat transported by electrons in Josephson nanocircuits, the coherent manipulation of heat carried by electrons and exchanged between electrons and lattice phonons in superconducting proximity systems,
and finally, the control of the heat exchanged between electrons and photons by coherently tuning the coupling with the electromagnetic environment. We will integrate superconductors with normal-metal or semiconductor electrodes thus exploring new device concepts such as heat transistors, heat diodes, heat splitters, where thermal flux control is achieved thanks to the use of the quantum phase."
Max ERC Funding
1 754 897 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym COMPASS
Project Colloids with complex interactions: from model atoms to colloidal recognition and bio-inspired self assembly
Researcher (PI) Peter Schurtenberger
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE3, ERC-2013-ADG
Summary Self-assembly is the key construction principle that nature uses so successfully to fabricate its molecular machinery and highly elaborate structures. In this project we will follow nature’s strategies and make a concerted experimental and theoretical effort to study, understand and control self-assembly for a new generation of colloidal building blocks. Starting point will be recent advances in colloid synthesis strategies that have led to a spectacular array of colloids of different shapes, compositions, patterns and functionalities. These allow us to investigate the influence of anisotropy in shape and interactions on aggregation and self-assembly in colloidal suspensions and mixtures. Using responsive particles we will implement colloidal lock-and-key mechanisms and then assemble a library of “colloidal molecules” with well-defined and externally tunable binding sites using microfluidics-based and externally controlled fabrication and sorting principles. We will use them to explore the equilibrium phase behavior of particle systems interacting through a finite number of binding sites. In parallel, we will exploit them and investigate colloid self-assembly into well-defined nanostructures. Here we aim at achieving much more refined control than currently possible by implementing a protein-inspired approach to controlled self-assembly. We combine molecule-like colloidal building blocks that possess directional interactions and externally triggerable specific recognition sites with directed self-assembly where external fields not only facilitate assembly, but also allow fabricating novel structures. We will use the tunable combination of different contributions to the interaction potential between the colloidal building blocks and the ability to create chirality in the assembly to establish the requirements for the controlled formation of tubular shells and thus create a colloid-based minimal model of synthetic virus capsid proteins.
Summary
Self-assembly is the key construction principle that nature uses so successfully to fabricate its molecular machinery and highly elaborate structures. In this project we will follow nature’s strategies and make a concerted experimental and theoretical effort to study, understand and control self-assembly for a new generation of colloidal building blocks. Starting point will be recent advances in colloid synthesis strategies that have led to a spectacular array of colloids of different shapes, compositions, patterns and functionalities. These allow us to investigate the influence of anisotropy in shape and interactions on aggregation and self-assembly in colloidal suspensions and mixtures. Using responsive particles we will implement colloidal lock-and-key mechanisms and then assemble a library of “colloidal molecules” with well-defined and externally tunable binding sites using microfluidics-based and externally controlled fabrication and sorting principles. We will use them to explore the equilibrium phase behavior of particle systems interacting through a finite number of binding sites. In parallel, we will exploit them and investigate colloid self-assembly into well-defined nanostructures. Here we aim at achieving much more refined control than currently possible by implementing a protein-inspired approach to controlled self-assembly. We combine molecule-like colloidal building blocks that possess directional interactions and externally triggerable specific recognition sites with directed self-assembly where external fields not only facilitate assembly, but also allow fabricating novel structures. We will use the tunable combination of different contributions to the interaction potential between the colloidal building blocks and the ability to create chirality in the assembly to establish the requirements for the controlled formation of tubular shells and thus create a colloid-based minimal model of synthetic virus capsid proteins.
Max ERC Funding
2 498 040 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym COMPAT
Project Complex Patterns for Strongly Interacting Dynamical Systems
Researcher (PI) Susanna Terracini
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary This project focuses on nontrivial solutions of systems of differential equations characterized by strongly nonlinear interactions. We are interested in the effect of the nonlinearities on the emergence of non trivial self-organized structures. Such patterns correspond to selected solutions of the differential system possessing special symmetries or shadowing particular shapes. We want to understand, from the
mathematical point of view, what are the main mechanisms involved in the aggregation process in terms of the global variational structure of the problem. Following this common thread, we deal with both with the classical N-body problem of Celestial Mechanics, where interactions feature attractive singularities, and competition-diffusion systems, where pattern formation is driven by strongly repulsive forces. More
precisely, we are interested in periodic and bounded solutions, parabolic trajectories with the final intent to build complex motions and possibly obtain the symbolic dynamics for the general N–body problem. On the other hand, we deal with elliptic, parabolic and hyperbolic systems of differential equations with strongly competing interaction terms, modeling both the dynamics of competing populations (Lotka-
Volterra systems) and other interesting physical phenomena, among which the phase segregation of solitary waves of Gross-Pitaevskii systems arising in the study of multicomponent Bose-Einstein condensates. In particular, we will study existence, multiplicity and asymptotic expansions of solutions when the competition parameter tends to infinity. We shall be concerned with optimal partition problems
related to linear and nonlinear eigenvalues
Summary
This project focuses on nontrivial solutions of systems of differential equations characterized by strongly nonlinear interactions. We are interested in the effect of the nonlinearities on the emergence of non trivial self-organized structures. Such patterns correspond to selected solutions of the differential system possessing special symmetries or shadowing particular shapes. We want to understand, from the
mathematical point of view, what are the main mechanisms involved in the aggregation process in terms of the global variational structure of the problem. Following this common thread, we deal with both with the classical N-body problem of Celestial Mechanics, where interactions feature attractive singularities, and competition-diffusion systems, where pattern formation is driven by strongly repulsive forces. More
precisely, we are interested in periodic and bounded solutions, parabolic trajectories with the final intent to build complex motions and possibly obtain the symbolic dynamics for the general N–body problem. On the other hand, we deal with elliptic, parabolic and hyperbolic systems of differential equations with strongly competing interaction terms, modeling both the dynamics of competing populations (Lotka-
Volterra systems) and other interesting physical phenomena, among which the phase segregation of solitary waves of Gross-Pitaevskii systems arising in the study of multicomponent Bose-Einstein condensates. In particular, we will study existence, multiplicity and asymptotic expansions of solutions when the competition parameter tends to infinity. We shall be concerned with optimal partition problems
related to linear and nonlinear eigenvalues
Max ERC Funding
1 346 145 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym CUSTOMER
Project Customizable Embedded Real-Time Systems: Challenges and Key Techniques
Researcher (PI) Yi WANG
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Advanced Grant (AdG), PE6, ERC-2018-ADG
Summary Today, many industrial products are defined by software and therefore customizable: their functionalities implemented by software can be modified and extended by dynamic software updates on demand. This trend towards customizable products is rapidly expanding into all domains of IT, including Embedded Real-Time Systems (ERTS) deployed in Cyber-Physical Systems such as cars, medical devices etc. However, the current state-of-practice in safety-critical systems allows hardly any modifications once they are put in operation. The lack of techniques to preserve crucial safety conditions for customizable systems severely restricts the benefits of advances in software-defined systems engineering.
CUSTOMER is to provide the missing paradigm and technology for building and updating ERTS after deployment – subject to stringent timing constraints, dynamic workloads, and limited resources on complex platforms. CUSTOMER explores research areas crossing two fields: Real-Time Computing and Formal Verification to develop the key techniques enabling (1) dynamic updates of ERTS in the field, (2) incremental updates over the products life time and (3) safe updates by verification to avoid updates that may compromise system safety.
CUSTOMER will develop a unified model-based framework supported with tools for the design, modelling, verification, deployment and update of ERTS, aiming at advancing the research fields by establishing the missing scientific foundation for multiprocessor real-time computing and providing the next generation of design tools with significantly enhanced capability and scalability increased by orders of magnitude compared with state-of-the-art tools e.g. UPPAAL.
Summary
Today, many industrial products are defined by software and therefore customizable: their functionalities implemented by software can be modified and extended by dynamic software updates on demand. This trend towards customizable products is rapidly expanding into all domains of IT, including Embedded Real-Time Systems (ERTS) deployed in Cyber-Physical Systems such as cars, medical devices etc. However, the current state-of-practice in safety-critical systems allows hardly any modifications once they are put in operation. The lack of techniques to preserve crucial safety conditions for customizable systems severely restricts the benefits of advances in software-defined systems engineering.
CUSTOMER is to provide the missing paradigm and technology for building and updating ERTS after deployment – subject to stringent timing constraints, dynamic workloads, and limited resources on complex platforms. CUSTOMER explores research areas crossing two fields: Real-Time Computing and Formal Verification to develop the key techniques enabling (1) dynamic updates of ERTS in the field, (2) incremental updates over the products life time and (3) safe updates by verification to avoid updates that may compromise system safety.
CUSTOMER will develop a unified model-based framework supported with tools for the design, modelling, verification, deployment and update of ERTS, aiming at advancing the research fields by establishing the missing scientific foundation for multiprocessor real-time computing and providing the next generation of design tools with significantly enhanced capability and scalability increased by orders of magnitude compared with state-of-the-art tools e.g. UPPAAL.
Max ERC Funding
2 499 894 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym DEVOCEAN
Project Impact of diatom evolution on the oceans
Researcher (PI) Daniel CONLEY
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE10, ERC-2018-ADG
Summary Motivated by a series of recent discoveries, DEVOCEAN will provide the first comprehensive evaluation of the emergence of diatoms and their impact on the global biogeochemical cycle of silica, carbon and other nutrients that regulate ocean productivity and ultimately climate. I propose that the proliferation of phytoplankton that occurred after the Permian-Triassic extinction, in particular the diatoms, fundamentally influenced oceanic environments through the enhancement of carbon export to depth as part of the biological pump. Although molecular clocks suggest that diatoms evolved over 200 Ma ago, this result has been largely ignored because of the lack of diatoms in the geologic fossil record with most studies therefore focused on diversification during the Cenozoic where abundant diatom fossils are found. Much of the older fossil evidence has likely been destroyed by dissolution during diagenesis, subducted or is concealed deep within the Earth under many layers of rock. DEVOCEAN will provide evidence on diatom evolution and speciation in the geological record by examining formations representing locations in which diatoms are likely to have accumulated in ocean sediments. We will generate robust estimates of the timing and magnitude of dissolved Si drawdown following the origin of diatoms using the isotopic silicon composition of fossil sponge spicules and radiolarians. The project will also provide fundamental new insights into the timing of dissolved Si drawdown and other key events, which reorganized the distribution of carbon and nutrients in seawater, changing energy flows and productivity in the biological communities of the ancient oceans.
Summary
Motivated by a series of recent discoveries, DEVOCEAN will provide the first comprehensive evaluation of the emergence of diatoms and their impact on the global biogeochemical cycle of silica, carbon and other nutrients that regulate ocean productivity and ultimately climate. I propose that the proliferation of phytoplankton that occurred after the Permian-Triassic extinction, in particular the diatoms, fundamentally influenced oceanic environments through the enhancement of carbon export to depth as part of the biological pump. Although molecular clocks suggest that diatoms evolved over 200 Ma ago, this result has been largely ignored because of the lack of diatoms in the geologic fossil record with most studies therefore focused on diversification during the Cenozoic where abundant diatom fossils are found. Much of the older fossil evidence has likely been destroyed by dissolution during diagenesis, subducted or is concealed deep within the Earth under many layers of rock. DEVOCEAN will provide evidence on diatom evolution and speciation in the geological record by examining formations representing locations in which diatoms are likely to have accumulated in ocean sediments. We will generate robust estimates of the timing and magnitude of dissolved Si drawdown following the origin of diatoms using the isotopic silicon composition of fossil sponge spicules and radiolarians. The project will also provide fundamental new insights into the timing of dissolved Si drawdown and other key events, which reorganized the distribution of carbon and nutrients in seawater, changing energy flows and productivity in the biological communities of the ancient oceans.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym DIAPASoN
Project Differential Program Semantics
Researcher (PI) Ugo DAL LAGO
Host Institution (HI) ALMA MATER STUDIORUM - UNIVERSITA DI BOLOGNA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary Traditionally, program semantics is centered around the notion of program identity, that is to say of program equivalence: a program is identified with its meaning, and programs are considered as equal only if their meanings are the same. This view has been extremely fruitful in the past, allowing for a deep understanding of highly interactive forms of computation as embodied by higher-order or concurrent programs. The byproducts of all this lie everywhere in computer science, from programming language design to verification methodologies. The emphasis on equality — as opposed to differences — is not however in line with the way programs are written and structured in modern complex software systems. Subtasks are delegated to pieces of code which behave as expected only up to a certain probability of error, and only if the environment in which they operate makes this possible deviation irrelevant. These aspects have been almost neglected by the program semantics community until recently, and still have a marginal role. DIAPASON's goal is to study differences between programs as a constitutive and informative concept, rather than by way of relations between them. This will be accomplished by generalizing four major frameworks of program semantics, traditionally used for giving semantics to programs, comparing them, proving properties of them, and controlling their usage of resources: logical relations, bisimulation, game semantics, and linear logic.
Summary
Traditionally, program semantics is centered around the notion of program identity, that is to say of program equivalence: a program is identified with its meaning, and programs are considered as equal only if their meanings are the same. This view has been extremely fruitful in the past, allowing for a deep understanding of highly interactive forms of computation as embodied by higher-order or concurrent programs. The byproducts of all this lie everywhere in computer science, from programming language design to verification methodologies. The emphasis on equality — as opposed to differences — is not however in line with the way programs are written and structured in modern complex software systems. Subtasks are delegated to pieces of code which behave as expected only up to a certain probability of error, and only if the environment in which they operate makes this possible deviation irrelevant. These aspects have been almost neglected by the program semantics community until recently, and still have a marginal role. DIAPASON's goal is to study differences between programs as a constitutive and informative concept, rather than by way of relations between them. This will be accomplished by generalizing four major frameworks of program semantics, traditionally used for giving semantics to programs, comparing them, proving properties of them, and controlling their usage of resources: logical relations, bisimulation, game semantics, and linear logic.
Max ERC Funding
959 562 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym e-NeuroPharma
Project Electronic Neuropharmacology
Researcher (PI) Rolf Magnus BERGGREN
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Advanced Grant (AdG), PE5, ERC-2018-ADG
Summary As the population ages, neurodegenerative diseases (ND) will have a devastating impact on individuals and society. Despite enormous research efforts there is still no cure for these diseases, only care! The origin of ND is hugely complex, spanning from the molecular level to systemic processes, causing malfunctioning of signalling in the central nervous system (CNS). This signalling includes the coupled processing of biochemical and electrical signals, however current approaches for symptomatic- and disease modifying treatments are all based on biochemical approaches, alone.
Organic bioelectronics has arisen as a promising technology providing signal translation, as sensors and modulators, across the biology-technology interface; especially, it has proven unique in neuronal applications. There is great opportunity with organic bioelectronics since it can complement biochemical pharmacology to enable a twinned electric-biochemical therapy for ND and neurological disorders. However, this technology is traditionally manufactured on stand-alone substrates. Even though organic bioelectronics has been manufactured on flexible and soft carriers in the past, current technology consume space and volume, that when applied to CNS, rule out close proximity and amalgamation between the bioelectronics technology and CNS components – features that are needed in order to reach high therapeutic efficacy.
e-NeuroPharma includes development of innovative organic bioelectronics, that can be in-vivo-manufactured within the brain. The overall aim is to evaluate and develop electrodes, delivery devices and sensors that enable a twinned biochemical-electric therapy approach to combat ND and other neurological disorders. e-NeuroPharma will focus on the development of materials that can cross the blood-brain-barrier, that self-organize and -polymerize along CNS components, and that record and regulate relevant electrical, electrochemical and physical parameters relevant to ND and disorders
Summary
As the population ages, neurodegenerative diseases (ND) will have a devastating impact on individuals and society. Despite enormous research efforts there is still no cure for these diseases, only care! The origin of ND is hugely complex, spanning from the molecular level to systemic processes, causing malfunctioning of signalling in the central nervous system (CNS). This signalling includes the coupled processing of biochemical and electrical signals, however current approaches for symptomatic- and disease modifying treatments are all based on biochemical approaches, alone.
Organic bioelectronics has arisen as a promising technology providing signal translation, as sensors and modulators, across the biology-technology interface; especially, it has proven unique in neuronal applications. There is great opportunity with organic bioelectronics since it can complement biochemical pharmacology to enable a twinned electric-biochemical therapy for ND and neurological disorders. However, this technology is traditionally manufactured on stand-alone substrates. Even though organic bioelectronics has been manufactured on flexible and soft carriers in the past, current technology consume space and volume, that when applied to CNS, rule out close proximity and amalgamation between the bioelectronics technology and CNS components – features that are needed in order to reach high therapeutic efficacy.
e-NeuroPharma includes development of innovative organic bioelectronics, that can be in-vivo-manufactured within the brain. The overall aim is to evaluate and develop electrodes, delivery devices and sensors that enable a twinned biochemical-electric therapy approach to combat ND and other neurological disorders. e-NeuroPharma will focus on the development of materials that can cross the blood-brain-barrier, that self-organize and -polymerize along CNS components, and that record and regulate relevant electrical, electrochemical and physical parameters relevant to ND and disorders
Max ERC Funding
3 237 335 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym FREENERGY
Project Lead-free halide perovskites for the highest efficient solar energy conversion
Researcher (PI) Antonio ABATE
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Starting Grant (StG), PE5, ERC-2018-STG
Summary Achieving zero net carbon emissions by the end of the century is the challenge for capping global warming. The largest share of carbon emissions belongs to the production of electric energy from fossil fuels, which renewable energies are progressively replacing. Sunlight is an ideal renewable energy source since it is most abundant and available worldwide. Photovoltaic solar cells can directly convert the sunlight into electric energy by making use of the photovoltaic effect in semiconductors. Halide perovskites are emerging crystalline semiconducting materials with among the strongest light absorption and the most effective electric charge generation needed to design the highest efficient photovoltaic solar cells. The PI has the ambition to reinvent halide perovskites as environmentally friendly photovoltaic material, aiming at:
(i) Removing lead: state-of-the-art perovskite solar cells are based on lead, which is in the list of hazardous substances of the European Union. The PI will prepare new tin-based perovskites and prove them in the highest efficient solar cells.
(ii) Solvent-free crystallisation: organic solvents drive the crystallisation of the perovskite in the most efficient solar cells. However, crystallising the perovskite without using solvents is more environmentally friendly. The PI will establish physical vapour deposition as a solvent-free method for preparing the perovskite and the other materials comprising the solar cell.
(iii) Durable power output: the long-term power output defines the solar energy yield and thus the return on investment. The PI aims to make stable tin-based perovskites addressing the oxidative instability of tin directly.
The quantified target of FREENERGY is demonstrating a tin-based perovskite solar cell with power conversion efficiency over 20% and stability for 25 years. The research strategy to enable this disruptive outcome comprises innovative perovskites formulations and unconventional supramolecular interactions
Summary
Achieving zero net carbon emissions by the end of the century is the challenge for capping global warming. The largest share of carbon emissions belongs to the production of electric energy from fossil fuels, which renewable energies are progressively replacing. Sunlight is an ideal renewable energy source since it is most abundant and available worldwide. Photovoltaic solar cells can directly convert the sunlight into electric energy by making use of the photovoltaic effect in semiconductors. Halide perovskites are emerging crystalline semiconducting materials with among the strongest light absorption and the most effective electric charge generation needed to design the highest efficient photovoltaic solar cells. The PI has the ambition to reinvent halide perovskites as environmentally friendly photovoltaic material, aiming at:
(i) Removing lead: state-of-the-art perovskite solar cells are based on lead, which is in the list of hazardous substances of the European Union. The PI will prepare new tin-based perovskites and prove them in the highest efficient solar cells.
(ii) Solvent-free crystallisation: organic solvents drive the crystallisation of the perovskite in the most efficient solar cells. However, crystallising the perovskite without using solvents is more environmentally friendly. The PI will establish physical vapour deposition as a solvent-free method for preparing the perovskite and the other materials comprising the solar cell.
(iii) Durable power output: the long-term power output defines the solar energy yield and thus the return on investment. The PI aims to make stable tin-based perovskites addressing the oxidative instability of tin directly.
The quantified target of FREENERGY is demonstrating a tin-based perovskite solar cell with power conversion efficiency over 20% and stability for 25 years. The research strategy to enable this disruptive outcome comprises innovative perovskites formulations and unconventional supramolecular interactions
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym GASP
Project GAs Stripping Phenomena in galaxies
Researcher (PI) Bianca Maria POGGIANTI
Host Institution (HI) ISTITUTO NAZIONALE DI ASTROFISICA
Call Details Advanced Grant (AdG), PE9, ERC-2018-ADG
Summary The build up of galaxies is mainly driven by the availability of gas that can cool and form new stars. Any physical process that is able to alter the gas content of a galaxy has therefore important consequences for its evolution. The study of processes that can remove gas from galaxies is the subject of GASP (GAs Stripping Phenomena in galaxies), an ESO Large Program I am leading. GASP has obtained integral field spectroscopy (IFS) with MUSE of 114 low-z galaxies with masses in the range 10^9-10^11.5 Msun, hosted in X-ray selected clusters, in groups and filaments. The GASP sample includes the largest existing IFS sample of so- called “jellyfish galaxies” that have long tails of ionised gas, as well as other galaxies in different stages of ram pressure stripping in clusters and galaxies undergoing gas disturbance due to various phenomena in groups and filaments. GASP has the unique capability to combine the power of spatially resolved observations covering galaxy disks, outskirts and surroundings with the virtues of a statistical study of a significant number of galaxies. The MUSE GASP dataset, combined with ALMA, APEX, JVLA, UVIT and HST follow-up programs, form the basis for this ERC program. The goal is to accomplish an unprecedented break-through in our understanding of jellyfish galaxies, ram pressure stripping, gas removal processes in different environments and their consequences for the stellar history of galaxies. This multi faced, coherent program will investigate the physics of the baryonic cycle between the various gas phases (ionised, molecular and neutral) and the star formation under extreme conditions, the connection between ram pressure and AGN activity, the quenching of galaxies undergoing gas removal phenomena, and the physics of such phenomena in clusters, groups and filaments. The GASP ERC program will be a game changer in this field of research: there is no previous similar study, nor there can be a comparable one for quite a long time.
Summary
The build up of galaxies is mainly driven by the availability of gas that can cool and form new stars. Any physical process that is able to alter the gas content of a galaxy has therefore important consequences for its evolution. The study of processes that can remove gas from galaxies is the subject of GASP (GAs Stripping Phenomena in galaxies), an ESO Large Program I am leading. GASP has obtained integral field spectroscopy (IFS) with MUSE of 114 low-z galaxies with masses in the range 10^9-10^11.5 Msun, hosted in X-ray selected clusters, in groups and filaments. The GASP sample includes the largest existing IFS sample of so- called “jellyfish galaxies” that have long tails of ionised gas, as well as other galaxies in different stages of ram pressure stripping in clusters and galaxies undergoing gas disturbance due to various phenomena in groups and filaments. GASP has the unique capability to combine the power of spatially resolved observations covering galaxy disks, outskirts and surroundings with the virtues of a statistical study of a significant number of galaxies. The MUSE GASP dataset, combined with ALMA, APEX, JVLA, UVIT and HST follow-up programs, form the basis for this ERC program. The goal is to accomplish an unprecedented break-through in our understanding of jellyfish galaxies, ram pressure stripping, gas removal processes in different environments and their consequences for the stellar history of galaxies. This multi faced, coherent program will investigate the physics of the baryonic cycle between the various gas phases (ionised, molecular and neutral) and the star formation under extreme conditions, the connection between ram pressure and AGN activity, the quenching of galaxies undergoing gas removal phenomena, and the physics of such phenomena in clusters, groups and filaments. The GASP ERC program will be a game changer in this field of research: there is no previous similar study, nor there can be a comparable one for quite a long time.
Max ERC Funding
2 498 238 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym GEMS
Project General Embedding Models for Spectroscopy
Researcher (PI) Chiara CAPPELLI
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary Recently, there has been a paradigmatic shift in experimental molecular spectroscopy, with new methods focusing on the study of molecules embedded within complex supramolecular/nanostructured aggregates. In the past, molecular spectroscopy has benefitted from the synergistic developments of accurate and cost-effective computational protocols for the simulation of a wide variety of spectroscopies. These methods, however, have been limited to isolated molecules or systems in solution, therefore are inadequate to describe the spectroscopy of complex nanostructured systems. The aim of GEMS is to bridge this gap, and to provide a coherent theoretical description and cost-effective computational tools for the simulation of spectra of molecules interacting with metal nano-particles, metal nanoaggregates and graphene sheets.
To this end, I will develop a novel frequency-dependent multilayer Quantum Mechanical (QM)/Molecular Mechanics (MM) embedding approach, general enough to be extendable to spectroscopic signals by using the machinery of quantum chemistry and able to treat any kind of plasmonic external environment by resorting to the same theoretical framework, but introducing its specificities through an accurate modelling and parametrization of the classical portion. The model will be interfaced with widely used computational chemistry software packages, so to maximize its use by the scientific community, and especially by non-specialists.
As pilot applications, GEMS will study the Surface-Enhanced Raman (SERS) spectra of systems that have found applications in the biosensor field, SERS of organic molecules in subnanometre junctions, enhanced infrared (IR) spectra of oligopeptides adsorbed on graphene, Graphene Enhanced Raman Scattering (GERS) of organic dyes, and the transmission of stereochemical response from a chiral analyte to an achiral molecule in the vicinity of a plasmon resonance of an achiral metallic nanostructure, as measured by Raman Optical Activity-ROA
Summary
Recently, there has been a paradigmatic shift in experimental molecular spectroscopy, with new methods focusing on the study of molecules embedded within complex supramolecular/nanostructured aggregates. In the past, molecular spectroscopy has benefitted from the synergistic developments of accurate and cost-effective computational protocols for the simulation of a wide variety of spectroscopies. These methods, however, have been limited to isolated molecules or systems in solution, therefore are inadequate to describe the spectroscopy of complex nanostructured systems. The aim of GEMS is to bridge this gap, and to provide a coherent theoretical description and cost-effective computational tools for the simulation of spectra of molecules interacting with metal nano-particles, metal nanoaggregates and graphene sheets.
To this end, I will develop a novel frequency-dependent multilayer Quantum Mechanical (QM)/Molecular Mechanics (MM) embedding approach, general enough to be extendable to spectroscopic signals by using the machinery of quantum chemistry and able to treat any kind of plasmonic external environment by resorting to the same theoretical framework, but introducing its specificities through an accurate modelling and parametrization of the classical portion. The model will be interfaced with widely used computational chemistry software packages, so to maximize its use by the scientific community, and especially by non-specialists.
As pilot applications, GEMS will study the Surface-Enhanced Raman (SERS) spectra of systems that have found applications in the biosensor field, SERS of organic molecules in subnanometre junctions, enhanced infrared (IR) spectra of oligopeptides adsorbed on graphene, Graphene Enhanced Raman Scattering (GERS) of organic dyes, and the transmission of stereochemical response from a chiral analyte to an achiral molecule in the vicinity of a plasmon resonance of an achiral metallic nanostructure, as measured by Raman Optical Activity-ROA
Max ERC Funding
1 609 500 €
Duration
Start date: 2019-06-01, End date: 2024-05-31
Project acronym GRAMS
Project GRavity from Astrophysical to Microscopic Scales
Researcher (PI) Enrico BARAUSSE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary General Relativity (GR) describes gravity on a huge range of scales, field strengths and velocities. However, despite its successes, GR has been showing its age. Cosmological data support the existence of a Dark Sector, but may also be interpreted as a breakdown of our understanding of gravity. Also, GR is intrinsically incompatible with quantum field theory, and should be replaced, at high energies, by a (still unknown) quantum theory of gravity.
This deadlock may prelude to a paradigm change in our understanding of gravity, possibly triggered by the direct observations of neutron stars and black holes by gravitational-wave interferometers. The recent LIGO/Virgo observations, and in particular the coincident detection of electromagnetic and gravitational signals from neutron-star binaries, have already made a huge impact on our theoretical understanding of gravity, by severely constraining several extensions of GR.
GRAMS is a high-risk/high-gain project seeking to push the implications of these observations even further, by exploring whether the existing LIGO/Virgo data, and in particular their absence of non-perturbative deviations from GR, are consistent with gravitational theories built to reproduce the large-scale behaviour of the Universe (i.e. the existence of Dark Energy and/or Dark Matter), while at the same time passing local tests of gravity thanks to non-perturbative screening mechanisms. I will prove that the very fact of screening local scales makes gravitational emission in these theories much more involved than in GR, and also intrinsically unlikely to yield results in agreement with existing (and future) gravitational-wave observations. This would be a huge step forward for our understanding of cosmology, as it would rule out a modified gravity origin for the Dark Sector. Even if this conjecture is incorrect, GRAMS will provide the first numerical-relativity simulations of compact binaries ever in gravitational theories of interest for cosmology.
Summary
General Relativity (GR) describes gravity on a huge range of scales, field strengths and velocities. However, despite its successes, GR has been showing its age. Cosmological data support the existence of a Dark Sector, but may also be interpreted as a breakdown of our understanding of gravity. Also, GR is intrinsically incompatible with quantum field theory, and should be replaced, at high energies, by a (still unknown) quantum theory of gravity.
This deadlock may prelude to a paradigm change in our understanding of gravity, possibly triggered by the direct observations of neutron stars and black holes by gravitational-wave interferometers. The recent LIGO/Virgo observations, and in particular the coincident detection of electromagnetic and gravitational signals from neutron-star binaries, have already made a huge impact on our theoretical understanding of gravity, by severely constraining several extensions of GR.
GRAMS is a high-risk/high-gain project seeking to push the implications of these observations even further, by exploring whether the existing LIGO/Virgo data, and in particular their absence of non-perturbative deviations from GR, are consistent with gravitational theories built to reproduce the large-scale behaviour of the Universe (i.e. the existence of Dark Energy and/or Dark Matter), while at the same time passing local tests of gravity thanks to non-perturbative screening mechanisms. I will prove that the very fact of screening local scales makes gravitational emission in these theories much more involved than in GR, and also intrinsically unlikely to yield results in agreement with existing (and future) gravitational-wave observations. This would be a huge step forward for our understanding of cosmology, as it would rule out a modified gravity origin for the Dark Sector. Even if this conjecture is incorrect, GRAMS will provide the first numerical-relativity simulations of compact binaries ever in gravitational theories of interest for cosmology.
Max ERC Funding
1 993 920 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym HIGEOM
Project Highly accurate Isogeometric Method
Researcher (PI) Giancarlo Sangalli
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE1, ERC-2013-CoG
Summary "Partial Differential Equations (PDEs) are widely used in science and engineering simulations, often in tight connection with Computer Aided Design (CAD). The Finite Element Method (FEM) is one of the most popular technique for the discretization of PDEs. The IsoGeometric Method (IGM), proposed in 2005 by T.J.R. Hughes et al., aims at improving the interoperability between CAD and FEMs. This is achieved by adopting the CAD mathematical primitives, i.e. Splines and Non-Uniform Rational B-Splines (NURBS), both for geometry and unknown fields representation. The IGM has gained an incredible momentum especially in the engineering community. The use of high-degree, highly smooth NURBS is extremely successful and the IGM outperforms the FEM in most academic benchmarks.
However, we are far from having a satisfactory mathematical understanding of the IGM and, even more importantly, from exploiting its full potential. Until now, the IGM theory and practice have been deeply influenced by finite element analysis. For example, the IGM is implemented resorting to a FEM code design, which is very inefficient for high-degree and high-smoothness NURBS. This has made possible a fast spreading of the IGM, but also limited it to quadratic or cubic NURBS in complex simulations.
The use of higher degree IGM for real-world applications asks for new tools allowing for the efficient construction and solution of the linear system, time integration, flexible local mesh refinement, and so on. These questions need to be approached beyond the FEM framework. This is possible only on solid mathematical grounds, on a new theory of splines and NURBS able to comply with the needs of the IGM.
This project will provide the crucial knowledge and will re-design the IGM to make it a superior, highly accurate and stable methodology, having a significant impact in the field of numerical simulation of PDEs, particularly when accuracy is essential both in geometry and fields representation."
Summary
"Partial Differential Equations (PDEs) are widely used in science and engineering simulations, often in tight connection with Computer Aided Design (CAD). The Finite Element Method (FEM) is one of the most popular technique for the discretization of PDEs. The IsoGeometric Method (IGM), proposed in 2005 by T.J.R. Hughes et al., aims at improving the interoperability between CAD and FEMs. This is achieved by adopting the CAD mathematical primitives, i.e. Splines and Non-Uniform Rational B-Splines (NURBS), both for geometry and unknown fields representation. The IGM has gained an incredible momentum especially in the engineering community. The use of high-degree, highly smooth NURBS is extremely successful and the IGM outperforms the FEM in most academic benchmarks.
However, we are far from having a satisfactory mathematical understanding of the IGM and, even more importantly, from exploiting its full potential. Until now, the IGM theory and practice have been deeply influenced by finite element analysis. For example, the IGM is implemented resorting to a FEM code design, which is very inefficient for high-degree and high-smoothness NURBS. This has made possible a fast spreading of the IGM, but also limited it to quadratic or cubic NURBS in complex simulations.
The use of higher degree IGM for real-world applications asks for new tools allowing for the efficient construction and solution of the linear system, time integration, flexible local mesh refinement, and so on. These questions need to be approached beyond the FEM framework. This is possible only on solid mathematical grounds, on a new theory of splines and NURBS able to comply with the needs of the IGM.
This project will provide the crucial knowledge and will re-design the IGM to make it a superior, highly accurate and stable methodology, having a significant impact in the field of numerical simulation of PDEs, particularly when accuracy is essential both in geometry and fields representation."
Max ERC Funding
928 188 €
Duration
Start date: 2014-06-01, End date: 2019-05-31
Project acronym HY-NANO
Project HYbrid NANOstructured multi-functional interfaces for stable, efficient and eco-friendly photovoltaic devices
Researcher (PI) Giulia GRANCINI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Starting Grant (StG), PE4, ERC-2018-STG
Summary HY-NANO focuses on one of the current major challenges in Europe: a global transition to a low-carbon society and green economy by 2050. Solar energy can lead a “paradigm shift” in the energy sector with a new low-cost, efficient, and stable technology (3-pillars strategy). Nowadays, low-cost three dimensional (3D) Hybrid Perovskites (HP) solar cells are revolutionizing the photovoltaic scene, with stunning power conversion efficiency beyond 22%. However, poor device stability (due to degradation in contact with water) and dependence on toxic components (lead) substantially hamper their commercialization.
HY-NANO aims to realize a new low-cost and efficient hybrid solar technology combining long-term stability with a reduced environmental impact. Design and engineering innovative multi-dimensional hybrid interfaces is the core idea. This will be achieved by: 1. design and characterization of new stable and eco-friendly perovskites structures, with tunable composition and dimensionality ranging from 3D to 2D; 2. exploiting new synergistic functions by combining 3D and 2D perovskites together into novel stable and efficient multi-dimensional interfaces while addressing the interface physics therein; 3. integrating the hybrid interfaces into high efficient and stable device architectures engineered “ad hoc”. In addition, I propose the development of new solar cell encapsulant using metal-organic frameworks (MOFs) functionalized as selective lead receptors to minimize the environmental risks associated with the potential release of lead.
My multidisciplinary expertise in advanced material design, cutting-edge photophysical experimental investigations, and solar cell engineering will enable me to successfully target the ambitious goals. HY-NANO is timely and it will generate the new fundamental knowledge that is urgently needed for a scientific and technological breakthrough in materials and devices for near future photovoltaics.
Summary
HY-NANO focuses on one of the current major challenges in Europe: a global transition to a low-carbon society and green economy by 2050. Solar energy can lead a “paradigm shift” in the energy sector with a new low-cost, efficient, and stable technology (3-pillars strategy). Nowadays, low-cost three dimensional (3D) Hybrid Perovskites (HP) solar cells are revolutionizing the photovoltaic scene, with stunning power conversion efficiency beyond 22%. However, poor device stability (due to degradation in contact with water) and dependence on toxic components (lead) substantially hamper their commercialization.
HY-NANO aims to realize a new low-cost and efficient hybrid solar technology combining long-term stability with a reduced environmental impact. Design and engineering innovative multi-dimensional hybrid interfaces is the core idea. This will be achieved by: 1. design and characterization of new stable and eco-friendly perovskites structures, with tunable composition and dimensionality ranging from 3D to 2D; 2. exploiting new synergistic functions by combining 3D and 2D perovskites together into novel stable and efficient multi-dimensional interfaces while addressing the interface physics therein; 3. integrating the hybrid interfaces into high efficient and stable device architectures engineered “ad hoc”. In addition, I propose the development of new solar cell encapsulant using metal-organic frameworks (MOFs) functionalized as selective lead receptors to minimize the environmental risks associated with the potential release of lead.
My multidisciplinary expertise in advanced material design, cutting-edge photophysical experimental investigations, and solar cell engineering will enable me to successfully target the ambitious goals. HY-NANO is timely and it will generate the new fundamental knowledge that is urgently needed for a scientific and technological breakthrough in materials and devices for near future photovoltaics.
Max ERC Funding
1 499 084 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym HYDROCARB
Project Towards a new understanding of carbon processing in freshwaters: methane emission hot spots and carbon burial
Researcher (PI) Sebastian Sobek
Host Institution (HI) UPPSALA UNIVERSITET
Call Details Starting Grant (StG), PE10, ERC-2013-StG
Summary In spite of their small areal extent, inland waters play a vital role in the carbon cycle of the continents, as they emit significant amounts of the greenhouse gases (GHG) carbon dioxide (CO2) and methane (CH4) to the atmosphere, and simultaneously bury more organic carbon (OC) in their sediments than the entire ocean. Particularly in tropical hydropower reservoirs, GHG emissions can be large, mainly owing to high CH4 emission. Moreover, the number of tropical hydropower reservoirs will continue to increase dramatically, due to an urgent need for economic growth and a vast unused hydropower potential in many tropical countries. However, the current understanding of the magnitude of GHG emission, and of the processes regulating it, is insufficient. Here I propose a research program on tropical reservoirs in Brazil that takes advantage of recent developments in both concepts and methodologies to provide unique evaluations of GHG emission and OC burial in tropical reservoirs. In particular, I will test the following hypotheses: 1) Current estimates of reservoir CH4 emission are at least one order of magnitude too low, since they have completely missed the recently discovered existence of gas bubble emission hot spots; 2) The burial of land-derived OC in reservoir sediments offsets a significant share of the GHG emissions; and 3) The sustained, long-term CH4 emission from reservoirs is to a large degree fuelled by primary production of new OC within the reservoir, and may therefore be reduced by management of nutrient supply. The new understanding and the cross-disciplinary methodological approach will constitute a major advance to aquatic science in general, and have strong impacts on the understanding of other aquatic systems at other latitudes as well. In addition, the results will be merged into an existing reservoir GHG risk assessment tool to improve planning, design, management and judgment of hydropower reservoirs.
Summary
In spite of their small areal extent, inland waters play a vital role in the carbon cycle of the continents, as they emit significant amounts of the greenhouse gases (GHG) carbon dioxide (CO2) and methane (CH4) to the atmosphere, and simultaneously bury more organic carbon (OC) in their sediments than the entire ocean. Particularly in tropical hydropower reservoirs, GHG emissions can be large, mainly owing to high CH4 emission. Moreover, the number of tropical hydropower reservoirs will continue to increase dramatically, due to an urgent need for economic growth and a vast unused hydropower potential in many tropical countries. However, the current understanding of the magnitude of GHG emission, and of the processes regulating it, is insufficient. Here I propose a research program on tropical reservoirs in Brazil that takes advantage of recent developments in both concepts and methodologies to provide unique evaluations of GHG emission and OC burial in tropical reservoirs. In particular, I will test the following hypotheses: 1) Current estimates of reservoir CH4 emission are at least one order of magnitude too low, since they have completely missed the recently discovered existence of gas bubble emission hot spots; 2) The burial of land-derived OC in reservoir sediments offsets a significant share of the GHG emissions; and 3) The sustained, long-term CH4 emission from reservoirs is to a large degree fuelled by primary production of new OC within the reservoir, and may therefore be reduced by management of nutrient supply. The new understanding and the cross-disciplinary methodological approach will constitute a major advance to aquatic science in general, and have strong impacts on the understanding of other aquatic systems at other latitudes as well. In addition, the results will be merged into an existing reservoir GHG risk assessment tool to improve planning, design, management and judgment of hydropower reservoirs.
Max ERC Funding
1 798 227 €
Duration
Start date: 2013-09-01, End date: 2019-08-31
Project acronym HyGate
Project Hydrophobic Gating in nanochannels: understanding single channel mechanisms for designing better nanoscale sensors
Researcher (PI) Alberto GIACOMELLO
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary Hydrophobic gating is the phenomenon by which the flux of ions or other molecules through biological ion channels or synthetic nanopores is hindered by the formation of nanoscale bubbles. Recent studies suggest that this is a generic mechanism for the inactivation of a plethora of ion channels, which are all characterized by a strongly hydrophobic interior. The conformation, compliance, and hydrophobicity of the nanochannels – in addition to external parameters such as electric potential, pressure, presence of gases – have a dramatic influence on the probability of opening and closing of the gate. This largely unexplored confined phase transition is known to cause low frequency noise in solid-state nanopores used for DNA sequencing and sensing, limiting their applicability. In biological channels, hydrophobic gating might conspire in determining the high selectivity towards a specific ions or molecules, a characteristic which is sought for in biosensors.
The objective of HyGate is to unravel the fundamental mechanisms of hydrophobic gating in model nanopores and biological ion channels and exploit their understanding in order to design biosensors with lower noise and higher selectivity. In order to achieve this ambitious goal, I will deploy the one-of-a-kind simulation and theoretical tools I developed to study vapor nucleation in extreme confinement, which comprises rare-event molecular dynamics and confined nucleation theory. These quantitative tools will be instrumental in designing better biosensors and nanodevices which avoid the formation of nanobubbles or exploit them to achieve exquisite species selectivity. The novel physical insights into the behavior of water in complex nanoconfined environments are expected to inspire radically innovative strategies for nanopore sensing and nanofluidic circuits and to promote a stepwise advancement in the fundamental understanding of hydrophobic gating mechanisms and their influence on bio-electrical cell response.
Summary
Hydrophobic gating is the phenomenon by which the flux of ions or other molecules through biological ion channels or synthetic nanopores is hindered by the formation of nanoscale bubbles. Recent studies suggest that this is a generic mechanism for the inactivation of a plethora of ion channels, which are all characterized by a strongly hydrophobic interior. The conformation, compliance, and hydrophobicity of the nanochannels – in addition to external parameters such as electric potential, pressure, presence of gases – have a dramatic influence on the probability of opening and closing of the gate. This largely unexplored confined phase transition is known to cause low frequency noise in solid-state nanopores used for DNA sequencing and sensing, limiting their applicability. In biological channels, hydrophobic gating might conspire in determining the high selectivity towards a specific ions or molecules, a characteristic which is sought for in biosensors.
The objective of HyGate is to unravel the fundamental mechanisms of hydrophobic gating in model nanopores and biological ion channels and exploit their understanding in order to design biosensors with lower noise and higher selectivity. In order to achieve this ambitious goal, I will deploy the one-of-a-kind simulation and theoretical tools I developed to study vapor nucleation in extreme confinement, which comprises rare-event molecular dynamics and confined nucleation theory. These quantitative tools will be instrumental in designing better biosensors and nanodevices which avoid the formation of nanobubbles or exploit them to achieve exquisite species selectivity. The novel physical insights into the behavior of water in complex nanoconfined environments are expected to inspire radically innovative strategies for nanopore sensing and nanofluidic circuits and to promote a stepwise advancement in the fundamental understanding of hydrophobic gating mechanisms and their influence on bio-electrical cell response.
Max ERC Funding
1 496 250 €
Duration
Start date: 2019-02-01, End date: 2024-01-31
Project acronym INSTABILITIES
Project Instabilities and nonlocal multiscale modelling of materials
Researcher (PI) Davide Bigoni
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Failure in ductile materials results from a multiscale interaction of discrete microstructures hierarchically emerging through subsequent material instabilities and self-organizing into regular patterns (shear band clusters, for instance). The targets of the project are: (i.) to disclose the failure mechanisms of materials through analysis of material instabilities and (ii.) to develop innovative microstructures to be embedded in solids, in order to open new possibilities in the design of ultra-resistant materials and structures.
The link between the two targets is that micromechanisms developing during failure inspire the way of enhancing the mechanical properties of materials by embedding microstructures. The aim is to provide design tools to obtain groundbreaking and unchallenged mechanical properties employing discrete microstructures, for instance to design a microstructure defining a material working under flutter condition.
The design of these microstructures will permit the achievement of innovative dynamical properties, defining elastic metamaterials, for instance, permitting the fabrication of flat lenses for elastic waves, evidencing negative refraction and superlensing effects. The objective is the discovery of these effects in mechanics, thus disclosing new horizons in the dynamics of materials.
Microstructures introduce length scales and nonlocal effects in the mechanical modelling, which involve the use of higher-order theories.
The analysis of these effects, usually developed within a phenomenological approach, will be attacked from the fundamental and almost unexplored point of view: the explicit evaluation of nonlocality, related to the microstructure via homogenisation theory."
Summary
"Failure in ductile materials results from a multiscale interaction of discrete microstructures hierarchically emerging through subsequent material instabilities and self-organizing into regular patterns (shear band clusters, for instance). The targets of the project are: (i.) to disclose the failure mechanisms of materials through analysis of material instabilities and (ii.) to develop innovative microstructures to be embedded in solids, in order to open new possibilities in the design of ultra-resistant materials and structures.
The link between the two targets is that micromechanisms developing during failure inspire the way of enhancing the mechanical properties of materials by embedding microstructures. The aim is to provide design tools to obtain groundbreaking and unchallenged mechanical properties employing discrete microstructures, for instance to design a microstructure defining a material working under flutter condition.
The design of these microstructures will permit the achievement of innovative dynamical properties, defining elastic metamaterials, for instance, permitting the fabrication of flat lenses for elastic waves, evidencing negative refraction and superlensing effects. The objective is the discovery of these effects in mechanics, thus disclosing new horizons in the dynamics of materials.
Microstructures introduce length scales and nonlocal effects in the mechanical modelling, which involve the use of higher-order theories.
The analysis of these effects, usually developed within a phenomenological approach, will be attacked from the fundamental and almost unexplored point of view: the explicit evaluation of nonlocality, related to the microstructure via homogenisation theory."
Max ERC Funding
2 379 359 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym LINCE
Project Light INduced Cell control by Exogenous organic semiconductors
Researcher (PI) Maria Rosa ANTOGNAZZA
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Starting Grant (StG), PE8, ERC-2018-STG
Summary LINCE will develop light-sensitive devices based on organic semiconductors (OS) for optical regulation of living cells functions.
The possibility to control the activity of biological systems is a timeless mission for neuroscientists, since it allows both to understand specific functions and to manage dysfunctions. Optical modulation provides, respect to traditional electrical methods, unprecedented spatio-temporal resolution, lower invasiveness, and higher selectivity. However, the vast majority of animal cells does not bear specific sensitivity to light. Search for new materials capable to optically regulate cell activity is thus an extremely hot topic. OS are ideal candidates, since they are inherently sensitive to visible light and highly biocompatible, sustain both ionic and electronic conduction, can be functionalized with biomolecules and drugs. Recently, it was reported that polymer-mediated optical excitation efficiently modulates the neuronal electrical activity.
LINCE will significantly broaden the application of OS to address key, open issues of high biological relevance, in both neuroscience and regenerative medicine. In particular, it will develop new devices for: (i) regulation of astrocytes functions, active in many fundamental processes of the central nervous system and in pathological disorders; (ii) control of stem cell differentiation and tissue regeneration; (iii) control of animal behavior, to first assess device biocompatibility and efficacy in vivo. LINCE tools will be sensitive to visible and NIR light, flexible, biocompatible, and easily integrated with any standard physiology set-up. They will combine electrical, chemical and thermal stimuli, offering high spatio-temporal resolution, reversibility, specificity and yield. The combination of all these features is not achievable by current technologies. Overall, LINCE will provide neuroscientists and medical doctors with an unprecedented tool-box for in vitro and in vivo investigations.
Summary
LINCE will develop light-sensitive devices based on organic semiconductors (OS) for optical regulation of living cells functions.
The possibility to control the activity of biological systems is a timeless mission for neuroscientists, since it allows both to understand specific functions and to manage dysfunctions. Optical modulation provides, respect to traditional electrical methods, unprecedented spatio-temporal resolution, lower invasiveness, and higher selectivity. However, the vast majority of animal cells does not bear specific sensitivity to light. Search for new materials capable to optically regulate cell activity is thus an extremely hot topic. OS are ideal candidates, since they are inherently sensitive to visible light and highly biocompatible, sustain both ionic and electronic conduction, can be functionalized with biomolecules and drugs. Recently, it was reported that polymer-mediated optical excitation efficiently modulates the neuronal electrical activity.
LINCE will significantly broaden the application of OS to address key, open issues of high biological relevance, in both neuroscience and regenerative medicine. In particular, it will develop new devices for: (i) regulation of astrocytes functions, active in many fundamental processes of the central nervous system and in pathological disorders; (ii) control of stem cell differentiation and tissue regeneration; (iii) control of animal behavior, to first assess device biocompatibility and efficacy in vivo. LINCE tools will be sensitive to visible and NIR light, flexible, biocompatible, and easily integrated with any standard physiology set-up. They will combine electrical, chemical and thermal stimuli, offering high spatio-temporal resolution, reversibility, specificity and yield. The combination of all these features is not achievable by current technologies. Overall, LINCE will provide neuroscientists and medical doctors with an unprecedented tool-box for in vitro and in vivo investigations.
Max ERC Funding
1 866 250 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym MAMBA
Project Molecular mechanism of amyloid β aggregation
Researcher (PI) Sara Elisabet Snogerup Linse
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Generation of toxic oligomers during aggregation of amyloid beta peptide (Abeta42) into amyloid fibrils is a central event in Alzheimer disease. Understanding the aggregation process is therefore one important step towards therapy and diagnosis of the disease. We propose a physical chemistry approach with the goal of finding the molecular mechanisms behind the process in terms of the underlying microscopic steps and the molecular driving forces governing each step. We will use methodology developed recently in our laboratory yielding unprecedented reproducibility in the kinetic data. The methodology relies on optimization of every step from production and purification to isolation of highly pure monomeric peptide, and inertness and minimized area of all surfaces. We will use cell viability studies to detect toxic oligomeric species, and selective radio-labeling experiments to pinpoint the origin of those species. In order to obtain insight into the molecular determinants and the relative role of different kinds of intermolecular interactions for each microscopic step, we will study the concentration dependent aggregation kinetics as a function of extrinsic and intrinsic parameters. Extrinsic parameters include temperature, salt, pH, biological membranes, other proteins, and low and high Mw inhibitors. Intrinsic parameters include point mutations and sequence extension/truncation. We will perform detailed kinetic studies for each inhibitor to learn which step in the process is inhibited coupled to cell toxicity assays to learn whether the generation of toxic oligomers is limited. We will use spectroscopic techniques, dynamic light scattering, cryogenic transmission electron microscopy and mass spectrometry coupled to HD exchange to learn about structural transitions as a function of process progression under different conditions to favor different microscopic steps. The results may lead to improved diagnostics and therapeutics of Alzheimer disease.
Summary
Generation of toxic oligomers during aggregation of amyloid beta peptide (Abeta42) into amyloid fibrils is a central event in Alzheimer disease. Understanding the aggregation process is therefore one important step towards therapy and diagnosis of the disease. We propose a physical chemistry approach with the goal of finding the molecular mechanisms behind the process in terms of the underlying microscopic steps and the molecular driving forces governing each step. We will use methodology developed recently in our laboratory yielding unprecedented reproducibility in the kinetic data. The methodology relies on optimization of every step from production and purification to isolation of highly pure monomeric peptide, and inertness and minimized area of all surfaces. We will use cell viability studies to detect toxic oligomeric species, and selective radio-labeling experiments to pinpoint the origin of those species. In order to obtain insight into the molecular determinants and the relative role of different kinds of intermolecular interactions for each microscopic step, we will study the concentration dependent aggregation kinetics as a function of extrinsic and intrinsic parameters. Extrinsic parameters include temperature, salt, pH, biological membranes, other proteins, and low and high Mw inhibitors. Intrinsic parameters include point mutations and sequence extension/truncation. We will perform detailed kinetic studies for each inhibitor to learn which step in the process is inhibited coupled to cell toxicity assays to learn whether the generation of toxic oligomers is limited. We will use spectroscopic techniques, dynamic light scattering, cryogenic transmission electron microscopy and mass spectrometry coupled to HD exchange to learn about structural transitions as a function of process progression under different conditions to favor different microscopic steps. The results may lead to improved diagnostics and therapeutics of Alzheimer disease.
Max ERC Funding
2 499 920 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym MECCA
Project Meeting Challenges in Computer Architecture
Researcher (PI) Per Orvar Stenström
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Computer technology has doubled computational performance every 24 months, over the past several decades. This performance growth rate has been an enabler for the dramatic innovation in information technology that now embraces our society. Before 2004, application developers could exploit this performance growth rate with no effort. However, since 2004 power consumption of computer chips exceeded the allowable limits and from that point and onwards, parallel computer architectures became the norm. Currently, parallelism is completely exposed to application developers and managing it is difficult and time-consuming. This has a serious impact on software productivity that may stall progress in information technology.
Technology forecasts predict that by 2020 there will be hundreds of processors on a computer chip. Apart from managing parallelism, keeping power consumption within allowable limits will remain a key roadblock for maintaining historical performance growth rates. Power efficiency must increase by an order of magnitude in the next ten years to not limit the growth rate. Finally, computer chips are also key components in embedded controllers, where stringent timing responses are mandatory. Delivering predictable and tight response times using parallel architectures is a challenging and unsolved problem.
MECCA takes a novel, interdisciplinary and unconventional approach to address three important challenges facing computer architecture – the three Ps: Parallelism, Power, and Predictability in a unified framework. Unlike earlier, predominantly disciplinary approaches, MECCA bridges layers in computing systems from the programming language/model, to the compiler, to the run-time/OS, down to the architecture layer. This opens up for exchanging information across layers to manage parallelism and architectural resources in a
transparent way to application developers to meet challenging performance, power, and predictability requirements for future computers."
Summary
"Computer technology has doubled computational performance every 24 months, over the past several decades. This performance growth rate has been an enabler for the dramatic innovation in information technology that now embraces our society. Before 2004, application developers could exploit this performance growth rate with no effort. However, since 2004 power consumption of computer chips exceeded the allowable limits and from that point and onwards, parallel computer architectures became the norm. Currently, parallelism is completely exposed to application developers and managing it is difficult and time-consuming. This has a serious impact on software productivity that may stall progress in information technology.
Technology forecasts predict that by 2020 there will be hundreds of processors on a computer chip. Apart from managing parallelism, keeping power consumption within allowable limits will remain a key roadblock for maintaining historical performance growth rates. Power efficiency must increase by an order of magnitude in the next ten years to not limit the growth rate. Finally, computer chips are also key components in embedded controllers, where stringent timing responses are mandatory. Delivering predictable and tight response times using parallel architectures is a challenging and unsolved problem.
MECCA takes a novel, interdisciplinary and unconventional approach to address three important challenges facing computer architecture – the three Ps: Parallelism, Power, and Predictability in a unified framework. Unlike earlier, predominantly disciplinary approaches, MECCA bridges layers in computing systems from the programming language/model, to the compiler, to the run-time/OS, down to the architecture layer. This opens up for exchanging information across layers to manage parallelism and architectural resources in a
transparent way to application developers to meet challenging performance, power, and predictability requirements for future computers."
Max ERC Funding
2 379 822 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym METAmorphoses
Project Shapeshifting Metasurfaces for Chemically Selective Augmented Reality
Researcher (PI) Antonio AMBROSIO
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary I propose to realize the first shapeshifting optical metasurface that changes its functionality on-demand and adapts to changing external conditions. The metasurface may work as a chemically selective lens that allows transmission only of the spectral fingerprint of a specific molecule in the mid-IR wavelength range. The same metasurface can later be turned into an adaptive lens for focusing and detection under the skin. For such ambitious goal, a radically new approach is needed.
I will realize shapeshifting metasurfaces made of a polymer containing photo-switchable molecules. The surface of such polymers undergoes a morphology re-organization (surface structuring) when illuminated by an external visible light pattern. The polymer will be structured with visible light and the resulting metasurfaces will work in the mid-IR. I will use state-of-the-art optical nano-imaging techniques to investigate the surface structuring phenomenon at the nanoscale in order to achieve full control of the mechanism.
Since the polymer surface can continuously be adjusted with the illuminating visible light, it will be possible to shift from one encoded optical functionality to a completely different one. Once optimized, this completely out-of-the-box approach will be completed by developing a feedback mechanism that allows for self- adjustment of the polymeric metasurface to changing external conditions. This will open endless possibilities in many fields, from medical imaging to security and quality control.
The proposed approach is unprecedented but it is perfectly in line with my research activities, resulting in fact from merging different techniques that I master into a new research field.
My approach is also inexpensive relative to the usual nano-fabrication techniques and immediately compatible with high-volume production, providing a viable technology platform for lightweight, eyewear technology, that reflects the views of key industrial players in the field.
Summary
I propose to realize the first shapeshifting optical metasurface that changes its functionality on-demand and adapts to changing external conditions. The metasurface may work as a chemically selective lens that allows transmission only of the spectral fingerprint of a specific molecule in the mid-IR wavelength range. The same metasurface can later be turned into an adaptive lens for focusing and detection under the skin. For such ambitious goal, a radically new approach is needed.
I will realize shapeshifting metasurfaces made of a polymer containing photo-switchable molecules. The surface of such polymers undergoes a morphology re-organization (surface structuring) when illuminated by an external visible light pattern. The polymer will be structured with visible light and the resulting metasurfaces will work in the mid-IR. I will use state-of-the-art optical nano-imaging techniques to investigate the surface structuring phenomenon at the nanoscale in order to achieve full control of the mechanism.
Since the polymer surface can continuously be adjusted with the illuminating visible light, it will be possible to shift from one encoded optical functionality to a completely different one. Once optimized, this completely out-of-the-box approach will be completed by developing a feedback mechanism that allows for self- adjustment of the polymeric metasurface to changing external conditions. This will open endless possibilities in many fields, from medical imaging to security and quality control.
The proposed approach is unprecedented but it is perfectly in line with my research activities, resulting in fact from merging different techniques that I master into a new research field.
My approach is also inexpensive relative to the usual nano-fabrication techniques and immediately compatible with high-volume production, providing a viable technology platform for lightweight, eyewear technology, that reflects the views of key industrial players in the field.
Max ERC Funding
2 745 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym MICROMOTILITY
Project Multiscale modeling and simulation of biological and artificial
locomotion at the micron scale: from metastatic tumor cells and unicellular swimmers to bioinspired microrobots
Researcher (PI) Antonio De Simone
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary The project addresses the mechanical bases of cell motility by swimming and crawling, and the possibility of replicating the principles behind them in artificial systems.
The goals are to elucidate some key mechanisms governing bio-locomotion. In particular, actin- based motility of crawling cells and motility by swimming of unicellular organisms will be studied both in general and with reference to concrete model systems.
The study of biological examples of swimming and crawling motility will be used to produce a prototype of a micron-scale bio-inspired motile micro-robot exploiting the miniaturization that becomes possible from the extensive use of active materials.
This is a multi-disciplinary research project. The themes arise from the Mechanics of Soft and Bio- logical Matter. The methods are those of Computational Engineering, and take advantage of innovative techniques from Applied Mathematics. The planned research activities rest on the development of new tools and methods in mathematical modeling, numerical simulation, data acquisition on biological systems, and on the construction of prototype devices.
***
Summary
The project addresses the mechanical bases of cell motility by swimming and crawling, and the possibility of replicating the principles behind them in artificial systems.
The goals are to elucidate some key mechanisms governing bio-locomotion. In particular, actin- based motility of crawling cells and motility by swimming of unicellular organisms will be studied both in general and with reference to concrete model systems.
The study of biological examples of swimming and crawling motility will be used to produce a prototype of a micron-scale bio-inspired motile micro-robot exploiting the miniaturization that becomes possible from the extensive use of active materials.
This is a multi-disciplinary research project. The themes arise from the Mechanics of Soft and Bio- logical Matter. The methods are those of Computational Engineering, and take advantage of innovative techniques from Applied Mathematics. The planned research activities rest on the development of new tools and methods in mathematical modeling, numerical simulation, data acquisition on biological systems, and on the construction of prototype devices.
***
Max ERC Funding
1 302 270 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym NBEB-SSP
Project Nonparametric Bayes and empirical Bayes for species sampling problems: classical questions, new directions and related issues
Researcher (PI) Stefano FAVARO
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Consolidator Grant (CoG), PE1, ERC-2018-COG
Summary Consider a population of individuals belonging to different species with unknown proportions. Given an
initial (observable) random sample from the population, how do we estimate the number of species in the
population, or the probability of discovering a new species in one additional sample, or the number of
hitherto unseen species that would be observed in additional unobservable samples? These are archetypal
examples of a broad class of statistical problems referred to as species sampling problems (SSP), namely:
statistical problems in which the objects of inference are functionals involving the unknown species
proportions and/or the species frequency counts induced by observable and unobservable samples from the
population. SSPs first appeared in ecology, and their importance has grown considerably in the recent years
driven by challenging applications in a wide range of leading scientific disciplines, e.g., biosciences and
physical sciences, engineering sciences, machine learning, theoretical computer science and information
theory, etc.
The objective of this project is the introduction and a thorough investigation of new nonparametric Bayes
and empirical Bayes methods for SSPs. The proposed advances will include: i) addressing challenging
methodological open problems in classical SSPs under the nonparametric empirical Bayes framework, which
is arguably the most developed (currently most implemented by practitioners) framework do deal with
classical SSPs; fully exploiting and developing the potential of tools from mathematical analysis,
combinatorial probability and Bayesian nonparametric statistics to set forth a coherent modern approach to
classical SSPs, and then investigating the interplay between this approach and its empirical counterpart;
extending the scope of the above studies to more challenging SSPs, and classes of generalized SSPs, that
have emerged recently in the fields of biosciences and physical sciences, machine learning and information
theory.
Summary
Consider a population of individuals belonging to different species with unknown proportions. Given an
initial (observable) random sample from the population, how do we estimate the number of species in the
population, or the probability of discovering a new species in one additional sample, or the number of
hitherto unseen species that would be observed in additional unobservable samples? These are archetypal
examples of a broad class of statistical problems referred to as species sampling problems (SSP), namely:
statistical problems in which the objects of inference are functionals involving the unknown species
proportions and/or the species frequency counts induced by observable and unobservable samples from the
population. SSPs first appeared in ecology, and their importance has grown considerably in the recent years
driven by challenging applications in a wide range of leading scientific disciplines, e.g., biosciences and
physical sciences, engineering sciences, machine learning, theoretical computer science and information
theory, etc.
The objective of this project is the introduction and a thorough investigation of new nonparametric Bayes
and empirical Bayes methods for SSPs. The proposed advances will include: i) addressing challenging
methodological open problems in classical SSPs under the nonparametric empirical Bayes framework, which
is arguably the most developed (currently most implemented by practitioners) framework do deal with
classical SSPs; fully exploiting and developing the potential of tools from mathematical analysis,
combinatorial probability and Bayesian nonparametric statistics to set forth a coherent modern approach to
classical SSPs, and then investigating the interplay between this approach and its empirical counterpart;
extending the scope of the above studies to more challenging SSPs, and classes of generalized SSPs, that
have emerged recently in the fields of biosciences and physical sciences, machine learning and information
theory.
Max ERC Funding
982 930 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym NEFERTITI
Project NEar FiEld cosmology: Re-Tracing Invisible TImes
Researcher (PI) stefania SALVADORI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI FIRENZE
Call Details Starting Grant (StG), PE9, ERC-2018-STG
Summary The goal of NEFERTITI is to make a major step forward in our understanding of the first stars and galaxies by catching the stellar fossils from the early Universe in our Galactic neighborhood. To move beyond the state-of-the-art and study many of these precious fossils, I will adopt a novel approach that integrates theoretical and observational research and that will allow me to fully exploit: i) the huge data-flow from upcoming stellar surveys, ii) my cosmological models, which uniquely link Local data and early cosmic star-formation.
The first stars profoundly influenced the primordial Universe, affecting subsequent stellar generations and the build-up of the first galaxies. In spite of extraordinary progress in theoretical modeling and observational techniques little is known about their properties, not even their typical mass. A direct exploration of their formation epochs is a tremendous challenge. Even JWST will not see the faint dwarf galaxies where the first stars formed more than 13 billion years ago.
In the Local Group, the living relics of the first stars can be directly observed and used to re-trace the chemical evolution and star-formation of the gas during those “invisible” times. Yet, these early Universe survivors are very rare and difficult to catch. In the present era of wide and deep Local surveys, such as DES, Gaia-ESO, and WEAVE, the total number of stars observed is dramatically increasing. Combining semi-numerical models with radiative transfer codes, I will fully exploit these novel data flow to catch the local stellar fossils and to:
1) constrain the mass distribution of the first stars,
2) uncover the physical processes driving the build-up of the first galaxies.
NEFERTITI will link Near and Far-field cosmology, give new insights into the formation of the Local Group, guide the interpretation of data from future surveys, and pave the way for the exploitation of new generation spectrographs on the E-ELT (MOSAIC, HIRES).
Summary
The goal of NEFERTITI is to make a major step forward in our understanding of the first stars and galaxies by catching the stellar fossils from the early Universe in our Galactic neighborhood. To move beyond the state-of-the-art and study many of these precious fossils, I will adopt a novel approach that integrates theoretical and observational research and that will allow me to fully exploit: i) the huge data-flow from upcoming stellar surveys, ii) my cosmological models, which uniquely link Local data and early cosmic star-formation.
The first stars profoundly influenced the primordial Universe, affecting subsequent stellar generations and the build-up of the first galaxies. In spite of extraordinary progress in theoretical modeling and observational techniques little is known about their properties, not even their typical mass. A direct exploration of their formation epochs is a tremendous challenge. Even JWST will not see the faint dwarf galaxies where the first stars formed more than 13 billion years ago.
In the Local Group, the living relics of the first stars can be directly observed and used to re-trace the chemical evolution and star-formation of the gas during those “invisible” times. Yet, these early Universe survivors are very rare and difficult to catch. In the present era of wide and deep Local surveys, such as DES, Gaia-ESO, and WEAVE, the total number of stars observed is dramatically increasing. Combining semi-numerical models with radiative transfer codes, I will fully exploit these novel data flow to catch the local stellar fossils and to:
1) constrain the mass distribution of the first stars,
2) uncover the physical processes driving the build-up of the first galaxies.
NEFERTITI will link Near and Far-field cosmology, give new insights into the formation of the Local Group, guide the interpretation of data from future surveys, and pave the way for the exploitation of new generation spectrographs on the E-ELT (MOSAIC, HIRES).
Max ERC Funding
1 180 813 €
Duration
Start date: 2019-05-01, End date: 2024-04-30
Project acronym NEURO-PLASMONICS
Project Neuro-Plasmonics
Researcher (PI) Francesco De Angelis
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE3, ERC-2013-CoG
Summary Research neuronal signaling is the subject of a very large community, but progresses face a dense multi-scale dynamics involving signaling at the molecular, cellular and large neuronal network levels. Whereas the brain capabilities are most likely emerging from large neuronal networks, available electrophysiological methods limit our access to single cells and typically provides only a fragmented observation, on limited spatial/temporal scales. Therefore, broadening the spectrum of scales for observing neuronal signaling within large neuronal networks is a major challenge that can revolutionize our capability of studying the brain and its physio-pathological functions, as well as of deriving bio-inspired concepts to implement artificial system based on neuronal circuits. We propose the development of an innovative electro-plasmonic multifunctional platform that by combining different methodologies emerging from distant fields of Science and Technology will provide a radically new path for real time neurointerfacing at different scale levels:
1. The molecular scale: 3D plasmonic nanoantennas will give access to information at molecular level by means of enhanced spectroscopies with particular regard of time resolved Raman scattering.
2. The single-neuron scale within neuronal networks: by both in-cell and extra-cell couplings with 3D nanostructures which work at the same time as plasmonic antennas and CMOS 3D nanoelectrodes.
3. The scale of large neuronal networks: by CMOS high-density electrode arrays for spatially and temporally resolving neuronal signaling form thousands of measuring sites.
This is achieved by exploiting an innovative nanofabrication method able to realize 3D nanostructures which can work at the same time as plasmonic nanoantennas and as nanoelectrodes. These structures will be integrated on CMOS multi-electrode arrays designed to manage multiscale measurements from the molecular level up to network level on several thousand of measurement sites.
Summary
Research neuronal signaling is the subject of a very large community, but progresses face a dense multi-scale dynamics involving signaling at the molecular, cellular and large neuronal network levels. Whereas the brain capabilities are most likely emerging from large neuronal networks, available electrophysiological methods limit our access to single cells and typically provides only a fragmented observation, on limited spatial/temporal scales. Therefore, broadening the spectrum of scales for observing neuronal signaling within large neuronal networks is a major challenge that can revolutionize our capability of studying the brain and its physio-pathological functions, as well as of deriving bio-inspired concepts to implement artificial system based on neuronal circuits. We propose the development of an innovative electro-plasmonic multifunctional platform that by combining different methodologies emerging from distant fields of Science and Technology will provide a radically new path for real time neurointerfacing at different scale levels:
1. The molecular scale: 3D plasmonic nanoantennas will give access to information at molecular level by means of enhanced spectroscopies with particular regard of time resolved Raman scattering.
2. The single-neuron scale within neuronal networks: by both in-cell and extra-cell couplings with 3D nanostructures which work at the same time as plasmonic antennas and CMOS 3D nanoelectrodes.
3. The scale of large neuronal networks: by CMOS high-density electrode arrays for spatially and temporally resolving neuronal signaling form thousands of measuring sites.
This is achieved by exploiting an innovative nanofabrication method able to realize 3D nanostructures which can work at the same time as plasmonic nanoantennas and as nanoelectrodes. These structures will be integrated on CMOS multi-electrode arrays designed to manage multiscale measurements from the molecular level up to network level on several thousand of measurement sites.
Max ERC Funding
1 388 000 €
Duration
Start date: 2014-04-01, End date: 2018-03-31
Project acronym NEWIRES
Project Next Generation Semiconductor Nanowires
Researcher (PI) Kimberly Thelander
Host Institution (HI) LUNDS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary Semiconductor nanowires composed of III-V materials have enormous potential to add new functionality to electronics and optical applications. However, integration of these promising structures into applications is severely limited by the current near-universal reliance on gold nanoparticles as seeds for nanowire fabrication. Although highly controlled fabrication is achieved, this metal is entirely incompatible with the Si-based electronics industry. It also presents limitations for the extension of nanowire research towards novel materials not existing in bulk. To date, exploration of alternatives has been limited to selective-area and self-seeded processes, both of which have major limitations in terms of size and morphology control, potential to combine materials, and crystal structure tuning. There is also very little understanding of precisely why gold has proven so successful for nanowire growth, and which alternatives may yield comparable or better results. The aim of this project will be to explore alternative nanoparticle seed materials to go beyond the use of gold in III-V nanowire fabrication. This will be achieved using a unique and recently developed capability for aerosol-phase fabrication of highly controlled nanoparticles directly integrated with conventional nanowire fabrication equipment. The primary goal will be to deepen the understanding of the nanowire fabrication process, and the specific advantages (and limitations) of gold as a seed material, in order to develop and optimize alternatives. The use of a wide variety of seed particle materials in nanowire fabrication will greatly broaden the variety of novel structures that can be fabricated. The results will also transform the nanowire fabrication research field, in order to develop important connections between nanowire research and the semiconductor industry, and to greatly improve the viability of nanowire integration into future devices.
Summary
Semiconductor nanowires composed of III-V materials have enormous potential to add new functionality to electronics and optical applications. However, integration of these promising structures into applications is severely limited by the current near-universal reliance on gold nanoparticles as seeds for nanowire fabrication. Although highly controlled fabrication is achieved, this metal is entirely incompatible with the Si-based electronics industry. It also presents limitations for the extension of nanowire research towards novel materials not existing in bulk. To date, exploration of alternatives has been limited to selective-area and self-seeded processes, both of which have major limitations in terms of size and morphology control, potential to combine materials, and crystal structure tuning. There is also very little understanding of precisely why gold has proven so successful for nanowire growth, and which alternatives may yield comparable or better results. The aim of this project will be to explore alternative nanoparticle seed materials to go beyond the use of gold in III-V nanowire fabrication. This will be achieved using a unique and recently developed capability for aerosol-phase fabrication of highly controlled nanoparticles directly integrated with conventional nanowire fabrication equipment. The primary goal will be to deepen the understanding of the nanowire fabrication process, and the specific advantages (and limitations) of gold as a seed material, in order to develop and optimize alternatives. The use of a wide variety of seed particle materials in nanowire fabrication will greatly broaden the variety of novel structures that can be fabricated. The results will also transform the nanowire fabrication research field, in order to develop important connections between nanowire research and the semiconductor industry, and to greatly improve the viability of nanowire integration into future devices.
Max ERC Funding
1 496 246 €
Duration
Start date: 2013-09-01, End date: 2018-08-31
Project acronym NewTURB
Project New eddy-simulation concepts and methodologies
for frontier problems in Turbulence
Researcher (PI) Luca Biferale
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary Advances in transportation, energy harvesting, chemical processing, climatology, atmospheric and marine pollution are obstructed by the lack of understanding of turbulence. The turbulent energy transfer toward small-scales is characterized by highly non-Gaussian and out-of-equilibrium fluctuations that cannot be described by mean-field theories or traditional closure approximations. State-of-the-art computers and algorithms do not allow to perform brute-force direct numerical simulations of any realistic turbulent configuration: modelling is mandatory. On the other hand, turbulence models are often strongly limited by our lack of understanding of fundamental mechanisms. As a result, we have a deadlock: turbulence is thought of as ‘unsolvable’ theoretically and computationally ‘intensive’. Indeed, progress by using conventional methods has been slow. Last year, however, something new happened. Two unconventional conceptual and numerical methodologies to study Navier-Stokes equations appeared based on: (i) a surgery of nonlinear interactions with different Energy and Helicity contents, (ii) a fractal-Fourier decimation. These unexplored tools are potential breakthroughs to unravel the basic mechanisms governing the turbulent transfer in isotropic, anisotropic and bounded flows, e.g. the mechanism behind the growth of small-scales vorticity and formation/stability of coherent structures, a challenge that has defeated all numerical and theoretical attempts, up to now. The ultimate goal of NewTURB is to integrate the fresh knowledge achieved by using these novel numerical instruments to push forward the frontiers of turbulence modelling, exploiting the possibility to reduce the number-of-degrees-of-freedom in an innovative way to deliver alternative frontier ‘multiscale eddy-simulations’ methodologies for both unbounded and bounded flows with smooth walls or with heterogeneous landscapes, e.g. flows over a rough surface.
Summary
Advances in transportation, energy harvesting, chemical processing, climatology, atmospheric and marine pollution are obstructed by the lack of understanding of turbulence. The turbulent energy transfer toward small-scales is characterized by highly non-Gaussian and out-of-equilibrium fluctuations that cannot be described by mean-field theories or traditional closure approximations. State-of-the-art computers and algorithms do not allow to perform brute-force direct numerical simulations of any realistic turbulent configuration: modelling is mandatory. On the other hand, turbulence models are often strongly limited by our lack of understanding of fundamental mechanisms. As a result, we have a deadlock: turbulence is thought of as ‘unsolvable’ theoretically and computationally ‘intensive’. Indeed, progress by using conventional methods has been slow. Last year, however, something new happened. Two unconventional conceptual and numerical methodologies to study Navier-Stokes equations appeared based on: (i) a surgery of nonlinear interactions with different Energy and Helicity contents, (ii) a fractal-Fourier decimation. These unexplored tools are potential breakthroughs to unravel the basic mechanisms governing the turbulent transfer in isotropic, anisotropic and bounded flows, e.g. the mechanism behind the growth of small-scales vorticity and formation/stability of coherent structures, a challenge that has defeated all numerical and theoretical attempts, up to now. The ultimate goal of NewTURB is to integrate the fresh knowledge achieved by using these novel numerical instruments to push forward the frontiers of turbulence modelling, exploiting the possibility to reduce the number-of-degrees-of-freedom in an innovative way to deliver alternative frontier ‘multiscale eddy-simulations’ methodologies for both unbounded and bounded flows with smooth walls or with heterogeneous landscapes, e.g. flows over a rough surface.
Max ERC Funding
1 986 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym NINA
Project Nitride-based nanostructured novel thermoelectric thin-film materials
Researcher (PI) Per Daniel Eklund
Host Institution (HI) LINKOPINGS UNIVERSITET
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary My recent discovery of the anomalously high thermoelectric power factor of ScN thin films demonstrates that unexpected thermoelectric materials can be found among the early transition-metal and rare-earth nitrides. Corroborated by first-principles calculations, we have well-founded hypotheses that these properties stem from nitrogen vacancies, dopants, and alloying, which introduce controllable sharp features with a large slope at the Fermi level, causing a drastically increased Seebeck coefficient. In-depth fundamental studies are needed to enable property tuning and materials design in these systems, to timely exploit my discovery and break new ground.
The project concerns fundamental, primarily experimental, studies on scandium nitride-based and related single-phase and nanostructured films. The overall goal is to understand the complex correlations between electronic, thermal and thermoelectric properties and structural features such as layering, orientation, epitaxy, dopants and lattice defects. Ab initio calculations of band structures, mixing thermodynamics, and properties are integrated with the experimental activities. Novel mechanisms are proposed for drastic reduction of the thermal conductivity with retained high power factor. This will be realized by intentionally introduced secondary phases and artificial nanolaminates; the layering causing discontinuities in the phonon distribution and thus reducing thermal conductivity.
My expertise in thin-film processing and advanced materials characterization places me in a unique position to pursue this novel high-gain approach to thermoelectrics, and an ERC starting grant will be essential in achieving critical mass and consolidating an internationally leading research platform. The scientific impact and vision is in pioneering an understanding of a novel class of thermoelectric materials with potential for thermoelectric devices for widespread use in environmentally friendly energy applications.
Summary
My recent discovery of the anomalously high thermoelectric power factor of ScN thin films demonstrates that unexpected thermoelectric materials can be found among the early transition-metal and rare-earth nitrides. Corroborated by first-principles calculations, we have well-founded hypotheses that these properties stem from nitrogen vacancies, dopants, and alloying, which introduce controllable sharp features with a large slope at the Fermi level, causing a drastically increased Seebeck coefficient. In-depth fundamental studies are needed to enable property tuning and materials design in these systems, to timely exploit my discovery and break new ground.
The project concerns fundamental, primarily experimental, studies on scandium nitride-based and related single-phase and nanostructured films. The overall goal is to understand the complex correlations between electronic, thermal and thermoelectric properties and structural features such as layering, orientation, epitaxy, dopants and lattice defects. Ab initio calculations of band structures, mixing thermodynamics, and properties are integrated with the experimental activities. Novel mechanisms are proposed for drastic reduction of the thermal conductivity with retained high power factor. This will be realized by intentionally introduced secondary phases and artificial nanolaminates; the layering causing discontinuities in the phonon distribution and thus reducing thermal conductivity.
My expertise in thin-film processing and advanced materials characterization places me in a unique position to pursue this novel high-gain approach to thermoelectrics, and an ERC starting grant will be essential in achieving critical mass and consolidating an internationally leading research platform. The scientific impact and vision is in pioneering an understanding of a novel class of thermoelectric materials with potential for thermoelectric devices for widespread use in environmentally friendly energy applications.
Max ERC Funding
1 499 976 €
Duration
Start date: 2013-10-01, End date: 2018-09-30
Project acronym NSHOCK
Project Non classical rarefaction shock-waves in molecularly complex vapours
Researcher (PI) Alberto Guardone
Host Institution (HI) POLITECNICO DI MILANO
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary The expansion of a dilute gas through a gasdynamics convergent-divergent nozzle can occur in three different regimes, depending on the inlet and discharge conditions and on the gas: via a fully subsonic expansion, via a subsonic-supersonic or via a subsonic-supersonic-subsonic expansion embedding a compression shock wave within the divergent portion of the nozzle. I devised an exact solution procedure for computing nozzle flows of real gases, which allowed me to discover that in molecularly complex fluids eighteen additional different flow configurations are possible, each including multiple compression classical shocks as well as non classical rarefaction ones. Modern thermodynamic models indicate that these exotic regimes can possibly occur in nozzle flows of molecularly complex fluids such as hydrocarbons, siloxanes or perfluorocarbons operating close to the liquid-vapour saturation curve and critical point. The experimental observation of one only of these eighteen flow configurations would be sufficient to prove for the first time that non classical gasdynamics phenomena are indeed possible in the vapour region of a fluid with high molecular complexity
To this purpose, a modification to the blow-down wind tunnel for dense gases at Politecnico di Milano is proposed to use mixtures of siloxane fluids. Measurements are complemented by numerical simulations of the expected flow field and by state-of-the-art uncertainty quantification techniques. The distinctive feature of the proposed experiment is the adoption of mixture of siloxanes as working fluids. Mixtures of siloxanes are well known to exhibit an higher stability limit than their pure components, due to the redistribution process occurring at high temperature.
The increased understanding of real-gas dynamics will enable to improve the design of Organic Rankine Cycle Engines, to be used in small scale energy production from biomasses, binary geothermal systems and concentrating solar thermal power plants.
Summary
The expansion of a dilute gas through a gasdynamics convergent-divergent nozzle can occur in three different regimes, depending on the inlet and discharge conditions and on the gas: via a fully subsonic expansion, via a subsonic-supersonic or via a subsonic-supersonic-subsonic expansion embedding a compression shock wave within the divergent portion of the nozzle. I devised an exact solution procedure for computing nozzle flows of real gases, which allowed me to discover that in molecularly complex fluids eighteen additional different flow configurations are possible, each including multiple compression classical shocks as well as non classical rarefaction ones. Modern thermodynamic models indicate that these exotic regimes can possibly occur in nozzle flows of molecularly complex fluids such as hydrocarbons, siloxanes or perfluorocarbons operating close to the liquid-vapour saturation curve and critical point. The experimental observation of one only of these eighteen flow configurations would be sufficient to prove for the first time that non classical gasdynamics phenomena are indeed possible in the vapour region of a fluid with high molecular complexity
To this purpose, a modification to the blow-down wind tunnel for dense gases at Politecnico di Milano is proposed to use mixtures of siloxane fluids. Measurements are complemented by numerical simulations of the expected flow field and by state-of-the-art uncertainty quantification techniques. The distinctive feature of the proposed experiment is the adoption of mixture of siloxanes as working fluids. Mixtures of siloxanes are well known to exhibit an higher stability limit than their pure components, due to the redistribution process occurring at high temperature.
The increased understanding of real-gas dynamics will enable to improve the design of Organic Rankine Cycle Engines, to be used in small scale energy production from biomasses, binary geothermal systems and concentrating solar thermal power plants.
Max ERC Funding
1 485 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym OPUS
Project Optical Ultra-Sensor
Researcher (PI) Markus Pollnau
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary This project aims at pushing the limits of optical sensing on a microchip by orders of magnitude, thereby allowing for ultra-high sensitivity in optical detection and enabling first-time-ever demonstrations of several optical sensing principles on a microchip. My idea is based upon our distributed-feedback lasers in rare-earth-ion-doped aluminum oxide waveguides on a silicon chip with ultra-narrow linewidths of 1 kHz, corresponding to Q-factors exceeding 10^11, intra-cavity laser intensities of several watts over a waveguide cross-section of 2 micrometer, and light interaction lengths reaching 20 km. Optical read-out of the laser frequency and linewidth is achieved by frequency down-conversion via detection of the GHz beat signal of two such lasers positioned in the same waveguide or in parallel waveguides on the same microchip.
The sensitivity of optical detection is related to the laser linewidth, interaction length, and transverse mode overlap with the measurand; its potential of optically exciting ions or molecules and its optical trapping force are related to the laser intensity. By applying novel concepts, we will decrease the laser linewidth to 1 Hz (Q-factor > 10^14), thereby also significantly increasing the intra-cavity intensity and light interaction length, simplify the read-out by reducing the line-width separation between two lasers to the MHz regime, and increase the mode interaction with the environment by either increasing its evanescent field or perpendicularly intersecting a nanofluidic channel with the optical waveguide, thereby allowing for unprecedented sensitivity of optical detection on a microchip. We will exploit this dual-wavelength distributed-feedback laser sensor for the first-ever demonstrations of intra-laser-cavity (ILC) optical trapping and detection of nano-sized biological objects in an optofluidic chip, ILC trace-gas detection on a microchip, ILC Raman spectrometry on a microchip, and ILC spectroscopy of single rare-earth ions.
Summary
This project aims at pushing the limits of optical sensing on a microchip by orders of magnitude, thereby allowing for ultra-high sensitivity in optical detection and enabling first-time-ever demonstrations of several optical sensing principles on a microchip. My idea is based upon our distributed-feedback lasers in rare-earth-ion-doped aluminum oxide waveguides on a silicon chip with ultra-narrow linewidths of 1 kHz, corresponding to Q-factors exceeding 10^11, intra-cavity laser intensities of several watts over a waveguide cross-section of 2 micrometer, and light interaction lengths reaching 20 km. Optical read-out of the laser frequency and linewidth is achieved by frequency down-conversion via detection of the GHz beat signal of two such lasers positioned in the same waveguide or in parallel waveguides on the same microchip.
The sensitivity of optical detection is related to the laser linewidth, interaction length, and transverse mode overlap with the measurand; its potential of optically exciting ions or molecules and its optical trapping force are related to the laser intensity. By applying novel concepts, we will decrease the laser linewidth to 1 Hz (Q-factor > 10^14), thereby also significantly increasing the intra-cavity intensity and light interaction length, simplify the read-out by reducing the line-width separation between two lasers to the MHz regime, and increase the mode interaction with the environment by either increasing its evanescent field or perpendicularly intersecting a nanofluidic channel with the optical waveguide, thereby allowing for unprecedented sensitivity of optical detection on a microchip. We will exploit this dual-wavelength distributed-feedback laser sensor for the first-ever demonstrations of intra-laser-cavity (ILC) optical trapping and detection of nano-sized biological objects in an optofluidic chip, ILC trace-gas detection on a microchip, ILC Raman spectrometry on a microchip, and ILC spectroscopy of single rare-earth ions.
Max ERC Funding
2 499 958 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym OutflowMagn
Project Magnetic fields and the outflows during the formation and evolution of stars
Researcher (PI) Wouter Henricus Theodorus Vlemmings
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary The outflows of young and old stars play a crucial role in the cycle of matter in galaxies. Stars and planetary systems are formed through complex physical processes during the collapse of gas clouds with outflows a required ingredient. At the end of a stars life, stellar outflows are the main source of heavy elements that are essential for the formation of stars, planets and life. Magnetic fields are one of the key factors governing the in particular the often observed collimated outflow. They might also be a key ingredient in driving stellar mass loss and are potentially essential for stabilizing accretion disks of, in particular, massive proto-stars. Only polarization observations at different spatial scales are able to measure the strength and structure of magnetic fields during the launching of outflows from young and old stars. Because stars in these evolutionary phases are highly obscured by dusty envelopes, their magnetic fields are best probed through observations of molecules and dust at submillimeter and radio wavelengths. In addition to its role, the origin of the magnetic field in these stellar phases is also still unknown and to determine it multi-wavelength observations are essential. The proposed research group will use state of the art submillimeter and radio instruments, integrated with self-consistent radiative transfer and magneto-hydrodynamic models, to examine the role and origin of magnetic fields during star formation and in the outflows from evolved stars. The group will search for planets around evolved stars to answer the elusive question on the origin of their magnetic field and determine the connection between the galactic magnetic field and that responsible for the formation of jets and potentially disks around young proto-stars. This fundamental new work, for which a dedicated research group is essential, will reveal the importance of magnetism during star formation as well as in driving and shaping the mass loss of evolved stars.
Summary
The outflows of young and old stars play a crucial role in the cycle of matter in galaxies. Stars and planetary systems are formed through complex physical processes during the collapse of gas clouds with outflows a required ingredient. At the end of a stars life, stellar outflows are the main source of heavy elements that are essential for the formation of stars, planets and life. Magnetic fields are one of the key factors governing the in particular the often observed collimated outflow. They might also be a key ingredient in driving stellar mass loss and are potentially essential for stabilizing accretion disks of, in particular, massive proto-stars. Only polarization observations at different spatial scales are able to measure the strength and structure of magnetic fields during the launching of outflows from young and old stars. Because stars in these evolutionary phases are highly obscured by dusty envelopes, their magnetic fields are best probed through observations of molecules and dust at submillimeter and radio wavelengths. In addition to its role, the origin of the magnetic field in these stellar phases is also still unknown and to determine it multi-wavelength observations are essential. The proposed research group will use state of the art submillimeter and radio instruments, integrated with self-consistent radiative transfer and magneto-hydrodynamic models, to examine the role and origin of magnetic fields during star formation and in the outflows from evolved stars. The group will search for planets around evolved stars to answer the elusive question on the origin of their magnetic field and determine the connection between the galactic magnetic field and that responsible for the formation of jets and potentially disks around young proto-stars. This fundamental new work, for which a dedicated research group is essential, will reveal the importance of magnetism during star formation as well as in driving and shaping the mass loss of evolved stars.
Max ERC Funding
2 000 000 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym PAIDEIA
Project PlAsmon InduceD hot Electron extraction with doped semiconductors for Infrared solAr energy
Researcher (PI) Francesco SCOTOGNELLA
Host Institution (HI) POLITECNICO DI MILANO
Call Details Consolidator Grant (CoG), PE8, ERC-2018-COG
Summary Earth is inhabited by an energy hungry human society. The Sun, with a global radiation at the ground level of more than 1 kW/m^2, is our largest source of energy. However, 45% of the total radiation is in the near infrared (NIR) and is not absorbed by most photovoltaic materials.
PAIDEIA focuses on two main advantages aiming to enhance the capacity of solar energy conversion:
i) plasmon assisted hot carriers extraction from NIR plasmonic materials;
ii) linewidth narrowing in plasmonic nanoparticle films that enhances the lifetime of hot carriers and, thus, boosts the efficiency of light driven carrier extraction.
Instead of metals, which operate mostly in the visible region, we will make use of doped semiconductor nanocrystals (DSNCs) as hot electron extraction materials possessing a plasmonic response tunable in the range 800 nm – 4000 nm. Three different innovative architectures will be used for improved device performance: i) improved Schottky junctions (DSNC/wide band gap semiconductor nanocomposites); ii) ultrathin devices (DSNCs/2D quantum materials); iii) maximized interface DSNC/semiconductor bulk hetero-Schottky junctions.
By combining both concepts in advanced architectures we aim to produce a solar cell device that functions in the NIR with efficiencies of up to 10%. A tandem solar cell that combines the conventional power conversion efficiency, up to ~1100 nm, of a commercial Si solar cell (~20%) with the new PAIDEIA based device is expected to reach a total power conversion efficiency of 30% by extending the width of wavelengths that are converted to the full spectral range delivered by the Sun. PAIDEIA has a deeply fundamental character impacting several areas in the field of nanophysics, nanochemistry and materials processing and, at the same time, having a high impact on the study of solar energy conversion. Finally, PAIDEIA will provide answers to the fundamental questions regarding the physical behaviour of plasmonic/semiconductor interfaces.
Summary
Earth is inhabited by an energy hungry human society. The Sun, with a global radiation at the ground level of more than 1 kW/m^2, is our largest source of energy. However, 45% of the total radiation is in the near infrared (NIR) and is not absorbed by most photovoltaic materials.
PAIDEIA focuses on two main advantages aiming to enhance the capacity of solar energy conversion:
i) plasmon assisted hot carriers extraction from NIR plasmonic materials;
ii) linewidth narrowing in plasmonic nanoparticle films that enhances the lifetime of hot carriers and, thus, boosts the efficiency of light driven carrier extraction.
Instead of metals, which operate mostly in the visible region, we will make use of doped semiconductor nanocrystals (DSNCs) as hot electron extraction materials possessing a plasmonic response tunable in the range 800 nm – 4000 nm. Three different innovative architectures will be used for improved device performance: i) improved Schottky junctions (DSNC/wide band gap semiconductor nanocomposites); ii) ultrathin devices (DSNCs/2D quantum materials); iii) maximized interface DSNC/semiconductor bulk hetero-Schottky junctions.
By combining both concepts in advanced architectures we aim to produce a solar cell device that functions in the NIR with efficiencies of up to 10%. A tandem solar cell that combines the conventional power conversion efficiency, up to ~1100 nm, of a commercial Si solar cell (~20%) with the new PAIDEIA based device is expected to reach a total power conversion efficiency of 30% by extending the width of wavelengths that are converted to the full spectral range delivered by the Sun. PAIDEIA has a deeply fundamental character impacting several areas in the field of nanophysics, nanochemistry and materials processing and, at the same time, having a high impact on the study of solar energy conversion. Finally, PAIDEIA will provide answers to the fundamental questions regarding the physical behaviour of plasmonic/semiconductor interfaces.
Max ERC Funding
1 815 445 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym POTENT
Project Engineering Discoidal Polymeric Nanoconstructs for the Multi-Physics Treatment of Brain Tumors
Researcher (PI) Paolo Decuzzi
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE8, ERC-2013-CoG
Summary Despite significant advances in chemotherapy, the effective treatment of malignant masses via systemically injectable agents are still limited by insufficient accumulation at the biological target (<< 10% injected dose per gram tumor) and non-specific sequestration by the reticulo-endothelial system (tumor/liver < 0.1).
The goal of this proposal is to engineer Discoidal Polymeric Nanoconstructs (DPNs) to preferentially target the malignant neovasculature for the delivery of imaging agents, controlled release of therapeutic molecules and thermal energy. The central hypothesis is that the size, shape, surface properties and stiffness (4S parameters) of the DPNs can be controlled during synthesis, and that therapeutic molecules (Temozolomide), Gd(DOTA) complexes and ultra-small Super-Paramagnetic Iron Oxide nanoparticles (USPIOs) can be efficiently incorporated within the DPN polymeric matrix.
This will be achieved by pursuing 3 specific aims: i) synthesis and physico-chemical characterization of poly(lactic-co-glycolic acid)/poly(ethylene glycol) DPNs with multiple 4S combinations; ii) in-silico and in vitro rational selection of DPN configurations with preferential tumor deposition, low macrophage uptake and high loading; and iii) in-vivo testing of the DPN imaging and therapeutic performance in mice bearing Glioblastoma Multiforme (GBM).
The innovation stays in i) using synergistically three different targeting strategies (rational selection of the 4S parameters; magnetic guidance via external magnets acting on the USPIOs; specific ligand-receptor recognition of the tumor neovasculature); ii) combining therapeutic and imaging molecules within the same nanoconstruct; and iii) employing synergistically different therapeutic approaches (molecular and thermal ablation therapies). This would allow us to support minimally invasive screening via clinical imaging and enhance therapeutic efficacy in GBM patients.
Summary
Despite significant advances in chemotherapy, the effective treatment of malignant masses via systemically injectable agents are still limited by insufficient accumulation at the biological target (<< 10% injected dose per gram tumor) and non-specific sequestration by the reticulo-endothelial system (tumor/liver < 0.1).
The goal of this proposal is to engineer Discoidal Polymeric Nanoconstructs (DPNs) to preferentially target the malignant neovasculature for the delivery of imaging agents, controlled release of therapeutic molecules and thermal energy. The central hypothesis is that the size, shape, surface properties and stiffness (4S parameters) of the DPNs can be controlled during synthesis, and that therapeutic molecules (Temozolomide), Gd(DOTA) complexes and ultra-small Super-Paramagnetic Iron Oxide nanoparticles (USPIOs) can be efficiently incorporated within the DPN polymeric matrix.
This will be achieved by pursuing 3 specific aims: i) synthesis and physico-chemical characterization of poly(lactic-co-glycolic acid)/poly(ethylene glycol) DPNs with multiple 4S combinations; ii) in-silico and in vitro rational selection of DPN configurations with preferential tumor deposition, low macrophage uptake and high loading; and iii) in-vivo testing of the DPN imaging and therapeutic performance in mice bearing Glioblastoma Multiforme (GBM).
The innovation stays in i) using synergistically three different targeting strategies (rational selection of the 4S parameters; magnetic guidance via external magnets acting on the USPIOs; specific ligand-receptor recognition of the tumor neovasculature); ii) combining therapeutic and imaging molecules within the same nanoconstruct; and iii) employing synergistically different therapeutic approaches (molecular and thermal ablation therapies). This would allow us to support minimally invasive screening via clinical imaging and enhance therapeutic efficacy in GBM patients.
Max ERC Funding
2 390 000 €
Duration
Start date: 2014-07-01, End date: 2019-06-30
Project acronym PRO-TOOLKITS
Project Programmable nucleic acid toolkits for cell-free diagnostics and genetically encoded biosensing
Researcher (PI) francesco RICCI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Consolidator Grant (CoG), PE4, ERC-2018-COG
Summary WHY: The biological complexity of tumours and the large diversity of diagnostic biomarkers call for the development of innovative analytical tools that can detect multiple targets in a sensitive, specific and low-cost way and allow real-time monitoring of disease pathways and therapeutic effects. To provide such transformative tools creative thinking, innovative approach and the exploration of new research avenues that span different disciplines is necessary.
WHAT: The goal of the PRO-TOOLKITS project is to address this need by developing innovative cell-free point of care diagnostic kits and genetically encodable biosensing tools.
HOW: I oriented my independent career as a P.I. towards the design and development of synthetic nucleic acid-based nanodevices and nanomachines. With the help of an ERC Starting Grant I made ground-breaking contributions in the field of nucleic acid Nanotechnology. Motivated by these advancements I propose to challenge my know-how and expertise to explore new research avenues that will open exciting possibilities in biosensing applications. The key, ground-breaking IDEA underlying this project is to take advantage of my expertise and harness the advantageous features of RNA synthetic modules that can translate the expression of proteins in controlled in-vitro cell-free systems and can be also genetically encoded in living organisms and function inside the cells. I will develop rationally designed programmable nucleic acid modules that respond to a wide range of molecular markers and environmental stimuli through innovative nature-inspired mechanisms and that can be orthogonally wired to provide cell-free diagnostic kits and genetically encoded live-cell biosensing tools. The project will provide transformative approaches, methods and tools that will represent a genuine break-through in the fields of in-vitro diagnostics, biosensing and synthetic biology.
Summary
WHY: The biological complexity of tumours and the large diversity of diagnostic biomarkers call for the development of innovative analytical tools that can detect multiple targets in a sensitive, specific and low-cost way and allow real-time monitoring of disease pathways and therapeutic effects. To provide such transformative tools creative thinking, innovative approach and the exploration of new research avenues that span different disciplines is necessary.
WHAT: The goal of the PRO-TOOLKITS project is to address this need by developing innovative cell-free point of care diagnostic kits and genetically encodable biosensing tools.
HOW: I oriented my independent career as a P.I. towards the design and development of synthetic nucleic acid-based nanodevices and nanomachines. With the help of an ERC Starting Grant I made ground-breaking contributions in the field of nucleic acid Nanotechnology. Motivated by these advancements I propose to challenge my know-how and expertise to explore new research avenues that will open exciting possibilities in biosensing applications. The key, ground-breaking IDEA underlying this project is to take advantage of my expertise and harness the advantageous features of RNA synthetic modules that can translate the expression of proteins in controlled in-vitro cell-free systems and can be also genetically encoded in living organisms and function inside the cells. I will develop rationally designed programmable nucleic acid modules that respond to a wide range of molecular markers and environmental stimuli through innovative nature-inspired mechanisms and that can be orthogonally wired to provide cell-free diagnostic kits and genetically encoded live-cell biosensing tools. The project will provide transformative approaches, methods and tools that will represent a genuine break-through in the fields of in-vitro diagnostics, biosensing and synthetic biology.
Max ERC Funding
1 999 375 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym ScalableControl
Project Scalable Control of Interconnected Systems
Researcher (PI) Anders RANTZER
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE7, ERC-2018-ADG
Summary Modern society is critically dependent on large-scale networks for services such as energy supply, transportation and communications. The design and control of such networks is becoming increasingly complex, due to their growing size, heterogeneity and autonomy. A systematic theory and methodology for control of large-scale interconnected systems is therefore needed. In an ambitious effort towards this goal, this project will develop rigorous tools for control synthesis, adaptation and verification.
Many large-scale systems exhibit properties that have not yet been systematically exploited by the control community. One such property is positive (or monotone) system dynamics. This correspond to the property that all states of a network respond in the same direction when the demand or supply is perturbed in some node. Scalable methods for control of positive systems are starting to be developed, but several fundamental questions remain: How can existing results be extended to scalable synthesis of dynamic controllers? Can results for linear positive systems be extended to nonlinear monotone ones? How about systems with resonances?
The second focus area, adaptation, takes advantage of recent progress in machine learning, such as statistical concentration bounds and approximate dynamic programming. Adaptation is of fundamental importance for scalability, since high-fidelity models are very expensive to generate manually and hard to maintain. Thirdly, since systematic procedures for control synthesis generally rely on simplified models and idealized assumptions, we will also develop scalable methods to bound the effect of imperfections, such as nonlinearities, time-variations and parameter uncertainty that are not taken into account in the original design.
The research will be carried out in interaction with industry studying a new concept for district heating networks. This collaboration will give access to experimental data from a full scale demonstration plant.
Summary
Modern society is critically dependent on large-scale networks for services such as energy supply, transportation and communications. The design and control of such networks is becoming increasingly complex, due to their growing size, heterogeneity and autonomy. A systematic theory and methodology for control of large-scale interconnected systems is therefore needed. In an ambitious effort towards this goal, this project will develop rigorous tools for control synthesis, adaptation and verification.
Many large-scale systems exhibit properties that have not yet been systematically exploited by the control community. One such property is positive (or monotone) system dynamics. This correspond to the property that all states of a network respond in the same direction when the demand or supply is perturbed in some node. Scalable methods for control of positive systems are starting to be developed, but several fundamental questions remain: How can existing results be extended to scalable synthesis of dynamic controllers? Can results for linear positive systems be extended to nonlinear monotone ones? How about systems with resonances?
The second focus area, adaptation, takes advantage of recent progress in machine learning, such as statistical concentration bounds and approximate dynamic programming. Adaptation is of fundamental importance for scalability, since high-fidelity models are very expensive to generate manually and hard to maintain. Thirdly, since systematic procedures for control synthesis generally rely on simplified models and idealized assumptions, we will also develop scalable methods to bound the effect of imperfections, such as nonlinearities, time-variations and parameter uncertainty that are not taken into account in the original design.
The research will be carried out in interaction with industry studying a new concept for district heating networks. This collaboration will give access to experimental data from a full scale demonstration plant.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym SIMONE
Project Single Molecule Nano Electronics (SIMONE)
Researcher (PI) Kasper Moth-Poulsen
Host Institution (HI) CHALMERS TEKNISKA HOEGSKOLA AB
Call Details Starting Grant (StG), PE5, ERC-2013-StG
Summary "The development of micro fabrication and field effect transistors are key enabling technologies for todays information society. It is hard to imagine superfast and omnipresent electronic devices, information technology, the Internet and mobile communication technologies without access to continuously cheaper and miniaturized microprocessors. The giant leaps in performance of microprocessors from the first personal computing machines to todays mobile devices are to a large extent realized via miniaturization of the active components. The ultimate limit of miniaturization of electronic components is the realization of single molecule electronics. Due to fundamental physical limitations, single molecule resolution cannot be achieved using classical top-down lithographic techniques. At the same time, existing surface functionalization schemes do not provide any means of placing a single molecule with high precision at a specific location on a nanostructure. This project has the ambitious goal of establishing the first method ever allowing for self-assembly of multiple single molecule devices in a parallel way and thereby provide the first method ever allowing for multiple individual single molecule components to operate together in the same device.
The impact of the technology platforms described herein goes vastly beyond the field of single molecule electronics and utilization in ultra-sensitive plasmonic biosensors with a digital single molecule response will be explored in parallel with the main roadmaps of the project."
Summary
"The development of micro fabrication and field effect transistors are key enabling technologies for todays information society. It is hard to imagine superfast and omnipresent electronic devices, information technology, the Internet and mobile communication technologies without access to continuously cheaper and miniaturized microprocessors. The giant leaps in performance of microprocessors from the first personal computing machines to todays mobile devices are to a large extent realized via miniaturization of the active components. The ultimate limit of miniaturization of electronic components is the realization of single molecule electronics. Due to fundamental physical limitations, single molecule resolution cannot be achieved using classical top-down lithographic techniques. At the same time, existing surface functionalization schemes do not provide any means of placing a single molecule with high precision at a specific location on a nanostructure. This project has the ambitious goal of establishing the first method ever allowing for self-assembly of multiple single molecule devices in a parallel way and thereby provide the first method ever allowing for multiple individual single molecule components to operate together in the same device.
The impact of the technology platforms described herein goes vastly beyond the field of single molecule electronics and utilization in ultra-sensitive plasmonic biosensors with a digital single molecule response will be explored in parallel with the main roadmaps of the project."
Max ERC Funding
1 500 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym SIREN
Project Stability Islands: Performance Revolution in Machining
Researcher (PI) Gábor Stépán
Host Institution (HI) BUDAPESTI MUSZAKI ES GAZDASAGTUDOMANYI EGYETEM
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Cutting went through a revolution in the 1990s when high-speed milling (HSM) was introduced: the sculpture-like workpieces produced with high precision and efficiency resulted in one order of magnitude less parts in cars/aircrafts, which kept this traditional technology competitive at the turn of the century. This has been followed by an incremental development when not just the cutting speeds, but depths of cut and feed rates are pushed to limits, too.
The limits are where harmful vibrations occur. Cutting is subject to a special one called chatter, which is originated in a time delay: the cutting edge interferes with its own past oscillation recorded on the wavy surface cut of the workpiece. In 1907, the 3rd president of ASME, Taylor wrote: “Chatter is the most obscure and delicate of all problems facing the machinist”.
In spite of the development of the theory of delay-differential equations and nonlinear dynamics, Taylor’s statement remained valid 100 years later when HSM appeared together with a new kind of chatter. The applicant has been among those leading researchers who predicted these phenomena; the experimental/numerical techniques developed in his group are widely used to find parameters, e.g. where milling tools with serrated edges and/or with varying helix angles are advantageous.
The SIREN project aims to find isolated parameter islands with 3-5 times increased cutting efficiency. The work-packages correspond to points of high risk: (1) validated, delay-based nonlinear modelling of the dynamic contact problem between chip and tool; (2) fixation of the tool that is compatible with a dynamically reliable mathematical model of the contact between tool and tool-holder; (3) up-to-date dynamic modelling of the spindle at varying speeds.
High risk originates in the attempt of using distributed delay models, but high gain is expected with robust use of parameter islands where technology reaches a breakthrough in cutting efficiency for the 21st century."
Summary
"Cutting went through a revolution in the 1990s when high-speed milling (HSM) was introduced: the sculpture-like workpieces produced with high precision and efficiency resulted in one order of magnitude less parts in cars/aircrafts, which kept this traditional technology competitive at the turn of the century. This has been followed by an incremental development when not just the cutting speeds, but depths of cut and feed rates are pushed to limits, too.
The limits are where harmful vibrations occur. Cutting is subject to a special one called chatter, which is originated in a time delay: the cutting edge interferes with its own past oscillation recorded on the wavy surface cut of the workpiece. In 1907, the 3rd president of ASME, Taylor wrote: “Chatter is the most obscure and delicate of all problems facing the machinist”.
In spite of the development of the theory of delay-differential equations and nonlinear dynamics, Taylor’s statement remained valid 100 years later when HSM appeared together with a new kind of chatter. The applicant has been among those leading researchers who predicted these phenomena; the experimental/numerical techniques developed in his group are widely used to find parameters, e.g. where milling tools with serrated edges and/or with varying helix angles are advantageous.
The SIREN project aims to find isolated parameter islands with 3-5 times increased cutting efficiency. The work-packages correspond to points of high risk: (1) validated, delay-based nonlinear modelling of the dynamic contact problem between chip and tool; (2) fixation of the tool that is compatible with a dynamically reliable mathematical model of the contact between tool and tool-holder; (3) up-to-date dynamic modelling of the spindle at varying speeds.
High risk originates in the attempt of using distributed delay models, but high gain is expected with robust use of parameter islands where technology reaches a breakthrough in cutting efficiency for the 21st century."
Max ERC Funding
2 573 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym SLING
Project Efficient algorithms for sustainable machine learning
Researcher (PI) Lorenzo ROSASCO
Host Institution (HI) UNIVERSITA DEGLI STUDI DI GENOVA
Call Details Consolidator Grant (CoG), PE6, ERC-2018-COG
Summary This project will develop and integrate the latest optimization and statistical advances into a new generation of resource-efficient algorithms for large-scale machine learning. State-of-the-art machine learning methods provide impressive results, opening new perspectives for science, technology, and society. However, they rely on massive computational resources to process huge manually annotated data-sets. The corresponding costs in terms of energy consumption and human efforts are not sustainable.
This project builds on the idea that improving efficiency is a key to scale the ambitions and applicability of machine learning. Achieving efficiency requires overcoming the traditional boundaries between statistics and computations, to develop new theory and algorithms.
Within a multidisciplinary approach, we will establish a new regularization theory of efficient machine learning.
We will develop models that incorporate budgeted computations, and numerical solutions with resources tailored to the statistically accuracy allowed by the data. Theoretical advances will provide the foundations for novel and sound algorithmic solutions. Close collaborations in diverse applied fields
will ensure that our research results and solutions will be apt and immediately applicable to real world scenarios.
The new algorithms developed in the project will contribute to boost the possibilities of Artificial Intelligence, modeling and decision making in a world of data with ever-increasing size and complexity.
Summary
This project will develop and integrate the latest optimization and statistical advances into a new generation of resource-efficient algorithms for large-scale machine learning. State-of-the-art machine learning methods provide impressive results, opening new perspectives for science, technology, and society. However, they rely on massive computational resources to process huge manually annotated data-sets. The corresponding costs in terms of energy consumption and human efforts are not sustainable.
This project builds on the idea that improving efficiency is a key to scale the ambitions and applicability of machine learning. Achieving efficiency requires overcoming the traditional boundaries between statistics and computations, to develop new theory and algorithms.
Within a multidisciplinary approach, we will establish a new regularization theory of efficient machine learning.
We will develop models that incorporate budgeted computations, and numerical solutions with resources tailored to the statistically accuracy allowed by the data. Theoretical advances will provide the foundations for novel and sound algorithmic solutions. Close collaborations in diverse applied fields
will ensure that our research results and solutions will be apt and immediately applicable to real world scenarios.
The new algorithms developed in the project will contribute to boost the possibilities of Artificial Intelligence, modeling and decision making in a world of data with ever-increasing size and complexity.
Max ERC Funding
1 977 500 €
Duration
Start date: 2019-11-01, End date: 2024-10-31
Project acronym SO-ReCoDi
Project Spectral and Optimization Techniques for Robust Recovery, Combinatorial Constructions, and Distributed Algorithms
Researcher (PI) Luca Trevisan
Host Institution (HI) UNIVERSITA COMMERCIALE LUIGI BOCCONI
Call Details Advanced Grant (AdG), PE6, ERC-2018-ADG
Summary In a recovery problem, we are interested in recovering structure from data that contains a mix of combinatorial structure and random noise. In a robust recovery problem, the data may contain adversarial perturbations as well. A series of recent results in theoretical computer science has led to algorithms based on the convex optimization technique of Semidefinite Programming for several recovery problems motivated by unsupervised machine learning. Can those algorithms be made robust? Sparsifiers are compressed representations of graphs that speed up certain algorithms. The recent proof of the Kadison-Singer conjecture by Marcus, Spielman and Srivastava (MSS) shows that certain kinds of sparsifiers exist, but the proof does not provide an explicit construction. Dynamics and population protocols are simple models of distributed computing that were introduced to study sensor networks and other lightweight distributed systems, and have also been used to model naturally occurring networks. What can and cannot be computed in such models is largely open. We propose an ambitious unifying approach to go beyond the state of the art in these three domains, and provide: robust recovery algorithms for the problems mentioned above; a new connection between sparsifiers and the Szemeredi Regularity Lemma and explicit constructions of the sparsifiers resulting from the MSS work; and an understanding of the ability of simple distributed algorithms to solve community detection problems and to deal with noise and faults. The unification is provided by a common underpinning of spectral methods, random matrix theory, and convex optimization. Such tools are used in technically similar but conceptually very different ways in the three domains. By pursuing these goals together, we will make it more likely that an idea that is natural and simple in one context will translate to an idea that is deep and unexpected in another, increasing the chances of a breakthrough.
Summary
In a recovery problem, we are interested in recovering structure from data that contains a mix of combinatorial structure and random noise. In a robust recovery problem, the data may contain adversarial perturbations as well. A series of recent results in theoretical computer science has led to algorithms based on the convex optimization technique of Semidefinite Programming for several recovery problems motivated by unsupervised machine learning. Can those algorithms be made robust? Sparsifiers are compressed representations of graphs that speed up certain algorithms. The recent proof of the Kadison-Singer conjecture by Marcus, Spielman and Srivastava (MSS) shows that certain kinds of sparsifiers exist, but the proof does not provide an explicit construction. Dynamics and population protocols are simple models of distributed computing that were introduced to study sensor networks and other lightweight distributed systems, and have also been used to model naturally occurring networks. What can and cannot be computed in such models is largely open. We propose an ambitious unifying approach to go beyond the state of the art in these three domains, and provide: robust recovery algorithms for the problems mentioned above; a new connection between sparsifiers and the Szemeredi Regularity Lemma and explicit constructions of the sparsifiers resulting from the MSS work; and an understanding of the ability of simple distributed algorithms to solve community detection problems and to deal with noise and faults. The unification is provided by a common underpinning of spectral methods, random matrix theory, and convex optimization. Such tools are used in technically similar but conceptually very different ways in the three domains. By pursuing these goals together, we will make it more likely that an idea that is natural and simple in one context will translate to an idea that is deep and unexpected in another, increasing the chances of a breakthrough.
Max ERC Funding
1 971 805 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym SPECGEO
Project Spectral geometric methods in practice
Researcher (PI) Emanuele RODOLA'
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE6, ERC-2018-STG
Summary Spectral geometry concerns the study of the geometric properties of data domains, such as surfaces or graphs, via the spectral decomposition of linear operators defined upon them. Due to their valuable properties analogous to Fourier theory, such methods find widespread use in several branches of computer science, ranging from computer vision to machine learning and network analysis.
Despite their pervasive presence, very little efforts have been devoted to the design and application of spectral techniques that deal with corrupted, missing, high-dimensional or abstract data undergoing complex transformations. This lack of focus is mainly motivated by the widespread acceptance, supported in part by theoretical results, that an ε-perturbation to the geometry of the data (as small as the removal of a single point) can induce arbitrary changes in the operator’s eigendecomposition – leading to a limited adoption of spectral models in real-world applications. This project challenges this view, contending that such presumption of instability is primarily due to a suboptimal choice of the analytical tools that are currently being employed, and which only provide part of the picture. In fact, strong evidence largely contradicts the expected behavior on real geometric data. The reason behind this apparent inconsistency lies in the different focus of current methods, which provide crude bounds and are directed toward other kinds of perturbation than those observed in real settings.
The ambitious goal of this project is to develop a novel theoretical and computational framework that will fundamentally change the way spectral techniques are constructed, interpreted, and applied. These tools will enable a range of currently infeasible uses of spectral methods on real data. They will deal with strong incompleteness, corruption and cross-modality, and they will be applied to outstanding problems in geometry processing, computer vision, machine learning, and computational biology.
Summary
Spectral geometry concerns the study of the geometric properties of data domains, such as surfaces or graphs, via the spectral decomposition of linear operators defined upon them. Due to their valuable properties analogous to Fourier theory, such methods find widespread use in several branches of computer science, ranging from computer vision to machine learning and network analysis.
Despite their pervasive presence, very little efforts have been devoted to the design and application of spectral techniques that deal with corrupted, missing, high-dimensional or abstract data undergoing complex transformations. This lack of focus is mainly motivated by the widespread acceptance, supported in part by theoretical results, that an ε-perturbation to the geometry of the data (as small as the removal of a single point) can induce arbitrary changes in the operator’s eigendecomposition – leading to a limited adoption of spectral models in real-world applications. This project challenges this view, contending that such presumption of instability is primarily due to a suboptimal choice of the analytical tools that are currently being employed, and which only provide part of the picture. In fact, strong evidence largely contradicts the expected behavior on real geometric data. The reason behind this apparent inconsistency lies in the different focus of current methods, which provide crude bounds and are directed toward other kinds of perturbation than those observed in real settings.
The ambitious goal of this project is to develop a novel theoretical and computational framework that will fundamentally change the way spectral techniques are constructed, interpreted, and applied. These tools will enable a range of currently infeasible uses of spectral methods on real data. They will deal with strong incompleteness, corruption and cross-modality, and they will be applied to outstanding problems in geometry processing, computer vision, machine learning, and computational biology.
Max ERC Funding
1 434 000 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym STARKEY
Project Solving the TP-AGB STAR Conundrum: a KEY to Galaxy Evolution
Researcher (PI) Paola Marigo
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PADOVA
Call Details Consolidator Grant (CoG), PE9, ERC-2013-CoG
Summary "Models of the Thermally Pulsing Asymptotic Giant Branch (TP-AGB) stellar evolutionary phase play a critical role across astrophysics, from the chemical composition of meteorites belonging to the pre-solar nebula up to galaxy evolution in the high-redshift Universe. In spite of its importance, the modelling of TP-AGB is still affected by large uncertainties that propagate into the field of extragalactic astronomy, degrading the predicting power of current population synthesis models of galaxies. The major goal of this proposal is to remedy this persistent condition of uncertainty and controversy. The solution to the TP-AGB star conundrum will be provided by a new approach, which stands on the optimised integration of a) state-of-the-art theoretical tools to account for the complex physics of TP-AGB stars (evolution, nucleosynthesis, pulsation, winds, dust formation, etc.), and b) exceptionally high-quality observations of resolved TP-AGB stellar populations in stars clusters and nearby galaxies (Magellanic Clouds, M31, dwarf galaxies up to 4 Mpc) with reliable measurements of their star formation histories. We will adopt a global calibration method, in which TP-AGB evolution models are required to simultaneously reproduce a set of well-defined observational constraints (distributions of luminosities, colours, pulsation periods, dust mass-loss rates, expansion velocities of dusty envelopes, etc.). This project will deepen our understanding of TP-AGB physics profoundly, and provide wide-spread community benefits as well. We will publicly release well-tested and reliable ``TP-AGB products'', including stellar tracks, isochrones in all photometric systems, and chemical yields for both gas and dust. Eventually these products will be embedded in the stellar population synthesis models that are routinely used to analyse the integrated galaxy observables that probe the extragalactic Universe."
Summary
"Models of the Thermally Pulsing Asymptotic Giant Branch (TP-AGB) stellar evolutionary phase play a critical role across astrophysics, from the chemical composition of meteorites belonging to the pre-solar nebula up to galaxy evolution in the high-redshift Universe. In spite of its importance, the modelling of TP-AGB is still affected by large uncertainties that propagate into the field of extragalactic astronomy, degrading the predicting power of current population synthesis models of galaxies. The major goal of this proposal is to remedy this persistent condition of uncertainty and controversy. The solution to the TP-AGB star conundrum will be provided by a new approach, which stands on the optimised integration of a) state-of-the-art theoretical tools to account for the complex physics of TP-AGB stars (evolution, nucleosynthesis, pulsation, winds, dust formation, etc.), and b) exceptionally high-quality observations of resolved TP-AGB stellar populations in stars clusters and nearby galaxies (Magellanic Clouds, M31, dwarf galaxies up to 4 Mpc) with reliable measurements of their star formation histories. We will adopt a global calibration method, in which TP-AGB evolution models are required to simultaneously reproduce a set of well-defined observational constraints (distributions of luminosities, colours, pulsation periods, dust mass-loss rates, expansion velocities of dusty envelopes, etc.). This project will deepen our understanding of TP-AGB physics profoundly, and provide wide-spread community benefits as well. We will publicly release well-tested and reliable ``TP-AGB products'', including stellar tracks, isochrones in all photometric systems, and chemical yields for both gas and dust. Eventually these products will be embedded in the stellar population synthesis models that are routinely used to analyse the integrated galaxy observables that probe the extragalactic Universe."
Max ERC Funding
1 930 628 €
Duration
Start date: 2014-05-01, End date: 2019-04-30
Project acronym StrucLim
Project Limits of discrete structures
Researcher (PI) Balazs Szegedy
Host Institution (HI) MAGYAR TUDOMANYOS AKADEMIA RENYI ALFRED MATEMATIKAI KUTATOINTEZET
Call Details Consolidator Grant (CoG), PE1, ERC-2013-CoG
Summary Built on decades of deep research in ergodic theory, Szemeredi's regularity theory and statistical physics, a new subject is emerging whose goal is to study convergence and limits of various structures.
The main idea is to regard very large structures in combinatorics and algebra as approximations of infinite analytic objects. This viewpoint brings new tools from analysis and topology into these subjects. The success of this branch of mathematics has already been demonstrated through numerous applications in computer science, extremal combinatorics, probability theory and group theory. The present research plan addresses a number of open problems in additive combinatorics, ergodic theory, higher order Fourier analysis, extremal combinatorics and random graph theory. These subjects are all interrelated through the limit approach.
Summary
Built on decades of deep research in ergodic theory, Szemeredi's regularity theory and statistical physics, a new subject is emerging whose goal is to study convergence and limits of various structures.
The main idea is to regard very large structures in combinatorics and algebra as approximations of infinite analytic objects. This viewpoint brings new tools from analysis and topology into these subjects. The success of this branch of mathematics has already been demonstrated through numerous applications in computer science, extremal combinatorics, probability theory and group theory. The present research plan addresses a number of open problems in additive combinatorics, ergodic theory, higher order Fourier analysis, extremal combinatorics and random graph theory. These subjects are all interrelated through the limit approach.
Max ERC Funding
1 175 200 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym SUPERSPEC
Project Three-dimensional spectral modelling of astrophysical transients : unravelling the nucleosynthetic content of supernovae and kilonovae
Researcher (PI) Anders JERKSTRAND
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Starting Grant (StG), PE9, ERC-2018-STG
Summary Determining the origin of the elements is a fundamental quest in physics and astronomy. Most of the elements in the periodic table are believed to be produced by supernovae and kilonovae. However, this has for decades been little more than a prediction from theory. Now, with a dramatically changing observational situation and new modelling capabilities, it is within our reach to determine the nucleosynthesis production and structure in these transients. To really see what supernovae and kilonovae contain, we must study their spectra in the later so called nebular phase when the inner regions become visible. This project is aimed at establishing the first picture of the origin of elements by determining the yields from supernovae and kilonovae using such analysis. To do this, new spectral synthesis methods need to be developed considering the necessary microphysical (ejecta chemistry, r-process physics, time-dependent gas state) and macrophysical (3D radiation transport) processes to obtain sufficient accuracy. These tools will then be applied to the first 3D explosion simulations of these transients now becoming available. When applied to the growing library of data emerging from automated surveys and follow-up programs, as well to the recent first kilonova observations, this will provide a breakthrough in our understanding of these transients. This development will not only allow a determination of cosmic element production, but also allow tests of theories for stellar evolution, nucleosynthesis, and explosion processes. This will in turn have fundamental impact on several fields of astrophysics such as population synthesis, galactic chemical evolution modelling, and understanding of mass transfer in the progenitor systems. It has a strong connection to recent detections of stellar-mass black holes and merging neutron stars by gravitational waves.
Summary
Determining the origin of the elements is a fundamental quest in physics and astronomy. Most of the elements in the periodic table are believed to be produced by supernovae and kilonovae. However, this has for decades been little more than a prediction from theory. Now, with a dramatically changing observational situation and new modelling capabilities, it is within our reach to determine the nucleosynthesis production and structure in these transients. To really see what supernovae and kilonovae contain, we must study their spectra in the later so called nebular phase when the inner regions become visible. This project is aimed at establishing the first picture of the origin of elements by determining the yields from supernovae and kilonovae using such analysis. To do this, new spectral synthesis methods need to be developed considering the necessary microphysical (ejecta chemistry, r-process physics, time-dependent gas state) and macrophysical (3D radiation transport) processes to obtain sufficient accuracy. These tools will then be applied to the first 3D explosion simulations of these transients now becoming available. When applied to the growing library of data emerging from automated surveys and follow-up programs, as well to the recent first kilonova observations, this will provide a breakthrough in our understanding of these transients. This development will not only allow a determination of cosmic element production, but also allow tests of theories for stellar evolution, nucleosynthesis, and explosion processes. This will in turn have fundamental impact on several fields of astrophysics such as population synthesis, galactic chemical evolution modelling, and understanding of mass transfer in the progenitor systems. It has a strong connection to recent detections of stellar-mass black holes and merging neutron stars by gravitational waves.
Max ERC Funding
1 500 000 €
Duration
Start date: 2019-10-01, End date: 2024-09-30
Project acronym SusDrug
Project Sustainable Approach to Drug Discovery
Researcher (PI) David SARLAH
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Starting Grant (StG), PE5, ERC-2018-STG
Summary Modern drug discovery is facing critical challenges. Rapid advances in human biology are revealing new biomolecular targets and processes, for which existing chemical compound libraries can provide only limited success in the identification of novel bioactive agents. This deficiency has been attributed primarily to the relative lack of structural diversity within the libraries. The three-dimensional world of biological macromolecules has been continuously interrogated with generally similar planar, aromatic, and structurally simple compounds. Contemporary diversity-generating methods have never been implemented for the preparation of large libraries, as an increase in the number of diverse members requires a corresponding increase in the number of synthetic steps, or a continuous supply of different staring materials. This proposal details a strategy for developing a chemically sustainable diversification method, by tapping into our largest source of organic compounds: arenes.
The proposed research aims to develop new methods that can rapidly convert simple aromatic entities into highly functionalized, complex small molecules. By integration of this strategy with many different chemical operations, numerous distinctive and independent dearomative programs will generate a diverse set of multiplex small molecules. This simplicity-to-complexity approach will provide a practical platform for the rapid, controlled access to a functionally diverse set of compounds, ranging from anticancer to anti-infective agents. This research will also deliver methods for dearomative diversification of existing aromatic compound libraries to provide new members with unique physiochemical properties. Given the broad scope of possible dearomative programs that will be developed, and the vast amount of aromatic compounds accessible, this will ultimately provide a sustainable source of diverse molecules for the next generation of compound libraries.
Summary
Modern drug discovery is facing critical challenges. Rapid advances in human biology are revealing new biomolecular targets and processes, for which existing chemical compound libraries can provide only limited success in the identification of novel bioactive agents. This deficiency has been attributed primarily to the relative lack of structural diversity within the libraries. The three-dimensional world of biological macromolecules has been continuously interrogated with generally similar planar, aromatic, and structurally simple compounds. Contemporary diversity-generating methods have never been implemented for the preparation of large libraries, as an increase in the number of diverse members requires a corresponding increase in the number of synthetic steps, or a continuous supply of different staring materials. This proposal details a strategy for developing a chemically sustainable diversification method, by tapping into our largest source of organic compounds: arenes.
The proposed research aims to develop new methods that can rapidly convert simple aromatic entities into highly functionalized, complex small molecules. By integration of this strategy with many different chemical operations, numerous distinctive and independent dearomative programs will generate a diverse set of multiplex small molecules. This simplicity-to-complexity approach will provide a practical platform for the rapid, controlled access to a functionally diverse set of compounds, ranging from anticancer to anti-infective agents. This research will also deliver methods for dearomative diversification of existing aromatic compound libraries to provide new members with unique physiochemical properties. Given the broad scope of possible dearomative programs that will be developed, and the vast amount of aromatic compounds accessible, this will ultimately provide a sustainable source of diverse molecules for the next generation of compound libraries.
Max ERC Funding
1 400 000 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym TECTONIC
Project The physics of Earthquake faulting: learning from laboratory earthquake prediCTiON to Improve forecasts of the spectrum of tectoniC failure modes: TECTONIC
Researcher (PI) Chris MARONE
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE10, ERC-2018-ADG
Summary Earthquakes represent one of our greatest natural hazards. Even a modest improvement in the ability to forecast devastating events like the 2016 sequence that destroyed the villages of Amatrice and Norcia, Italy would save thousands of lives and billions of euros. Current efforts to forecast earthquakes are hampered by a lack of reliable lab or field observations. Moreover, even when changes in rock properties prior to failure (precursors) have been found, we have not known enough about the physics to rationally extrapolate lab results to tectonic faults and account for tectonic history, local plate motion, hydrogeology, or the local P/T/chemical environment. However, recent advances show: 1) clear and consistent precursors prior to earthquake-like failure in the lab and 2) that lab earthquakes can be predicted using machine learning (ML). These works show that stick-slip failure events –the lab equivalent of earthquakes– are preceded by a cascade of micro-failure events that radiate elastic energy in a manner that foretells catastrophic failure. Remarkably, ML predicts the failure time and in some cases the magnitude of lab earthquakes. Here, I propose to connect these results with field observations and use ML to search for earthquake precursors and build predictive models for tectonic faulting.
This proposal will support acquisition and analysis of seismic and geodetic data and construction of new lab equipment to unravel earthquake physics, precursors and forecasts. I will use my background in earthquake source theory, ML, fault rheology, and geodesy to address the physics of earthquake precursors, the conditions under which they can be observed for tectonic faults and the extent to which ML can forecast the spectrum of fault slip modes. My multidisciplinary team will train the next generation of researchers in earthquake science and foster a new level of broad community collaboration.
Summary
Earthquakes represent one of our greatest natural hazards. Even a modest improvement in the ability to forecast devastating events like the 2016 sequence that destroyed the villages of Amatrice and Norcia, Italy would save thousands of lives and billions of euros. Current efforts to forecast earthquakes are hampered by a lack of reliable lab or field observations. Moreover, even when changes in rock properties prior to failure (precursors) have been found, we have not known enough about the physics to rationally extrapolate lab results to tectonic faults and account for tectonic history, local plate motion, hydrogeology, or the local P/T/chemical environment. However, recent advances show: 1) clear and consistent precursors prior to earthquake-like failure in the lab and 2) that lab earthquakes can be predicted using machine learning (ML). These works show that stick-slip failure events –the lab equivalent of earthquakes– are preceded by a cascade of micro-failure events that radiate elastic energy in a manner that foretells catastrophic failure. Remarkably, ML predicts the failure time and in some cases the magnitude of lab earthquakes. Here, I propose to connect these results with field observations and use ML to search for earthquake precursors and build predictive models for tectonic faulting.
This proposal will support acquisition and analysis of seismic and geodetic data and construction of new lab equipment to unravel earthquake physics, precursors and forecasts. I will use my background in earthquake source theory, ML, fault rheology, and geodesy to address the physics of earthquake precursors, the conditions under which they can be observed for tectonic faults and the extent to which ML can forecast the spectrum of fault slip modes. My multidisciplinary team will train the next generation of researchers in earthquake science and foster a new level of broad community collaboration.
Max ERC Funding
3 459 750 €
Duration
Start date: 2020-01-01, End date: 2024-12-31
Project acronym TERAMICROSYS
Project Terahertz microsystems - Enabling the large-scale exploitation of the terahertz gap
Researcher (PI) Joachim Oberhammer
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Consolidator Grant (CoG), PE7, ERC-2013-CoG
Summary This project envisions the wide-spread use of THz technology in various applications in our society, which is enabled by the proposed THz microsystems, providing an unprecedented way of creating highly-integrated, volume-manufacturable, cost and energy-efficient, reconfigurable and thus adaptive submillimeter-wave and THz systems. Advanced three-dimensional micromachining is used as the key enabling fabrication technology. In connection with the technology convergence of advancing microwave semiconductor technology according to international technology programmes and roadmaps, the findings of this project are expected to comprise a significant contribution towards the large-scale exploitation of the heavily sought-after frequency space between 100 GHz and 1 THz, the so-called ‘terahertz gap’.
Primary application fields with high impact of the proposed technology are wireless short-range communication links to interconnect future small-cell clouds replacing the current macro-basestation radio access network, and submillimeter-wave/THz sensing with application fields including medical diagnosis, food quality control, agriculture and industrial sensors.
The proposed THz microsystems are based on rectangular waveguide-technology integrated into a multi-wafer stacked silicon substrate, which integrates all passive components needed for completing a submillimeter-wave/THz system around the monolithic-microwave integrated circuits (MMIC). Novel key building blocks investigated in this proposal include platform-integrated sensor and antenna interfaces, micro-electromechanically tuneable filters, phase-shifters, impedance-matching networks and non-galvanic microsystem-to-IC interfaces. The micro-mechanical reconfigurability enables unprecedented adaptive THz systems.
Key outcomes of this project are proof-of-concept prototypes of all key building blocks up to 650 GHz, and of complete THz microsystems implemented for the two key applications telecom links and medical sensors.
Summary
This project envisions the wide-spread use of THz technology in various applications in our society, which is enabled by the proposed THz microsystems, providing an unprecedented way of creating highly-integrated, volume-manufacturable, cost and energy-efficient, reconfigurable and thus adaptive submillimeter-wave and THz systems. Advanced three-dimensional micromachining is used as the key enabling fabrication technology. In connection with the technology convergence of advancing microwave semiconductor technology according to international technology programmes and roadmaps, the findings of this project are expected to comprise a significant contribution towards the large-scale exploitation of the heavily sought-after frequency space between 100 GHz and 1 THz, the so-called ‘terahertz gap’.
Primary application fields with high impact of the proposed technology are wireless short-range communication links to interconnect future small-cell clouds replacing the current macro-basestation radio access network, and submillimeter-wave/THz sensing with application fields including medical diagnosis, food quality control, agriculture and industrial sensors.
The proposed THz microsystems are based on rectangular waveguide-technology integrated into a multi-wafer stacked silicon substrate, which integrates all passive components needed for completing a submillimeter-wave/THz system around the monolithic-microwave integrated circuits (MMIC). Novel key building blocks investigated in this proposal include platform-integrated sensor and antenna interfaces, micro-electromechanically tuneable filters, phase-shifters, impedance-matching networks and non-galvanic microsystem-to-IC interfaces. The micro-mechanical reconfigurability enables unprecedented adaptive THz systems.
Key outcomes of this project are proof-of-concept prototypes of all key building blocks up to 650 GHz, and of complete THz microsystems implemented for the two key applications telecom links and medical sensors.
Max ERC Funding
1 727 189 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym TOPSPIN
Project Topotronic multi-dimensional spin Hall nano-oscillator networks
Researcher (PI) Johan Åkerman
Host Institution (HI) GOETEBORGS UNIVERSITET
Call Details Advanced Grant (AdG), PE7, ERC-2018-ADG
Summary TOPSPIN will focus on spin Hall nano-oscillators (SHNOs), which are nano-sized, ultra-tunable, and CMOS compatible spin wave based microwave oscillators. TOPSPIN will push the boundaries of SHNO lithography, frequency, speed, and power consumption by combining topological insulators, having record high spin Hall efficiencies, with materials having ultra-high spin wave frequencies. TOPSPIN will reduce the required current densities 1-2 orders of magnitude compared to state-of-the-art, making SHNO operating currents approach 1 uA, and increase the SHNO operating frequencies an order of magnitude to as high as 300 GHz.
TOPSPIN will use mutually synchronized SHNOs to achieve orders of magnitude higher signal coherence and achieve novel functionality such as pattern matching and neuromorphic computing. TOPSPIN will demonstrate mutual synchronization of up to 1,000 SHNOs in chains, and as many as 1,000,000 SHNOs in very large-scale two-dimensional arrays. Using dipolar coupling between SHNOs fabricated on top of each other, three-dimensional mutual synchronization will also be demonstrated. As the signal coherence increases linearly with the number of mutually synchronized SHNOs the oscillator quality factor will improve by many orders of magnitude. TOPSPIN will also develop such arrays using magnetic tunnel junction stacks thus combining ultra-high coherence with the highest possible microwave output power.
TOPSPIN will demonstrate ultrafast pattern matching and neuromorphic computing using its SHNO networks. It will functionalize SHNOs to exhibit ultra-fast individual voltage controlled tuning and non-volatile tuning of both the SHNO frequency and the inter-SHNO coupling.
TOPSPIN will characterize its SHNOs using novel methods and techniques such as multichannel electrical measurements, time- and phase-resolved Brillouin Light Scattering microscopy, time-resolved Scanning Transmission X-ray Microscopy, and ultrafast pump-probe Transmission Electron Microscopy.
Summary
TOPSPIN will focus on spin Hall nano-oscillators (SHNOs), which are nano-sized, ultra-tunable, and CMOS compatible spin wave based microwave oscillators. TOPSPIN will push the boundaries of SHNO lithography, frequency, speed, and power consumption by combining topological insulators, having record high spin Hall efficiencies, with materials having ultra-high spin wave frequencies. TOPSPIN will reduce the required current densities 1-2 orders of magnitude compared to state-of-the-art, making SHNO operating currents approach 1 uA, and increase the SHNO operating frequencies an order of magnitude to as high as 300 GHz.
TOPSPIN will use mutually synchronized SHNOs to achieve orders of magnitude higher signal coherence and achieve novel functionality such as pattern matching and neuromorphic computing. TOPSPIN will demonstrate mutual synchronization of up to 1,000 SHNOs in chains, and as many as 1,000,000 SHNOs in very large-scale two-dimensional arrays. Using dipolar coupling between SHNOs fabricated on top of each other, three-dimensional mutual synchronization will also be demonstrated. As the signal coherence increases linearly with the number of mutually synchronized SHNOs the oscillator quality factor will improve by many orders of magnitude. TOPSPIN will also develop such arrays using magnetic tunnel junction stacks thus combining ultra-high coherence with the highest possible microwave output power.
TOPSPIN will demonstrate ultrafast pattern matching and neuromorphic computing using its SHNO networks. It will functionalize SHNOs to exhibit ultra-fast individual voltage controlled tuning and non-volatile tuning of both the SHNO frequency and the inter-SHNO coupling.
TOPSPIN will characterize its SHNOs using novel methods and techniques such as multichannel electrical measurements, time- and phase-resolved Brillouin Light Scattering microscopy, time-resolved Scanning Transmission X-ray Microscopy, and ultrafast pump-probe Transmission Electron Microscopy.
Max ERC Funding
2 500 000 €
Duration
Start date: 2019-09-01, End date: 2024-08-31