Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3D-nanoMorph
Project Label-free 3D morphological nanoscopy for studying sub-cellular dynamics in live cancer cells with high spatio-temporal resolution
Researcher (PI) Krishna AGARWAL
Host Institution (HI) UNIVERSITETET I TROMSOE - NORGES ARKTISKE UNIVERSITET
Call Details Starting Grant (StG), PE7, ERC-2018-STG
Summary Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Summary
Label-free optical nanoscopy, free from photobleaching and photochemical toxicity of fluorescence labels and yielding 3D morphological resolution of <50 nm, is the future of live cell imaging. 3D-nanoMorph breaks the diffraction barrier and shifts the paradigm in label-free nanoscopy, providing isotropic 3D resolution of <50 nm. To achieve this, 3D-nanoMorph performs non-linear inverse scattering for the first time in nanoscopy and decodes scattering between sub-cellular structures (organelles).
3D-nanoMorph innovatively devises complementary roles of light measurement system and computational nanoscopy algorithm. A novel illumination system and a novel light collection system together enable measurement of only the most relevant intensity component and create a fresh perspective about label-free measurements. A new computational nanoscopy approach employs non-linear inverse scattering. Harnessing non-linear inverse scattering for resolution enhancement in nanoscopy opens new possibilities in label-free 3D nanoscopy.
I will apply 3D-nanoMorph to study organelle degradation (autophagy) in live cancer cells over extended duration with high spatial and temporal resolution, presently limited by the lack of high-resolution label-free 3D morphological nanoscopy. Successful 3D mapping of nanoscale biological process of autophagy will open new avenues for cancer treatment and showcase 3D-nanoMorph for wider applications.
My cross-disciplinary expertise of 14 years spanning inverse problems, electromagnetism, optical microscopy, integrated optics and live cell nanoscopy paves path for successful implementation of 3D-nanoMorph.
Max ERC Funding
1 499 999 €
Duration
Start date: 2019-07-01, End date: 2024-06-30
Project acronym 3D-QUEST
Project 3D-Quantum Integrated Optical Simulation
Researcher (PI) Fabio Sciarrino
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Starting Grant (StG), PE2, ERC-2012-StG_20111012
Summary "Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Summary
"Quantum information was born from the merging of classical information and quantum physics. Its main objective consists of understanding the quantum nature of information and learning how to process it by using physical systems which operate by following quantum mechanics laws. Quantum simulation is a fundamental instrument to investigate phenomena of quantum systems dynamics, such as quantum transport, particle localizations and energy transfer, quantum-to-classical transition, and even quantum improved computation, all tasks that are hard to simulate with classical approaches. Within this framework integrated photonic circuits have a strong potential to realize quantum information processing by optical systems.
The aim of 3D-QUEST is to develop and implement quantum simulation by exploiting 3-dimensional integrated photonic circuits. 3D-QUEST is structured to demonstrate the potential of linear optics to implement a computational power beyond the one of a classical computer. Such ""hard-to-simulate"" scenario is disclosed when multiphoton-multimode platforms are realized. The 3D-QUEST research program will focus on three tasks of growing difficulty.
A-1. To simulate bosonic-fermionic dynamics with integrated optical systems acting on 2 photon entangled states.
A-2. To pave the way towards hard-to-simulate, scalable quantum linear optical circuits by investigating m-port interferometers acting on n-photon states with n>2.
A-3. To exploit 3-dimensional integrated structures for the observation of new quantum optical phenomena and for the quantum simulation of more complex scenarios.
3D-QUEST will exploit the potential of the femtosecond laser writing integrated waveguides. This technique will be adopted to realize 3-dimensional capabilities and high flexibility, bringing in this way the optical quantum simulation in to new regime."
Max ERC Funding
1 474 800 €
Duration
Start date: 2012-08-01, End date: 2017-07-31
Project acronym 3DSPIN
Project 3-Dimensional Maps of the Spinning Nucleon
Researcher (PI) Alessandro Bacchetta
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PAVIA
Call Details Consolidator Grant (CoG), PE2, ERC-2014-CoG
Summary How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Summary
How does the inside of the proton look like? What generates its spin?
3DSPIN will deliver essential information to answer these questions at the frontier of subnuclear physics.
At present, we have detailed maps of the distribution of quarks and gluons in the nucleon in 1D (as a function of their momentum in a single direction). We also know that quark spins account for only about 1/3 of the spin of the nucleon.
3DSPIN will lead the way into a new stage of nucleon mapping, explore the distribution of quarks in full 3D momentum space and obtain unprecedented information on orbital angular momentum.
Goals
1. extract from experimental data the 3D distribution of quarks (in momentum space), as described by Transverse-Momentum Distributions (TMDs);
2. obtain from TMDs information on quark Orbital Angular Momentum (OAM).
Methodology
3DSPIN will implement state-of-the-art fitting procedures to analyze relevant experimental data and extract quark TMDs, similarly to global fits of standard parton distribution functions. Information about quark angular momentum will be obtained through assumptions based on theoretical considerations. The next five years represent an ideal time window to accomplish our goals, thanks to the wealth of expected data from deep-inelastic scattering experiments (COMPASS, Jefferson Lab), hadronic colliders (Fermilab, BNL, LHC), and electron-positron colliders (BELLE, BABAR). The PI has a strong reputation in this field. The group will operate in partnership with the Italian National Institute of Nuclear Physics and in close interaction with leading experts and experimental collaborations worldwide.
Impact
Mapping the 3D structure of chemical compounds has revolutionized chemistry. Similarly, mapping the 3D structure of the nucleon will have a deep impact on our understanding of the fundamental constituents of matter. We will open new perspectives on the dynamics of quarks and gluons and sharpen our view of high-energy processes involving nucleons.
Max ERC Funding
1 509 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym 4DPHOTON
Project Beyond Light Imaging: High-Rate Single-Photon Detection in Four Dimensions
Researcher (PI) Massimiliano FIORINI
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Consolidator Grant (CoG), PE2, ERC-2018-COG
Summary Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Summary
Goal of the 4DPHOTON project is the development and construction of a photon imaging detector with unprecedented performance. The proposed device will be capable of detecting fluxes of single-photons up to one billion photons per second, over areas of several square centimetres, and will measure - for each photon - position and time simultaneously with resolutions better than ten microns and few tens of picoseconds, respectively. These figures of merit will open many important applications allowing significant advances in particle physics, life sciences or other emerging fields where excellent timing and position resolutions are simultaneously required.
Our goal will be achieved thanks to the use of an application-specific integrated circuit in 65 nm complementary metal-oxide-semiconductor (CMOS) technology, that will deliver a timing resolution of few tens of picoseconds at the pixel level, over few hundred thousand individually-active pixel channels, allowing very high rates of photons to be detected, and the corresponding information digitized and transferred to a processing unit.
As a result of the 4DPHOTON project we will remove the constraints that many light imaging applications have due to the lack of precise single-photon information on four dimensions (4D): the three spatial coordinates and time simultaneously. In particular, we will prove the performance of this detector in the field of particle physics, performing the reconstruction of Cherenkov photon rings with a timing resolution of ten picoseconds. With its excellent granularity, timing resolution, rate capability and compactness, this detector will represent a new paradigm for the realisation of future Ring Imaging Cherenkov detectors, capable of achieving high efficiency particle identification in environments with very high particle multiplicities, exploiting time-association of the photon hits.
Max ERC Funding
1 975 000 €
Duration
Start date: 2019-12-01, End date: 2024-11-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AGENSI
Project A Genetic View into Past Sea Ice Variability in the Arctic
Researcher (PI) Stijn DE SCHEPPER
Host Institution (HI) NORCE NORWEGIAN RESEARCH CENTRE AS
Call Details Consolidator Grant (CoG), PE10, ERC-2018-COG
Summary Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Summary
Arctic sea ice decline is the exponent of the rapidly transforming Arctic climate. The ensuing local and global implications can be understood by studying past climate transitions, yet few methods are available to examine past Arctic sea ice cover, severely restricting our understanding of sea ice in the climate system. The decline in Arctic sea ice cover is a ‘canary in the coalmine’ for the state of our climate, and if greenhouse gas emissions remain unchecked, summer sea ice loss may pass a critical threshold that could drastically transform the Arctic. Because historical observations are limited, it is crucial to have reliable proxies for assessing natural sea ice variability, its stability and sensitivity to climate forcing on different time scales. Current proxies address aspects of sea ice variability, but are limited due to a selective fossil record, preservation effects, regional applicability, or being semi-quantitative. With such restraints on our knowledge about natural variations and drivers, major uncertainties about the future remain.
I propose to develop and apply a novel sea ice proxy that exploits genetic information stored in marine sediments, sedimentary ancient DNA (sedaDNA). This innovation uses the genetic signature of phytoplankton communities from surface waters and sea ice as it gets stored in sediments. This wealth of information has not been explored before for reconstructing sea ice conditions. Preliminary results from my cross-disciplinary team indicate that our unconventional approach can provide a detailed, qualitative account of past sea ice ecosystems and quantitative estimates of sea ice parameters. I will address fundamental questions about past Arctic sea ice variability on different timescales, information essential to provide a framework upon which to assess the ecological and socio-economic consequences of a changing Arctic. This new proxy is not limited to sea ice research and can transform the field of paleoceanography.
Max ERC Funding
2 615 858 €
Duration
Start date: 2019-08-01, End date: 2024-07-31
Project acronym AGEnTh
Project Atomic Gauge and Entanglement Theories
Researcher (PI) Marcello DALMONTE
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Starting Grant (StG), PE2, ERC-2017-STG
Summary AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Summary
AGEnTh is an interdisciplinary proposal which aims at theoretically investigating atomic many-body systems (cold atoms and trapped ions) in close connection to concepts from quantum information, condensed matter, and high energy physics. The main goals of this programme are to:
I) Find to scalable schemes for the measurements of entanglement properties, and in particular entanglement spectra, by proposing a shifting paradigm to access entanglement focused on entanglement Hamiltonians and field theories instead of probing density matrices;
II) Show how atomic gauge theories (including dynamical gauge fields) are ideal candidates for the realization of long-sought, highly-entangled states of matter, in particular topological superconductors supporting parafermion edge modes, and novel classes of quantum spin liquids emerging from clustering;
III) Develop new implementation strategies for the realization of gauge symmetries of paramount importance, such as discrete and SU(N)xSU(2)xU(1) groups, and establish a theoretical framework for the understanding of atomic physics experiments within the light-from-chaos scenario pioneered in particle physics.
These objectives are at the cutting-edge of fundamental science, and represent a coherent effort aimed at underpinning unprecedented regimes of strongly interacting quantum matter by addressing the basic aspects of probing, many-body physics, and implementations. The results are expected to (i) build up and establish qualitatively new synergies between the aforementioned communities, and (ii) stimulate an intense theoretical and experimental activity focused on both entanglement and atomic gauge theories.
In order to achieve those, AGEnTh builds: (1) on my background working at the interface between atomic physics and quantum optics from one side, and many-body theory on the other, and (2) on exploratory studies which I carried out to mitigate the conceptual risks associated with its high-risk/high-gain goals.
Max ERC Funding
1 055 317 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym AIDA
Project An Illumination of the Dark Ages: modeling reionization and interpreting observations
Researcher (PI) Andrei Albert Mesinger
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), PE9, ERC-2014-STG
Summary "Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Summary
"Understanding the dawn of the first galaxies and how their light permeated the early Universe is at the very frontier of modern astrophysical cosmology. Generous resources, including ambitions observational programs, are being devoted to studying these epochs of Cosmic Dawn (CD) and Reionization (EoR). In order to interpret these observations, we propose to build on our widely-used, semi-numeric simulation tool, 21cmFAST, and apply it to observations. Using sub-grid, semi-analytic models, we will incorporate additional physical processes governing the evolution of sources and sinks of ionizing photons. The resulting state-of-the-art simulations will be well poised to interpret topical observations of quasar spectra and the cosmic 21cm signal. They would be both physically-motivated and fast, allowing us to rapidly explore astrophysical parameter space. We will statistically quantify the resulting degeneracies and constraints, providing a robust answer to the question, ""What can we learn from EoR/CD observations?"" As an end goal, these investigations will help us understand when the first generations of galaxies formed, how they drove the EoR, and what are the associated large-scale observational signatures."
Max ERC Funding
1 468 750 €
Duration
Start date: 2015-05-01, End date: 2021-01-31
Project acronym AISENS
Project New generation of high sensitive atom interferometers
Researcher (PI) Marco Fattori
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE2, ERC-2010-StG_20091028
Summary Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Summary
Interferometers are fundamental tools for the study of nature laws and for the precise measurement and control of the physical world. In the last century, the scientific and technological progress has proceeded in parallel with a constant improvement of interferometric performances. For this reason, the challenge of conceiving and realizing new generations of interferometers with broader ranges of operation and with higher sensitivities is always open and actual.
Despite the introduction of laser devices has deeply improved the way of developing and performing interferometric measurements with light, the atomic matter wave analogous, i.e. the Bose-Einstein condensate (BEC), has not yet triggered any revolution in precision interferometry. However, thanks to recent improvements on the control of the quantum properties of ultra-cold atomic gases, and new original ideas on the creation and manipulation of quantum entangled particles, the field of atom interferometry is now mature to experience a big step forward.
The system I want to realize is a Mach-Zehnder spatial interferometer operating with trapped BECs. Undesired decoherence sources will be suppressed by implementing BECs with tunable interactions in ultra-stable optical potentials. Entangled states will be used to improve the sensitivity of the sensor beyond the standard quantum limit to ideally reach the ultimate, Heisenberg, limit set by quantum mechanics. The resulting apparatus will show unprecedented spatial resolution and will overcome state-of-the-art interferometers with cold (non condensed) atomic gases.
A successful completion of this project will lead to a new generation of interferometers for the immediate application to local inertial measurements with unprecedented resolution. In addition, we expect to develop experimental capabilities which might find application well beyond quantum interferometry and crucially contribute to the broader emerging field of quantum-enhanced technologies.
Max ERC Funding
1 068 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ALLELECHOKER
Project DNA binding proteins for treatment of gain of function mutations
Researcher (PI) Enrico Maria Surace
Host Institution (HI) FONDAZIONE TELETHON
Call Details Starting Grant (StG), LS7, ERC-2012-StG_20111109
Summary Zinc finger (ZF) and transcription activator-like effector (TALE) based technologies are been allowing the tailored design of “artificial” DNA-binding proteins targeted to specific and unique DNA genomic sequences. Coupling DNA binding proteins to effectors domains enables the constitution of DNA binding factors for genomic directed transcriptional modulation or targeted genomic editing. We have demonstrated that pairing a ZF DNA binding protein to the transcriptional repressor Kruppel-associated box enables in vivo, the transcriptional repression of one of the most abundantly expressed gene in mammals, the human rhodopsin gene (RHO). We propose to generate RHO DNA binding silencers (“AlleleChoker”), which inactivate RHO either by transcriptional repression or targeted genome modification, irrespectively to wild-type or mutated alleles (mutational-independent approach), and combine RHO endogenous silencing to RHO replacement (silencing-replacement strategy). With this strategy in principle a single bimodal bio-therapeutic will enable the correction of any photoreceptor disease associated with RHO mutation. Adeno-associated viral (AAV) vector-based delivery will be used for photoreceptors gene transfer. Specifically our objectives are: 1) Construction of transcriptional repressors and nucleases for RHO silencing. Characterization and comparison of RHO silencing mediated by transcriptional repressors (ZFR/ TALER) or nucleases (ZFN/ TALEN) to generate genomic directed inactivation by non-homologous end-joining (NHEJ), and refer these results to RNA interference (RNAi) targeted to RHO; 2) RHO silencing in photoreceptors. to determine genome-wide DNA binding specificity of silencers, chromatin modifications and expression profile on human retinal explants; 3) Tuning silencing and replacement. To determine the impact of gene silencing-replacement strategy on disease progression in animal models of autosomal dominant retinitis pigmentosa (adRP) associated to RHO mutations
Summary
Zinc finger (ZF) and transcription activator-like effector (TALE) based technologies are been allowing the tailored design of “artificial” DNA-binding proteins targeted to specific and unique DNA genomic sequences. Coupling DNA binding proteins to effectors domains enables the constitution of DNA binding factors for genomic directed transcriptional modulation or targeted genomic editing. We have demonstrated that pairing a ZF DNA binding protein to the transcriptional repressor Kruppel-associated box enables in vivo, the transcriptional repression of one of the most abundantly expressed gene in mammals, the human rhodopsin gene (RHO). We propose to generate RHO DNA binding silencers (“AlleleChoker”), which inactivate RHO either by transcriptional repression or targeted genome modification, irrespectively to wild-type or mutated alleles (mutational-independent approach), and combine RHO endogenous silencing to RHO replacement (silencing-replacement strategy). With this strategy in principle a single bimodal bio-therapeutic will enable the correction of any photoreceptor disease associated with RHO mutation. Adeno-associated viral (AAV) vector-based delivery will be used for photoreceptors gene transfer. Specifically our objectives are: 1) Construction of transcriptional repressors and nucleases for RHO silencing. Characterization and comparison of RHO silencing mediated by transcriptional repressors (ZFR/ TALER) or nucleases (ZFN/ TALEN) to generate genomic directed inactivation by non-homologous end-joining (NHEJ), and refer these results to RNA interference (RNAi) targeted to RHO; 2) RHO silencing in photoreceptors. to determine genome-wide DNA binding specificity of silencers, chromatin modifications and expression profile on human retinal explants; 3) Tuning silencing and replacement. To determine the impact of gene silencing-replacement strategy on disease progression in animal models of autosomal dominant retinitis pigmentosa (adRP) associated to RHO mutations
Max ERC Funding
1 354 840 €
Duration
Start date: 2013-02-01, End date: 2018-01-31
Project acronym AMDROMA
Project Algorithmic and Mechanism Design Research in Online MArkets
Researcher (PI) Stefano LEONARDI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA LA SAPIENZA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Summary
Online markets currently form an important share of the global economy. The Internet hosts classical markets (real-estate, stocks, e-commerce) as well allowing new markets with previously unknown features (web-based advertisement, viral marketing, digital goods, crowdsourcing, sharing economy). Algorithms play a central role in many decision processes involved in online markets. For example, algorithms run electronic auctions, trade stocks, adjusts prices dynamically, and harvest big data to provide economic information. Thus, it is of paramount importance to understand the algorithmic and mechanism design foundations of online markets.
The algorithmic research issues that we consider involve algorithmic mechanism design, online and approximation algorithms, modelling uncertainty in online market design, and large-scale data analysisonline and approximation algorithms, large-scale optimization and data mining. The aim of this research project is to combine these fields to consider research questions that are central for today's Internet economy. We plan to apply these techniques so as to solve fundamental algorithmic problems motivated by web-basedInternet advertisement, Internet market designsharing economy, and crowdsourcingonline labour marketplaces. While my planned research is focussedcentered on foundational work with rigorous design and analysis of in algorithms and mechanismsic design and analysis, it will also include as an important component empirical validation on large-scale real-life datasets.
Max ERC Funding
1 780 150 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym AN07AT
Project Understanding computational roles of new neurons generated in the adult hippocampus
Researcher (PI) Ayumu Tashiro
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Starting Grant (StG), LS4, ERC-2007-StG
Summary New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Summary
New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Max ERC Funding
1 991 743 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANGIOPLACE
Project Expression and Methylation Status of Genes Regulating Placental Angiogenesis in Normal, Cloned, IVF and Monoparental Sheep Foetuses
Researcher (PI) Grazyna Ewa Ptak
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TERAMO
Call Details Starting Grant (StG), LS7, ERC-2007-StG
Summary Normal placental angiogenesis is critical for embryonic survival and development. Epigenetic modifications, such as methylation of CpG islands, regulate the expression and imprinting of genes. Epigenetic abnormalities have been observed in embryos from assisted reproductive technologies (ART), which could explain the poor placental vascularisation, embryonic/fetal death, and altered fetal growth in these pregnancies. Both cloned (somatic cell nuclear transfer, or SNCT) and monoparental (parthogenotes, only maternal genes; androgenotes, only paternal genes) embryos provide important models for studying defects in expression and methylation status/imprinting of genes regulating placental function. Our hypothesis is that placental vascular development is compromised during early pregnancy in embryos from ART, in part due to altered expression or imprinting/methylation status of specific genes regulating placental angiogenesis. We will evaluate fetal growth, placental vascular growth, and expression and epigenetic status of genes regulating placental angiogenesis during early pregnancy in 3 Specific Aims: (1) after natural mating; (2) after transfer of biparental embryos from in vitro fertilization, and SCNT; and (3) after transfer of parthenogenetic or androgenetic embryos. These studies will therefore contribute substantially to our understanding of the regulation of placental development and vascularisation during early pregnancy, and could pinpoint the mechanism contributing to embryonic loss and developmental abnormalities in foetuses from ART. Any or all of these observations will contribute to our understanding of and also our ability to successfully employ ART, which are becoming very wide spread and important in human medicine as well as in animal production.
Summary
Normal placental angiogenesis is critical for embryonic survival and development. Epigenetic modifications, such as methylation of CpG islands, regulate the expression and imprinting of genes. Epigenetic abnormalities have been observed in embryos from assisted reproductive technologies (ART), which could explain the poor placental vascularisation, embryonic/fetal death, and altered fetal growth in these pregnancies. Both cloned (somatic cell nuclear transfer, or SNCT) and monoparental (parthogenotes, only maternal genes; androgenotes, only paternal genes) embryos provide important models for studying defects in expression and methylation status/imprinting of genes regulating placental function. Our hypothesis is that placental vascular development is compromised during early pregnancy in embryos from ART, in part due to altered expression or imprinting/methylation status of specific genes regulating placental angiogenesis. We will evaluate fetal growth, placental vascular growth, and expression and epigenetic status of genes regulating placental angiogenesis during early pregnancy in 3 Specific Aims: (1) after natural mating; (2) after transfer of biparental embryos from in vitro fertilization, and SCNT; and (3) after transfer of parthenogenetic or androgenetic embryos. These studies will therefore contribute substantially to our understanding of the regulation of placental development and vascularisation during early pregnancy, and could pinpoint the mechanism contributing to embryonic loss and developmental abnormalities in foetuses from ART. Any or all of these observations will contribute to our understanding of and also our ability to successfully employ ART, which are becoming very wide spread and important in human medicine as well as in animal production.
Max ERC Funding
363 600 €
Duration
Start date: 2008-10-01, End date: 2012-06-30
Project acronym ANISOTROPIC UNIVERSE
Project The anisotropic universe -- a reality or fluke?
Researcher (PI) Hans Kristian Kamfjord Eriksen
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE9, ERC-2010-StG_20091028
Summary "During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Summary
"During the last decade, a strikingly successful cosmological concordance model has been established. With only six free parameters, nearly all observables, comprising millions of data points, may be fitted with outstanding precision. However, in this beautiful picture a few ""blemishes"" have turned up, apparently not consistent with the standard model: While the model predicts that the universe is isotropic (i.e., looks the same in all directions) and homogeneous (i.e., the statistical properties are the same everywhere), subtle hints of the contrary are now seen. For instance, peculiar preferred directions and correlations are observed in the cosmic microwave background; some studies considering nearby galaxies suggest the existence of anomalous large-scale cosmic flows; a study of distant quasars hints towards unexpected large-scale correlations. All of these reports are individually highly intriguing, and together they hint toward a more complicated and interesting universe than previously imagined -- but none of the reports can be considered decisive. One major obstacle in many cases has been the relatively poor data quality.
This is currently about to change, as the next generation of new and far more powerful experiments are coming online. Of special interest to me are Planck, an ESA-funded CMB satellite currently taking data; QUIET, a ground-based CMB polarization experiment located in Chile; and various large-scale structure (LSS) data sets, such as the SDSS and 2dF surveys, and in the future Euclid, a proposed galaxy survey satellite also funded by ESA. By combining the world s best data from both CMB and LSS measurements, I will in the proposed project attempt to settle this question: Is our universe really anisotropic? Or are these recent claims only the results of systematic errors or statistical flukes? If the claims turn out to hold against this tide of new and high-quality data, then cosmology as a whole may need to be re-written."
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANOREP
Project Targeting the reproductive biology of the malaria mosquito Anopheles gambiae: from laboratory studies to field applications
Researcher (PI) Flaminia Catteruccia
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PERUGIA
Call Details Starting Grant (StG), LS2, ERC-2010-StG_20091118
Summary Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Summary
Anopheles gambiae mosquitoes are the major vectors of malaria, a disease with devastating consequences for
human health. Novel methods for controlling the natural vector populations are urgently needed, given the
evolution of insecticide resistance in mosquitoes and the lack of novel insecticidals. Understanding the
processes at the bases of mosquito biology may help to roll back malaria. In this proposal, we will target
mosquito reproduction, a major determinant of the An. gambiae vectorial capacity. This will be achieved at
two levels: (i) fundamental research, to provide a deeper knowledge of the processes regulating reproduction
in this species, and (ii) applied research, to identify novel targets and to develop innovative approaches for
the control of natural populations. We will focus our analysis on three major players of mosquito
reproduction: male accessory glands (MAGs), sperm, and spermatheca, in both laboratory and field settings.
We will then translate this information into the identification of inhibitors of mosquito fertility. The
experimental activities will be divided across three objectives. In Objective 1, we will unravel the role of the
MAGs in shaping mosquito fertility and behaviour, by performing a combination of transcriptional and
functional studies that will reveal the multifaceted activities of these tissues. In Objective 2 we will instead
focus on the identification of the male and female factors responsible for sperm viability and function.
Results obtained in both objectives will be validated in field mosquitoes. In Objective 3, we will perform
screens aimed at the identification of inhibitors of mosquito reproductive success. This study will reveal as
yet unknown molecular mechanisms underlying reproductive success in mosquitoes, considerably increasing
our knowledge beyond the state-of-the-art and critically contributing with innovative tools and ideas to the
fight against malaria.
Max ERC Funding
1 500 000 €
Duration
Start date: 2011-01-01, End date: 2015-12-31
Project acronym ANTEGEFI
Project Analytic Techniques for Geometric and Functional Inequalities
Researcher (PI) Nicola Fusco
Host Institution (HI) UNIVERSITA DEGLI STUDI DI NAPOLI FEDERICO II
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Summary
Isoperimetric and Sobolev inequalities are the best known examples of geometric-functional inequalities. In recent years the PI and collaborators have obtained new and sharp quantitative versions of these and other important related inequalities. These results have been obtained by the combined use of classical symmetrization methods, new tools coming from mass transportation theory, deep geometric measure tools and ad hoc symmetrizations. The objective of this project is to further develop thes techniques in order to get: sharp quantitative versions of Faber-Krahn inequality, Gaussian isoperimetric inequality, Brunn-Minkowski inequality, Poincaré and Sobolev logarithm inequalities; sharp decay rates for the quantitative Sobolev inequalities and Polya-Szegö inequality.
Max ERC Funding
600 000 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym AROMA-CFD
Project Advanced Reduced Order Methods with Applications in Computational Fluid Dynamics
Researcher (PI) Gianluigi Rozza
Host Institution (HI) SCUOLA INTERNAZIONALE SUPERIORE DI STUDI AVANZATI DI TRIESTE
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Summary
The aim of AROMA-CFD is to create a team of scientists at SISSA for the development of Advanced Reduced Order Modelling techniques with a focus in Computational Fluid Dynamics (CFD), in order to face and overcome many current limitations of the state of the art and improve the capabilities of reduced order methodologies for more demanding applications in industrial, medical and applied sciences contexts. AROMA-CFD deals with strong methodological developments in numerical analysis, with a special emphasis on mathematical modelling and extensive exploitation of computational science and engineering. Several tasks have been identified to tackle important problems and open questions in reduced order modelling: study of bifurcations and instabilities in flows, increasing Reynolds number and guaranteeing stability, moving towards turbulent flows, considering complex geometrical parametrizations of shapes as computational domains into extended networks. A reduced computational and geometrical framework will be developed for nonlinear inverse problems, focusing on optimal flow control, shape optimization and uncertainty quantification. Further, all the advanced developments in reduced order modelling for CFD will be delivered for applications in multiphysics, such as fluid-structure interaction problems and general coupled phenomena involving inviscid, viscous and thermal flows, solids and porous media. The advanced developed framework within AROMA-CFD will provide attractive capabilities for several industrial and medical applications (e.g. aeronautical, mechanical, naval, off-shore, wind, sport, biomedical engineering, and cardiovascular surgery as well), combining high performance computing (in dedicated supercomputing centers) and advanced reduced order modelling (in common devices) to guarantee real time computing and visualization. A new open source software library for AROMA-CFD will be created: ITHACA, In real Time Highly Advanced Computational Applications.
Max ERC Funding
1 656 579 €
Duration
Start date: 2016-05-01, End date: 2021-04-30
Project acronym ARS
Project Autonomous Robotic Surgery
Researcher (PI) Paolo FIORINI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI VERONA
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Summary
The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Max ERC Funding
2 750 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AUTO-CD
Project COELIAC DISEASE: UNDERSTANDING HOW A FOREIGN PROTEIN DRIVES AUTOANTIBODY FORMATION
Researcher (PI) Ludvig Magne Sollid
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), LS6, ERC-2010-AdG_20100317
Summary The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Summary
The goal of this project is to understand the mechanism of how highly disease specific autoantibodies are generated in response to the exposure to a foreign antigen. IgA autoantibodies reactive with the enzyme transglutaminase 2 (TG2) are typical of coeliac disease (CD). These antibodies are only present in subjects who are HLA-DQ2 or -DQ8, and their production is dependent on dietary gluten exposure. This suggests that CD4+ gluten reactive T cells, which are found in CD patients and which recognise gluten peptides deamidated by TG2 in context of DQ2 or DQ8, are implicated in the generation of these autoantibodies. Many small intestinal IgA+ plasma cells express membrane Ig hence allowing isolation of antigen specific cells. Whereas control subjects lack anti-TG2 IgA+ plasma cells, on average 10% of the plasma cells of CD patients are specific for TG2. We have sorted single TG2 reactive IgA+ plasma cells, cloned their VH and VL genes and expressed recombinant mAbs. So far we have expressed 26 TG2 specific mAbs. There is a strong bias for VH5-51 usage, and surprisingly the antibodies are modestly mutated. TG2 acts on specific glutamine residues and can either crosslink these to other proteins (transamidation) or hydrolyse the glutamine to a glutamate (deamidation). None of the 18 mAbs tested affected either transamidation or deamidation leading us to hypothesise that retained crosslinking ability of TG2 when bound to membrane Ig of B cells is an integral part of the anti-TG2 response. Four models of how activation of TG2 specific B cells is facilitated by TG2 crosslinking and the help of gluten reactive CD4 T cells are proposed. These four models will be extensively tested including doing in vivo assays with a newly generated transgenic anti-TG2 immunoglobulin knock-in mouse model.
Max ERC Funding
2 291 045 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym B Massive
Project Binary massive black hole astrophysics
Researcher (PI) Alberto SESANA
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE9, ERC-2018-COG
Summary Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Summary
Massive black hole binaries (MBHBs) are the most extreme, fascinating yet elusive astrophysical objects in the Universe. Establishing observationally their existence will be a milestone for contemporary astronomy, providing a fundamental missing piece in the puzzle of galaxy formation, piercing through the (hydro)dynamical physical processes shaping dense galactic nuclei from parsec scales down to the event horizon, and probing gravity in extreme conditions.
We can both see and listen to MBHBs. Remarkably, besides arguably being among the brightest variable objects shining in the Cosmos, MBHBs are also the loudest gravitational wave (GW) sources in the Universe. As such, we shall take advantage of both the type of messengers – photons and gravitons – they are sending to us, which can now be probed by all-sky time-domain surveys and radio pulsar timing arrays (PTAs) respectively.
B MASSIVE leverages on a unique comprehensive approach combining theoretical astrophysics, radio and gravitational-wave astronomy and time-domain surveys, with state of the art data analysis techniques to: i) observationally prove the existence of MBHBs, ii) understand and constrain their astrophysics and dynamics, iii) enable and bring closer in time the direct detection of GWs with PTA.
As European PTA (EPTA) executive committee member and former I
International PTA (IPTA) chair, I am a driving force in the development of pulsar timing science world-wide, and the project will build on the profound knowledge, broad vision and wide collaboration network that established me as a world leader in the field of MBHB and GW astrophysics. B MASSIVE is extremely timely; a pulsar timing data set of unprecedented quality is being assembled by EPTA/IPTA, and Time-Domain astronomy surveys are at their dawn. In the long term, B MASSIVE will be a fundamental milestone establishing European leadership in the cutting-edge field of MBHB astrophysics in the era of LSST, SKA and LISA.
Max ERC Funding
1 532 750 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BACKUP
Project Unveiling the relationship between brain connectivity and function by integrated photonics
Researcher (PI) Lorenzo PAVESI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRENTO
Call Details Advanced Grant (AdG), PE7, ERC-2017-ADG
Summary I will address the fundamental question of which is the role of neuron activity and plasticity in information elaboration and storage in the brain. I, together with an interdisciplinary team, will develop a hybrid neuro-morphic computing platform. Integrated photonic circuits will be interfaced to both electronic circuits and neuronal circuits (in vitro experiments) to emulate brain functions and develop schemes able to supplement (backup) neuronal functions. The photonic network is based on massive reconfigurable matrices of nonlinear nodes formed by microring resonators, which enter in regime of self-pulsing and chaos by positive optical feedback. These networks resemble human brain. I will push this analogy further by interfacing the photonic network with neurons making hybrid network. By using optogenetics, I will control the synaptic strengthen-ing and the neuron activity. Deep learning algorithms will model the biological network functionality, initial-ly within a separate artificial network and, then, in an integrated hybrid artificial-biological network.
My project aims at:
1. Developing a photonic integrated reservoir-computing network (RCN);
2. Developing dynamic memories in photonic integrated circuits using RCN;
3. Developing hybrid interfaces between a neuronal network and a photonic integrated circuit;
4. Developing a hybrid electronic, photonic and biological network that computes jointly;
5. Addressing neuronal network activity by photonic RCN to simulate in vitro memory storage and retrieval;
6. Elaborating the signal from RCN and neuronal circuits in order to cope with plastic changes in pathologi-cal brain conditions such as amnesia and epilepsy.
The long-term vision is that hybrid neuromorphic photonic networks will (a) clarify the way brain thinks, (b) compute beyond von Neumann, and (c) control and supplement specific neuronal functions.
Summary
I will address the fundamental question of which is the role of neuron activity and plasticity in information elaboration and storage in the brain. I, together with an interdisciplinary team, will develop a hybrid neuro-morphic computing platform. Integrated photonic circuits will be interfaced to both electronic circuits and neuronal circuits (in vitro experiments) to emulate brain functions and develop schemes able to supplement (backup) neuronal functions. The photonic network is based on massive reconfigurable matrices of nonlinear nodes formed by microring resonators, which enter in regime of self-pulsing and chaos by positive optical feedback. These networks resemble human brain. I will push this analogy further by interfacing the photonic network with neurons making hybrid network. By using optogenetics, I will control the synaptic strengthen-ing and the neuron activity. Deep learning algorithms will model the biological network functionality, initial-ly within a separate artificial network and, then, in an integrated hybrid artificial-biological network.
My project aims at:
1. Developing a photonic integrated reservoir-computing network (RCN);
2. Developing dynamic memories in photonic integrated circuits using RCN;
3. Developing hybrid interfaces between a neuronal network and a photonic integrated circuit;
4. Developing a hybrid electronic, photonic and biological network that computes jointly;
5. Addressing neuronal network activity by photonic RCN to simulate in vitro memory storage and retrieval;
6. Elaborating the signal from RCN and neuronal circuits in order to cope with plastic changes in pathologi-cal brain conditions such as amnesia and epilepsy.
The long-term vision is that hybrid neuromorphic photonic networks will (a) clarify the way brain thinks, (b) compute beyond von Neumann, and (c) control and supplement specific neuronal functions.
Max ERC Funding
2 499 825 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym BEAT
Project The functional interaction of EGFR and beta-catenin signalling in colorectal cancer: Genetics, mechanisms, and therapeutic potential.
Researcher (PI) Andrea BERTOTTI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Consolidator Grant (CoG), LS7, ERC-2016-COG
Summary Monoclonal antibodies against the EGF receptor (EGFR) provide substantive benefit to colorectal cancer (CRC) patients. However, no genetic lesions that robustly predict ‘addiction’ to the EGFR pathway have been yet identified. Further, even in tumours that regress after EGFR blockade, subsets of drug-tolerant cells often linger and foster ‘minimal residual disease’ (MRD), which portends tumour relapse.
Our preliminary evidence suggests that reliance on EGFR activity, as opposed to MRD persistence, could be assisted by genetically-based variations in transcription factor partnerships and activities, gene expression outputs, and biological fates controlled by the WNT/beta-catenin pathway. On such premises, BEAT (Beta-catenin and EGFR Abrogation Therapy) will elucidate the mechanisms of EGFR dependency, and escape from it, with the goal to identify biomarkers for more efficient clinical management of CRC and develop new therapies for MRD eradication.
A multidisciplinary approach will be pursued spanning from integrative gene regulation analyses to functional genomics in vitro, pharmacological experiments in vivo, and clinical investigation, to address whether: (i) specific genetic alterations of the WNT pathway affect anti-EGFR sensitivity; (ii) combined neutralisation of EGFR and WNT signals fuels MRD deterioration; (iii) data from analysis of this synergy can lead to the discovery of clinically meaningful biomarkers with predictive and prognostic significance.
This proposal capitalises on a unique proprietary platform for high-content studies based on a large biobank of viable CRC samples, which ensures strong analytical power and unprecedented biological flexibility. By providing fresh insight into the mechanisms whereby WNT/beta-catenin signalling differentially sustains EGFR dependency or drug tolerance, the project is expected to put forward an innovative reinterpretation of CRC molecular bases and advance the rational application of more effective therapies.
Summary
Monoclonal antibodies against the EGF receptor (EGFR) provide substantive benefit to colorectal cancer (CRC) patients. However, no genetic lesions that robustly predict ‘addiction’ to the EGFR pathway have been yet identified. Further, even in tumours that regress after EGFR blockade, subsets of drug-tolerant cells often linger and foster ‘minimal residual disease’ (MRD), which portends tumour relapse.
Our preliminary evidence suggests that reliance on EGFR activity, as opposed to MRD persistence, could be assisted by genetically-based variations in transcription factor partnerships and activities, gene expression outputs, and biological fates controlled by the WNT/beta-catenin pathway. On such premises, BEAT (Beta-catenin and EGFR Abrogation Therapy) will elucidate the mechanisms of EGFR dependency, and escape from it, with the goal to identify biomarkers for more efficient clinical management of CRC and develop new therapies for MRD eradication.
A multidisciplinary approach will be pursued spanning from integrative gene regulation analyses to functional genomics in vitro, pharmacological experiments in vivo, and clinical investigation, to address whether: (i) specific genetic alterations of the WNT pathway affect anti-EGFR sensitivity; (ii) combined neutralisation of EGFR and WNT signals fuels MRD deterioration; (iii) data from analysis of this synergy can lead to the discovery of clinically meaningful biomarkers with predictive and prognostic significance.
This proposal capitalises on a unique proprietary platform for high-content studies based on a large biobank of viable CRC samples, which ensures strong analytical power and unprecedented biological flexibility. By providing fresh insight into the mechanisms whereby WNT/beta-catenin signalling differentially sustains EGFR dependency or drug tolerance, the project is expected to put forward an innovative reinterpretation of CRC molecular bases and advance the rational application of more effective therapies.
Max ERC Funding
1 793 421 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym bECOMiNG
Project spontaneous Evolution and Clonal heterOgeneity in MoNoclonal Gammopathies: from mechanisms of progression to clinical management
Researcher (PI) Niccolo Bolli
Host Institution (HI) UNIVERSITA DEGLI STUDI DI MILANO
Call Details Consolidator Grant (CoG), LS7, ERC-2018-COG
Summary As an onco-hematologist with a strong expertise in genomics, I significantly contributed to the understanding of multiple myeloma (MM) heterogeneity and its evolution over time, driven by genotypic and phenotypic features carried by different subpopulations of cells. MM is preceded by prevalent, asymptomatic stages that may evolve with variable frequency, not accurately captured by current clinical prognostic scores. Supported by preliminary data, my hypothesis is that the same heterogeneity is present early on the disease course, and identification of the biological determinants of evolution at this stage will allow better prediction of its evolutionary trajectory, if not its control. In this proposal I will therefore make a sharp change from conventional approaches and move to early stages of MM using unique retrospective sample cohorts and ambitious prospective sampling. To identify clonal MM cells in the elderly before a monoclonal gammopathy can be detected, I will collect bone marrow (BM) from hundreds of hip replacement specimens, and analyze archive peripheral blood samples of thousands of healthy individuals with years of annotated clinical follow-up. This will identify early genomic alterations that are permissive to disease initiation/evolution and may serve as biomarkers for clinical screening. Through innovative, integrated single-cell genotyping and phenotyping of hundreds of asymptomatic MMs, I will functionally dissect heterogeneity and characterize the BM microenvironment to look for determinants of disease progression. Correlation with clinical outcome and mini-invasive serial sampling of circulating cell-free DNA will identify candidate biological markers to better predict evolution. Last, aggressive modelling of candidate early lesions and modifier screens will offer a list of vulnerabilities that could be exploited for rationale therapies. These methodologies will deliver a paradigm for the use of molecularly-driven precision medicine in cancer.
Summary
As an onco-hematologist with a strong expertise in genomics, I significantly contributed to the understanding of multiple myeloma (MM) heterogeneity and its evolution over time, driven by genotypic and phenotypic features carried by different subpopulations of cells. MM is preceded by prevalent, asymptomatic stages that may evolve with variable frequency, not accurately captured by current clinical prognostic scores. Supported by preliminary data, my hypothesis is that the same heterogeneity is present early on the disease course, and identification of the biological determinants of evolution at this stage will allow better prediction of its evolutionary trajectory, if not its control. In this proposal I will therefore make a sharp change from conventional approaches and move to early stages of MM using unique retrospective sample cohorts and ambitious prospective sampling. To identify clonal MM cells in the elderly before a monoclonal gammopathy can be detected, I will collect bone marrow (BM) from hundreds of hip replacement specimens, and analyze archive peripheral blood samples of thousands of healthy individuals with years of annotated clinical follow-up. This will identify early genomic alterations that are permissive to disease initiation/evolution and may serve as biomarkers for clinical screening. Through innovative, integrated single-cell genotyping and phenotyping of hundreds of asymptomatic MMs, I will functionally dissect heterogeneity and characterize the BM microenvironment to look for determinants of disease progression. Correlation with clinical outcome and mini-invasive serial sampling of circulating cell-free DNA will identify candidate biological markers to better predict evolution. Last, aggressive modelling of candidate early lesions and modifier screens will offer a list of vulnerabilities that could be exploited for rationale therapies. These methodologies will deliver a paradigm for the use of molecularly-driven precision medicine in cancer.
Max ERC Funding
1 998 781 €
Duration
Start date: 2019-03-01, End date: 2024-02-29
Project acronym BIOINOHYB
Project Smart Bioinorganic Hybrids for Nanomedicine
Researcher (PI) Cristiana Di Valentin
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE5, ERC-2014-CoG
Summary The use of bioinorganic nanohybrids (nanoscaled systems based on an inorganic and a biological component) has already resulted in several innovative medical breakthroughs for drug delivery, therapeutics, imaging, diagnosis and biocompatibility. However, researchers still know relatively little about the structure, function and mechanism of these nanodevices. Theoretical investigations of bioinorganic interfaces are mostly limited to force-field approaches which cannot grasp the details of the physicochemical mechanisms. The BIOINOHYB project proposes to capitalize on recent massively parallelized codes to investigate bioinorganic nanohybrids by advanced quantum chemical methods. This approach will allow to master the chemical and electronic interplay between the bio and the inorganic components in the first part of the project, and the interaction of the hybrid systems with light in the second part. The ultimate goal is to provide the design principles for novel, unconventional assemblies with unprecedented functionalities and strong impact potential in nanomedicine.
More specifically, in this project the traditional metallic nanoparticle will be substituted by emerging semiconducting metal oxide nanostructures with photocatalytic or magnetic properties capable of opening totally new horizons in nanomedicine (e.g. photocatalytic therapy, a new class of contrast agents, magnetically guided drug delivery). Potentially efficient linkers will be screened regarding their ability both to anchor surfaces and to bind biomolecules. Different kinds of biomolecules (from oligopeptides and oligonucleotides to small drugs) will be tethered to the activated surface according to the desired functionality. The key computational challenge, requiring the recourse to more sophisticated methods, will be the investigation of the photo-response to light of the assembled bioinorganic systems, also with specific reference to their labelling with fluorescent markers and contrast agents.
Summary
The use of bioinorganic nanohybrids (nanoscaled systems based on an inorganic and a biological component) has already resulted in several innovative medical breakthroughs for drug delivery, therapeutics, imaging, diagnosis and biocompatibility. However, researchers still know relatively little about the structure, function and mechanism of these nanodevices. Theoretical investigations of bioinorganic interfaces are mostly limited to force-field approaches which cannot grasp the details of the physicochemical mechanisms. The BIOINOHYB project proposes to capitalize on recent massively parallelized codes to investigate bioinorganic nanohybrids by advanced quantum chemical methods. This approach will allow to master the chemical and electronic interplay between the bio and the inorganic components in the first part of the project, and the interaction of the hybrid systems with light in the second part. The ultimate goal is to provide the design principles for novel, unconventional assemblies with unprecedented functionalities and strong impact potential in nanomedicine.
More specifically, in this project the traditional metallic nanoparticle will be substituted by emerging semiconducting metal oxide nanostructures with photocatalytic or magnetic properties capable of opening totally new horizons in nanomedicine (e.g. photocatalytic therapy, a new class of contrast agents, magnetically guided drug delivery). Potentially efficient linkers will be screened regarding their ability both to anchor surfaces and to bind biomolecules. Different kinds of biomolecules (from oligopeptides and oligonucleotides to small drugs) will be tethered to the activated surface according to the desired functionality. The key computational challenge, requiring the recourse to more sophisticated methods, will be the investigation of the photo-response to light of the assembled bioinorganic systems, also with specific reference to their labelling with fluorescent markers and contrast agents.
Max ERC Funding
1 748 125 €
Duration
Start date: 2016-02-01, End date: 2021-01-31
Project acronym BioLEAP
Project Biotechnological optimization of light use efficiency in algae photobioreactors
Researcher (PI) Tomas Morosinotto
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PADOVA
Call Details Starting Grant (StG), LS9, ERC-2012-StG_20111109
Summary New renewable energy source are highly needed to compensate exhausting fossil fuels reserves and reduce greenhouse gases emissions. Some species of algae have an interesting potential as feedstock for the production of biodiesel thanks to their ability to accumulate large amount of lipids. Strong research efforts are however needed to fulfil this potential and address many issues involving optimization of cultivation systems, biomass harvesting and algae genetic improvement. This proposal aims to address one of these issues, the optimization of algae light use efficiency. Light, in fact, provides the energy supporting algae growth and must be exploited with the highest possible efficiency to achieve sufficient productivity.
In a photobioreactor algae are highly concentrated and this cause a inhomogeneous light distribution with a large fraction of the cells exposed to very low light or even in the dark. Algae are also actively mixed and they can abruptly move from dark to full illumination and vice versa. This proposal aims to assess how alternation of dark/light cycles affect algae growth and functionality of photosynthetic apparatus both in batch and continuous cultures. In collaboration with the Chemical Engineering department, experimental data will be exploited to build a model describing the photobioreactor, a fundamental tool to improve its design.
The other main scope of this proposal is the isolation of genetically improved strains more suitable to the artificial environment of a photobioreactor. A first part of the work of setting up protocols for transformation will be followed by a second phase for generation and selection of mutants with altered photosynthetic performances. Transcriptome analyses in different light conditions will also be instrumental to identify genes to be targeted by genetic engineering.
Summary
New renewable energy source are highly needed to compensate exhausting fossil fuels reserves and reduce greenhouse gases emissions. Some species of algae have an interesting potential as feedstock for the production of biodiesel thanks to their ability to accumulate large amount of lipids. Strong research efforts are however needed to fulfil this potential and address many issues involving optimization of cultivation systems, biomass harvesting and algae genetic improvement. This proposal aims to address one of these issues, the optimization of algae light use efficiency. Light, in fact, provides the energy supporting algae growth and must be exploited with the highest possible efficiency to achieve sufficient productivity.
In a photobioreactor algae are highly concentrated and this cause a inhomogeneous light distribution with a large fraction of the cells exposed to very low light or even in the dark. Algae are also actively mixed and they can abruptly move from dark to full illumination and vice versa. This proposal aims to assess how alternation of dark/light cycles affect algae growth and functionality of photosynthetic apparatus both in batch and continuous cultures. In collaboration with the Chemical Engineering department, experimental data will be exploited to build a model describing the photobioreactor, a fundamental tool to improve its design.
The other main scope of this proposal is the isolation of genetically improved strains more suitable to the artificial environment of a photobioreactor. A first part of the work of setting up protocols for transformation will be followed by a second phase for generation and selection of mutants with altered photosynthetic performances. Transcriptome analyses in different light conditions will also be instrumental to identify genes to be targeted by genetic engineering.
Max ERC Funding
1 257 600 €
Duration
Start date: 2012-10-01, End date: 2017-09-30
Project acronym BioMNP
Project Understanding the interaction between metal nanoparticles and biological membranes
Researcher (PI) Giulia Rossi
Host Institution (HI) UNIVERSITA DEGLI STUDI DI GENOVA
Call Details Starting Grant (StG), PE3, ERC-2015-STG
Summary The BioMNP objective is the molecular-level understanding of the interactions between surface functionalized metal nanoparticles and biological membranes, by means of cutting-edge computational techniques and new molecular models.
Metal nanoparticles (NP) play more and more important roles in pharmaceutical and medical technology as diagnostic or therapeutic devices. Metal NPs can nowadays be engineered in a multitude of shapes, sizes and compositions, and they can be decorated with an almost infinite variety of functionalities. Despite such technological advances, there is still poor understanding of the molecular processes that drive the interactions of metal NPs with cells. Cell membranes are the first barrier encountered by NPs entering living organisms. The understanding and control of the interaction of nanoparticles with biological membranes is therefore of paramount importance to understand the molecular basis of the NP biological effects.
BioMNP will go beyond the state of the art by rationalizing the complex interplay of NP size, composition, functionalization and aggregation state during the interaction with model biomembranes. Membranes, in turn, will be modelled at an increasing level of complexity in terms of lipid composition and phase. BioMNP will rely on cutting-edge simulation techniques and facilities, and develop new coarse-grained models grounded on finer-level atomistic simulations, to study the NP-membrane interactions on an extremely large range of length and time scales.
BioMNP will benefit from important and complementary experimental collaborations, will propose interpretations of the available experimental data and make predictions to guide the design of functional, non-toxic metal nanoparticles for biomedical applications. BioMNP aims at answering fundamental questions at the crossroads of physics, biology and chemistry. Its results will have an impact on nanomedicine, toxicology, nanotechnology and material sciences.
Summary
The BioMNP objective is the molecular-level understanding of the interactions between surface functionalized metal nanoparticles and biological membranes, by means of cutting-edge computational techniques and new molecular models.
Metal nanoparticles (NP) play more and more important roles in pharmaceutical and medical technology as diagnostic or therapeutic devices. Metal NPs can nowadays be engineered in a multitude of shapes, sizes and compositions, and they can be decorated with an almost infinite variety of functionalities. Despite such technological advances, there is still poor understanding of the molecular processes that drive the interactions of metal NPs with cells. Cell membranes are the first barrier encountered by NPs entering living organisms. The understanding and control of the interaction of nanoparticles with biological membranes is therefore of paramount importance to understand the molecular basis of the NP biological effects.
BioMNP will go beyond the state of the art by rationalizing the complex interplay of NP size, composition, functionalization and aggregation state during the interaction with model biomembranes. Membranes, in turn, will be modelled at an increasing level of complexity in terms of lipid composition and phase. BioMNP will rely on cutting-edge simulation techniques and facilities, and develop new coarse-grained models grounded on finer-level atomistic simulations, to study the NP-membrane interactions on an extremely large range of length and time scales.
BioMNP will benefit from important and complementary experimental collaborations, will propose interpretations of the available experimental data and make predictions to guide the design of functional, non-toxic metal nanoparticles for biomedical applications. BioMNP aims at answering fundamental questions at the crossroads of physics, biology and chemistry. Its results will have an impact on nanomedicine, toxicology, nanotechnology and material sciences.
Max ERC Funding
1 131 250 €
Duration
Start date: 2016-04-01, End date: 2021-03-31
Project acronym BIOSMA
Project Mathematics for Shape Memory Technologies in Biomechanics
Researcher (PI) Ulisse Stefanelli
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary Shape Memory Alloys (SMAs) are nowadays widely exploited for the realization of innovative devices and have a great impact on the development of a variety of biomedical applications ranging from orthodontic archwires to vascular stents. The design, realization, and optimization of such devices are quite demanding tasks. Mathematics is involved in this process as a major tool in order to let the modeling more accurate, the numerical simulations more reliable, and the design more effective. Many material properties of SMAs such as martensitic reorientation, training, and ferromagnetic behavior, are still to be properly and efficiently addressed. Therefore, new modeling ideas, along with original analytical and numerical techniques, are required. This project is aimed at addressing novel mathematical issues in order to move from experimental materials results toward the solution of real-scale biomechanical Engineering problems. The research focus will be multidisciplinary and include modeling, analytic, numerical, and computational issues. A progress in the macroscopic description of SMAs, the computational simulation of real-scale SMA devices, and the optimization of the production processes will contribute to advance in the direction of innovative applications.
Summary
Shape Memory Alloys (SMAs) are nowadays widely exploited for the realization of innovative devices and have a great impact on the development of a variety of biomedical applications ranging from orthodontic archwires to vascular stents. The design, realization, and optimization of such devices are quite demanding tasks. Mathematics is involved in this process as a major tool in order to let the modeling more accurate, the numerical simulations more reliable, and the design more effective. Many material properties of SMAs such as martensitic reorientation, training, and ferromagnetic behavior, are still to be properly and efficiently addressed. Therefore, new modeling ideas, along with original analytical and numerical techniques, are required. This project is aimed at addressing novel mathematical issues in order to move from experimental materials results toward the solution of real-scale biomechanical Engineering problems. The research focus will be multidisciplinary and include modeling, analytic, numerical, and computational issues. A progress in the macroscopic description of SMAs, the computational simulation of real-scale SMA devices, and the optimization of the production processes will contribute to advance in the direction of innovative applications.
Max ERC Funding
700 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym BISMUTH
Project Breaking Inversion Symmetry in Magnets: Understand via THeory
Researcher (PI) Silvia Picozzi
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Starting Grant (StG), PE3, ERC-2007-StG
Summary Multiferroics (i.e. materials where ferroelectricity and magnetism coexist) are presently drawing enormous interests, due to their technologically-relevant multifunctional character and to the astoundingly rich playground for fundamental condensed-matter physics they constitute. Here, we put forward several concepts on how to break inversion symmetry and achieve sizable ferroelectricity in collinear magnets; our approach is corroborated via first-principles calculations as tools to quantitatively estimate relevant ferroelectric and magnetic properties as well as to reveal ab-initio the main mechanisms behind the dipolar and magnetic orders. In closer detail, we focus on the interplay between ferroelectricity and electronic degrees of freedom in magnets, i.e. on those cases where spin- or orbital- or charge-ordering can be the driving force for a spontaneous polarization to develop. Antiferromagnetism will be considered as a primary mechanism for lifting inversion symmetry; however, the effects of charge disproportionation and orbital ordering will also be studied by examining a wide class of materials, including ortho-manganites with E-type spin-arrangement, non-E-type antiferromagnets, nickelates, etc. Finally, as an example of materials-design accessible to our ab-initio approach, we use “chemistry” to break inversion symmetry by artificially constructing an oxide superlattice and propose a way to switch, via an electric field, from antiferromagnetism to ferrimagnetism. To our knowledge, the link between electronic degrees of freedom and ferroelectricity in collinear magnets is an almost totally unexplored field by ab-initio methods; indeed, its clear understanding and optimization would lead to a scientific breakthrough in the multiferroics area. Technologically, it would pave the way to materials design of magnetic ferroelectrics with properties persisting above room temperature and, therefore, to a novel generation of electrically-controlled spintronic devices
Summary
Multiferroics (i.e. materials where ferroelectricity and magnetism coexist) are presently drawing enormous interests, due to their technologically-relevant multifunctional character and to the astoundingly rich playground for fundamental condensed-matter physics they constitute. Here, we put forward several concepts on how to break inversion symmetry and achieve sizable ferroelectricity in collinear magnets; our approach is corroborated via first-principles calculations as tools to quantitatively estimate relevant ferroelectric and magnetic properties as well as to reveal ab-initio the main mechanisms behind the dipolar and magnetic orders. In closer detail, we focus on the interplay between ferroelectricity and electronic degrees of freedom in magnets, i.e. on those cases where spin- or orbital- or charge-ordering can be the driving force for a spontaneous polarization to develop. Antiferromagnetism will be considered as a primary mechanism for lifting inversion symmetry; however, the effects of charge disproportionation and orbital ordering will also be studied by examining a wide class of materials, including ortho-manganites with E-type spin-arrangement, non-E-type antiferromagnets, nickelates, etc. Finally, as an example of materials-design accessible to our ab-initio approach, we use “chemistry” to break inversion symmetry by artificially constructing an oxide superlattice and propose a way to switch, via an electric field, from antiferromagnetism to ferrimagnetism. To our knowledge, the link between electronic degrees of freedom and ferroelectricity in collinear magnets is an almost totally unexplored field by ab-initio methods; indeed, its clear understanding and optimization would lead to a scientific breakthrough in the multiferroics area. Technologically, it would pave the way to materials design of magnetic ferroelectrics with properties persisting above room temperature and, therefore, to a novel generation of electrically-controlled spintronic devices
Max ERC Funding
684 000 €
Duration
Start date: 2008-05-01, End date: 2012-04-30
Project acronym Bits2Cosmology
Project Time-domain Gibbs sampling: From bits to inflationary gravitational waves
Researcher (PI) Hans Kristian ERIKSEN
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), PE9, ERC-2017-COG
Summary The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Summary
The detection of primordial gravity waves created during the Big Bang ranks among the greatest potential intellectual achievements in modern science. During the last few decades, the instrumental progress necessary to achieve this has been nothing short of breathtaking, and we today are able to measure the microwave sky with better than one-in-a-million precision. However, from the latest ultra-sensitive experiments such as BICEP2 and Planck, it is clear that instrumental sensitivity alone will not be sufficient to make a robust detection of gravitational waves. Contamination in the form of astrophysical radiation from the Milky Way, for instance thermal dust and synchrotron radiation, obscures the cosmological signal by orders of magnitude. Even more critically, though, are second-order interactions between this radiation and the instrument characterization itself that lead to a highly non-linear and complicated problem.
I propose a ground-breaking solution to this problem that allows for joint estimation of cosmological parameters, astrophysical components, and instrument specifications. The engine of this method is called Gibbs sampling, which I have already applied extremely successfully to basic CMB component separation. The new and ciritical step is to apply this method to raw time-ordered observations observed directly by the instrument, as opposed to pre-processed frequency maps. While representing a ~100-fold increase in input data volume, this step is unavoidable in order to break through the current foreground-induced systematics floor. I will apply this method to the best currently available and future data sets (WMAP, Planck, SPIDER and LiteBIRD), and thereby derive the world's tightest constraint on the amplitude of inflationary gravitational waves. Additionally, the resulting ancillary science in the form of robust cosmological parameters and astrophysical component maps will represent the state-of-the-art in observational cosmology in years to come.
Max ERC Funding
1 999 205 €
Duration
Start date: 2018-04-01, End date: 2023-03-31
Project acronym BIVAQUM
Project Bivariational Approximations in Quantum Mechanics and Applications to Quantum Chemistry
Researcher (PI) Simen Kvaal
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Starting Grant (StG), PE4, ERC-2014-STG
Summary The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Summary
The standard variational principles (VPs) are cornerstones of quantum mechanics, and one can hardly overestimate their usefulness as tools for generating approximations to the time-independent and
time-dependent Schröodinger equations. The aim of the proposal is to study and apply a generalization of these, the bivariational principles (BIVPs), which arise naturally when one does not assume a priori that the system Hamiltonian is Hermitian. This unconventional approach may have transformative impact on development of ab initio methodology, both for electronic structure and dynamics.
The first objective is to establish the mathematical foundation for the BIVPs. This opens up a whole new axis of method development for ab initio approaches. For instance, it is a largely ignored fact that the popular traditional coupled cluster (TCC) method can be neatly formulated with the BIVPs, and TCC is both polynomially scaling with the number of electrons and size-consistent. No “variational” method enjoys these properties simultaneously, indeed this seems to be incompatible with the standard VPs.
Armed with the BIVPs, the project aims to develop new and understand existing ab initio methods. The second objective is thus a systematic multireference coupled cluster theory (MRCC) based on the BIVPs. This
is in itself a novel approach that carries large potential benefits and impact. The third and last objective is an implementation of a new coupled-cluster type method where the orbitals are bivariational
parameters. This gives a size-consistent hierarchy of approximations to multiconfiguration
Hartree--Fock.
The PI's broad contact with and background in scientific disciplines such as applied mathematics and nuclear physics in addition to quantum chemistry increases the feasibility of the project.
Max ERC Funding
1 499 572 €
Duration
Start date: 2015-04-01, End date: 2020-03-31
Project acronym BONEPHAGY
Project Defining the role of the FGF – autophagy axis in bone physiology
Researcher (PI) Carmine SETTEMBRE
Host Institution (HI) FONDAZIONE TELETHON
Call Details Starting Grant (StG), LS4, ERC-2016-STG
Summary Autophagy is a fundamental cellular catabolic process deputed to the degradation and recycling of a variety of intracellular materials. Autophagy plays a significant role in multiple human physio-pathological processes and is now emerging as a critical regulator of skeletal development and homeostasis. We have discovered that during postnatal development in mice, the growth factor FGF18 induces autophagy in the chondrocyte cells of the growth plate to regulate the secretion of type II collagen, a major component of cartilaginous extracellular matrix. The FGF signaling pathways play crucial roles during skeletal development and maintenance and are deregulated in many skeletal disorders. Hence our findings may offer the unique opportunity to uncover new molecular mechanisms through which FGF pathways regulate skeletal development and maintenance and to identify new targets for the treatment of FGF-related skeletal disorders. In this grant application we propose to study the role played by the different FGF ligands and receptors on autophagy regulation and to investigate the physiological relevance of these findings in the context of skeletal growth, homeostasis and maintenance. We will also investigate the intracellular machinery that links FGF signalling pathways to the regulation of autophagy. In addition, we generated preliminary data showing an impairment of autophagy in chondrocyte models of Achondroplasia (ACH) and Thanathoporic dysplasia, two skeletal disorders caused by mutations in FGFR3. We propose to study the role of autophagy in the pathogenesis of FGFR3-related dwarfisms and explore the pharmacological modulation of autophagy as new therapeutic approach for achondroplasia. This application, which combines cell biology, mouse genetics and pharmacological approaches, has the potential to shed light on new mechanisms involved in organismal development and homeostasis, which could be targeted to treat bone and cartilage diseases.
Summary
Autophagy is a fundamental cellular catabolic process deputed to the degradation and recycling of a variety of intracellular materials. Autophagy plays a significant role in multiple human physio-pathological processes and is now emerging as a critical regulator of skeletal development and homeostasis. We have discovered that during postnatal development in mice, the growth factor FGF18 induces autophagy in the chondrocyte cells of the growth plate to regulate the secretion of type II collagen, a major component of cartilaginous extracellular matrix. The FGF signaling pathways play crucial roles during skeletal development and maintenance and are deregulated in many skeletal disorders. Hence our findings may offer the unique opportunity to uncover new molecular mechanisms through which FGF pathways regulate skeletal development and maintenance and to identify new targets for the treatment of FGF-related skeletal disorders. In this grant application we propose to study the role played by the different FGF ligands and receptors on autophagy regulation and to investigate the physiological relevance of these findings in the context of skeletal growth, homeostasis and maintenance. We will also investigate the intracellular machinery that links FGF signalling pathways to the regulation of autophagy. In addition, we generated preliminary data showing an impairment of autophagy in chondrocyte models of Achondroplasia (ACH) and Thanathoporic dysplasia, two skeletal disorders caused by mutations in FGFR3. We propose to study the role of autophagy in the pathogenesis of FGFR3-related dwarfisms and explore the pharmacological modulation of autophagy as new therapeutic approach for achondroplasia. This application, which combines cell biology, mouse genetics and pharmacological approaches, has the potential to shed light on new mechanisms involved in organismal development and homeostasis, which could be targeted to treat bone and cartilage diseases.
Max ERC Funding
1 586 430 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym BPT
Project BEYOND PLATE TECTONICS
Researcher (PI) Trond Helge Torsvik
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Summary
Plate tectonics characterises the complex and dynamic evolution of the outer shell of the Earth in terms of rigid plates. These tectonic plates overlie and interact with the Earth's mantle, which is slowly convecting owing to energy released by the decay of radioactive nuclides in the Earth's interior. Even though links between mantle convection and plate tectonics are becoming more evident, notably through subsurface tomographic images, advances in mineral physics and improved absolute plate motion reference frames, there is still no generally accepted mechanism that consistently explains plate tectonics and mantle convection in one framework. We will integrate plate tectonics into mantle dynamics and develop a theory that explains plate motions quantitatively and dynamically. This requires consistent and detailed reconstructions of plate motions through time (Objective 1).
A new model of plate kinematics will be linked to the mantle with the aid of a new global reference frame based on moving hotspots and on palaeomagnetic data. The global reference frame will be corrected for true polar wander in order to develop a global plate motion reference frame with respect to the mantle back to Pangea (ca. 320 million years) and possibly Gondwana assembly (ca. 550 million years). The resulting plate reconstructions will constitute the input to subduction models that are meant to test the consistency between the reference frame and subduction histories. The final outcome will be a novel global subduction reference frame, to be used to unravel links between the surface and deep Earth (Objective 2).
Max ERC Funding
2 499 010 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym BrainBIT
Project All-optical brain-to-brain behaviour and information transfer
Researcher (PI) Francesco PAVONE
Host Institution (HI) UNIVERSITA DEGLI STUDI DI FIRENZE
Call Details Advanced Grant (AdG), PE2, ERC-2015-AdG
Summary Exchange of information between different brains usually takes place through the interaction between bodies and the external environment. The ultimate goal of this project is to establish a novel paradigm of brain-to-brain communication based on direct full-optical recording and controlled stimulation of neuronal activity in different subjects. To pursue this challenging objective, we propose to develop optical technologies well beyond the state of the art for simultaneous neuronal “reading” and “writing” across large volumes and with high spatial and temporal resolution, targeted to the transfer of advantageous behaviour in physiological and pathological conditions.
We will perform whole-brain high-resolution imaging in zebrafish larvae to disentangle the activity patterns related to different tasks. We will then use these patterns as stimulation templates in other larvae to investigate spatio-temporal subject-invariant signatures of specific behavioural states. This ‘pump and probe’ strategy will allow gaining deep insights into the complex relationship between neuronal activity and subject behaviour.
To move towards clinics-oriented studies on brain stimulation therapies, we will complement whole-brain experiments in zebrafish with large area functional imaging and optostimulation in mammals. We will investigate all-optical brain-to-brain information transfer to boost an advantageous behaviour, i.e. motor recovery, in a mouse model of stroke. Mice showing more effective responses to rehabilitation will provide neuronal activity templates to be elicited in other animals, in order to increase rehabilitation efficiency.
We strongly believe that the implementation of new technologies for all-optical transfer of behaviour between different subjects will offer unprecedented views of neuronal activity in healthy and injured brain, paving the way to more effective brain stimulation therapies.
Summary
Exchange of information between different brains usually takes place through the interaction between bodies and the external environment. The ultimate goal of this project is to establish a novel paradigm of brain-to-brain communication based on direct full-optical recording and controlled stimulation of neuronal activity in different subjects. To pursue this challenging objective, we propose to develop optical technologies well beyond the state of the art for simultaneous neuronal “reading” and “writing” across large volumes and with high spatial and temporal resolution, targeted to the transfer of advantageous behaviour in physiological and pathological conditions.
We will perform whole-brain high-resolution imaging in zebrafish larvae to disentangle the activity patterns related to different tasks. We will then use these patterns as stimulation templates in other larvae to investigate spatio-temporal subject-invariant signatures of specific behavioural states. This ‘pump and probe’ strategy will allow gaining deep insights into the complex relationship between neuronal activity and subject behaviour.
To move towards clinics-oriented studies on brain stimulation therapies, we will complement whole-brain experiments in zebrafish with large area functional imaging and optostimulation in mammals. We will investigate all-optical brain-to-brain information transfer to boost an advantageous behaviour, i.e. motor recovery, in a mouse model of stroke. Mice showing more effective responses to rehabilitation will provide neuronal activity templates to be elicited in other animals, in order to increase rehabilitation efficiency.
We strongly believe that the implementation of new technologies for all-optical transfer of behaviour between different subjects will offer unprecedented views of neuronal activity in healthy and injured brain, paving the way to more effective brain stimulation therapies.
Max ERC Funding
2 370 250 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym BRIDGE
Project Bridging the gap between Gas Emissions and geophysical observations at active volcanoes
Researcher (PI) Alessandro Aiuppa
Host Institution (HI) UNIVERSITA DEGLI STUDI DI PALERMO
Call Details Starting Grant (StG), PE10, ERC-2012-StG_20111012
Summary In spite of their significance in a variety of volcanological aspects, gas observations at volcanoes have lagged behind geophysical studies for a long time. This has primarily reflected the inherent technical limitations met by gas geochemists in capturing volcanic gas properties (chemistry and flux) at high-rate (1 Hz), and using permanent instrumental arrays. The poor temporal resolution of volcanic gas observations has, in addition, precluded the real-time analysis of fast-occurring volcanic processes, as those occurring shortly prior to eruptions, therefore generally limiting the use of gas geochemistry in volcanic hazard assessment. However, the recent progresses made by modern multi-component/high frequency measurement techniques now open the way for decisive step ahead in the current state-of-the-art to be finally attempted.
The BRIDGE research proposal has the ambitious goals to bridge the existing technological gap between geochemical and geophysical observations at volcanoes. This will be achieved by designing, setting up, and deploying in the field, innovative instruments for 1 Hz observations of volcanic SO2 and CO2 fluxes. From this, the co-acquired volcanic gas and geophysical information will be then combined within a single interpretative framework, therefore contributing to fill our current gap of knowledge on fast (timescales of seconds/minutes) degassing processes, and to deeper exploration of the role played by gas exsolution from (and migration through) silicate liquids as effective source mechanism of the physical signals (e.g., LP and VLP seismicity, and tremor) measured at volcanoes. Finally, this combined volcanic gas-geophysical approach will be used to yield improved modelling/understanding of a variety of volcanic features, including modes/rates of gas separation from magmas, mechanisms of gas flow in conduits, and trigger mechanisms of explosive volcanic eruptions.
Summary
In spite of their significance in a variety of volcanological aspects, gas observations at volcanoes have lagged behind geophysical studies for a long time. This has primarily reflected the inherent technical limitations met by gas geochemists in capturing volcanic gas properties (chemistry and flux) at high-rate (1 Hz), and using permanent instrumental arrays. The poor temporal resolution of volcanic gas observations has, in addition, precluded the real-time analysis of fast-occurring volcanic processes, as those occurring shortly prior to eruptions, therefore generally limiting the use of gas geochemistry in volcanic hazard assessment. However, the recent progresses made by modern multi-component/high frequency measurement techniques now open the way for decisive step ahead in the current state-of-the-art to be finally attempted.
The BRIDGE research proposal has the ambitious goals to bridge the existing technological gap between geochemical and geophysical observations at volcanoes. This will be achieved by designing, setting up, and deploying in the field, innovative instruments for 1 Hz observations of volcanic SO2 and CO2 fluxes. From this, the co-acquired volcanic gas and geophysical information will be then combined within a single interpretative framework, therefore contributing to fill our current gap of knowledge on fast (timescales of seconds/minutes) degassing processes, and to deeper exploration of the role played by gas exsolution from (and migration through) silicate liquids as effective source mechanism of the physical signals (e.g., LP and VLP seismicity, and tremor) measured at volcanoes. Finally, this combined volcanic gas-geophysical approach will be used to yield improved modelling/understanding of a variety of volcanic features, including modes/rates of gas separation from magmas, mechanisms of gas flow in conduits, and trigger mechanisms of explosive volcanic eruptions.
Max ERC Funding
1 496 222 €
Duration
Start date: 2012-10-01, End date: 2016-09-30
Project acronym BrightEyes
Project Multi-Parameter Live-Cell Observation of Biomolecular Processes with Single-Photon Detector Array
Researcher (PI) Giuseppe Vicidomini
Host Institution (HI) FONDAZIONE ISTITUTO ITALIANO DI TECNOLOGIA
Call Details Consolidator Grant (CoG), PE7, ERC-2018-COG
Summary Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Summary
Fluorescence single-molecule (SM) detection techniques have the potential to provide insights into the complex functions, structures and interactions of individual, specifically labelled biomolecules. However, current SM techniques work properly only when the biomolecule is observed in controlled environments, e.g., immobilized on a glass surface. Observation of biomolecular processes in living (multi)cellular environments – which is fundamental for sound biological conclusion – always comes with a price, such as invasiveness, limitations in the accessible information and constraints in the spatial and temporal scales.
The overall objective of the BrightEyes project is to break the above limitations by creating a novel SM approach compatible with the state-of-the-art biomolecule-labelling protocols, able to track a biomolecule deep inside (multi)cellular environments – with temporal resolution in the microsecond scale, and with hundreds of micrometres tracking range – and simultaneously observe its structural changes, its nano- and micro-environments.
Specifically, by exploring a novel single-photon detectors array, the BrightEyes project will implement an optical system, able to continuously (i) track in real-time the biomolecule of interest from which to decode its dynamics and interactions; (ii) measure the nano-environment fluorescence spectroscopy properties, such as lifetime, photon-pair correlation and intensity, from which to extract the biochemical properties of the nano-environment, the structural properties of the biomolecule – via SM-FRET and anti-bunching – and the interactions of the biomolecule with other biomolecular species – via STED-FCS; (iii) visualize the sub-cellular structures within the micro-environment with sub-diffraction spatial resolution – via STED and image scanning microscopy.
This unique paradigm will enable unprecedented studies of biomolecular behaviours, interactions and self-organization at near-physiological conditions.
Max ERC Funding
1 861 250 €
Duration
Start date: 2019-09-01, End date: 2024-08-31
Project acronym BRuSH
Project Oral bacteria as determinants for respiratory health
Researcher (PI) Randi BERTELSEN
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), LS7, ERC-2018-STG
Summary The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Summary
The oral cavity is the gateway to the lower respiratory tract, and oral bacteria are likely to play a role in lung health. This may be the case for pathogens as well as commensal bacteria and the balance between species. The oral bacterial community of patients with periodontitis is dominated by gram-negative bacteria and a higher lipopolysaccharide (LPS) activity than in healthy microbiota. Furthermore, bacteria with especially potent pro-inflammatory LPS have been shown to be more common in the lungs of asthmatic than in healthy individuals. The working hypothesis of BRuSH is that microbiome communities dominated by LPS-producing bacteria which induce a particularly strong pro-inflammatory immune response in the host, will have a negative effect on respiratory health. I will test this hypothesis in two longitudinally designed population-based lung health studies. I aim to identify whether specific bacterial composition and types of LPS producing bacteria in oral and dust samples predict lung function and respiratory health over time; and if the different types of LPS-producing bacteria affect LPS in saliva saliva and dust. BRuSH will apply functional genome annotation that can assign biological significance to raw bacterial DNA sequences. With this bioinformatics tool I will cluster microbiome data into various LPS-producers: bacteria with LPS with strong inflammatory effects and others with weak- or antagonistic effects. The epidemiological studies will be supported by mice-models of asthma and cell assays of human bronchial epithelial cells, by exposing mice and bronchial cells to chemically synthesized Lipid A (the component that drive the LPS-induced immune responses) of various potency. The goal of BRuSH is to prove a causal relationship between oral microbiome and lung health, and gain knowledge that will enable us to make oral health a feasible target for intervention programs aimed at optimizing lung health and preventing respiratory disease.
Max ERC Funding
1 499 938 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym C4T
Project Climate change across Cenozoic cooling steps reconstructed with clumped isotope thermometry
Researcher (PI) Anna Nele Meckler
Host Institution (HI) UNIVERSITETET I BERGEN
Call Details Starting Grant (StG), PE10, ERC-2014-STG
Summary The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Summary
The Earth's climate system contains a highly complex interplay of numerous components, such as atmospheric greenhouse gases, ice sheets, and ocean circulation. Due to nonlinearities and feedbacks, changes to the system can result in rapid transitions to radically different climate states. In light of rising greenhouse gas levels there is an urgent need to better understand climate at such tipping points. Reconstructions of profound climate changes in the past provide crucial insight into our climate system and help to predict future changes. However, all proxies we use to reconstruct past climate depend on assumptions that are in addition increasingly uncertain back in time. A new kind of temperature proxy, the carbonate ‘clumped isotope’ thermometer, has great potential to overcome these obstacles. The proxy relies on thermodynamic principles, taking advantage of the temperature-dependence of the binding strength between different isotopes of carbon and oxygen, which makes it independent of other variables. Yet, widespread application of this technique in paleoceanography is currently prevented by the required large sample amounts, which are difficult to obtain from ocean sediments. If applied to the minute carbonate shells preserved in the sediments, this proxy would allow robust reconstructions of past temperatures in the surface and deep ocean, as well as global ice volume, far back in time. Here I propose to considerably decrease sample amount requirements of clumped isotope thermometry, building on recent successful modifications of the method and ideas for further analytical improvements. This will enable my group and me to thoroughly ground-truth the proxy for application in paleoceanography and for the first time apply it to aspects of past climate change across major climate transitions in the past, where clumped isotope thermometry can immediately contribute to solving long-standing first-order questions and allow for major progress in the field.
Max ERC Funding
1 877 209 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym CALDER
Project Cryogenic wide-Area Light Detectors
with Excellent Resolution
Researcher (PI) Marco Vignati
Host Institution (HI) ISTITUTO NAZIONALE DI FISICA NUCLEARE
Call Details Starting Grant (StG), PE2, ERC-2013-StG
Summary "In the comprehension of fundamental laws of nature, particle physics is now facing two important questions:
1) What is the nature of the neutrino, is it a standard (Dirac) particle or a Majorana particle? The nature of the neutrino plays a crucial role in the global framework of particle interactions and in cosmology. The only practicable way to answer this question is to search for a nuclear process called ""neutrinoless double beta decay"" (0nuDBD).
2) What is the so called ""dark matter"" made of? Astrophysical observations suggest that the largest part of the mass of the Universe is composed by a form of matter other than atoms and known matter constituents. We still do not know what dark matter is made of because its rate of interaction with ordinary matter is really low, thus making the direct experimental detection extremely difficult.
Both 0nuDBD and dark matter interactions are rare processes and can be detected using the same experimental technique. Bolometers are promising devices and their combination with light detectors provides the identification of interacting particles, a powerful tool to reduce the background.
The goal of CALDER is to realize a new type of light detectors to improve the upcoming generation of bolometric experiments. The detectors will be designed to feature unprecedented energy resolution and reliability, to ensure an almost complete particle identification. In case of success, CUORE, a 0nuDBD experiment in construction, would gain in sensitivity by up to a factor 6. LUCIFER, a 0nuDBD experiment already implementing the light detection, could be sensitive also to dark matter interactions, thus increasing its research potential. The light detectors will be based on Kinetic Inductance Detectors (KIDs), a new technology that proved its potential in astrophysical applications but that is still new in the field of particle physics and rare event searches."
Summary
"In the comprehension of fundamental laws of nature, particle physics is now facing two important questions:
1) What is the nature of the neutrino, is it a standard (Dirac) particle or a Majorana particle? The nature of the neutrino plays a crucial role in the global framework of particle interactions and in cosmology. The only practicable way to answer this question is to search for a nuclear process called ""neutrinoless double beta decay"" (0nuDBD).
2) What is the so called ""dark matter"" made of? Astrophysical observations suggest that the largest part of the mass of the Universe is composed by a form of matter other than atoms and known matter constituents. We still do not know what dark matter is made of because its rate of interaction with ordinary matter is really low, thus making the direct experimental detection extremely difficult.
Both 0nuDBD and dark matter interactions are rare processes and can be detected using the same experimental technique. Bolometers are promising devices and their combination with light detectors provides the identification of interacting particles, a powerful tool to reduce the background.
The goal of CALDER is to realize a new type of light detectors to improve the upcoming generation of bolometric experiments. The detectors will be designed to feature unprecedented energy resolution and reliability, to ensure an almost complete particle identification. In case of success, CUORE, a 0nuDBD experiment in construction, would gain in sensitivity by up to a factor 6. LUCIFER, a 0nuDBD experiment already implementing the light detection, could be sensitive also to dark matter interactions, thus increasing its research potential. The light detectors will be based on Kinetic Inductance Detectors (KIDs), a new technology that proved its potential in astrophysical applications but that is still new in the field of particle physics and rare event searches."
Max ERC Funding
1 176 758 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym CAPABLE
Project Composite integrated photonic platform by femtosecond laser micromachining
Researcher (PI) Roberto OSELLAME
Host Institution (HI) CONSIGLIO NAZIONALE DELLE RICERCHE
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary The quantum technology revolution promises a transformational impact on the society and economics worldwide. It will enable breakthrough advancements in such diverse fields as secure communications, computing, metrology, and imaging. Quantum photonics, which recently received an incredible boost by the use of integrated optical circuits, is an excellent technological platform to enable such revolution, as it already plays a relevant role in many of the above applications. However, some major technical roadblocks needs to be overcome. Currently, the various components required for a complete quantum photonic system are produced on very different materials by dedicated fabrication technologies, as no single material is able to fulfil all the requirements for single-photon generation, manipulation, storage and detection. This project proposes a new hybrid approach for integrated quantum photonic systems based on femtosecond laser microfabrication (FLM), enabling the innovative miniaturization of various components on different materials, but with a single tool and with very favourable integration capabilities.
This project will mainly focus on two major breakthroughs: the first one will be increasing the complexity achievable in the photonic platform and demonstrating unprecedented quantum computation capability; the second one will be the integration in the platform of multiple single-photon quantum memories and their interconnection.
Achievement of these goals will only be possible by taking full advantage of the unique features of FLM, from the possibility to machine very different materials, to the 3D capabilities in waveguide writing and selective material removal.
The successful demonstration and functional validation of this hybrid, integrated photonic platform will represent a significant leap for photonic microsystems in quantum computing and quantum communications.
Summary
The quantum technology revolution promises a transformational impact on the society and economics worldwide. It will enable breakthrough advancements in such diverse fields as secure communications, computing, metrology, and imaging. Quantum photonics, which recently received an incredible boost by the use of integrated optical circuits, is an excellent technological platform to enable such revolution, as it already plays a relevant role in many of the above applications. However, some major technical roadblocks needs to be overcome. Currently, the various components required for a complete quantum photonic system are produced on very different materials by dedicated fabrication technologies, as no single material is able to fulfil all the requirements for single-photon generation, manipulation, storage and detection. This project proposes a new hybrid approach for integrated quantum photonic systems based on femtosecond laser microfabrication (FLM), enabling the innovative miniaturization of various components on different materials, but with a single tool and with very favourable integration capabilities.
This project will mainly focus on two major breakthroughs: the first one will be increasing the complexity achievable in the photonic platform and demonstrating unprecedented quantum computation capability; the second one will be the integration in the platform of multiple single-photon quantum memories and their interconnection.
Achievement of these goals will only be possible by taking full advantage of the unique features of FLM, from the possibility to machine very different materials, to the 3D capabilities in waveguide writing and selective material removal.
The successful demonstration and functional validation of this hybrid, integrated photonic platform will represent a significant leap for photonic microsystems in quantum computing and quantum communications.
Max ERC Funding
2 381 875 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym CARBONANOBRIDGE
Project Neuron Networking with Nano Bridges via the Synthesis and Integration of Functionalized Carbon Nanotubes
Researcher (PI) Maurizio Prato
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TRIESTE
Call Details Advanced Grant (AdG), PE5, ERC-2008-AdG
Summary We propose the development of novel nanodevices, such as nanoscale bridges and nanovectors, based on functionalized carbon nanotubes (CNT) for manipulating neurons and neuronal network activity in vitro. The main aim is to put forward innovative solutions that have the potential to circumvent the problems currently faced by spinal cord lesions or by neurodegenerative diseases. The unifying theme is to use recent advances in chemistry and nanotechnology to gain insight into the functioning of hybrid neuronal/CNT networks, relevant for the development of novel implantable devices to control neuronal signaling and improve synapse formation in a controlled fashion. The proposal s core strategy is to exploit the expertise of the PI in the chemical control of CNT properties to develop devices reaching various degrees of functional integration with the physiological electrical activity of cells and their networks, and to understand how such global dynamics are orchestrated when integrated by different substrates. An unconventional strategy will be represented by the electrical characterization of micro and nano patterned substrates by AFM and conductive tip AFM, both before and after neurons have grown on the substrates. We will also use the capability of AFM to identify critical positions in the neuronal network, while delivering time-dependent chemical stimulations. We will apply nanotechnology to contemporary neuroscience in the perspective of novel neuro-implantable devices and drug nanovectors, engineered to treat neurological and neurodegenerative lesions. The scientific strategy at the core of the proposal is the convergence between nanotechnology, chemistry and neurobiology. Such convergence, beyond helping understand the functioning and malfunctioning of the brain, can stimulate further research in this area and may ultimately lead to a new generation of nanomedicine applications in neurology and to new opportunities for the health care industry.
Summary
We propose the development of novel nanodevices, such as nanoscale bridges and nanovectors, based on functionalized carbon nanotubes (CNT) for manipulating neurons and neuronal network activity in vitro. The main aim is to put forward innovative solutions that have the potential to circumvent the problems currently faced by spinal cord lesions or by neurodegenerative diseases. The unifying theme is to use recent advances in chemistry and nanotechnology to gain insight into the functioning of hybrid neuronal/CNT networks, relevant for the development of novel implantable devices to control neuronal signaling and improve synapse formation in a controlled fashion. The proposal s core strategy is to exploit the expertise of the PI in the chemical control of CNT properties to develop devices reaching various degrees of functional integration with the physiological electrical activity of cells and their networks, and to understand how such global dynamics are orchestrated when integrated by different substrates. An unconventional strategy will be represented by the electrical characterization of micro and nano patterned substrates by AFM and conductive tip AFM, both before and after neurons have grown on the substrates. We will also use the capability of AFM to identify critical positions in the neuronal network, while delivering time-dependent chemical stimulations. We will apply nanotechnology to contemporary neuroscience in the perspective of novel neuro-implantable devices and drug nanovectors, engineered to treat neurological and neurodegenerative lesions. The scientific strategy at the core of the proposal is the convergence between nanotechnology, chemistry and neurobiology. Such convergence, beyond helping understand the functioning and malfunctioning of the brain, can stimulate further research in this area and may ultimately lead to a new generation of nanomedicine applications in neurology and to new opportunities for the health care industry.
Max ERC Funding
2 500 000 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym CARDIOEPIGEN
Project Epigenetics and microRNAs in Myocardial Function and Disease
Researcher (PI) Gianluigi Condorelli
Host Institution (HI) HUMANITAS MIRASOLE SPA
Call Details Advanced Grant (AdG), LS4, ERC-2011-ADG_20110310
Summary Heart failure (HF) is the ultimate outcome of many cardiovascular diseases. Re-expression of fetal genes in the adult heart contributes to development of HF. Two mechanisms involved in the control of gene expression are epigenetics and microRNAs (miRs). We propose a project on epigenetic and miR-mediated mechanisms leading to HF.
Epigenetics refers to heritable modification of DNA and histones that does not modify the genetic code. Depending on the type of modification and on the site affected, these chemical changes up- or down-regulate transcription of specific genes. Despite it being a major player in gene regulation, epigenetics has been only partly investigated in HF. miRs are regulatory RNAs that target mRNAs for inhibition. Dysregulation of the cardiac miR signature occurs in HF. miR expression may itself be under epigenetic control, constituting a miR-epigenetic regulatory network. To our knowledge, this possibility has not been studied yet.
Our specific hypothesis is that the profile of DNA/histone methylation and the cross-talk between epigenetic enzymes and miRs have fundamental roles in defining the characteristics of cells during cardiac development and that the dysregulation of these processes determines the deleterious nature of the stressed heart’s gene programme. We will test this first through a genome-wide study of DNA/histone methylation to generate maps of the main methylation modifications occurring in the genome of cardiac cells treated with a pro-hypertrophy regulator and of a HF model. We will then investigate the role of epigenetic enzymes deemed important in HF, through the generation and study of knockout mice models. Finally, we will test the possible therapeutic potential of modulating epigenetic genes.
We hope to further understand the pathological mechanisms leading to HF and to generate data instrumental to the development of diagnostic and therapeutic strategies for this disease.
Summary
Heart failure (HF) is the ultimate outcome of many cardiovascular diseases. Re-expression of fetal genes in the adult heart contributes to development of HF. Two mechanisms involved in the control of gene expression are epigenetics and microRNAs (miRs). We propose a project on epigenetic and miR-mediated mechanisms leading to HF.
Epigenetics refers to heritable modification of DNA and histones that does not modify the genetic code. Depending on the type of modification and on the site affected, these chemical changes up- or down-regulate transcription of specific genes. Despite it being a major player in gene regulation, epigenetics has been only partly investigated in HF. miRs are regulatory RNAs that target mRNAs for inhibition. Dysregulation of the cardiac miR signature occurs in HF. miR expression may itself be under epigenetic control, constituting a miR-epigenetic regulatory network. To our knowledge, this possibility has not been studied yet.
Our specific hypothesis is that the profile of DNA/histone methylation and the cross-talk between epigenetic enzymes and miRs have fundamental roles in defining the characteristics of cells during cardiac development and that the dysregulation of these processes determines the deleterious nature of the stressed heart’s gene programme. We will test this first through a genome-wide study of DNA/histone methylation to generate maps of the main methylation modifications occurring in the genome of cardiac cells treated with a pro-hypertrophy regulator and of a HF model. We will then investigate the role of epigenetic enzymes deemed important in HF, through the generation and study of knockout mice models. Finally, we will test the possible therapeutic potential of modulating epigenetic genes.
We hope to further understand the pathological mechanisms leading to HF and to generate data instrumental to the development of diagnostic and therapeutic strategies for this disease.
Max ERC Funding
2 500 000 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym CARDYADS
Project Controlling Cardiomyocyte Dyadic Structure
Researcher (PI) William Edward Louch
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), LS4, ERC-2014-CoG
Summary Contraction and relaxation of cardiac myocytes, and thus the whole heart, are critically dependent on dyads. These functional junctions between t-tubules, which are invaginations of the surface membrane, and the sarcoplasmic reticulum allow efficient control of calcium release into the cytosol, and also its removal. Dyads are formed gradually during development and break down during disease. However, the precise nature of dyadic structure is unclear, even in healthy adult cardiac myocytes, as are the triggers and consequences of altering dyadic integrity. In this proposal, my group will investigate the precise 3-dimensional arrangement of dyads and their proteins during development, adulthood, and heart failure by employing CLEM imaging (PALM and EM tomography). This will be accomplished by developing transgenic mice with fluorescent labels on four dyadic proteins (L-type calcium channel, ryanodine receptor, sodium-calcium exchanger, SERCA), and by imaging tissue from explanted normal and failing human hearts. The signals responsible for controlling dyadic formation, maintenance, and disruption will be determined by performing high-throughput sequencing to identify novel genes involved with these processes in several established model systems. Particular focus will be given to investigating left ventricular wall stress and stretch-dependent gene regulation as controllers of dyadic integrity. Candidate genes will be manipulated in cell models and transgenic animals to promote dyadic formation and maintenance, and reverse dyadic disruption in heart failure. The consequences of dyadic structure for function will be tested experimentally and with mathematical modeling to examine effects on cardiac myocyte calcium homeostasis and whole-heart function. The results of this project are anticipated to yield unprecedented insight into dyadic structure, regulation, and function, and to identify novel therapeutic targets for heart disease patients.
Summary
Contraction and relaxation of cardiac myocytes, and thus the whole heart, are critically dependent on dyads. These functional junctions between t-tubules, which are invaginations of the surface membrane, and the sarcoplasmic reticulum allow efficient control of calcium release into the cytosol, and also its removal. Dyads are formed gradually during development and break down during disease. However, the precise nature of dyadic structure is unclear, even in healthy adult cardiac myocytes, as are the triggers and consequences of altering dyadic integrity. In this proposal, my group will investigate the precise 3-dimensional arrangement of dyads and their proteins during development, adulthood, and heart failure by employing CLEM imaging (PALM and EM tomography). This will be accomplished by developing transgenic mice with fluorescent labels on four dyadic proteins (L-type calcium channel, ryanodine receptor, sodium-calcium exchanger, SERCA), and by imaging tissue from explanted normal and failing human hearts. The signals responsible for controlling dyadic formation, maintenance, and disruption will be determined by performing high-throughput sequencing to identify novel genes involved with these processes in several established model systems. Particular focus will be given to investigating left ventricular wall stress and stretch-dependent gene regulation as controllers of dyadic integrity. Candidate genes will be manipulated in cell models and transgenic animals to promote dyadic formation and maintenance, and reverse dyadic disruption in heart failure. The consequences of dyadic structure for function will be tested experimentally and with mathematical modeling to examine effects on cardiac myocyte calcium homeostasis and whole-heart function. The results of this project are anticipated to yield unprecedented insight into dyadic structure, regulation, and function, and to identify novel therapeutic targets for heart disease patients.
Max ERC Funding
2 000 000 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym CAVE
Project Challenges and Advancements in Virtual Elements
Researcher (PI) Lourenco Beirao da veiga
Host Institution (HI) UNIVERSITA' DEGLI STUDI DI MILANO-BICOCCA
Call Details Consolidator Grant (CoG), PE1, ERC-2015-CoG
Summary The Virtual Element Method (VEM) is a novel technology for the discretization of partial differential equations (PDEs), that shares the same variational background as the Finite Element Method. First but not only, the VEM responds to the strongly increasing interest in using general polyhedral and polygonal meshes in the approximation of PDEs without the limit of using tetrahedral or hexahedral grids. By avoiding the explicit integration of the shape functions that span the discrete space and introducing an innovative construction of the stiffness matrixes, the VEM acquires very interesting properties and advantages with respect to more standard Galerkin methods, yet still keeping the same coding complexity. For instance, the VEM easily allows for polygonal/polyhedral meshes (even non-conforming) with non-convex elements and possibly with curved faces; it allows for discrete spaces of arbitrary C^k regularity on unstructured meshes.
The main scope of the project is to address the recent theoretical challenges posed by VEM and to assess whether this promising technology can achieve a breakthrough in applications. First, the theoretical and computational foundations of VEM will be made stronger. A deeper theoretical insight, supported by a wider numerical experience on benchmark problems, will be developed to gain a better understanding of the method's potentials and set the foundations for more applicative purposes. Second, we will focus our attention on two tough and up-to-date problems of practical interest: large deformation elasticity (where VEM can yield a dramatically more efficient handling of material inclusions, meshing of the domain and grid adaptivity, plus a much stronger robustness with respect to large grid distortions) and the cardiac bidomain model (where VEM can lead to a more accurate domain approximation through MRI data, a flexible refinement/de-refinement procedure along the propagation front, to an exact satisfaction of conservation laws).
Summary
The Virtual Element Method (VEM) is a novel technology for the discretization of partial differential equations (PDEs), that shares the same variational background as the Finite Element Method. First but not only, the VEM responds to the strongly increasing interest in using general polyhedral and polygonal meshes in the approximation of PDEs without the limit of using tetrahedral or hexahedral grids. By avoiding the explicit integration of the shape functions that span the discrete space and introducing an innovative construction of the stiffness matrixes, the VEM acquires very interesting properties and advantages with respect to more standard Galerkin methods, yet still keeping the same coding complexity. For instance, the VEM easily allows for polygonal/polyhedral meshes (even non-conforming) with non-convex elements and possibly with curved faces; it allows for discrete spaces of arbitrary C^k regularity on unstructured meshes.
The main scope of the project is to address the recent theoretical challenges posed by VEM and to assess whether this promising technology can achieve a breakthrough in applications. First, the theoretical and computational foundations of VEM will be made stronger. A deeper theoretical insight, supported by a wider numerical experience on benchmark problems, will be developed to gain a better understanding of the method's potentials and set the foundations for more applicative purposes. Second, we will focus our attention on two tough and up-to-date problems of practical interest: large deformation elasticity (where VEM can yield a dramatically more efficient handling of material inclusions, meshing of the domain and grid adaptivity, plus a much stronger robustness with respect to large grid distortions) and the cardiac bidomain model (where VEM can lead to a more accurate domain approximation through MRI data, a flexible refinement/de-refinement procedure along the propagation front, to an exact satisfaction of conservation laws).
Max ERC Funding
980 634 €
Duration
Start date: 2016-07-01, End date: 2021-06-30
Project acronym CBCD
Project Understanding the basis of cerebellar and brainstem congenital defects: from clinical and molecular characterisation to the development of a novel neuroembryonic in vitro model
Researcher (PI) Enza Maria Valente
Host Institution (HI) FONDAZIONE SANTA LUCIA
Call Details Starting Grant (StG), LS7, ERC-2010-StG_20091118
Summary Cerebellar and brainstem congenital defects (CBCDs) are heterogeneous disorders with high pre-and post-natal mortality and morbidity. Their genetic basis and pathogenetic mechanisms are largely unknown, hampering patients’ diagnosis and management and family counselling. This project aims at improve current understanding of primary CBCDs through a multidisciplinary approach combining innovative clinical, neuroimaging, molecular and functional studies, that will be articulated in four workpackages:
WP1- Clinical and neuroimaging studies: collection of detailed data and biological samples from a large cohort of patients covering a broad spectrum of CBCDs, neuroimaging classification based on magnetic resonance imaging and tractography, genotype-phenotype correlates and follow-up studies.
WP2 - Molecular studies on mendelian CBCDs: high-throughput resequencing of ciliary genes to identify pathogenic mutations and genetic modifiers in patients with ciliopathies, identification of novel disease genes, mutation analysis of genes causative of other mendelian CBCDs.
WP3 - Molecular studies on sporadic CBCDs: identification of cryptic chromosomal rearrangements by high resolution SNP-array analysis, selection and mutation analysis of candidate genes mapping to the rearranged regions.
WP4 - Functional studies: optimisation of a novel neuroembryonic in vitro model derived from mouse embryonic stem cells, to test the role of known and candidate disease genes (from WP2 and 3) on cerebellar and brainstem development, define the pathways in which they are involved and the effect of disease-causative mutations.
This project is expected to improve the current CBCD nosology, identify novel genes and mechanisms involved in cerebellar and brainstem development that are responsible for mendelian or sporadic defects, expand the available tools for pre- and post-natal diagnosis and identify clinical-genetic correlates and prognostic indexes.
Summary
Cerebellar and brainstem congenital defects (CBCDs) are heterogeneous disorders with high pre-and post-natal mortality and morbidity. Their genetic basis and pathogenetic mechanisms are largely unknown, hampering patients’ diagnosis and management and family counselling. This project aims at improve current understanding of primary CBCDs through a multidisciplinary approach combining innovative clinical, neuroimaging, molecular and functional studies, that will be articulated in four workpackages:
WP1- Clinical and neuroimaging studies: collection of detailed data and biological samples from a large cohort of patients covering a broad spectrum of CBCDs, neuroimaging classification based on magnetic resonance imaging and tractography, genotype-phenotype correlates and follow-up studies.
WP2 - Molecular studies on mendelian CBCDs: high-throughput resequencing of ciliary genes to identify pathogenic mutations and genetic modifiers in patients with ciliopathies, identification of novel disease genes, mutation analysis of genes causative of other mendelian CBCDs.
WP3 - Molecular studies on sporadic CBCDs: identification of cryptic chromosomal rearrangements by high resolution SNP-array analysis, selection and mutation analysis of candidate genes mapping to the rearranged regions.
WP4 - Functional studies: optimisation of a novel neuroembryonic in vitro model derived from mouse embryonic stem cells, to test the role of known and candidate disease genes (from WP2 and 3) on cerebellar and brainstem development, define the pathways in which they are involved and the effect of disease-causative mutations.
This project is expected to improve the current CBCD nosology, identify novel genes and mechanisms involved in cerebellar and brainstem development that are responsible for mendelian or sporadic defects, expand the available tools for pre- and post-natal diagnosis and identify clinical-genetic correlates and prognostic indexes.
Max ERC Funding
1 367 960 €
Duration
Start date: 2011-08-01, End date: 2018-03-31
Project acronym CellKarma
Project Dissecting the regulatory logic of cell fate reprogramming through integrative and single cell genomics
Researcher (PI) Davide CACCHIARELLI
Host Institution (HI) FONDAZIONE TELETHON
Call Details Starting Grant (StG), LS2, ERC-2017-STG
Summary The concept that any cell type, upon delivery of the right “cocktail” of transcription factors, can acquire an identity that otherwise it would never achieve, revolutionized the way we approach the study of developmental biology. In light of this, the discovery of induced pluripotent stem cells (IPSCs) and cell fate conversion approaches stimulated new research directions into human regenerative biology. However, the chance to successfully develop patient-tailored therapies is still very limited because reprogramming technologies are applied without a comprehensive understanding of the molecular processes involved.
Here, I propose a multifaceted approach that combines a wide range of cutting-edge integrative genomic strategies to significantly advance our understanding of the regulatory logic driving cell fate decisions during human reprogramming to pluripotency.
To this end, I will utilize single cell transcriptomics to isolate reprogramming intermediates, reconstruct their lineage relationships and define transcriptional regulators responsible for the observed transitions (AIM 1). Then, I will dissect the rules by which transcription factors modulate the activity of promoters and enhancer regions during reprogramming transitions, by applying synthetic biology and genome editing approaches (AIM 2). Then, I will adopt an alternative approach to identify reprogramming modulators by the analysis of reprogramming-induced mutagenesis events (AIM 3). Finally, I will explore my findings in multiple primary reprogramming approaches to pluripotency, with the ultimate goal of improving the quality of IPSC derivation (Aim 4).
In summary, this project will expose novel determinants and yet unidentified molecular barriers of reprogramming to pluripotency and will be essential to unlock the full potential of reprogramming technologies for shaping cellular identity in vitro and to address pressing challenges of regenerative medicine.
Summary
The concept that any cell type, upon delivery of the right “cocktail” of transcription factors, can acquire an identity that otherwise it would never achieve, revolutionized the way we approach the study of developmental biology. In light of this, the discovery of induced pluripotent stem cells (IPSCs) and cell fate conversion approaches stimulated new research directions into human regenerative biology. However, the chance to successfully develop patient-tailored therapies is still very limited because reprogramming technologies are applied without a comprehensive understanding of the molecular processes involved.
Here, I propose a multifaceted approach that combines a wide range of cutting-edge integrative genomic strategies to significantly advance our understanding of the regulatory logic driving cell fate decisions during human reprogramming to pluripotency.
To this end, I will utilize single cell transcriptomics to isolate reprogramming intermediates, reconstruct their lineage relationships and define transcriptional regulators responsible for the observed transitions (AIM 1). Then, I will dissect the rules by which transcription factors modulate the activity of promoters and enhancer regions during reprogramming transitions, by applying synthetic biology and genome editing approaches (AIM 2). Then, I will adopt an alternative approach to identify reprogramming modulators by the analysis of reprogramming-induced mutagenesis events (AIM 3). Finally, I will explore my findings in multiple primary reprogramming approaches to pluripotency, with the ultimate goal of improving the quality of IPSC derivation (Aim 4).
In summary, this project will expose novel determinants and yet unidentified molecular barriers of reprogramming to pluripotency and will be essential to unlock the full potential of reprogramming technologies for shaping cellular identity in vitro and to address pressing challenges of regenerative medicine.
Max ERC Funding
1 497 250 €
Duration
Start date: 2018-03-01, End date: 2023-02-28
Project acronym CGT HEMOPHILIA A
Project Cell and gene therapy based strategies to correct the bleeding phenotype in Hemophilia A
Researcher (PI) Antonia Follenzi
Host Institution (HI) UNIVERSITA DEGLI STUDI DEL PIEMONTE ORIENTALE AMEDEO AVOGADRO
Call Details Starting Grant (StG), LS7, ERC-2010-StG_20091118
Summary Currently, haemophilia A cannot be cured. To prevent major bleeding episodes in haemophilia, human Factor VIII (FVIII) protein must be frequently administered as prophylaxis or on demand. This treatment is complicated by its high cost and development of antibodies that neutralize FVIII activity in 20 to 30% of the patients. Therefore, permanent solutions in the form of cell and gene therapy are very attractive for haemophilia A. Recently, we demonstrated in a murine model that liver sinusoidal endothelial cells (LSEC) produce and secrete FVIII, although not exclusively. We have also found that these mice can be treated by reconstitution with wild-type bone marrow, indicating that bone marrow-derived cells, of hematopoietic, mesenchymal or even endothelial origin, can produce and secrete FVIII. Based on these findings in mice, I propose that human LSEC, umbilical cord blood cells, and bone marrow cells might be suitable sources of FVIII to be used for cell replacement therapy for haemophilia A. To advance opportunities for cell and gene therapies in haemophilia A and for identifying additional cell sources of FVIII, I intend to explore whether replacement of liver endothelium and bone marrow in immnocompromised Haemophilia A mice with healthy human cells will provide therapeutic correction. Recently, the possibility of reprogramming mature somatic cells to generate induced pluripotent stem (iPS) cells has enabled the derivation of disease-specific pluripotent cells, thus providing unprecedented experimental platforms to treat human diseases. Therefore, I intend to study whether the generation of patient-specific iPS cells may be applied to cell and gene therapy of coagulation disorders and in particular for the treatment of Haemophilia A. Studies with these novel target cells may impact significantly the future course of Haemophilia A by providing proof-of feasibility of a novel therapy strategies.
Summary
Currently, haemophilia A cannot be cured. To prevent major bleeding episodes in haemophilia, human Factor VIII (FVIII) protein must be frequently administered as prophylaxis or on demand. This treatment is complicated by its high cost and development of antibodies that neutralize FVIII activity in 20 to 30% of the patients. Therefore, permanent solutions in the form of cell and gene therapy are very attractive for haemophilia A. Recently, we demonstrated in a murine model that liver sinusoidal endothelial cells (LSEC) produce and secrete FVIII, although not exclusively. We have also found that these mice can be treated by reconstitution with wild-type bone marrow, indicating that bone marrow-derived cells, of hematopoietic, mesenchymal or even endothelial origin, can produce and secrete FVIII. Based on these findings in mice, I propose that human LSEC, umbilical cord blood cells, and bone marrow cells might be suitable sources of FVIII to be used for cell replacement therapy for haemophilia A. To advance opportunities for cell and gene therapies in haemophilia A and for identifying additional cell sources of FVIII, I intend to explore whether replacement of liver endothelium and bone marrow in immnocompromised Haemophilia A mice with healthy human cells will provide therapeutic correction. Recently, the possibility of reprogramming mature somatic cells to generate induced pluripotent stem (iPS) cells has enabled the derivation of disease-specific pluripotent cells, thus providing unprecedented experimental platforms to treat human diseases. Therefore, I intend to study whether the generation of patient-specific iPS cells may be applied to cell and gene therapy of coagulation disorders and in particular for the treatment of Haemophilia A. Studies with these novel target cells may impact significantly the future course of Haemophilia A by providing proof-of feasibility of a novel therapy strategies.
Max ERC Funding
1 123 000 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym CHEMOSENSORYCIRCUITS
Project Function of Chemosensory Circuits
Researcher (PI) Emre Yaksi
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Starting Grant (StG), LS5, ERC-2013-StG
Summary Smell and taste are the least studied of all senses. Very little is known about chemosensory information processing beyond the level of receptor neurons. Every morning we enjoy our coffee thanks to our brains ability to combine and process multiple sensory modalities. Meanwhile, we can still review a document on our desk by adjusting the weights of numerous sensory inputs that constantly bombard our brains. Yet, the smell of our coffee may remind us that pleasant weekend breakfast through associative learning and memory. In the proposed project we will explore the function and the architecture of neural circuits that are involved in olfactory and gustatory information processing, namely habenula and brainstem. Moreover we will investigate the fundamental principles underlying multimodal sensory integration and the neural basis of behavior in these highly conserved brain areas.
To achieve these goals we will take an innovative approach by combining two-photon calcium imaging, optogenetics and electrophysiology with the expanding genetic toolbox of a small vertebrate, the zebrafish. This pioneering approach will enable us to design new types of experiments that were unthinkable only a few years ago. Using this unique combination of methods, we will monitor and perturb the activity of functionally distinct elements of habenular and brainstem circuits, in vivo. The habenula and brainstem are important in mediating stress/anxiety and eating habits respectively. Therefore, understanding the neural computations in these brain regions is important for comprehending the neural mechanisms underlying psychological conditions related to anxiety and eating disorders. We anticipate that our results will go beyond chemical senses and contribute new insights to the understanding of how brain circuits work and interact with the sensory world to shape neural activity and behavioral outputs of animals.
Summary
Smell and taste are the least studied of all senses. Very little is known about chemosensory information processing beyond the level of receptor neurons. Every morning we enjoy our coffee thanks to our brains ability to combine and process multiple sensory modalities. Meanwhile, we can still review a document on our desk by adjusting the weights of numerous sensory inputs that constantly bombard our brains. Yet, the smell of our coffee may remind us that pleasant weekend breakfast through associative learning and memory. In the proposed project we will explore the function and the architecture of neural circuits that are involved in olfactory and gustatory information processing, namely habenula and brainstem. Moreover we will investigate the fundamental principles underlying multimodal sensory integration and the neural basis of behavior in these highly conserved brain areas.
To achieve these goals we will take an innovative approach by combining two-photon calcium imaging, optogenetics and electrophysiology with the expanding genetic toolbox of a small vertebrate, the zebrafish. This pioneering approach will enable us to design new types of experiments that were unthinkable only a few years ago. Using this unique combination of methods, we will monitor and perturb the activity of functionally distinct elements of habenular and brainstem circuits, in vivo. The habenula and brainstem are important in mediating stress/anxiety and eating habits respectively. Therefore, understanding the neural computations in these brain regions is important for comprehending the neural mechanisms underlying psychological conditions related to anxiety and eating disorders. We anticipate that our results will go beyond chemical senses and contribute new insights to the understanding of how brain circuits work and interact with the sensory world to shape neural activity and behavioral outputs of animals.
Max ERC Funding
1 499 471 €
Duration
Start date: 2014-04-01, End date: 2019-03-31
Project acronym CHROMPHYS
Project Physics of the Solar Chromosphere
Researcher (PI) Mats Per-Olof Carlsson
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE9, ERC-2011-ADG_20110209
Summary CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Summary
CHROMPHYS aims at a breakthrough in our understanding of the solar chromosphere by combining the development of sophisticated radiation-magnetohydrodynamic simulations with observations from the upcoming NASA SMEX mission Interface Region Imaging Spectrograph (IRIS).
The enigmatic chromosphere is the transition between the solar surface and the eruptive outer solar atmosphere. The chromosphere harbours and constrains the mass and energy loading processes that define the heating of the corona, the acceleration and the composition of the solar wind, and the energetics and triggering of solar outbursts (filament eruptions, flares, coronal mass ejections) that govern near-Earth space weather and affect mankind's technological environment.
CHROMPHYS targets the following fundamental physics questions about the chromospheric role in the mass and energy loading of the corona:
- Which types of non-thermal energy dominate in the chromosphere and beyond?
- How does the chromosphere regulate mass and energy supply to the corona and the solar wind?
- How do magnetic flux and matter rise through the chromosphere?
- How does the chromosphere affect the free magnetic energy loading that leads to solar eruptions?
CHROMPHYS proposes to answer these by producing a new, physics based vista of the chromosphere through a three-fold effort:
- develop the techniques of high-resolution numerical MHD physics to the level needed to realistically predict and analyse small-scale chromospheric structure and dynamics,
- optimise and calibrate diverse observational diagnostics by synthesizing these in detail from the simulations, and
- obtain and analyse data from IRIS using these diagnostics complemented by data from other space missions and the best solar telescopes on the ground.
Max ERC Funding
2 487 600 €
Duration
Start date: 2012-01-01, End date: 2016-12-31