Project acronym 19TH-CENTURY_EUCLID
Project Nineteenth-Century Euclid: Geometry and the Literary Imagination from Wordsworth to Wells
Researcher (PI) Alice Jenkins
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Starting Grant (StG), SH4, ERC-2007-StG
Summary This radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.
Summary
This radically interdisciplinary project aims to bring a substantially new field of research – literature and mathematics studies – to prominence as a tool for investigating the culture of nineteenth-century Britain. It will result in three kinds of outcome: a monograph, two interdisciplinary and international colloquia, and a collection of essays. The project focuses on Euclidean geometry as a key element of nineteenth-century literary and scientific culture, showing that it was part of the shared knowledge flowing through elite and popular Romantic and Victorian writing, and figuring notably in the work of very many of the century’s best-known writers. Despite its traditional cultural prestige and educational centrality, geometry has been almost wholly neglected by literary history. This project shows how literature and mathematics studies can draw a new map of nineteenth-century British culture, revitalising our understanding of the Romantic and Victorian imagination through its writing about geometry.
Max ERC Funding
323 118 €
Duration
Start date: 2009-01-01, End date: 2011-10-31
Project acronym 1toStopVax
Project RNA virus attenuation by altering mutational robustness
Researcher (PI) Marco VIGNUZZI
Host Institution (HI) INSTITUT PASTEUR
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary RNA viruses have extreme mutation frequencies. When a RNA virus replicates, nucleotide mutations are generated resulting in a population of variants. This genetic diversity creates a cloud of mutations that are potentially beneficial to viral survival, but the majority of mutations are detrimental to the virus. By increasing the mutation rate of a RNA virus, viral fitness is reduced because it generates more errors, and attenuates the virus during in vivo infection. Another feature that affects RNA virus fitness is mutational robustness. Mutational robustness is the ability to buffer the negative effects of mutation.
The attenuation of RNA viruses for vaccine production faces problems of genetic instability and reversion to a pathogenic phenotype. The conventional method for attenuation is mostly empirical and specific to the particular RNA virus species.
Hence, it cannot be universally applied to a variety of virus types. We've developed a non-empirical, rational means of attenuating RNA viruses, targeting mutational robustness as modifiable trait.
We demonstrate that mutational robustness of RNA viruses can be modified without changing a virus' physical and biological properties for vaccine production; yet the virus is attenuated as it becomes victim of its naturally high mutation rate. Specifically, the genome of RNA viruses are modified so that a larger proportion of mutations become lethal Stop mutations. Our technology places the virus one step away from these Stop mutations (1-to-Stop). We succeeded in attenuating two RNA viruses from very different viral families, confirming the broad applicability of this approach. These viruses were attenuated in vivo, generated high levels of neutralizing antibody and protected mice from lethal challenge infection.
The proposal now seeks to complete proof of concept studies and develop commercialization strategies to scale up this new technology to preclinical testing with industrial partners.
Summary
RNA viruses have extreme mutation frequencies. When a RNA virus replicates, nucleotide mutations are generated resulting in a population of variants. This genetic diversity creates a cloud of mutations that are potentially beneficial to viral survival, but the majority of mutations are detrimental to the virus. By increasing the mutation rate of a RNA virus, viral fitness is reduced because it generates more errors, and attenuates the virus during in vivo infection. Another feature that affects RNA virus fitness is mutational robustness. Mutational robustness is the ability to buffer the negative effects of mutation.
The attenuation of RNA viruses for vaccine production faces problems of genetic instability and reversion to a pathogenic phenotype. The conventional method for attenuation is mostly empirical and specific to the particular RNA virus species.
Hence, it cannot be universally applied to a variety of virus types. We've developed a non-empirical, rational means of attenuating RNA viruses, targeting mutational robustness as modifiable trait.
We demonstrate that mutational robustness of RNA viruses can be modified without changing a virus' physical and biological properties for vaccine production; yet the virus is attenuated as it becomes victim of its naturally high mutation rate. Specifically, the genome of RNA viruses are modified so that a larger proportion of mutations become lethal Stop mutations. Our technology places the virus one step away from these Stop mutations (1-to-Stop). We succeeded in attenuating two RNA viruses from very different viral families, confirming the broad applicability of this approach. These viruses were attenuated in vivo, generated high levels of neutralizing antibody and protected mice from lethal challenge infection.
The proposal now seeks to complete proof of concept studies and develop commercialization strategies to scale up this new technology to preclinical testing with industrial partners.
Max ERC Funding
150 000 €
Duration
Start date: 2016-09-01, End date: 2018-02-28
Project acronym 2-HIT
Project Genetic interaction networks: From C. elegans to human disease
Researcher (PI) Ben Lehner
Host Institution (HI) FUNDACIO CENTRE DE REGULACIO GENOMICA
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Summary
Most hereditary diseases in humans are genetically complex, resulting from combinations of mutations in multiple genes. However synthetic interactions between genes are very difficult to identify in population studies because of a lack of statistical power and we fundamentally do not understand how mutations interact to produce phenotypes. C. elegans is a unique animal in which genetic interactions can be rapidly identified in vivo using RNA interference, and we recently used this system to construct the first genetic interaction network for any animal, focused on signal transduction genes. The first objective of this proposal is to extend this work and map a comprehensive genetic interaction network for this model metazoan. This project will provide the first insights into the global properties of animal genetic interaction networks, and a comprehensive view of the functional relationships between genes in an animal. The second objective of the proposal is to use C. elegans to develop and validate experimentally integrated gene networks that connect genes to phenotypes and predict genetic interactions on a genome-wide scale. The methods that we develop and validate in C. elegans will then be applied to predict phenotypes and interactions for human genes. The final objective is to dissect the molecular mechanisms underlying genetic interactions, and to understand how these interactions evolve. The combined aim of these three objectives is to generate a framework for understanding and predicting how mutations interact to produce phenotypes, including in human disease.
Max ERC Funding
1 100 000 €
Duration
Start date: 2008-09-01, End date: 2014-04-30
Project acronym 2DNANOPTICA
Project Nano-optics on flatland: from quantum nanotechnology to nano-bio-photonics
Researcher (PI) Pablo Alonso-González
Host Institution (HI) UNIVERSIDAD DE OVIEDO
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary Ubiquitous in nature, light-matter interactions are of fundamental importance in science and all optical technologies. Understanding and controlling them has been a long-pursued objective in modern physics. However, so far, related experiments have relied on traditional optical schemes where, owing to the classical diffraction limit, control of optical fields to length scales below the wavelength of light is prevented. Importantly, this limitation impedes to exploit the extraordinary fundamental and scaling potentials of nanoscience and nanotechnology. A solution to concentrate optical fields into sub-diffracting volumes is the excitation of surface polaritons –coupled excitations of photons and mobile/bound charges in metals/polar materials (plasmons/phonons)-. However, their initial promises have been hindered by either strong optical losses or lack of electrical control in metals, and difficulties to fabricate high optical quality nanostructures in polar materials.
With the advent of two-dimensional (2D) materials and their extraordinary optical properties, during the last 2-3 years the visualization of both low-loss and electrically tunable (active) plasmons in graphene and high optical quality phonons in monolayer and multilayer h-BN nanostructures have been demonstrated in the mid-infrared spectral range, thus introducing a very encouraging arena for scientifically ground-breaking discoveries in nano-optics. Inspired by these extraordinary prospects, this ERC project aims to make use of our knowledge and unique expertise in 2D nanoplasmonics, and the recent advances in nanophononics, to establish a technological platform that, including coherent sources, waveguides, routers, and efficient detectors, permits an unprecedented active control and manipulation (at room temperature) of light and light-matter interactions on the nanoscale, thus laying experimentally the foundations of a 2D nano-optics field.
Summary
Ubiquitous in nature, light-matter interactions are of fundamental importance in science and all optical technologies. Understanding and controlling them has been a long-pursued objective in modern physics. However, so far, related experiments have relied on traditional optical schemes where, owing to the classical diffraction limit, control of optical fields to length scales below the wavelength of light is prevented. Importantly, this limitation impedes to exploit the extraordinary fundamental and scaling potentials of nanoscience and nanotechnology. A solution to concentrate optical fields into sub-diffracting volumes is the excitation of surface polaritons –coupled excitations of photons and mobile/bound charges in metals/polar materials (plasmons/phonons)-. However, their initial promises have been hindered by either strong optical losses or lack of electrical control in metals, and difficulties to fabricate high optical quality nanostructures in polar materials.
With the advent of two-dimensional (2D) materials and their extraordinary optical properties, during the last 2-3 years the visualization of both low-loss and electrically tunable (active) plasmons in graphene and high optical quality phonons in monolayer and multilayer h-BN nanostructures have been demonstrated in the mid-infrared spectral range, thus introducing a very encouraging arena for scientifically ground-breaking discoveries in nano-optics. Inspired by these extraordinary prospects, this ERC project aims to make use of our knowledge and unique expertise in 2D nanoplasmonics, and the recent advances in nanophononics, to establish a technological platform that, including coherent sources, waveguides, routers, and efficient detectors, permits an unprecedented active control and manipulation (at room temperature) of light and light-matter interactions on the nanoscale, thus laying experimentally the foundations of a 2D nano-optics field.
Max ERC Funding
1 459 219 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym 2DNanoSpec
Project Nanoscale Vibrational Spectroscopy of Sensitive 2D Molecular Materials
Researcher (PI) Renato ZENOBI
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Summary
I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Max ERC Funding
2 311 696 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 2DQP
Project Two-dimensional quantum photonics
Researcher (PI) Brian David GERARDOT
Host Institution (HI) HERIOT-WATT UNIVERSITY
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary Quantum optics, the study of how discrete packets of light (photons) and matter interact, has led to the development of remarkable new technologies which exploit the bizarre properties of quantum mechanics. These quantum technologies are primed to revolutionize the fields of communication, information processing, and metrology in the coming years. Similar to contemporary technologies, the future quantum machinery will likely consist of a semiconductor platform to create and process the quantum information. However, to date the demanding requirements on a quantum photonic platform have yet to be satisfied with conventional bulk (three-dimensional) semiconductors.
To surmount these well-known obstacles, a new paradigm in quantum photonics is required. Initiated by the recent discovery of single photon emitters in atomically flat (two-dimensional) semiconducting materials, 2DQP aims to be at the nucleus of a new approach by realizing quantum optics with ultra-stable (coherent) quantum states integrated into devices with electronic and photonic functionality. We will characterize, identify, engineer, and coherently manipulate localized quantum states in this two-dimensional quantum photonic platform. A vital component of 2DQP’s vision is to go beyond the fundamental science and achieve the ideal solid-state single photon device yielding perfect extraction - 100% efficiency - of on-demand indistinguishable single photons. Finally, we will exploit this ideal device to implement the critical building block for a photonic quantum computer.
Summary
Quantum optics, the study of how discrete packets of light (photons) and matter interact, has led to the development of remarkable new technologies which exploit the bizarre properties of quantum mechanics. These quantum technologies are primed to revolutionize the fields of communication, information processing, and metrology in the coming years. Similar to contemporary technologies, the future quantum machinery will likely consist of a semiconductor platform to create and process the quantum information. However, to date the demanding requirements on a quantum photonic platform have yet to be satisfied with conventional bulk (three-dimensional) semiconductors.
To surmount these well-known obstacles, a new paradigm in quantum photonics is required. Initiated by the recent discovery of single photon emitters in atomically flat (two-dimensional) semiconducting materials, 2DQP aims to be at the nucleus of a new approach by realizing quantum optics with ultra-stable (coherent) quantum states integrated into devices with electronic and photonic functionality. We will characterize, identify, engineer, and coherently manipulate localized quantum states in this two-dimensional quantum photonic platform. A vital component of 2DQP’s vision is to go beyond the fundamental science and achieve the ideal solid-state single photon device yielding perfect extraction - 100% efficiency - of on-demand indistinguishable single photons. Finally, we will exploit this ideal device to implement the critical building block for a photonic quantum computer.
Max ERC Funding
1 999 135 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym 2O2ACTIVATION
Project Development of Direct Dehydrogenative Couplings mediated by Dioxygen
Researcher (PI) Frederic William Patureau
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Starting Grant (StG), PE5, ERC-2016-STG
Summary The field of C-H bond activation has evolved at an exponential pace in the last 15 years. What appeals most in those novel synthetic techniques is clear: they bypass the pre-activation steps usually required in traditional cross-coupling chemistry by directly metalating C-H bonds. Many C-H bond functionalizations today however, rely on poorly atom and step efficient oxidants, leading to significant and costly chemical waste, thereby seriously undermining the overall sustainability of those methods. As restrictions in sustainability regulations will further increase, and the cost of certain chemical commodities will rise, atom efficiency in organic synthesis remains a top priority for research.
The aim of 2O2ACTIVATION is to develop novel technologies utilizing O2 as sole terminal oxidant in order to allow useful, extremely sustainable, thermodynamically challenging, dehydrogenative C-N and C-O bond forming coupling reactions. However, the moderate reactivity of O2 towards many catalysts constitutes a major challenge. 2O2ACTIVATION will pioneer the design of new catalysts based on the ultra-simple propene motive, capable of direct activation of O2 for C-H activation based cross-couplings. The project is divided into 3 major lines: O2 activation using propene and its analogues (propenoids), 1) without metal or halide, 2) with hypervalent halide catalysis, 3) with metal catalyzed C-H activation.
The philosophy of 2O2ACTIVATION is to focus C-H functionalization method development on the oxidative event.
Consequently, 2O2ACTIVATION breakthroughs will dramatically shortcut synthetic routes through the use of inactivated, unprotected, and readily available building blocks; and thus should be easily scalable. This will lead to a strong decrease in the costs related to the production of many essential chemicals, while preserving the environment (water as terminal by-product). The resulting novels coupling methods will thus have a lasting impact on the chemical industry.
Summary
The field of C-H bond activation has evolved at an exponential pace in the last 15 years. What appeals most in those novel synthetic techniques is clear: they bypass the pre-activation steps usually required in traditional cross-coupling chemistry by directly metalating C-H bonds. Many C-H bond functionalizations today however, rely on poorly atom and step efficient oxidants, leading to significant and costly chemical waste, thereby seriously undermining the overall sustainability of those methods. As restrictions in sustainability regulations will further increase, and the cost of certain chemical commodities will rise, atom efficiency in organic synthesis remains a top priority for research.
The aim of 2O2ACTIVATION is to develop novel technologies utilizing O2 as sole terminal oxidant in order to allow useful, extremely sustainable, thermodynamically challenging, dehydrogenative C-N and C-O bond forming coupling reactions. However, the moderate reactivity of O2 towards many catalysts constitutes a major challenge. 2O2ACTIVATION will pioneer the design of new catalysts based on the ultra-simple propene motive, capable of direct activation of O2 for C-H activation based cross-couplings. The project is divided into 3 major lines: O2 activation using propene and its analogues (propenoids), 1) without metal or halide, 2) with hypervalent halide catalysis, 3) with metal catalyzed C-H activation.
The philosophy of 2O2ACTIVATION is to focus C-H functionalization method development on the oxidative event.
Consequently, 2O2ACTIVATION breakthroughs will dramatically shortcut synthetic routes through the use of inactivated, unprotected, and readily available building blocks; and thus should be easily scalable. This will lead to a strong decrease in the costs related to the production of many essential chemicals, while preserving the environment (water as terminal by-product). The resulting novels coupling methods will thus have a lasting impact on the chemical industry.
Max ERC Funding
1 489 823 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 321
Project from Cubic To Linear complexity in computational electromagnetics
Researcher (PI) Francesco Paolo ANDRIULLI
Host Institution (HI) POLITECNICO DI TORINO
Call Details Consolidator Grant (CoG), PE7, ERC-2016-COG
Summary Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Summary
Computational Electromagnetics (CEM) is the scientific field at the origin of all new modeling and simulation tools required by the constantly arising design challenges of emerging and future technologies in applied electromagnetics. As in many other technological fields, however, the trend in all emerging technologies in electromagnetic engineering is going towards miniaturized, higher density and multi-scale scenarios. Computationally speaking this translates in the steep increase of the number of degrees of freedom. Given that the design cost (the cost of a multi-right-hand side problem dominated by matrix inversion) can scale as badly as cubically with these degrees of freedom, this fact, as pointed out by many, will sensibly compromise the practical impact of CEM on future and emerging technologies.
For this reason, the CEM scientific community has been looking for years for a FFT-like paradigm shift: a dynamic fast direct solver providing a design cost that would scale only linearly with the degrees of freedom. Such a fast solver is considered today a Holy Grail of the discipline.
The Grand Challenge of 321 will be to tackle this Holy Grail in Computational Electromagnetics by investigating a dynamic Fast Direct Solver for Maxwell Problems that would run in a linear-instead-of-cubic complexity for an arbitrary number and configuration of degrees of freedom.
The failure of all previous attempts will be overcome by a game-changing transformation of the CEM classical problem that will leverage on a recent breakthrough of the PI. Starting from this, the project will investigate an entire new paradigm for impacting algorithms to achieve this grand challenge.
The impact of the FFT’s quadratic-to-linear paradigm shift shows how computational complexity reductions can be groundbreaking on applications. The cubic-to-linear paradigm shift, which the 321 project will aim for, will have such a rupturing impact on electromagnetic science and technology.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 3D-BioMat
Project Deciphering biomineralization mechanisms through 3D explorations of mesoscale crystalline structure in calcareous biomaterials
Researcher (PI) VIRGINIE CHAMARD
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), PE3, ERC-2016-COG
Summary The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Summary
The fundamental 3D-BioMat project aims at providing a biomineralization model to explain the formation of microscopic calcareous single-crystals produced by living organisms. Although these crystals present a wide variety of shapes, associated to various organic materials, the observation of a nanoscale granular structure common to almost all calcareous crystallizing organisms, associated to an extended crystalline coherence, underlies a generic biomineralization and assembly process. A key to building realistic scenarios of biomineralization is to reveal the crystalline architecture, at the mesoscale, (i. e., over a few granules), which none of the existing nano-characterization tools is able to provide.
3D-BioMat is based on the recognized PI’s expertise in the field of synchrotron coherent x-ray diffraction microscopy. It will extend the PI’s disruptive pioneering microscopy formalism, towards an innovative high-throughput approach able at giving access to the 3D mesoscale image of the crystalline properties (crystal-line coherence, crystal plane tilts and strains) with the required flexibility, nanoscale resolution, and non-invasiveness.
This achievement will be used to timely reveal the generics of the mesoscale crystalline structure through the pioneering explorations of a vast variety of crystalline biominerals produced by the famous Pinctada mar-garitifera oyster shell, and thereby build a realistic biomineralization scenario.
The inferred biomineralization pathways, including both physico-chemical pathways and biological controls, will ultimately be validated by comparing the mesoscale structures produced by biomimetic samples with the biogenic ones. Beyond deciphering one of the most intriguing questions of material nanosciences, 3D-BioMat may contribute to new climate models, pave the way for new routes in material synthesis and supply answers to the pearl-culture calcification problems.
Max ERC Funding
1 966 429 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym 3D-FM
Project Taking Force Microscopy into the Third Dimension
Researcher (PI) Tjerk Hendrik Oosterkamp
Host Institution (HI) UNIVERSITEIT LEIDEN
Call Details Starting Grant (StG), PE3, ERC-2007-StG
Summary I propose to pursue two emerging Force Microscopy techniques that allow measuring structural properties below the surface of the specimen. Whereas Force Microscopy (most commonly known under the name AFM) is usually limited to measuring the surface topography and surface properties of a specimen, I will demonstrate that Force Microscopy can achieve true 3D images of the structure of the cell nucleus. In Ultrasound Force Microscopy, an ultrasound wave is launched from below towards the surface of the specimen. After the sound waves interact with structures beneath the surface of the specimen, the local variations in the amplitude and phase shift of the ultrasonic surface motion is collected by the Force Microscopy tip. Previously, measured 2D maps of the surface response have shown that the surface response is sensitive to structures below the surface. In this project I will employ miniature AFM cantilevers and nanotube tips that I have already developed in my lab. This will allow me to quickly acquire many such 2D maps at a much wider range of ultrasound frequencies and from these 2D maps calculate the full 3D structure below the surface. I expect this technique to have a resolving power better than 10 nm in three dimensions as far as 2 microns below the surface. In parallel I will introduce a major improvement to a technique based on Nuclear Magnetic Resonance (NMR). Magnetic Resonance Force Microscopy measures the interaction of a rotating nuclear spin in the field gradient of a magnetic Force Microscopy tip. However, these forces are so small that they pose an enormous challenge. Miniature cantilevers and nanotube tips, in combination with additional innovations in the detection of the cantilever motion, can overcome this problem. I expect to be able to measure the combined signal of 100 proton spins or fewer, which will allow me to measure proton densities with a resolution of 5 nm, but possibly even with atomic resolution.
Summary
I propose to pursue two emerging Force Microscopy techniques that allow measuring structural properties below the surface of the specimen. Whereas Force Microscopy (most commonly known under the name AFM) is usually limited to measuring the surface topography and surface properties of a specimen, I will demonstrate that Force Microscopy can achieve true 3D images of the structure of the cell nucleus. In Ultrasound Force Microscopy, an ultrasound wave is launched from below towards the surface of the specimen. After the sound waves interact with structures beneath the surface of the specimen, the local variations in the amplitude and phase shift of the ultrasonic surface motion is collected by the Force Microscopy tip. Previously, measured 2D maps of the surface response have shown that the surface response is sensitive to structures below the surface. In this project I will employ miniature AFM cantilevers and nanotube tips that I have already developed in my lab. This will allow me to quickly acquire many such 2D maps at a much wider range of ultrasound frequencies and from these 2D maps calculate the full 3D structure below the surface. I expect this technique to have a resolving power better than 10 nm in three dimensions as far as 2 microns below the surface. In parallel I will introduce a major improvement to a technique based on Nuclear Magnetic Resonance (NMR). Magnetic Resonance Force Microscopy measures the interaction of a rotating nuclear spin in the field gradient of a magnetic Force Microscopy tip. However, these forces are so small that they pose an enormous challenge. Miniature cantilevers and nanotube tips, in combination with additional innovations in the detection of the cantilever motion, can overcome this problem. I expect to be able to measure the combined signal of 100 proton spins or fewer, which will allow me to measure proton densities with a resolution of 5 nm, but possibly even with atomic resolution.
Max ERC Funding
1 794 960 €
Duration
Start date: 2008-08-01, End date: 2013-07-31
Project acronym 3D_Tryps
Project The role of three-dimensional genome architecture in antigenic variation
Researcher (PI) Tim Nicolai SIEGEL
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary Antigenic variation is a widely employed strategy to evade the host immune response. It has similar functional requirements even in evolutionarily divergent pathogens. These include the mutually exclusive expression of antigens and the periodic, nonrandom switching in the expression of different antigens during the course of an infection. Despite decades of research the mechanisms of antigenic variation are not fully understood in any organism.
The recent development of high-throughput sequencing-based assays to probe the 3D genome architecture (Hi-C) has revealed the importance of the spatial organization of DNA inside the nucleus. 3D genome architecture plays a critical role in the regulation of mutually exclusive gene expression and the frequency of translocation between different genomic loci in many eukaryotes. Thus, genome architecture may also be a key regulator of antigenic variation, yet the causal links between genome architecture and the expression of antigens have not been studied systematically. In addition, the development of CRISPR-Cas9-based approaches to perform nucleotide-specific genome editing has opened unprecedented opportunities to study the influence of DNA sequence elements on the spatial organization of DNA and how this impacts antigen expression.
I have adapted both Hi-C and CRISPR-Cas9 technology to the protozoan parasite Trypanosoma brucei, one of the most important model organisms to study antigenic variation. These techniques will enable me to bridge the field of antigenic variation research with that of genome architecture. I will perform the first systematic analysis of the role of genome architecture in the mutually exclusive and hierarchical expression of antigens in any pathogen.
The experiments outlined in this proposal will provide new insight, facilitating a new view of antigenic variation and may eventually help medical intervention in T. brucei and in other pathogens relying on antigenic variation for their survival.
Summary
Antigenic variation is a widely employed strategy to evade the host immune response. It has similar functional requirements even in evolutionarily divergent pathogens. These include the mutually exclusive expression of antigens and the periodic, nonrandom switching in the expression of different antigens during the course of an infection. Despite decades of research the mechanisms of antigenic variation are not fully understood in any organism.
The recent development of high-throughput sequencing-based assays to probe the 3D genome architecture (Hi-C) has revealed the importance of the spatial organization of DNA inside the nucleus. 3D genome architecture plays a critical role in the regulation of mutually exclusive gene expression and the frequency of translocation between different genomic loci in many eukaryotes. Thus, genome architecture may also be a key regulator of antigenic variation, yet the causal links between genome architecture and the expression of antigens have not been studied systematically. In addition, the development of CRISPR-Cas9-based approaches to perform nucleotide-specific genome editing has opened unprecedented opportunities to study the influence of DNA sequence elements on the spatial organization of DNA and how this impacts antigen expression.
I have adapted both Hi-C and CRISPR-Cas9 technology to the protozoan parasite Trypanosoma brucei, one of the most important model organisms to study antigenic variation. These techniques will enable me to bridge the field of antigenic variation research with that of genome architecture. I will perform the first systematic analysis of the role of genome architecture in the mutually exclusive and hierarchical expression of antigens in any pathogen.
The experiments outlined in this proposal will provide new insight, facilitating a new view of antigenic variation and may eventually help medical intervention in T. brucei and in other pathogens relying on antigenic variation for their survival.
Max ERC Funding
1 498 175 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym 3Dmaterials4Energy
Project Hierarchical Inorganic Nanomaterials as Next Generation Catalysts and Filters
Researcher (PI) Taleb Mokari
Host Institution (HI) BEN-GURION UNIVERSITY OF THE NEGEV
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary In the coming few decades, two major global grand challenges will continue to attract the attention of scientists and engineers in academia and industry: achieving clean water and clean energy. This PoC establishes the development of two prototypes, water oxidation catalyst and water purification filter, by creating inexpensive, abundant and versatile hierarchical structures of inorganic nanomaterials (HSINs).
The formation of HSINs has been one of the major obstacles toward achieving a technological progress in various applications. Presently, fabrication of well-defined 3-D structures can be achieved either by photo/electro lithography, assembly, 3D printing or template-mediated methods. Various structures with high quality/yield can be obtained through those techniques, however, these methods suffer from high cost, difficulty of fabrication of free-standing structures, and sometime the throughput is limited. On the other hand, the templated approaches usually are facile, low cost and offer several and complex structures in particular the ones obtained from nature.
Our invention is based on forming the HSINs using fossil templates from nature. We propose to harness the naturally designed morphologies of the fossil templates to rationally form hierarchical structures of nanomaterials. These structures have many advantageous, compared to the current state-of-the-art catalyst and filter, for example high surface area, high porosity, confined space (nano-reactor) and divers functionalities obtained by controlling the chemical composition of the inorganic material shell. Since these properties are important for achieving high performance, we propose HSINs as next generation water oxidation electrocatalyst and water purification filter.
Summary
In the coming few decades, two major global grand challenges will continue to attract the attention of scientists and engineers in academia and industry: achieving clean water and clean energy. This PoC establishes the development of two prototypes, water oxidation catalyst and water purification filter, by creating inexpensive, abundant and versatile hierarchical structures of inorganic nanomaterials (HSINs).
The formation of HSINs has been one of the major obstacles toward achieving a technological progress in various applications. Presently, fabrication of well-defined 3-D structures can be achieved either by photo/electro lithography, assembly, 3D printing or template-mediated methods. Various structures with high quality/yield can be obtained through those techniques, however, these methods suffer from high cost, difficulty of fabrication of free-standing structures, and sometime the throughput is limited. On the other hand, the templated approaches usually are facile, low cost and offer several and complex structures in particular the ones obtained from nature.
Our invention is based on forming the HSINs using fossil templates from nature. We propose to harness the naturally designed morphologies of the fossil templates to rationally form hierarchical structures of nanomaterials. These structures have many advantageous, compared to the current state-of-the-art catalyst and filter, for example high surface area, high porosity, confined space (nano-reactor) and divers functionalities obtained by controlling the chemical composition of the inorganic material shell. Since these properties are important for achieving high performance, we propose HSINs as next generation water oxidation electrocatalyst and water purification filter.
Max ERC Funding
150 000 €
Duration
Start date: 2017-03-01, End date: 2018-08-31
Project acronym 4-TOPS
Project Four experiments in Topological Superconductivity.
Researcher (PI) Laurens Molenkamp
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Call Details Advanced Grant (AdG), PE3, ERC-2016-ADG
Summary Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Summary
Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Max ERC Funding
2 497 567 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym 4C
Project 4C technology: uncovering the multi-dimensional structure of the genome
Researcher (PI) Wouter Leonard De Laat
Host Institution (HI) KONINKLIJKE NEDERLANDSE AKADEMIE VAN WETENSCHAPPEN - KNAW
Call Details Starting Grant (StG), LS2, ERC-2007-StG
Summary The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Summary
The architecture of DNA in the cell nucleus is an emerging epigenetic key contributor to genome function. We recently developed 4C technology, a high-throughput technique that combines state-of-the-art 3C technology with tailored micro-arrays to uniquely allow for an unbiased genome-wide search for DNA loci that interact in the nuclear space. Based on 4C technology, we were the first to provide a comprehensive overview of long-range DNA contacts of selected loci. The data showed that active and inactive chromatin domains contact many distinct regions within and between chromosomes and genes switch long-range DNA contacts in relation to their expression status. 4C technology not only allows investigating the three-dimensional structure of DNA in the nucleus, it also accurately reconstructs at least 10 megabases of the one-dimensional chromosome sequence map around the target sequence. Changes in this physical map as a result of genomic rearrangements are therefore identified by 4C technology. We recently demonstrated that 4C detects deletions, balanced inversions and translocations in patient samples at a resolution (~7kb) that allowed immediate sequencing of the breakpoints. Excitingly, 4C technology therefore offers the first high-resolution genomic approach that can identify both balanced and unbalanced genomic rearrangements. 4C is expected to become an important tool in clinical diagnosis and prognosis. Key objectives of this proposal are: 1. Explore the functional significance of DNA folding in the nucleus by systematically applying 4C technology to differentially expressed gene loci. 2. Adapt 4C technology such that it allows for massive parallel analysis of DNA interactions between regulatory elements and gene promoters. This method would greatly facilitate the identification of functionally relevant DNA elements in the genome. 3. Develop 4C technology into a clinical diagnostic tool for the accurate detection of balanced and unbalanced rearrangements.
Max ERC Funding
1 225 000 €
Duration
Start date: 2008-09-01, End date: 2013-08-31
Project acronym 4DVIDEO
Project 4DVideo: 4D spatio-temporal modeling of real-world events from video streams
Researcher (PI) Marc Pollefeys
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), PE5, ERC-2007-StG
Summary The focus of this project is the development of algorithms that allow one to capture and analyse dynamic events taking place in the real world. For this, we intend to develop smart camera networks that can perform a multitude of observation tasks, ranging from surveillance and tracking to high-fidelity, immersive reconstructions of important dynamic events (i.e. 4D videos). There are many fundamental questions in computer vision associated with these problems. Can the geometric, topologic and photometric properties of the camera network be obtained from live images? What is changing about the environment in which the network is embedded? How much information can be obtained from dynamic events that are observed by the network? What if the camera network consists of a random collection of sensors that happened to observe a particular event (think hand-held cell phone cameras)? Do we need synchronization? Those questions become even more challenging if one considers active camera networks that can adapt to the vision task at hand. How should resources be prioritized for different tasks? Can we derive optimal strategies to control camera parameters such as pan, tilt and zoom, trade-off resolution, frame-rate and bandwidth? More fundamentally, seeing cameras as points that sample incoming light rays and camera networks as a distributed sensor, how does one decide which rays should be sampled? Many of those issues are particularly interesting when we consider time-varying events. Both spatial and temporal resolution are important and heterogeneous frame-rates and resolution can offer advantages. Prior knowledge or information obtained from earlier samples can be used to restrict the possible range of solutions (e.g. smoothness assumption and motion prediction). My goal is to obtain fundamental answers to many of those question based on thorough theoretical analysis combined with practical algorithms that are proven on real applications.
Summary
The focus of this project is the development of algorithms that allow one to capture and analyse dynamic events taking place in the real world. For this, we intend to develop smart camera networks that can perform a multitude of observation tasks, ranging from surveillance and tracking to high-fidelity, immersive reconstructions of important dynamic events (i.e. 4D videos). There are many fundamental questions in computer vision associated with these problems. Can the geometric, topologic and photometric properties of the camera network be obtained from live images? What is changing about the environment in which the network is embedded? How much information can be obtained from dynamic events that are observed by the network? What if the camera network consists of a random collection of sensors that happened to observe a particular event (think hand-held cell phone cameras)? Do we need synchronization? Those questions become even more challenging if one considers active camera networks that can adapt to the vision task at hand. How should resources be prioritized for different tasks? Can we derive optimal strategies to control camera parameters such as pan, tilt and zoom, trade-off resolution, frame-rate and bandwidth? More fundamentally, seeing cameras as points that sample incoming light rays and camera networks as a distributed sensor, how does one decide which rays should be sampled? Many of those issues are particularly interesting when we consider time-varying events. Both spatial and temporal resolution are important and heterogeneous frame-rates and resolution can offer advantages. Prior knowledge or information obtained from earlier samples can be used to restrict the possible range of solutions (e.g. smoothness assumption and motion prediction). My goal is to obtain fundamental answers to many of those question based on thorough theoretical analysis combined with practical algorithms that are proven on real applications.
Max ERC Funding
1 757 422 €
Duration
Start date: 2008-08-01, End date: 2013-11-30
Project acronym 5D Heart Patch
Project A Functional, Mature In vivo Human Ventricular Muscle Patch for Cardiomyopathy
Researcher (PI) Kenneth Randall Chien
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Advanced Grant (AdG), LS7, ERC-2016-ADG
Summary Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Summary
Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Max ERC Funding
2 149 228 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym AAA
Project Adaptive Actin Architectures
Researcher (PI) Laurent Blanchoin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2016-ADG
Summary Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Summary
Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Max ERC Funding
2 349 898 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AAMDDR
Project DNA damage response and genome stability: The role of ATM, ATR and the Mre11 complex
Researcher (PI) Vincenzo Costanzo
Host Institution (HI) CANCER RESEARCH UK LBG
Call Details Starting Grant (StG), LS1, ERC-2007-StG
Summary Chromosomal DNA is continuously subjected to exogenous and endogenous damaging insults. In the presence of DNA damage cells activate a multi-faceted checkpoint response that delays cell cycle progression and promotes DNA repair. Failures in this response lead to genomic instability, the main feature of cancer cells. Several cancer-prone human syndromes including the Ataxia teleangiectasia (A-T), the A-T Like Disorder (ATLD) and the Seckel Syndrome reflect defects in the specific genes of the DNA damage response such as ATM, MRE11 and ATR. DNA damage response pathways are poorly understood at biochemical level in vertebrate organisms. We have established a cell-free system based on Xenopus laevis egg extract to study molecular events underlying DNA damage response. This is the first in vitro system that recapitulates different aspects of the DNA damage response in vertebrates. Using this system we propose to study the biochemistry of the ATM, ATR and the Mre11 complex dependent DNA damage response. In particular we will: 1) Dissect the signal transduction pathway that senses DNA damage and promotes cell cycle arrest and DNA damage repair; 2) Analyze at molecular level the role of ATM, ATR, Mre11 in chromosomal DNA replication and mitosis during normal and stressful conditions; 3) Identify substrates of the ATM and ATR dependent DNA damage response using an innovative screening procedure.
Summary
Chromosomal DNA is continuously subjected to exogenous and endogenous damaging insults. In the presence of DNA damage cells activate a multi-faceted checkpoint response that delays cell cycle progression and promotes DNA repair. Failures in this response lead to genomic instability, the main feature of cancer cells. Several cancer-prone human syndromes including the Ataxia teleangiectasia (A-T), the A-T Like Disorder (ATLD) and the Seckel Syndrome reflect defects in the specific genes of the DNA damage response such as ATM, MRE11 and ATR. DNA damage response pathways are poorly understood at biochemical level in vertebrate organisms. We have established a cell-free system based on Xenopus laevis egg extract to study molecular events underlying DNA damage response. This is the first in vitro system that recapitulates different aspects of the DNA damage response in vertebrates. Using this system we propose to study the biochemistry of the ATM, ATR and the Mre11 complex dependent DNA damage response. In particular we will: 1) Dissect the signal transduction pathway that senses DNA damage and promotes cell cycle arrest and DNA damage repair; 2) Analyze at molecular level the role of ATM, ATR, Mre11 in chromosomal DNA replication and mitosis during normal and stressful conditions; 3) Identify substrates of the ATM and ATR dependent DNA damage response using an innovative screening procedure.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym AB-SWITCH
Project Evaluation of commercial potential of a low-cost kit based on DNA-nanoswitches for the single-step measurement of diagnostic antibodies
Researcher (PI) Francesco RICCI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI ROMA TOR VERGATA
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary "Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Summary
"Antibodies are among the most widely monitored class of diagnostic biomarkers. Immunoassays market now covers about 1/3 of the global market of in-vitro diagnostics (about $50 billion). However, current methods for the detection of diagnostic antibodies are either qualitative or require cumbersome, resource-intensive laboratory procedures that need hours to provide clinicians with diagnostic information. A new method for fast and low-cost detection of antibodies will have a strong economic impact in the market of in-vitro diagnostics and Immunoassays.
During our ERC Starting Grant project ""Nature Nanodevices"" we have developed a novel diagnostic technology for the detection of clinically relevant antibodies in serum and other body fluids. The platform (here named Ab-switch) supports the fluorescent detection of diagnostic antibodies (for example, HIV diagnostic antibodies) in a rapid (<3 minutes), single-step and low-cost fashion.
The goal of this Proof of Concept project is to bring our promising platform to the proof of diagnostic market and exploit its innovative features for commercial purposes. We will focus our initial efforts in the development of rapid kits for the detection of antibodies diagnostic of HIV. We will 1) Fully characterize the Ab-switch product in terms of analytical performances (i.e. sensitivity, specificity, stability etc.) with direct comparison with other commercial kits; 2) Prepare a Manufacturing Plan for producing/testing the Ab-switch; 3) Establish an IP strategy for patent filing and maintenance; 4) Determine a business and commercialization planning."
Max ERC Funding
150 000 €
Duration
Start date: 2017-02-01, End date: 2018-07-31
Project acronym ACAP
Project Acency Costs and Asset Pricing
Researcher (PI) Thomas Mariotti
Host Institution (HI) FONDATION JEAN-JACQUES LAFFONT,TOULOUSE SCIENCES ECONOMIQUES
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Summary
The main objective of this research project is to contribute at bridging the gap between the two main branches of financial theory, namely corporate finance and asset pricing. It is motivated by the conviction that these two aspects of financial activity should and can be analyzed within a unified framework. This research will borrow from these two approaches in order to construct theoretical models that allow one to analyze the design and issuance of financial securities, as well as the dynamics of their valuations. Unlike asset pricing, which takes as given the price of the fundamentals, the goal is to derive security price processes from a precise description of firm’s operations and internal frictions. Regarding the latter, and in line with traditional corporate finance theory, the analysis will emphasize the role of agency costs within the firm for the design of its securities. But the analysis will be pushed one step further by studying the impact of these agency costs on key financial variables such as stock and bond prices, leverage, book-to-market ratios, default risk, or the holding of liquidities by firms. One of the contributions of this research project is to show how these variables are interrelated when firms and investors agree upon optimal financial arrangements. The final objective is to derive a rich set of testable asset pricing implications that would eventually be brought to the data.
Max ERC Funding
1 000 000 €
Duration
Start date: 2008-11-01, End date: 2014-10-31
Project acronym ACCENT
Project Unravelling the architecture and the cartography of the human centriole
Researcher (PI) Paul, Philippe, Desiré GUICHARD
Host Institution (HI) UNIVERSITE DE GENEVE
Call Details Starting Grant (StG), LS1, ERC-2016-STG
Summary The centriole is the largest evolutionary conserved macromolecular structure responsible for building centrosomes and cilia or flagella in many eukaryotes. Centrioles are critical for the proper execution of important biological processes ranging from cell division to cell signaling. Moreover, centriolar defects have been associated to several human pathologies including ciliopathies and cancer. This state of facts emphasizes the importance of understanding centriole biogenesis. The study of centriole formation is a deep-rooted question, however our current knowledge on its molecular organization at high resolution remains fragmented and limited. In particular, exquisite details of the overall molecular architecture of the human centriole and in particular of its central core region are lacking to understand the basis of centriole organization and function. Resolving this important question represents a challenge that needs to be undertaken and will undoubtedly lead to groundbreaking advances. Another important question to tackle next is to develop innovative methods to enable the nanometric molecular mapping of centriolar proteins within distinct architectural elements of the centriole. This missing information will be key to unravel the molecular mechanisms behind centriolar organization.
This research proposal aims at building a cartography of the human centriole by elucidating its molecular composition and architecture. To this end, we will combine the use of innovative and multidisciplinary techniques encompassing spatial proteomics, cryo-electron tomography, state-of-the-art microscopy and in vitro assays and to achieve a comprehensive molecular and structural view of the human centriole. All together, we expect that these advances will help understand basic principles underlying centriole and cilia formation as well as might have further relevance for human health.
Summary
The centriole is the largest evolutionary conserved macromolecular structure responsible for building centrosomes and cilia or flagella in many eukaryotes. Centrioles are critical for the proper execution of important biological processes ranging from cell division to cell signaling. Moreover, centriolar defects have been associated to several human pathologies including ciliopathies and cancer. This state of facts emphasizes the importance of understanding centriole biogenesis. The study of centriole formation is a deep-rooted question, however our current knowledge on its molecular organization at high resolution remains fragmented and limited. In particular, exquisite details of the overall molecular architecture of the human centriole and in particular of its central core region are lacking to understand the basis of centriole organization and function. Resolving this important question represents a challenge that needs to be undertaken and will undoubtedly lead to groundbreaking advances. Another important question to tackle next is to develop innovative methods to enable the nanometric molecular mapping of centriolar proteins within distinct architectural elements of the centriole. This missing information will be key to unravel the molecular mechanisms behind centriolar organization.
This research proposal aims at building a cartography of the human centriole by elucidating its molecular composition and architecture. To this end, we will combine the use of innovative and multidisciplinary techniques encompassing spatial proteomics, cryo-electron tomography, state-of-the-art microscopy and in vitro assays and to achieve a comprehensive molecular and structural view of the human centriole. All together, we expect that these advances will help understand basic principles underlying centriole and cilia formation as well as might have further relevance for human health.
Max ERC Funding
1 498 965 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ACETOGENS
Project Acetogenic bacteria: from basic physiology via gene regulation to application in industrial biotechnology
Researcher (PI) Volker MÜLLER
Host Institution (HI) JOHANN WOLFGANG GOETHE-UNIVERSITATFRANKFURT AM MAIN
Call Details Advanced Grant (AdG), LS9, ERC-2016-ADG
Summary Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Summary
Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Max ERC Funding
2 497 140 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ACOUSEQ
Project Acoustics for Next Generation Sequencing
Researcher (PI) Jonathan Mark Cooper
Host Institution (HI) UNIVERSITY OF GLASGOW
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Since completion of the first human genome sequence, the demand for cheaper and faster sequencing methods has increased enormously. This need has driven the development of second-generation sequencing methods, or next-generation sequencing (also known as NGS or high throughput sequencing). The creation of these platforms has made sequencing accessible to more laboratories, rapidly increasing the volume of research, including clinical diagnostics and its use in directing treatment (precision medicine). The applications of NGS are also allowing rapid advances in clinically related fields such as public health and epidemiology. Such developments illustrate why sequencing is now the fastest-growing area in genomics (+23% p.a.). The activity is said to be worth $2.5B this year, and poised to reach ~$9B by 2020. In any workflow, prior to the sequencing reactions, a number of pre-sequencing steps are required, including the fragmentation of the DNA into smaller sizes for processing, size selection, library preparation and target enrichment. This proposal is specifically concerned with this latter area, namely DNA fragmentation – now widely acknowledged across the industry as being the most important technological bottleneck in the pre-sequencing workflow. Our new method for DNA fragmentation – involving using surface acoustic waves will enable sample preparation from lower sample volumes using lower powers. It also has the potential to allow the seamless integration of fragmentation into sequencing instrumentation, opening up the possibility of “sample to answer” diagnostics. In the near term this will enable the implementation of sample preparation pre-sequencing steps within the NGS instruments. In the longer term, our techniques will also enable us to develop methods for field-based DNA sequencing – as may be required for determining “microbial resistance” and informing the treatment of infectious disease in the face of the emergence of drug resistance.
Summary
Since completion of the first human genome sequence, the demand for cheaper and faster sequencing methods has increased enormously. This need has driven the development of second-generation sequencing methods, or next-generation sequencing (also known as NGS or high throughput sequencing). The creation of these platforms has made sequencing accessible to more laboratories, rapidly increasing the volume of research, including clinical diagnostics and its use in directing treatment (precision medicine). The applications of NGS are also allowing rapid advances in clinically related fields such as public health and epidemiology. Such developments illustrate why sequencing is now the fastest-growing area in genomics (+23% p.a.). The activity is said to be worth $2.5B this year, and poised to reach ~$9B by 2020. In any workflow, prior to the sequencing reactions, a number of pre-sequencing steps are required, including the fragmentation of the DNA into smaller sizes for processing, size selection, library preparation and target enrichment. This proposal is specifically concerned with this latter area, namely DNA fragmentation – now widely acknowledged across the industry as being the most important technological bottleneck in the pre-sequencing workflow. Our new method for DNA fragmentation – involving using surface acoustic waves will enable sample preparation from lower sample volumes using lower powers. It also has the potential to allow the seamless integration of fragmentation into sequencing instrumentation, opening up the possibility of “sample to answer” diagnostics. In the near term this will enable the implementation of sample preparation pre-sequencing steps within the NGS instruments. In the longer term, our techniques will also enable us to develop methods for field-based DNA sequencing – as may be required for determining “microbial resistance” and informing the treatment of infectious disease in the face of the emergence of drug resistance.
Max ERC Funding
149 995 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym ACrossWire
Project A Cross-Correlated Approach to Engineering Nitride Nanowires
Researcher (PI) Hannah Jane JOYCE
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Nanowires based on group III–nitride semiconductors exhibit outstanding potential for emerging applications in energy-efficient lighting, optoelectronics and solar energy harvesting. Nitride nanowires, tailored at the nanoscale, should overcome many of the challenges facing conventional planar nitride materials, and also add extraordinary new functionality to these materials. However, progress towards III–nitride nanowire devices has been hampered by the challenges in quantifying nanowire electrical properties using conventional contact-based measurements. Without reliable electrical transport data, it is extremely difficult to optimise nanowire growth and device design. This project aims to overcome this problem through an unconventional approach: advanced contact-free electrical measurements. Contact-free measurements, growth studies, and device studies will be cross-correlated to provide unprecedented insight into the growth mechanisms that govern nanowire electronic properties and ultimately dictate device performance. A key contact-free technique at the heart of this proposal is ultrafast terahertz conductivity spectroscopy: an advanced technique ideal for probing nanowire electrical properties. We will develop new methods to enable the full suite of contact-free (including terahertz, photoluminescence and cathodoluminescence measurements) and contact-based measurements to be performed with high spatial resolution on the same nanowires. This will provide accurate, comprehensive and cross-correlated feedback to guide growth studies and expedite the targeted development of nanowires with specified functionality. We will apply this powerful approach to tailor nanowires as photoelectrodes for solar photoelectrochemical water splitting. This is an application for which nitride nanowires have outstanding, yet unfulfilled, potential. This project will thus harness the true potential of nitride nanowires and bring them to the forefront of 21st century technology.
Summary
Nanowires based on group III–nitride semiconductors exhibit outstanding potential for emerging applications in energy-efficient lighting, optoelectronics and solar energy harvesting. Nitride nanowires, tailored at the nanoscale, should overcome many of the challenges facing conventional planar nitride materials, and also add extraordinary new functionality to these materials. However, progress towards III–nitride nanowire devices has been hampered by the challenges in quantifying nanowire electrical properties using conventional contact-based measurements. Without reliable electrical transport data, it is extremely difficult to optimise nanowire growth and device design. This project aims to overcome this problem through an unconventional approach: advanced contact-free electrical measurements. Contact-free measurements, growth studies, and device studies will be cross-correlated to provide unprecedented insight into the growth mechanisms that govern nanowire electronic properties and ultimately dictate device performance. A key contact-free technique at the heart of this proposal is ultrafast terahertz conductivity spectroscopy: an advanced technique ideal for probing nanowire electrical properties. We will develop new methods to enable the full suite of contact-free (including terahertz, photoluminescence and cathodoluminescence measurements) and contact-based measurements to be performed with high spatial resolution on the same nanowires. This will provide accurate, comprehensive and cross-correlated feedback to guide growth studies and expedite the targeted development of nanowires with specified functionality. We will apply this powerful approach to tailor nanowires as photoelectrodes for solar photoelectrochemical water splitting. This is an application for which nitride nanowires have outstanding, yet unfulfilled, potential. This project will thus harness the true potential of nitride nanowires and bring them to the forefront of 21st century technology.
Max ERC Funding
1 499 195 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym ACTICELL
Project Precision confiner for mechanical cell activation
Researcher (PI) Matthieu PIEL
Host Institution (HI) INSTITUT CURIE
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary In tissues, cells have their physical space constrained by neighbouring cells and extracellular matrix. In the PROMICO ERC project, our team proposed to specifically address the effect of physical confinement on normal and cancer cells that are dividing and migrating, using new pathophysiologically relevant in vitro approaches based on innovative micro-fabrication techniques. One of the devices we developed was meant to quantitatively control two key parameters of the cell environment: its geometry and its surface chemical properties. The main technical breakthrough was achieved using micro-fabricated elastomeric structures bound to a hard substrate (Le Berre Integrative Biology, 2012). The method led to important fundamental discoveries in cell biology (Lancaster Dev Cell 2013, Le Berre PRL 2013, Liu Cell 2015, Raab Science 2016). In part based on our findings, the notion that confinement is a crucial parameter for cell physiology has spread through the cell biology. Based on this, our idea is that cell confinement could be used as a powerfull cell conditioning technology, to change the cell state and offer new opportunities for fundamental research in cell biology, but also in cell therapies and drug screening. However, our current method to confine cells is not adapted to large scale cell conditioning applications, because the throughput and reliability of the device is still too low and because the recovery of cells after confinement remain poorly controlled. It is thus now timely to develop a robust and versatile cell confiner adapted to use in any cell biology lab, in academy and in industry, with no prior experience in micro-fabrication. Achieving this goal involves a complete change of technology compared to the ‘homemade’ PDMS device we have been using so far. We will also perform proofs of concept of its use for its application in cell based therapies, such as cancer immunotherapy, by testing the possibility to mechanically activate dendritic cells.
Summary
In tissues, cells have their physical space constrained by neighbouring cells and extracellular matrix. In the PROMICO ERC project, our team proposed to specifically address the effect of physical confinement on normal and cancer cells that are dividing and migrating, using new pathophysiologically relevant in vitro approaches based on innovative micro-fabrication techniques. One of the devices we developed was meant to quantitatively control two key parameters of the cell environment: its geometry and its surface chemical properties. The main technical breakthrough was achieved using micro-fabricated elastomeric structures bound to a hard substrate (Le Berre Integrative Biology, 2012). The method led to important fundamental discoveries in cell biology (Lancaster Dev Cell 2013, Le Berre PRL 2013, Liu Cell 2015, Raab Science 2016). In part based on our findings, the notion that confinement is a crucial parameter for cell physiology has spread through the cell biology. Based on this, our idea is that cell confinement could be used as a powerfull cell conditioning technology, to change the cell state and offer new opportunities for fundamental research in cell biology, but also in cell therapies and drug screening. However, our current method to confine cells is not adapted to large scale cell conditioning applications, because the throughput and reliability of the device is still too low and because the recovery of cells after confinement remain poorly controlled. It is thus now timely to develop a robust and versatile cell confiner adapted to use in any cell biology lab, in academy and in industry, with no prior experience in micro-fabrication. Achieving this goal involves a complete change of technology compared to the ‘homemade’ PDMS device we have been using so far. We will also perform proofs of concept of its use for its application in cell based therapies, such as cancer immunotherapy, by testing the possibility to mechanically activate dendritic cells.
Max ERC Funding
150 000 €
Duration
Start date: 2017-06-01, End date: 2018-11-30
Project acronym ActiveBioFluids
Project Origins of Collective Motion in Active Biofluids
Researcher (PI) Daniel TAM
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary The emergence of coherent behaviour is ubiquitous in the natural world and has long captivated biologists and physicists alike. One area of growing interest is the collective motion and synchronization arising within and between simple motile organisms. My goal is to develop and use a novel experimental approach to unravel the origins of spontaneous coherent motion in three model systems of biofluids: (1) the synchronization of the two flagella of green algae Chlamydomonas Rheinhardtii, (2) the metachronal wave in the cilia of protist Paramecium and (3) the collective motion of swimming microorganisms in active suspensions. Understanding the mechanisms leading to collective motion is of tremendous importance because it is crucial to many biological processes such as mechanical signal transduction, embryonic development and biofilm formation.
Up till now, most of the work has been theoretical and has led to the dominant view that hydrodynamic interactions are the main driving force for synchronization and collective motion. Recent experiments have challenged this view and highlighted the importance of direct mechanical contact. New experimental studies are now crucially needed. The state-of-the-art of experimental approaches consists of observations of unperturbed cells. The key innovation in our approach is to dynamically interact with microorganisms in real-time, at the relevant time and length scales. I will investigate the origins of coherent motion by reproducing synthetically the mechanical signatures of physiological flows and direct mechanical interactions and track precisely the response of the organism to the perturbations. Our new approach will incorporate optical tweezers to interact with motile cells, and a unique μ-Tomographic PIV setup to track their 3D micron-scale motion.
This proposal tackles a timely question in biophysics and will yield new insight into the fundamental principles underlying collective motion in active biological matter.
Summary
The emergence of coherent behaviour is ubiquitous in the natural world and has long captivated biologists and physicists alike. One area of growing interest is the collective motion and synchronization arising within and between simple motile organisms. My goal is to develop and use a novel experimental approach to unravel the origins of spontaneous coherent motion in three model systems of biofluids: (1) the synchronization of the two flagella of green algae Chlamydomonas Rheinhardtii, (2) the metachronal wave in the cilia of protist Paramecium and (3) the collective motion of swimming microorganisms in active suspensions. Understanding the mechanisms leading to collective motion is of tremendous importance because it is crucial to many biological processes such as mechanical signal transduction, embryonic development and biofilm formation.
Up till now, most of the work has been theoretical and has led to the dominant view that hydrodynamic interactions are the main driving force for synchronization and collective motion. Recent experiments have challenged this view and highlighted the importance of direct mechanical contact. New experimental studies are now crucially needed. The state-of-the-art of experimental approaches consists of observations of unperturbed cells. The key innovation in our approach is to dynamically interact with microorganisms in real-time, at the relevant time and length scales. I will investigate the origins of coherent motion by reproducing synthetically the mechanical signatures of physiological flows and direct mechanical interactions and track precisely the response of the organism to the perturbations. Our new approach will incorporate optical tweezers to interact with motile cells, and a unique μ-Tomographic PIV setup to track their 3D micron-scale motion.
This proposal tackles a timely question in biophysics and will yield new insight into the fundamental principles underlying collective motion in active biological matter.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym ACTIVEPHANTOM
Project Active Organ Phantoms for Medical Robotics
Researcher (PI) Peer FISCHER
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Robot-assisted and minimally invasive medical procedures are impacting medical care by increasing accuracy, reducing cost, and minimizing patient discomfort and recovery times after interventions. Developers of commercial robotic surgical systems and medical device manufacturers look for realistic phantoms that can be used in place of animal experiments or cadavers to test procedures and to train medical personnel. Existing phantoms are either made from hard materials, or they lack anatomical detail, and they are mainly passive and thus unrealistic.
Here, we use recently developed fabrication know-how and expertise within our ERC-funded research to develop the first active artificial urinary tract model that includes a kidney, a bladder, and a prostate. Rapid prototyping is combined with a fabrication step that we have developed to permit the incorporation of active elements, such as a peristaltic system and fluidic valves in the phantom. We have developed smart material composites that reproduce the mechanical and haptic properties, and that give ultrasound contrast indistinguishable from real organs, while permitting anatomical details to be reproduced with a mean error of as little as 500 microns.
Feedback from a major medical device company indicates that ours is a unique phantom with unprecedented accuracy for which there is a market. Within this POC grant we want to develop a complete prototype, and to demonstrate a series of medical interventions on the phantom, including endoscopic diagnostic procedures (cystoscopy and ureterorenoscopy) and endoscopic treatment procedures (laser lithotripsy). The grant will allow us to protect our know-how, identify further markets, and develop a commercialization strategy.
Overall, this project will generate the first active phantom system that permits the testing of surgical instruments and procedures, with a sizeable market potential.
Summary
Robot-assisted and minimally invasive medical procedures are impacting medical care by increasing accuracy, reducing cost, and minimizing patient discomfort and recovery times after interventions. Developers of commercial robotic surgical systems and medical device manufacturers look for realistic phantoms that can be used in place of animal experiments or cadavers to test procedures and to train medical personnel. Existing phantoms are either made from hard materials, or they lack anatomical detail, and they are mainly passive and thus unrealistic.
Here, we use recently developed fabrication know-how and expertise within our ERC-funded research to develop the first active artificial urinary tract model that includes a kidney, a bladder, and a prostate. Rapid prototyping is combined with a fabrication step that we have developed to permit the incorporation of active elements, such as a peristaltic system and fluidic valves in the phantom. We have developed smart material composites that reproduce the mechanical and haptic properties, and that give ultrasound contrast indistinguishable from real organs, while permitting anatomical details to be reproduced with a mean error of as little as 500 microns.
Feedback from a major medical device company indicates that ours is a unique phantom with unprecedented accuracy for which there is a market. Within this POC grant we want to develop a complete prototype, and to demonstrate a series of medical interventions on the phantom, including endoscopic diagnostic procedures (cystoscopy and ureterorenoscopy) and endoscopic treatment procedures (laser lithotripsy). The grant will allow us to protect our know-how, identify further markets, and develop a commercialization strategy.
Overall, this project will generate the first active phantom system that permits the testing of surgical instruments and procedures, with a sizeable market potential.
Max ERC Funding
150 000 €
Duration
Start date: 2017-03-01, End date: 2018-08-31
Project acronym AdaSmartRes
Project Adapter for a commercial grade camera or a smart phone to perform depth resolved imaging
Researcher (PI) Adrian PODOLEANU
Host Institution (HI) UNIVERSITY OF KENT
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary The proposal refers to a patented adapter that can transform a commercial grade digital camera or the camera in a smart phone into a depth resolved imaging instrument. Several adapters will be assembled, making use of optical coherence tomography (OCT) technology protected by some other of PI’s patents. The activity takes advantage of recent progress in commercial grade cameras in terms of their modes of operation as well as in terms of parameters of their devices, such as sensitivity and speed of their photodetector arrays.
Three versions of low cost functional OCT systems will be assembled as proof of concepts responding to needs of three possible markets that can be addressed by such an adapter: 1. En-face depth resolved, high transversal resolution microscope; 2. Fast cross sectioning imager. 3. Swept source volumetric analyser.
Industrial input comes from a company involved in professional eye imaging systems, a company already selling adapters for smart phones to perform medical imaging, a company specialised in digital photographic equipment and a company efficient in prototyping photonics equipment and handling medical images. Clinical input is provided by two specialists in the two highest potential medical imaging markets of the adapter serving ophthalmology and ear, nose and throat speciality.
Summary
The proposal refers to a patented adapter that can transform a commercial grade digital camera or the camera in a smart phone into a depth resolved imaging instrument. Several adapters will be assembled, making use of optical coherence tomography (OCT) technology protected by some other of PI’s patents. The activity takes advantage of recent progress in commercial grade cameras in terms of their modes of operation as well as in terms of parameters of their devices, such as sensitivity and speed of their photodetector arrays.
Three versions of low cost functional OCT systems will be assembled as proof of concepts responding to needs of three possible markets that can be addressed by such an adapter: 1. En-face depth resolved, high transversal resolution microscope; 2. Fast cross sectioning imager. 3. Swept source volumetric analyser.
Industrial input comes from a company involved in professional eye imaging systems, a company already selling adapters for smart phones to perform medical imaging, a company specialised in digital photographic equipment and a company efficient in prototyping photonics equipment and handling medical images. Clinical input is provided by two specialists in the two highest potential medical imaging markets of the adapter serving ophthalmology and ear, nose and throat speciality.
Max ERC Funding
149 300 €
Duration
Start date: 2017-06-01, End date: 2018-11-30
Project acronym ADIPODIF
Project Adipocyte Differentiation and Metabolic Functions in Obesity and Type 2 Diabetes
Researcher (PI) Christian Wolfrum
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Starting Grant (StG), LS6, ERC-2007-StG
Summary Obesity associated disorders such as T2D, hypertension and CVD, commonly referred to as the “metabolic syndrome”, are prevalent diseases of industrialized societies. Deranged adipose tissue proliferation and differentiation contribute significantly to the development of these metabolic disorders. Comparatively little however is known, about how these processes influence the development of metabolic disorders. Using a multidisciplinary approach, I plan to elucidate molecular mechanisms underlying the altered adipocyte differentiation and maturation in different models of obesity associated metabolic disorders. Special emphasis will be given to the analysis of gene expression, postranslational modifications and lipid molecular species composition. To achieve this goal, I am establishing several novel methods to isolate pure primary preadipocytes including a new animal model that will allow me to monitor preadipocytes, in vivo and track their cellular fate in the context of a complete organism. These systems will allow, for the first time to study preadipocyte biology, in an in vivo setting. By monitoring preadipocyte differentiation in vivo, I will also be able to answer the key questions regarding the development of preadipocytes and examine signals that induce or inhibit their differentiation. Using transplantation techniques, I will elucidate the genetic and environmental contributions to the progression of obesity and its associated metabolic disorders. Furthermore, these studies will integrate a lipidomics approach to systematically analyze lipid molecular species composition in different models of metabolic disorders. My studies will provide new insights into the mechanisms and dynamics underlying adipocyte differentiation and maturation, and relate them to metabolic disorders. Detailed knowledge of these mechanisms will facilitate development of novel therapeutic approaches for the treatment of obesity and associated metabolic disorders.
Summary
Obesity associated disorders such as T2D, hypertension and CVD, commonly referred to as the “metabolic syndrome”, are prevalent diseases of industrialized societies. Deranged adipose tissue proliferation and differentiation contribute significantly to the development of these metabolic disorders. Comparatively little however is known, about how these processes influence the development of metabolic disorders. Using a multidisciplinary approach, I plan to elucidate molecular mechanisms underlying the altered adipocyte differentiation and maturation in different models of obesity associated metabolic disorders. Special emphasis will be given to the analysis of gene expression, postranslational modifications and lipid molecular species composition. To achieve this goal, I am establishing several novel methods to isolate pure primary preadipocytes including a new animal model that will allow me to monitor preadipocytes, in vivo and track their cellular fate in the context of a complete organism. These systems will allow, for the first time to study preadipocyte biology, in an in vivo setting. By monitoring preadipocyte differentiation in vivo, I will also be able to answer the key questions regarding the development of preadipocytes and examine signals that induce or inhibit their differentiation. Using transplantation techniques, I will elucidate the genetic and environmental contributions to the progression of obesity and its associated metabolic disorders. Furthermore, these studies will integrate a lipidomics approach to systematically analyze lipid molecular species composition in different models of metabolic disorders. My studies will provide new insights into the mechanisms and dynamics underlying adipocyte differentiation and maturation, and relate them to metabolic disorders. Detailed knowledge of these mechanisms will facilitate development of novel therapeutic approaches for the treatment of obesity and associated metabolic disorders.
Max ERC Funding
1 607 105 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym ADJUV-ANT VACCINES
Project Elucidating the Molecular Mechanisms of Synthetic Saponin Adjuvants and Development of Novel Self-Adjuvanting Vaccines
Researcher (PI) Alberto FERNANDEZ TEJADA
Host Institution (HI) ASOCIACION CENTRO DE INVESTIGACION COOPERATIVA EN BIOCIENCIAS
Call Details Starting Grant (StG), PE5, ERC-2016-STG
Summary The clinical success of anticancer and antiviral vaccines often requires the use of an adjuvant, a substance that helps stimulate the body’s immune response to the vaccine, making it work better. However, few adjuvants are sufficiently potent and non-toxic for clinical use; moreover, it is not really known how they work. Current vaccine approaches based on weak carbohydrate and glycopeptide antigens are not being particularly effective to induce the human immune system to mount an effective fight against cancer. Despite intensive research and several clinical trials, no such carbohydrate-based antitumor vaccine has yet been approved for public use. In this context, the proposed project has a double, ultimate goal based on applying chemistry to address the above clear gaps in the adjuvant-vaccine field. First, I will develop new improved adjuvants and novel chemical strategies towards more effective, self-adjuvanting synthetic vaccines. Second, I will probe deeply into the molecular mechanisms of the synthetic constructs by combining extensive immunological evaluations with molecular target identification and detailed conformational studies. Thus, the singularity of this multidisciplinary proposal stems from the integration of its main objectives and approaches connecting chemical synthesis and chemical/structural biology with cellular and molecular immunology. This ground-breaking project at the chemistry-biology frontier will allow me to establish my own independent research group and explore key unresolved mechanistic questions in the adjuvant/vaccine arena with extraordinary chemical precision. Therefore, with this transformative and timely research program I aim to (a) develop novel synthetic antitumor and antiviral vaccines with improved properties and efficacy for their prospective translation into the clinic and (b) gain new critical insights into the molecular basis and three-dimensional structure underlying the biological activity of these constructs.
Summary
The clinical success of anticancer and antiviral vaccines often requires the use of an adjuvant, a substance that helps stimulate the body’s immune response to the vaccine, making it work better. However, few adjuvants are sufficiently potent and non-toxic for clinical use; moreover, it is not really known how they work. Current vaccine approaches based on weak carbohydrate and glycopeptide antigens are not being particularly effective to induce the human immune system to mount an effective fight against cancer. Despite intensive research and several clinical trials, no such carbohydrate-based antitumor vaccine has yet been approved for public use. In this context, the proposed project has a double, ultimate goal based on applying chemistry to address the above clear gaps in the adjuvant-vaccine field. First, I will develop new improved adjuvants and novel chemical strategies towards more effective, self-adjuvanting synthetic vaccines. Second, I will probe deeply into the molecular mechanisms of the synthetic constructs by combining extensive immunological evaluations with molecular target identification and detailed conformational studies. Thus, the singularity of this multidisciplinary proposal stems from the integration of its main objectives and approaches connecting chemical synthesis and chemical/structural biology with cellular and molecular immunology. This ground-breaking project at the chemistry-biology frontier will allow me to establish my own independent research group and explore key unresolved mechanistic questions in the adjuvant/vaccine arena with extraordinary chemical precision. Therefore, with this transformative and timely research program I aim to (a) develop novel synthetic antitumor and antiviral vaccines with improved properties and efficacy for their prospective translation into the clinic and (b) gain new critical insights into the molecular basis and three-dimensional structure underlying the biological activity of these constructs.
Max ERC Funding
1 499 219 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym ADONIS
Project Attosecond Dynamics On Interfaces and Solids
Researcher (PI) Reinhard Kienberger
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Starting Grant (StG), PE2, ERC-2007-StG
Summary New insight into ever smaller microscopic units of matter as well as in ever faster evolving chemical, physical or atomic processes pushes the frontiers in many fields in science. Pump/probe experiments turned out to be the most direct approach to time-domain investigations of fast-evolving microscopic processes. Accessing atomic and molecular inner-shell processes directly in the time-domain requires a combination of short wavelengths in the few hundred eV range and sub-femtosecond pulse duration. The concept of light-field-controlled XUV photoemission employs an XUV pulse achieved by High-order Harmonic Generation (HHG) as a pump and the light pulse as a probe or vice versa. The basic prerequisite, namely the generation and measurement of isolated sub-femtosecond XUV pulses synchronized to a strong few-cycle light pulse with attosecond precision, opens up a route to time-resolved inner-shell atomic and molecular spectroscopy with present day sources. Studies of attosecond electronic motion (1 as = 10-18 s) in solids and on surfaces and interfaces have until now remained out of reach. The unprecedented time resolution of the aforementioned technique will enable for the first time monitoring of sub-fs dynamics of such systems in the time domain. These dynamics – of electronic excitation, relaxation, and wave packet motion – are of broad scientific interest and pertinent to the development of many modern technologies including semiconductor and molecular electronics, optoelectronics, information processing, photovoltaics, and optical nano-structuring. The purpose of this project is to investigate phenomena like the temporal evolution of direct photoemission, interference effects in resonant photoemission, fast adsorbate-substrate charge transfer, and electronic dynamics in supramolecular assemblies, in a series of experiments in order to overcome the temporal limits of measurements in solid state physics and to better understand processes in microcosm.
Summary
New insight into ever smaller microscopic units of matter as well as in ever faster evolving chemical, physical or atomic processes pushes the frontiers in many fields in science. Pump/probe experiments turned out to be the most direct approach to time-domain investigations of fast-evolving microscopic processes. Accessing atomic and molecular inner-shell processes directly in the time-domain requires a combination of short wavelengths in the few hundred eV range and sub-femtosecond pulse duration. The concept of light-field-controlled XUV photoemission employs an XUV pulse achieved by High-order Harmonic Generation (HHG) as a pump and the light pulse as a probe or vice versa. The basic prerequisite, namely the generation and measurement of isolated sub-femtosecond XUV pulses synchronized to a strong few-cycle light pulse with attosecond precision, opens up a route to time-resolved inner-shell atomic and molecular spectroscopy with present day sources. Studies of attosecond electronic motion (1 as = 10-18 s) in solids and on surfaces and interfaces have until now remained out of reach. The unprecedented time resolution of the aforementioned technique will enable for the first time monitoring of sub-fs dynamics of such systems in the time domain. These dynamics – of electronic excitation, relaxation, and wave packet motion – are of broad scientific interest and pertinent to the development of many modern technologies including semiconductor and molecular electronics, optoelectronics, information processing, photovoltaics, and optical nano-structuring. The purpose of this project is to investigate phenomena like the temporal evolution of direct photoemission, interference effects in resonant photoemission, fast adsorbate-substrate charge transfer, and electronic dynamics in supramolecular assemblies, in a series of experiments in order to overcome the temporal limits of measurements in solid state physics and to better understand processes in microcosm.
Max ERC Funding
1 296 000 €
Duration
Start date: 2008-10-01, End date: 2013-09-30
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ADSNeSP
Project Active and Driven Systems: Nonequilibrium Statistical Physics
Researcher (PI) Michael Elmhirst CATES
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE3, ERC-2016-ADG
Summary Active Matter systems, such as self-propelled colloids, violate time-reversal symmetry by producing entropy locally, typically converting fuel into mechanical motion at the particle scale. Other driven systems instead produce entropy because of global forcing by external fields, or boundary conditions that impose macroscopic fluxes (such as the momentum flux across a fluid sheared between moving parallel walls).
Nonequilibrium statistical physics (NeSP) is the basic toolbox for both classes of system. In recent years, much progress in NeSP has stemmed from bottom-up work on driven systems. This has provided a number of exactly solved benchmark models, and extended approximation techniques to address driven non-ergodic systems, such as sheared glasses. Meanwhile, work on fluctuation theorems and stochastic thermodynamics have created profound, model-independent insights into dynamics far from equilibrium.
More recently, the field of Active Matter has moved forward rapidly, leaving in its wake a series of generic and profound NeSP questions that now need answers: When is time-reversal symmetry, broken at the microscale, restored by coarse-graining? If it is restored, is an effective thermodynamic description is possible? How different is an active system's behaviour from a globally forced one?
ADSNeSP aims to distil from recent Active Matter research such fundamental questions; answer them first in the context of specific models and second in more general terms; and then, using the tools and insights gained, shed new light on longstanding problems in the wider class of driven systems.
I believe these new tools and insights will be substantial, because local activity takes systems far from equilibrium in a conceptually distinct direction from most types of global driving. By focusing on general principles and on simple models of activity, I seek to create a new vantage point that can inform, and potentially transform, wider areas of statistical physics.
Summary
Active Matter systems, such as self-propelled colloids, violate time-reversal symmetry by producing entropy locally, typically converting fuel into mechanical motion at the particle scale. Other driven systems instead produce entropy because of global forcing by external fields, or boundary conditions that impose macroscopic fluxes (such as the momentum flux across a fluid sheared between moving parallel walls).
Nonequilibrium statistical physics (NeSP) is the basic toolbox for both classes of system. In recent years, much progress in NeSP has stemmed from bottom-up work on driven systems. This has provided a number of exactly solved benchmark models, and extended approximation techniques to address driven non-ergodic systems, such as sheared glasses. Meanwhile, work on fluctuation theorems and stochastic thermodynamics have created profound, model-independent insights into dynamics far from equilibrium.
More recently, the field of Active Matter has moved forward rapidly, leaving in its wake a series of generic and profound NeSP questions that now need answers: When is time-reversal symmetry, broken at the microscale, restored by coarse-graining? If it is restored, is an effective thermodynamic description is possible? How different is an active system's behaviour from a globally forced one?
ADSNeSP aims to distil from recent Active Matter research such fundamental questions; answer them first in the context of specific models and second in more general terms; and then, using the tools and insights gained, shed new light on longstanding problems in the wider class of driven systems.
I believe these new tools and insights will be substantial, because local activity takes systems far from equilibrium in a conceptually distinct direction from most types of global driving. By focusing on general principles and on simple models of activity, I seek to create a new vantage point that can inform, and potentially transform, wider areas of statistical physics.
Max ERC Funding
2 043 630 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AFDMATS
Project Anton Francesco Doni – Multimedia Archive Texts and Sources
Researcher (PI) Giovanna Rizzarelli
Host Institution (HI) SCUOLA NORMALE SUPERIORE
Call Details Starting Grant (StG), SH4, ERC-2007-StG
Summary This project aims at creating a multimedia archive of the printed works of Anton Francesco Doni, who was not only an author but also a typographer, a publisher and a member of the Giolito and Marcolini’s editorial staff. The analysis of Doni’s work may be a good way to investigate appropriation, text rewriting and image reusing practices which are typical of several authors of the 16th Century, as clearly shown by the critics in the last decades. This project intends to bring to light the wide range of impulses from which Doni’s texts are generated, with a great emphasis on the figurative aspect. The encoding of these texts will be carried out using the TEI (Text Encoding Initiative) guidelines, which will enable any single text to interact with a range of intertextual references both at a local level (inside the same text) and at a macrostructural level (references to other texts by Doni or to other authors). The elements that will emerge from the textual encoding concern: A) The use of images Real images: the complex relation between Doni’s writing and the xylographies available in Marcolini’s printing-house or belonging to other collections. Mental images: the remarkable presence of verbal images, as descriptions, ekphràseis, figurative visions, dreams and iconographic allusions not accompanied by illustrations, but related to a recognizable visual repertoire or to real images that will be reproduced. B) The use of sources A parallel archive of the texts most used by Doni will be created. Digital anastatic reproductions of the 16th-Century editions known by Doni will be provided whenever available. The various forms of intertextuality will be divided into the following typologies: allusions; citations; rewritings; plagiarisms; self-quotations. Finally, the different forms of narrative (tales, short stories, anecdotes, lyrics) and the different idiomatic expressions (proverbial forms and wellerisms) will also be encoded.
Summary
This project aims at creating a multimedia archive of the printed works of Anton Francesco Doni, who was not only an author but also a typographer, a publisher and a member of the Giolito and Marcolini’s editorial staff. The analysis of Doni’s work may be a good way to investigate appropriation, text rewriting and image reusing practices which are typical of several authors of the 16th Century, as clearly shown by the critics in the last decades. This project intends to bring to light the wide range of impulses from which Doni’s texts are generated, with a great emphasis on the figurative aspect. The encoding of these texts will be carried out using the TEI (Text Encoding Initiative) guidelines, which will enable any single text to interact with a range of intertextual references both at a local level (inside the same text) and at a macrostructural level (references to other texts by Doni or to other authors). The elements that will emerge from the textual encoding concern: A) The use of images Real images: the complex relation between Doni’s writing and the xylographies available in Marcolini’s printing-house or belonging to other collections. Mental images: the remarkable presence of verbal images, as descriptions, ekphràseis, figurative visions, dreams and iconographic allusions not accompanied by illustrations, but related to a recognizable visual repertoire or to real images that will be reproduced. B) The use of sources A parallel archive of the texts most used by Doni will be created. Digital anastatic reproductions of the 16th-Century editions known by Doni will be provided whenever available. The various forms of intertextuality will be divided into the following typologies: allusions; citations; rewritings; plagiarisms; self-quotations. Finally, the different forms of narrative (tales, short stories, anecdotes, lyrics) and the different idiomatic expressions (proverbial forms and wellerisms) will also be encoded.
Max ERC Funding
559 200 €
Duration
Start date: 2008-08-01, End date: 2012-07-31
Project acronym AGALT
Project Asymptotic Geometric Analysis and Learning Theory
Researcher (PI) Shahar Mendelson
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Starting Grant (StG), PE1, ERC-2007-StG
Summary In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Summary
In a typical learning problem one tries to approximate an unknown function by a function from a given class using random data, sampled according to an unknown measure. In this project we will be interested in parameters that govern the complexity of a learning problem. It turns out that this complexity is determined by the geometry of certain sets in high dimension that are connected to the given class (random coordinate projections of the class). Thus, one has to understand the structure of these sets as a function of the dimension - which is given by the cardinality of the random sample. The resulting analysis leads to many theoretical questions in Asymptotic Geometric Analysis, Probability (most notably, Empirical Processes Theory) and Combinatorics, which are of independent interest beyond the application to Learning Theory. Our main goal is to describe the role of various complexity parameters involved in a learning problem, to analyze the connections between them and to investigate the way they determine the geometry of the relevant high dimensional sets. Some of the questions we intend to tackle are well known open problems and making progress towards their solution will have a significant theoretical impact. Moreover, this project should lead to a more complete theory of learning and is likely to have some practical impact, for example, in the design of more efficient learning algorithms.
Max ERC Funding
750 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym AgeConsolidate
Project The Missing Link of Episodic Memory Decline in Aging: The Role of Inefficient Systems Consolidation
Researcher (PI) Anders Martin FJELL
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Consolidator Grant (CoG), SH4, ERC-2016-COG
Summary Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Summary
Which brain mechanisms are responsible for the faith of the memories we make with age, whether they wither or stay, and in what form? Episodic memory function does decline with age. While this decline can have multiple causes, research has focused almost entirely on encoding and retrieval processes, largely ignoring a third critical process– consolidation. The objective of AgeConsolidate is to provide this missing link, by combining novel experimental cognitive paradigms with neuroimaging in a longitudinal large-scale attempt to directly test how age-related changes in consolidation processes in the brain impact episodic memory decline. The ambitious aims of the present proposal are two-fold:
(1) Use recent advances in memory consolidation theory to achieve an elaborate model of episodic memory deficits in aging
(2) Use aging as a model to uncover how structural and functional brain changes affect episodic memory consolidation in general
The novelty of the project lies in the synthesis of recent methodological advances and theoretical models for episodic memory consolidation to explain age-related decline, by employing a unique combination of a range of different techniques and approaches. This is ground-breaking, in that it aims at taking our understanding of the brain processes underlying episodic memory decline in aging to a new level, while at the same time advancing our theoretical understanding of how episodic memories are consolidated in the human brain. To obtain this outcome, I will test the main hypothesis of the project: Brain processes of episodic memory consolidation are less effective in older adults, and this can account for a significant portion of the episodic memory decline in aging. This will be answered by six secondary hypotheses, with 1-3 experiments or tasks designated to address each hypothesis, focusing on functional and structural MRI, positron emission tomography data and sleep experiments to target consolidation from different angles.
Max ERC Funding
1 999 482 €
Duration
Start date: 2017-05-01, End date: 2022-04-30
Project acronym AGNOSTIC
Project Actively Enhanced Cognition based Framework for Design of Complex Systems
Researcher (PI) Björn Ottersten
Host Institution (HI) UNIVERSITE DU LUXEMBOURG
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary Parameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.
Summary
Parameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.
Max ERC Funding
2 499 595 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AIDA
Project Architectural design In Dialogue with dis-Ability Theoretical and methodological exploration of a multi-sensorial design approach in architecture
Researcher (PI) Ann Heylighen
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Starting Grant (StG), SH2, ERC-2007-StG
Summary This research project is based on the notion that, because of their specific interaction with space, people with particular dis-abilities are able to appreciate spatial qualities or detect misfits in the environment that most architects—or other designers—are not even aware of. This notion holds for sensory dis-abilities such as blindness or visual impairment, but also for mental dis-abilities like autism or Alzheimer’s dementia. The experiences and subsequent insights of these dis-abled people, so it is argued, represent a considerable knowledge resource that would complement and enrich the professional expertise of architects and designers in general. This argument forms the basis for a methodological and theoretical exploration of a multi-sensorial design approach in architecture. On the one hand, a series of retrospective case studies will be conducted to identify and describe the motives and elements that trigger or stimulate architects’ attention for the multi-sensorial spatial experiences of people with dis-abilities when designing spaces. On the other hand, the research project will investigate experimentally in real time to what extent design processes and products in architecture can be enriched by establishing a dialogue between the multi-sensorial ‘knowing-in-action’ of people with dis-abilities and the expertise of professional architects/designers. In this way, the research project aims to develop a more profound understanding of how the concept of Design for All can be realised in architectural practice. At least as important, however, is its contribution to innovation in architecture tout court. The research results are expected to give a powerful impulse to quality improvement of the built environment by stimulating and supporting the development of innovative design concepts.
Summary
This research project is based on the notion that, because of their specific interaction with space, people with particular dis-abilities are able to appreciate spatial qualities or detect misfits in the environment that most architects—or other designers—are not even aware of. This notion holds for sensory dis-abilities such as blindness or visual impairment, but also for mental dis-abilities like autism or Alzheimer’s dementia. The experiences and subsequent insights of these dis-abled people, so it is argued, represent a considerable knowledge resource that would complement and enrich the professional expertise of architects and designers in general. This argument forms the basis for a methodological and theoretical exploration of a multi-sensorial design approach in architecture. On the one hand, a series of retrospective case studies will be conducted to identify and describe the motives and elements that trigger or stimulate architects’ attention for the multi-sensorial spatial experiences of people with dis-abilities when designing spaces. On the other hand, the research project will investigate experimentally in real time to what extent design processes and products in architecture can be enriched by establishing a dialogue between the multi-sensorial ‘knowing-in-action’ of people with dis-abilities and the expertise of professional architects/designers. In this way, the research project aims to develop a more profound understanding of how the concept of Design for All can be realised in architectural practice. At least as important, however, is its contribution to innovation in architecture tout court. The research results are expected to give a powerful impulse to quality improvement of the built environment by stimulating and supporting the development of innovative design concepts.
Max ERC Funding
1 195 385 €
Duration
Start date: 2008-05-01, End date: 2013-10-31
Project acronym AIDViC
Project Antibiotic intracellular delivery via virus-like carriers
Researcher (PI) Giuseppe BATTAGLIA
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Taking inspiration from natural carriers, such as viruses, a new technology has been developed in
our laboratories part of an ongoing ERC starting grant project, Molecular Engineering of Virus-like
Carriers (MEViC). We created synthetic viruses using polymers and thus safer materials. They are
able of delivering high payload of specific drugs into cells with no detrimental effect. While testing for
anticancer therapies, we identified a synthetic virus capable of targeting almost exclusively
macrophages. We performed preliminary work showing that this can be successfully applied to
deliver antibiotics to rid of intracellular pathogens. This has now open a completely new possibility
whereas we can expand our technology for the treatment of several infections as well as to contribute
to the ongoing efforts in tackling antibiotic resistance.
Summary
Taking inspiration from natural carriers, such as viruses, a new technology has been developed in
our laboratories part of an ongoing ERC starting grant project, Molecular Engineering of Virus-like
Carriers (MEViC). We created synthetic viruses using polymers and thus safer materials. They are
able of delivering high payload of specific drugs into cells with no detrimental effect. While testing for
anticancer therapies, we identified a synthetic virus capable of targeting almost exclusively
macrophages. We performed preliminary work showing that this can be successfully applied to
deliver antibiotics to rid of intracellular pathogens. This has now open a completely new possibility
whereas we can expand our technology for the treatment of several infections as well as to contribute
to the ongoing efforts in tackling antibiotic resistance.
Max ERC Funding
149 062 €
Duration
Start date: 2017-07-01, End date: 2018-12-31
Project acronym AIM
Project Adaptive Imaging Microscopy
Researcher (PI) Michel Verhaegen
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary The project has a goal of starting up a small business producing highly special customizable microscope systems for biomedical research. Microscopic imaging is one of the major drivers of the progress in biomedical and life sciences. The development of novel concepts, addressing the challenges of advanced optical microscopy, represents the front line of scientific research. Modern microscopes are not purely optical devices anymore. They have developed into complex integrated systems, combining optics, mechanics, electronics, feedback control systems, and image processing Many novel concepts of modern microscopy, while very interesting for research, still have to prove the commercial profitability. Such developments can be effectively addressed by start-up companies with a goal of either custom development, production and service of these advanced systems, or development and selling the IP to a larger player.
The major goal of this proposal is the creation of the first commercial optical microscope, the performance of which depends completely on the adaptive optics feedback controls. To prove the feasibility of this approach, we select a highly attractive technical concept of adaptive light sheet microscope, developed in our group in the framework of the ERC project. In this aspect, our development relates to ordinary microscope system in the same way as “fly by wire” airplane relates to an old-fashioned one.
Our contribution in the development of instrumentation for biomedical research will bring a positive impact on our knowledge about the nature and ourselves, the quality of life and life expectation of the population. Our proposal addresses the largest societal challenge of Europe: the healthcare. Our instrument will contribute to the understanding of complex diseases and support the greying population to stay healthy and self-supportive for extended period of time.
Summary
The project has a goal of starting up a small business producing highly special customizable microscope systems for biomedical research. Microscopic imaging is one of the major drivers of the progress in biomedical and life sciences. The development of novel concepts, addressing the challenges of advanced optical microscopy, represents the front line of scientific research. Modern microscopes are not purely optical devices anymore. They have developed into complex integrated systems, combining optics, mechanics, electronics, feedback control systems, and image processing Many novel concepts of modern microscopy, while very interesting for research, still have to prove the commercial profitability. Such developments can be effectively addressed by start-up companies with a goal of either custom development, production and service of these advanced systems, or development and selling the IP to a larger player.
The major goal of this proposal is the creation of the first commercial optical microscope, the performance of which depends completely on the adaptive optics feedback controls. To prove the feasibility of this approach, we select a highly attractive technical concept of adaptive light sheet microscope, developed in our group in the framework of the ERC project. In this aspect, our development relates to ordinary microscope system in the same way as “fly by wire” airplane relates to an old-fashioned one.
Our contribution in the development of instrumentation for biomedical research will bring a positive impact on our knowledge about the nature and ourselves, the quality of life and life expectation of the population. Our proposal addresses the largest societal challenge of Europe: the healthcare. Our instrument will contribute to the understanding of complex diseases and support the greying population to stay healthy and self-supportive for extended period of time.
Max ERC Funding
149 998 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym AlCat
Project Bond activation and catalysis with low-valent aluminium
Researcher (PI) Michael James COWLEY
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Starting Grant (StG), PE5, ERC-2016-STG
Summary This project will develop the principles required to enable bond-modifying redox catalysis based on aluminium by preparing and studying new Al(I) compounds capable of reversible oxidative addition.
Catalytic processes are involved in the synthesis of 75 % of all industrially produced chemicals, but most catalysts involved are based on precious metals such as rhodium, palladium or platinum. These metals are expensive and their supply limited and unstable; there is a significant need to develop the chemistry of non-precious metals as alternatives. On toxicity and abundance alone, aluminium is an attractive candidate. Furthermore, recent work, including in our group, has demonstrated that Al(I) compounds can perform a key step in catalytic cycles - the oxidative addition of E-H bonds.
In order to realise the significant potential of Al(I) for transition-metal style catalysis we urgently need to:
- establish the principles governing oxidative addition and reductive elimination reactivity in aluminium systems.
- know how the reactivity of Al(I) compounds can be controlled by varying properties of ligand frameworks.
- understand the onward reactivity of oxidative addition products of Al(I) to enable applications in catalysis.
In this project we will:
- Study mechanisms of oxidative addition and reductive elimination of a range of synthetically relevant bonds at Al(I) centres, establishing the principles governing this fundamental reactivity.
- Develop new ligand frameworks to support of Al(I) centres and evaluate the effect of the ligand on oxidative addition/reductive elimination at Al centres.
- Investigate methods for Al-mediated functionalisation of organic compounds by exploring the reactivity of E-H oxidative addition products with unsaturated organic compounds.
Summary
This project will develop the principles required to enable bond-modifying redox catalysis based on aluminium by preparing and studying new Al(I) compounds capable of reversible oxidative addition.
Catalytic processes are involved in the synthesis of 75 % of all industrially produced chemicals, but most catalysts involved are based on precious metals such as rhodium, palladium or platinum. These metals are expensive and their supply limited and unstable; there is a significant need to develop the chemistry of non-precious metals as alternatives. On toxicity and abundance alone, aluminium is an attractive candidate. Furthermore, recent work, including in our group, has demonstrated that Al(I) compounds can perform a key step in catalytic cycles - the oxidative addition of E-H bonds.
In order to realise the significant potential of Al(I) for transition-metal style catalysis we urgently need to:
- establish the principles governing oxidative addition and reductive elimination reactivity in aluminium systems.
- know how the reactivity of Al(I) compounds can be controlled by varying properties of ligand frameworks.
- understand the onward reactivity of oxidative addition products of Al(I) to enable applications in catalysis.
In this project we will:
- Study mechanisms of oxidative addition and reductive elimination of a range of synthetically relevant bonds at Al(I) centres, establishing the principles governing this fundamental reactivity.
- Develop new ligand frameworks to support of Al(I) centres and evaluate the effect of the ligand on oxidative addition/reductive elimination at Al centres.
- Investigate methods for Al-mediated functionalisation of organic compounds by exploring the reactivity of E-H oxidative addition products with unsaturated organic compounds.
Max ERC Funding
1 493 679 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym AlchemEast
Project Alchemy in the Making: From ancient Babylonia via Graeco-Roman Egypt into the Byzantine, Syriac and Arabic traditions (1500 BCE - 1000 AD)
Researcher (PI) Matteo MARTELLI
Host Institution (HI) ALMA MATER STUDIORUM - UNIVERSITA DI BOLOGNA
Call Details Consolidator Grant (CoG), SH5, ERC-2016-COG
Summary The AlchemEast project is devoted to the study of alchemical theory and practice as it appeared and developed in distinct, albeit contiguous (both chronologically and geographically) areas: Graeco-Roman Egypt, Byzantium, and the Near East, from Ancient Babylonian times to the early Islamic Period. This project combines innovative textual investigations with experimental replications of ancient alchemical procedures. It uses sets of historically and philologically informed laboratory replications in order to reconstruct the actual practice of ancient alchemists, and it studies the texts and literary forms in which this practice was conceptualized and transmitted. It proposes new models for textual criticism in order to capture the fluidity of the transmission of ancient alchemical writings. AlchemEast is designed to carry out a comparative investigation of cuneiform tablets as well as a vast corpus of Greek, Syriac and Arabic writings. It will overcome the old, pejorative paradigm that dismissed ancient alchemy as a "pseudo-science", by proposing a new theoretical framework for comprehending the entirety of ancient alchemical practices and theories. Alongside established forms of scholarly output, such as critical editions of key texts, AlchemEast will provide an integrative, longue durée perspective on the many different phases of ancient alchemy. It will thus offer a radically new vision of this discipline as a dynamic and diversified art that developed across different technical and scholastic traditions. This new representation will allow us to connect ancient alchemy with medieval and early modern alchemy and thus fully reintegrate ancient alchemy in the history of pre-modern alchemy as well as in the history of ancient science more broadly.
Summary
The AlchemEast project is devoted to the study of alchemical theory and practice as it appeared and developed in distinct, albeit contiguous (both chronologically and geographically) areas: Graeco-Roman Egypt, Byzantium, and the Near East, from Ancient Babylonian times to the early Islamic Period. This project combines innovative textual investigations with experimental replications of ancient alchemical procedures. It uses sets of historically and philologically informed laboratory replications in order to reconstruct the actual practice of ancient alchemists, and it studies the texts and literary forms in which this practice was conceptualized and transmitted. It proposes new models for textual criticism in order to capture the fluidity of the transmission of ancient alchemical writings. AlchemEast is designed to carry out a comparative investigation of cuneiform tablets as well as a vast corpus of Greek, Syriac and Arabic writings. It will overcome the old, pejorative paradigm that dismissed ancient alchemy as a "pseudo-science", by proposing a new theoretical framework for comprehending the entirety of ancient alchemical practices and theories. Alongside established forms of scholarly output, such as critical editions of key texts, AlchemEast will provide an integrative, longue durée perspective on the many different phases of ancient alchemy. It will thus offer a radically new vision of this discipline as a dynamic and diversified art that developed across different technical and scholastic traditions. This new representation will allow us to connect ancient alchemy with medieval and early modern alchemy and thus fully reintegrate ancient alchemy in the history of pre-modern alchemy as well as in the history of ancient science more broadly.
Max ERC Funding
1 997 000 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym ALEXANDRIA
Project Large-Scale Formal Proof for the Working Mathematician
Researcher (PI) Lawrence PAULSON
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Summary
Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Max ERC Funding
2 430 140 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALFA
Project Shaping a European Scientific Scene : Alfonsine Astronomy
Researcher (PI) Matthieu Husson
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Consolidator Grant (CoG), SH6, ERC-2016-COG
Summary Alfonsine astronomy is arguably among the first European scientific achievements. It shaped a scene for actors like Regiomontanus or Copernicus. There is however little detailed historical analysis encompassing its development in its full breadth. ALFA addresses this issue by studying tables, instruments, mathematical and theoretical texts in a methodologically innovative way relying on approaches from the history of manuscript cultures, history of mathematics, and history of astronomy.
ALFA integrates these approaches not only to benefit from different perspectives but also to build new questions from their interactions. For instance the analysis of mathematical practices in astral sciences manuscripts induces new ways to analyse the documents and to think about astronomical questions.
Relying on these approaches the main objectives of ALFA are thus to:
- Retrace the development of the corpus of Alfonsine texts from its origin in the second half of the 13th century to the end of the 15th century by following, on the manuscript level, the milieus fostering it;
- Analyse the Alfonsine astronomers’ practices, their relations to mathematics, to the natural world, to proofs and justification, their intellectual context and audiences;
- Build a meaningful narrative showing how astronomers in different milieus with diverse practices shaped, also from Arabic materials, an original scientific scene in Europe.
ALFA will shed new light on the intellectual history of the late medieval period as a whole and produce a better understanding of its relations to related scientific periods in Europe and beyond. It will also produce methodological breakthroughs impacting the ways history of knowledge is practiced outside the field of ancient and medieval sciences. Efforts will be devoted to bring these results not only to the relevant scholarly communities but also to a wider audience as a resource in the public debates around science, knowledge and culture.
Summary
Alfonsine astronomy is arguably among the first European scientific achievements. It shaped a scene for actors like Regiomontanus or Copernicus. There is however little detailed historical analysis encompassing its development in its full breadth. ALFA addresses this issue by studying tables, instruments, mathematical and theoretical texts in a methodologically innovative way relying on approaches from the history of manuscript cultures, history of mathematics, and history of astronomy.
ALFA integrates these approaches not only to benefit from different perspectives but also to build new questions from their interactions. For instance the analysis of mathematical practices in astral sciences manuscripts induces new ways to analyse the documents and to think about astronomical questions.
Relying on these approaches the main objectives of ALFA are thus to:
- Retrace the development of the corpus of Alfonsine texts from its origin in the second half of the 13th century to the end of the 15th century by following, on the manuscript level, the milieus fostering it;
- Analyse the Alfonsine astronomers’ practices, their relations to mathematics, to the natural world, to proofs and justification, their intellectual context and audiences;
- Build a meaningful narrative showing how astronomers in different milieus with diverse practices shaped, also from Arabic materials, an original scientific scene in Europe.
ALFA will shed new light on the intellectual history of the late medieval period as a whole and produce a better understanding of its relations to related scientific periods in Europe and beyond. It will also produce methodological breakthroughs impacting the ways history of knowledge is practiced outside the field of ancient and medieval sciences. Efforts will be devoted to bring these results not only to the relevant scholarly communities but also to a wider audience as a resource in the public debates around science, knowledge and culture.
Max ERC Funding
1 871 250 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALGOA
Project Novel algorithm for treatment planning of patients with osteoarthritis
Researcher (PI) Rami Kristian KORHONEN
Host Institution (HI) ITA-SUOMEN YLIOPISTO
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Osteoarthritis (OA) is a common joint disease affecting over 40 million Europeans. Most common consequences of OA are pain, disability and social isolation. What is alarming, the number of patients will increase 50% in developed countries during the next 20 years. Moreover, the economic costs of OA are considerable since 1) direct healthcare (hospital admissions, medical examinations, drug therapy, etc.) and 2) productivity costs due to reduced performance while at work and absence from work have been estimated to be between 1% and 2.5% of the gross domestic product (GDP) in Western countries.
We have developed an algorithm that is able to predict the progression of OA for overweight subjects while healthy subjects do not develop OA. When employed in clinical use, preventive and personalised treatments can be started before clinically significant symptoms are observed. This marks a major breakthrough in improving the life quality of OA patients and patients prone to OA. Our discovery will directly lead to longer working careers and lesser absence from work, and will result subsequently increased productivity. Moreover, the patients are expected to live longer due to reduced disability and social isolation.
Moreover, the discovery provides economic long-term relief for the health care system, which is burdened by increasing geriatric population and stringent economic environment. With our tool, as an example, by eliminating 25% of medical examinations annually due to overweight or obesity in Finland (150.000 patients), we estimate to decrease annual direct costs by 140M€ and indirect costs by 185M€.
In the PoC project we will carry out technical proof-of-concept and perform pre-commercialisation actions to shorten the time to market. The ultimate goal after the project is to develop our innovation towards a software product, aiding the OA diagnostics in hospitals and having commercialisation potential amongst medical device companies.
Summary
Osteoarthritis (OA) is a common joint disease affecting over 40 million Europeans. Most common consequences of OA are pain, disability and social isolation. What is alarming, the number of patients will increase 50% in developed countries during the next 20 years. Moreover, the economic costs of OA are considerable since 1) direct healthcare (hospital admissions, medical examinations, drug therapy, etc.) and 2) productivity costs due to reduced performance while at work and absence from work have been estimated to be between 1% and 2.5% of the gross domestic product (GDP) in Western countries.
We have developed an algorithm that is able to predict the progression of OA for overweight subjects while healthy subjects do not develop OA. When employed in clinical use, preventive and personalised treatments can be started before clinically significant symptoms are observed. This marks a major breakthrough in improving the life quality of OA patients and patients prone to OA. Our discovery will directly lead to longer working careers and lesser absence from work, and will result subsequently increased productivity. Moreover, the patients are expected to live longer due to reduced disability and social isolation.
Moreover, the discovery provides economic long-term relief for the health care system, which is burdened by increasing geriatric population and stringent economic environment. With our tool, as an example, by eliminating 25% of medical examinations annually due to overweight or obesity in Finland (150.000 patients), we estimate to decrease annual direct costs by 140M€ and indirect costs by 185M€.
In the PoC project we will carry out technical proof-of-concept and perform pre-commercialisation actions to shorten the time to market. The ultimate goal after the project is to develop our innovation towards a software product, aiding the OA diagnostics in hospitals and having commercialisation potential amongst medical device companies.
Max ERC Funding
150 000 €
Duration
Start date: 2018-01-01, End date: 2019-06-30
Project acronym AlgoFinance
Project Algorithmic Finance: Inquiring into the Reshaping of Financial Markets
Researcher (PI) Christian BORCH
Host Institution (HI) COPENHAGEN BUSINESS SCHOOL
Call Details Consolidator Grant (CoG), SH3, ERC-2016-COG
Summary Present-day financial markets are turning algorithmic, as market orders are increasingly being executed by fully automated computer algorithms, without any direct human intervention. Although algorithmic finance seems to fundamentally reshape the central dynamics in financial markets, and even though it prompts core sociological questions, it has not yet received any systematic attention. In a pioneering contribution to economic sociology and social studies of finance, ALGOFINANCE aims to understand how and with what consequences the turn to algorithms is changing financial markets. The overall concept and central contributions of ALGOFINANCE are the following: (1) on an intra-firm level, the project examines how the shift to algorithmic finance reshapes the ways in which trading firms operate, and does so by systematically and empirically investigating the reconfiguration of organizational structures and employee subjectivity; (2) on an inter-algorithmic level, it offers a ground-breaking methodology (agent-based modelling informed by qualitative data) to grasp how trading algorithms interact with one another in a fully digital space; and (3) on the level of market sociality, it proposes a novel theorization of how intra-firm and inter-algorithmic dynamics can be conceived of as introducing a particular form of sociality that is characteristic to algorithmic finance: a form of sociality-as-association heuristically analyzed as imitation. None of these three levels have received systematic attention in the state-of-the-art literature. Addressing them will significantly advance the understanding of present-day algorithmic finance in economic sociology. By contributing novel empirical, methodological, and theoretical understandings of the functioning and consequences of algorithms, ALGOFINANCE will pave the way for other research into digital sociology and the broader algorithmization of society.
Summary
Present-day financial markets are turning algorithmic, as market orders are increasingly being executed by fully automated computer algorithms, without any direct human intervention. Although algorithmic finance seems to fundamentally reshape the central dynamics in financial markets, and even though it prompts core sociological questions, it has not yet received any systematic attention. In a pioneering contribution to economic sociology and social studies of finance, ALGOFINANCE aims to understand how and with what consequences the turn to algorithms is changing financial markets. The overall concept and central contributions of ALGOFINANCE are the following: (1) on an intra-firm level, the project examines how the shift to algorithmic finance reshapes the ways in which trading firms operate, and does so by systematically and empirically investigating the reconfiguration of organizational structures and employee subjectivity; (2) on an inter-algorithmic level, it offers a ground-breaking methodology (agent-based modelling informed by qualitative data) to grasp how trading algorithms interact with one another in a fully digital space; and (3) on the level of market sociality, it proposes a novel theorization of how intra-firm and inter-algorithmic dynamics can be conceived of as introducing a particular form of sociality that is characteristic to algorithmic finance: a form of sociality-as-association heuristically analyzed as imitation. None of these three levels have received systematic attention in the state-of-the-art literature. Addressing them will significantly advance the understanding of present-day algorithmic finance in economic sociology. By contributing novel empirical, methodological, and theoretical understandings of the functioning and consequences of algorithms, ALGOFINANCE will pave the way for other research into digital sociology and the broader algorithmization of society.
Max ERC Funding
1 590 036 €
Duration
Start date: 2017-05-01, End date: 2021-04-30
Project acronym AlgoRNN
Project Recurrent Neural Networks and Related Machines That Learn Algorithms
Researcher (PI) Juergen Schmidhuber
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Summary
Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AlgTateGro
Project Constructing line bundles on algebraic varieties --around conjectures of Tate and Grothendieck
Researcher (PI) François CHARLES
Host Institution (HI) UNIVERSITE PARIS-SUD
Call Details Starting Grant (StG), PE1, ERC-2016-STG
Summary The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Summary
The goal of this project is to investigate two conjectures in arithmetic geometry pertaining to the geometry of projective varieties over finite and number fields. These two conjectures, formulated by Tate and Grothendieck in the 1960s, predict which cohomology classes are chern classes of line bundles. They both form an arithmetic counterpart of a theorem of Lefschetz, proved in the 1940s, which itself is the only known case of the Hodge conjecture. These two long-standing conjectures are one of the aspects of a more general web of questions regarding the topology of algebraic varieties which have been emphasized by Grothendieck and have since had a central role in modern arithmetic geometry. Special cases of these conjectures, appearing for instance in the work of Tate, Deligne, Faltings, Schneider-Lang, Masser-Wüstholz, have all had important consequences.
My goal is to investigate different lines of attack towards these conjectures, building on recent work on myself and Jean-Benoît Bost on related problems. The two main directions of the proposal are as follows. Over finite fields, the Tate conjecture is related to finiteness results for certain cohomological objects. I want to understand how to relate these to hidden boundedness properties of algebraic varieties that have appeared in my recent geometric proof of the Tate conjecture for K3 surfaces. The existence and relevance of a theory of Donaldson invariants for moduli spaces of twisted sheaves over finite fields seems to be a promising and novel direction. Over number fields, I want to combine the geometric insight above with algebraization techniques developed by Bost. In a joint project, we want to investigate how these can be used to first understand geometrically major results in transcendence theory and then attack the Grothendieck period conjecture for divisors via a number-theoretic and complex-analytic understanding of universal vector extensions of abelian schemes over curves.
Max ERC Funding
1 222 329 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym ALLERGUT
Project Mucosal Tolerance and Allergic Predisposition: Does it all start in the gut?
Researcher (PI) Caspar OHNMACHT
Host Institution (HI) HELMHOLTZ ZENTRUM MUENCHEN DEUTSCHES FORSCHUNGSZENTRUM FUER GESUNDHEIT UND UMWELT GMBH
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary Currently, more than 30% of all Europeans suffer from one or more allergic disorder but treatment is still mostly symptomatic due to a lack of understanding the underlying causality. Allergies are caused by type 2 immune responses triggered by recognition of harmless antigens. Both genetic and environmental factors have been proposed to favour allergic predisposition and both factors have a huge impact on the symbiotic microbiota and the intestinal immune system. Recently we and others showed that the transcription factor ROR(γt) seems to play a key role in mucosal tolerance in the gut and also regulates intestinal type 2 immune responses.
Based on these results I postulate two major events in the gut for the development of an allergy in the lifetime of an individual: First, a failure to establish mucosal tolerance or anergy constitutes a necessity for the outbreak of allergic symptoms and allergic disease. Second, a certain ‘core’ microbiome or pathway of the intestinal microbiota predispose certain individuals for the later development of allergic disorders. Therefore, I will address the following aims:
1) Influence of ROR(γt) on mucosal tolerance induction and allergic disorders
2) Elucidate the T cell receptor repertoire of intestinal Th2 and ROR(γt)+ Tregs and assess the role of alternative NFκB pathway for induction of mucosal tolerance
3) Identification of ‘core’ microbiome signatures or metabolic pathways that favour allergic predisposition
ALLERGUT will provide ground-breaking knowledge on molecular mechanisms of the failure of mucosal tolerance in the gut and will prove if the resident ROR(γt)+ T(reg) cells can function as a mechanistic starting point for molecular intervention strategies on the background of the hygiene hypothesis. The vision of ALLERGUT is to diagnose mucosal disbalance, prevent and treat allergic disorders even before outbreak and thereby promote Public Health initiative for better living.
Summary
Currently, more than 30% of all Europeans suffer from one or more allergic disorder but treatment is still mostly symptomatic due to a lack of understanding the underlying causality. Allergies are caused by type 2 immune responses triggered by recognition of harmless antigens. Both genetic and environmental factors have been proposed to favour allergic predisposition and both factors have a huge impact on the symbiotic microbiota and the intestinal immune system. Recently we and others showed that the transcription factor ROR(γt) seems to play a key role in mucosal tolerance in the gut and also regulates intestinal type 2 immune responses.
Based on these results I postulate two major events in the gut for the development of an allergy in the lifetime of an individual: First, a failure to establish mucosal tolerance or anergy constitutes a necessity for the outbreak of allergic symptoms and allergic disease. Second, a certain ‘core’ microbiome or pathway of the intestinal microbiota predispose certain individuals for the later development of allergic disorders. Therefore, I will address the following aims:
1) Influence of ROR(γt) on mucosal tolerance induction and allergic disorders
2) Elucidate the T cell receptor repertoire of intestinal Th2 and ROR(γt)+ Tregs and assess the role of alternative NFκB pathway for induction of mucosal tolerance
3) Identification of ‘core’ microbiome signatures or metabolic pathways that favour allergic predisposition
ALLERGUT will provide ground-breaking knowledge on molecular mechanisms of the failure of mucosal tolerance in the gut and will prove if the resident ROR(γt)+ T(reg) cells can function as a mechanistic starting point for molecular intervention strategies on the background of the hygiene hypothesis. The vision of ALLERGUT is to diagnose mucosal disbalance, prevent and treat allergic disorders even before outbreak and thereby promote Public Health initiative for better living.
Max ERC Funding
1 498 175 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ALMP_ECON
Project Effective evaluation of active labour market policies in social insurance programs - improving the interaction between econometric evaluation estimators and economic theory
Researcher (PI) Bas Van Der Klaauw
Host Institution (HI) STICHTING VU
Call Details Starting Grant (StG), SH1, ERC-2007-StG
Summary In most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.
Summary
In most European countries social insurance programs, like welfare, unemployment insurance and disability insurance are characterized by low reemployment rates. Therefore, governments spend huge amounts of money on active labour market programs, which should help individuals in finding work. Recent surveys indicate that programs which aim at intensifying job search behaviour are much more effective than schooling programs for improving human capital. A second conclusion from these surveys is that despite the size of the spendings on these programs, evidence on its effectiveness is limited. This research proposal aims at developing an economic framework that will be used to evaluate the effectiveness of popular programs like offering reemployment bonuses, fraud detection, workfare and job search monitoring. The main innovation is that I will combine economic theory with recently developed econometric techniques and detailed administrative data sets, which have not been explored before. While most of the literature only focuses on short-term outcomes, the available data allow me to also consider the long-term effectiveness of programs. The key advantage of an economic model is that I can compare the effectiveness of the different programs, consider modifications of programs and combinations of programs. Furthermore, using an economic model I can construct profiling measures to improve the targeting of programs to subsamples of the population. This is particularly relevant if the effectiveness of programs differs between individuals or depends on the moment in time the program is offered. Therefore, the results from this research will not only be of scientific interest, but will also be of great value to policymakers.
Max ERC Funding
550 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym altEJrepair
Project Characterisation of DNA Double-Strand Break Repair by Alternative End-Joining: Potential Targets for Cancer Therapy
Researcher (PI) Raphael CECCALDI
Host Institution (HI) INSTITUT CURIE
Call Details Starting Grant (StG), LS1, ERC-2016-STG
Summary DNA repair pathways evolved as an intricate network that senses DNA damage and resolves it in order to minimise genetic lesions and thus preventing tumour formation. Gaining in recognition the last few years, the alternative end-joining (alt-EJ) DNA repair pathway was recently shown to be up-regulated and required for cancer cell viability in the absence of homologous recombination-mediated repair (HR). Despite this integral role, the alt-EJ repair pathway remains poorly characterised in humans. As such, its molecular composition, regulation and crosstalk with HR and other repair pathways remain elusive. Additionally, the contribution of the alt-EJ pathway to tumour progression as well as the identification of a mutational signature associated with the use of alt-EJ has not yet been investigated. Moreover, the clinical relevance of developing small-molecule inhibitors targeting players in the alt-EJ pathway, such as the polymerase Pol Theta (Polθ), is of importance as current anticancer drug treatments have shown limited effectiveness in achieving cancer remission in patients with HR-deficient (HRD) tumours.
Here, we propose a novel, multidisciplinary approach that aims to characterise the players and mechanisms of action involved in the utilisation of alt-EJ in cancer. This understanding will better elucidate the changing interplay between different DNA repair pathways, thus shedding light on whether and how the use of alt-EJ contributes to the pathogenic history and survival of HRD tumours, eventually paving the way for the development of novel anticancer therapeutics.
For all the abovementioned reasons, we are convinced this project will have important implications in: 1) elucidating critical interconnections between DNA repair pathways, 2) improving the basic understanding of the composition, regulation and function of the alt-EJ pathway, and 3) facilitating the development of new synthetic lethality-based chemotherapeutics for the treatment of HRD tumours.
Summary
DNA repair pathways evolved as an intricate network that senses DNA damage and resolves it in order to minimise genetic lesions and thus preventing tumour formation. Gaining in recognition the last few years, the alternative end-joining (alt-EJ) DNA repair pathway was recently shown to be up-regulated and required for cancer cell viability in the absence of homologous recombination-mediated repair (HR). Despite this integral role, the alt-EJ repair pathway remains poorly characterised in humans. As such, its molecular composition, regulation and crosstalk with HR and other repair pathways remain elusive. Additionally, the contribution of the alt-EJ pathway to tumour progression as well as the identification of a mutational signature associated with the use of alt-EJ has not yet been investigated. Moreover, the clinical relevance of developing small-molecule inhibitors targeting players in the alt-EJ pathway, such as the polymerase Pol Theta (Polθ), is of importance as current anticancer drug treatments have shown limited effectiveness in achieving cancer remission in patients with HR-deficient (HRD) tumours.
Here, we propose a novel, multidisciplinary approach that aims to characterise the players and mechanisms of action involved in the utilisation of alt-EJ in cancer. This understanding will better elucidate the changing interplay between different DNA repair pathways, thus shedding light on whether and how the use of alt-EJ contributes to the pathogenic history and survival of HRD tumours, eventually paving the way for the development of novel anticancer therapeutics.
For all the abovementioned reasons, we are convinced this project will have important implications in: 1) elucidating critical interconnections between DNA repair pathways, 2) improving the basic understanding of the composition, regulation and function of the alt-EJ pathway, and 3) facilitating the development of new synthetic lethality-based chemotherapeutics for the treatment of HRD tumours.
Max ERC Funding
1 498 750 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ALTERUMMA
Project Creating an Alternative umma: Clerical Authority and Religio-political Mobilisation in Transnational Shii Islam
Researcher (PI) Oliver Paul SCHARBRODT
Host Institution (HI) THE UNIVERSITY OF BIRMINGHAM
Call Details Consolidator Grant (CoG), SH5, ERC-2016-COG
Summary This interdisciplinary project investigates the transformation of Shii Islam in the Middle East and Europe since the 1950s. The project examines the formation of modern Shii communal identities and the role Shii clerical authorities and their transnational networks have played in their religio-political mobilisation. The volatile situation post-Arab Spring, the rise of militant movements such as ISIS and the sectarianisation of geopolitical conflicts in the Middle East have intensified efforts to forge distinct Shii communal identities and to conceive Shii Muslims as part of an alternative umma (Islamic community). The project focusses on Iran, Iraq and significant but unexplored diasporic links to Syria, Kuwait and Britain. In response to the rise of modern nation-states in the Middle East, Shii clerical authorities resorted to a wide range of activities: (a) articulating intellectual responses to the ideologies underpinning modern Middle Eastern nation-states, (b) forming political parties and other platforms of socio-political activism and (c) using various forms of cultural production by systematising and promoting Shii ritual practices and utilising visual art, poetry and new media.
The project yields a perspectival shift on the factors that led to Shii communal mobilisation by:
- Analysing unacknowledged intellectual responses of Shii clerical authorities to the secular or sectarian ideologies of post-colonial nation-states and to the current sectarianisation of geopolitics in the Middle East.
- Emphasising the central role of diasporic networks in the Middle East and Europe in mobilising Shii communities and in influencing discourses and agendas of clerical authorities based in Iraq and Iran.
- Exploring new modes of cultural production in the form of a modern Shii aesthetics articulated in ritual practices, visual art, poetry and new media and thus creating a more holistic narrative on Shii religio-political mobilisation.
Summary
This interdisciplinary project investigates the transformation of Shii Islam in the Middle East and Europe since the 1950s. The project examines the formation of modern Shii communal identities and the role Shii clerical authorities and their transnational networks have played in their religio-political mobilisation. The volatile situation post-Arab Spring, the rise of militant movements such as ISIS and the sectarianisation of geopolitical conflicts in the Middle East have intensified efforts to forge distinct Shii communal identities and to conceive Shii Muslims as part of an alternative umma (Islamic community). The project focusses on Iran, Iraq and significant but unexplored diasporic links to Syria, Kuwait and Britain. In response to the rise of modern nation-states in the Middle East, Shii clerical authorities resorted to a wide range of activities: (a) articulating intellectual responses to the ideologies underpinning modern Middle Eastern nation-states, (b) forming political parties and other platforms of socio-political activism and (c) using various forms of cultural production by systematising and promoting Shii ritual practices and utilising visual art, poetry and new media.
The project yields a perspectival shift on the factors that led to Shii communal mobilisation by:
- Analysing unacknowledged intellectual responses of Shii clerical authorities to the secular or sectarian ideologies of post-colonial nation-states and to the current sectarianisation of geopolitics in the Middle East.
- Emphasising the central role of diasporic networks in the Middle East and Europe in mobilising Shii communities and in influencing discourses and agendas of clerical authorities based in Iraq and Iran.
- Exploring new modes of cultural production in the form of a modern Shii aesthetics articulated in ritual practices, visual art, poetry and new media and thus creating a more holistic narrative on Shii religio-political mobilisation.
Max ERC Funding
1 952 374 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym ALUFIX
Project Friction stir processing based local damage mitigation and healing in aluminium alloys
Researcher (PI) Aude SIMAR
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Summary
ALUFIX proposes an original strategy for the development of aluminium-based materials involving damage mitigation and extrinsic self-healing concepts exploiting the new opportunities of the solid-state friction stir process. Friction stir processing locally extrudes and drags material from the front to the back and around the tool pin. It involves short duration at moderate temperatures (typically 80% of the melting temperature), fast cooling rates and large plastic deformations leading to far out-of-equilibrium microstructures. The idea is that commercial aluminium alloys can be locally improved and healed in regions of stress concentration where damage is likely to occur. Self-healing in metal-based materials is still in its infancy and existing strategies can hardly be extended to applications. Friction stir processing can enhance the damage and fatigue resistance of aluminium alloys by microstructure homogenisation and refinement. In parallel, friction stir processing can be used to integrate secondary phases in an aluminium matrix. In the ALUFIX project, healing phases will thus be integrated in aluminium in addition to refining and homogenising the microstructure. The “local stress management strategy” favours crack closure and crack deviation at the sub-millimetre scale thanks to a controlled residual stress field. The “transient liquid healing agent” strategy involves the in-situ generation of an out-of-equilibrium compositionally graded microstructure at the aluminium/healing agent interface capable of liquid-phase healing after a thermal treatment. Along the road, a variety of new scientific questions concerning the damage mechanisms will have to be addressed.
Max ERC Funding
1 497 447 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym AMORE
Project A distributional MOdel of Reference to Entities
Researcher (PI) Gemma BOLEDA TORRENT
Host Institution (HI) UNIVERSIDAD POMPEU FABRA
Call Details Starting Grant (StG), SH4, ERC-2016-STG
Summary "When I asked my seven-year-old daughter ""Who is the boy in your class who was also new in school last year, like you?"", she instantly replied ""Daniel"", using the descriptive content in my utterance to identify an entity in the real world and refer to it. The ability to use language to refer to reality is crucial for humans, and yet it is very difficult to model. AMORE breaks new ground in Computational Linguistics, Linguistics, and Artificial Intelligence by developing a model of linguistic reference to entities implemented as a computational system that can learn its own representations from data.
This interdisciplinary project builds on two complementary semantic traditions: 1) Formal semantics, a symbolic approach that can delimit and track linguistic referents, but does not adequately match them with the descriptive content of linguistic expressions; 2) Distributional semantics, which can handle descriptive content but does not associate it to individuated referents. AMORE synthesizes the two approaches into a unified, scalable model of reference that operates with individuated referents and links them to referential expressions characterized by rich descriptive content. The model is a distributed (neural network) version of a formal semantic framework that is furthermore able to integrate perceptual (visual) and linguistic information about entities. We test it extensively in referential tasks that require matching noun phrases (“the Medicine student”, “the white cat”) with entity representations extracted from text and images.
AMORE advances our scientific understanding of language and its computational modeling, and contributes to the far-reaching debate between symbolic and distributed approaches to cognition with an integrative proposal. I am in a privileged position to carry out this integration, since I have contributed top research in both distributional and formal semantics.
"
Summary
"When I asked my seven-year-old daughter ""Who is the boy in your class who was also new in school last year, like you?"", she instantly replied ""Daniel"", using the descriptive content in my utterance to identify an entity in the real world and refer to it. The ability to use language to refer to reality is crucial for humans, and yet it is very difficult to model. AMORE breaks new ground in Computational Linguistics, Linguistics, and Artificial Intelligence by developing a model of linguistic reference to entities implemented as a computational system that can learn its own representations from data.
This interdisciplinary project builds on two complementary semantic traditions: 1) Formal semantics, a symbolic approach that can delimit and track linguistic referents, but does not adequately match them with the descriptive content of linguistic expressions; 2) Distributional semantics, which can handle descriptive content but does not associate it to individuated referents. AMORE synthesizes the two approaches into a unified, scalable model of reference that operates with individuated referents and links them to referential expressions characterized by rich descriptive content. The model is a distributed (neural network) version of a formal semantic framework that is furthermore able to integrate perceptual (visual) and linguistic information about entities. We test it extensively in referential tasks that require matching noun phrases (“the Medicine student”, “the white cat”) with entity representations extracted from text and images.
AMORE advances our scientific understanding of language and its computational modeling, and contributes to the far-reaching debate between symbolic and distributed approaches to cognition with an integrative proposal. I am in a privileged position to carry out this integration, since I have contributed top research in both distributional and formal semantics.
"
Max ERC Funding
1 499 805 €
Duration
Start date: 2017-02-01, End date: 2022-01-31
Project acronym AMPLIPORE
Project Understanding negative gas adsorption in highly porous networks for the design of pressure amplifying materials
Researcher (PI) Stefan Kaskel
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Advanced Grant (AdG), PE5, ERC-2016-ADG
Summary Negative gas adsorption (NGA) is a new, counterintuitive and paradoxical phenomenon, for the first time
reported by my group in 2016: Normal solid materials with significant outer or inner surface area always
take up gas when the pressure in the surrounding reservoir is increased (adsorption). NGA networks instead
react at a certain point in the opposite direction: They release gas upon external pressure increase, leading to
an overall pressure amplification in a closed system. Comparable phenomena have never been reported
before. What is so exciting about NGA? We have a unique material in hand, that counteracts to an external
force by force amplification.
So far NGA has solely been observed in one of our new coordination polymers, featuring a colossal selfcompression
associated with a mesopore-to-micropore transformation. Gas pressure amplifying materials
could lead to important innovations in gas releasing rescue systems, pneumatic control systems (production,
transportation), micropumps, microfluidic devices, pneumatic actuators, and artificial lungs. A fundamental
understanding of the physical mechanisms, structures, and thermodynamic boundary conditions is an
essential prerequisite for any industrial application of this counterintuitive phenomenon.
Combining strong synthetic methodologies with advanced analytical techniques, AMPLIPORE will elucidate
the characteristic molecular and mesoscopic materials signatures as well as thermodynamic boundary
conditions of NGA phenomena. We will elaborate a generic NGA-materials concept to tailor the pressure
amplification and explore temperature and pressure ranges at which NGA can be applied. Developing tailormade
instrumentation for kinetic investigations of NGA will give fundamental insights into the intrinsic and
macroscopic dynamics of crystal-to-crystal transformations for applications in micropneumatic systems.
Summary
Negative gas adsorption (NGA) is a new, counterintuitive and paradoxical phenomenon, for the first time
reported by my group in 2016: Normal solid materials with significant outer or inner surface area always
take up gas when the pressure in the surrounding reservoir is increased (adsorption). NGA networks instead
react at a certain point in the opposite direction: They release gas upon external pressure increase, leading to
an overall pressure amplification in a closed system. Comparable phenomena have never been reported
before. What is so exciting about NGA? We have a unique material in hand, that counteracts to an external
force by force amplification.
So far NGA has solely been observed in one of our new coordination polymers, featuring a colossal selfcompression
associated with a mesopore-to-micropore transformation. Gas pressure amplifying materials
could lead to important innovations in gas releasing rescue systems, pneumatic control systems (production,
transportation), micropumps, microfluidic devices, pneumatic actuators, and artificial lungs. A fundamental
understanding of the physical mechanisms, structures, and thermodynamic boundary conditions is an
essential prerequisite for any industrial application of this counterintuitive phenomenon.
Combining strong synthetic methodologies with advanced analytical techniques, AMPLIPORE will elucidate
the characteristic molecular and mesoscopic materials signatures as well as thermodynamic boundary
conditions of NGA phenomena. We will elaborate a generic NGA-materials concept to tailor the pressure
amplification and explore temperature and pressure ranges at which NGA can be applied. Developing tailormade
instrumentation for kinetic investigations of NGA will give fundamental insights into the intrinsic and
macroscopic dynamics of crystal-to-crystal transformations for applications in micropneumatic systems.
Max ERC Funding
2 363 125 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AMPLITUDES
Project Novel structures in scattering amplitudes
Researcher (PI) Johannes Martin HENN
Host Institution (HI) JOHANNES GUTENBERG-UNIVERSITAT MAINZ
Call Details Consolidator Grant (CoG), PE2, ERC-2016-COG
Summary This project focuses on developing quantum field theory methods and applying them to the phenomenology of elementary particles. At the Large Hadron Collider (LHC) our current best theoretical understanding of particle physics is being tested against experiment by measuring e.g. properties of the recently discovered Higgs boson. With run two of the LHC, currently underway, the experimental accuracy will further increase. Theoretical predictions matching the latter are urgently needed. Obtaining these requires extremely difficult calculations of scattering amplitudes and cross sections in quantum field theory, including calculations to correctly describe large contributions due to long-distance physics in the latter. Major obstacles in such computations are the large number of Feynman diagrams that are difficult to handle, even with the help of modern computers, and the computation of Feynman loop integrals. To address these issues, we will develop innovative methods that are inspired by new structures found in supersymmetric field theories. We will extend the scope of the differential equations method for computing Feynman integrals, and apply it to scattering processes that are needed for phenomenology, but too complicated to analyze using current methods. Our results will help measure fundamental parameters of Nature, such as, for example, couplings of the Higgs boson, with unprecedented precision. Moreover, by accurately predicting backgrounds from known physics, our results will also be invaluable for searches of new particles.
Summary
This project focuses on developing quantum field theory methods and applying them to the phenomenology of elementary particles. At the Large Hadron Collider (LHC) our current best theoretical understanding of particle physics is being tested against experiment by measuring e.g. properties of the recently discovered Higgs boson. With run two of the LHC, currently underway, the experimental accuracy will further increase. Theoretical predictions matching the latter are urgently needed. Obtaining these requires extremely difficult calculations of scattering amplitudes and cross sections in quantum field theory, including calculations to correctly describe large contributions due to long-distance physics in the latter. Major obstacles in such computations are the large number of Feynman diagrams that are difficult to handle, even with the help of modern computers, and the computation of Feynman loop integrals. To address these issues, we will develop innovative methods that are inspired by new structures found in supersymmetric field theories. We will extend the scope of the differential equations method for computing Feynman integrals, and apply it to scattering processes that are needed for phenomenology, but too complicated to analyze using current methods. Our results will help measure fundamental parameters of Nature, such as, for example, couplings of the Higgs boson, with unprecedented precision. Moreover, by accurately predicting backgrounds from known physics, our results will also be invaluable for searches of new particles.
Max ERC Funding
2 000 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AN07AT
Project Understanding computational roles of new neurons generated in the adult hippocampus
Researcher (PI) Ayumu Tashiro
Host Institution (HI) NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET NTNU
Call Details Starting Grant (StG), LS4, ERC-2007-StG
Summary New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Summary
New neurons are continuously generated in certain regions of adult mammalian brain. One of those regions is the dentate gyrus, a subregion of hippocampus, which is essential for memory formation. Although these new neurons in the adult dentate gyrus are thought to have an important role in learning and memory, it is largely unclear how new neurons are involved in information processing and storage underlying memory. Because new neurons constitute a minor portion of intermingled local neuronal population, simple application of conventional techniques such as multi-unit extracellular recording and pharmacological lesion are not suitable for the functional analysis of new neurons. In this proposed research program, I will combine multi-unit recording and behavioral analysis with virus mediated, cell-type-specific genetic manipulation of neuronal activity, to investigate computational roles of new neurons in learning and memory. Specifically, I will determine: 1) specific memory processes that require new neurons, 2) dynamic patterns of activity that new neurons express during memory-related behavior, 3) influence of new neurons on their downstream structure. Further, based on the information obtained by these three lines of studies, we will establish causal relationship between specific memory-related behavior and specific pattern of activity in new neurons. Solving these issues will cooperatively provide important insight into the understanding of computational roles performed by adult neurogenesis. The information on the function of new neurons in normal brain could contribute to future development of efficient therapeutic strategy for a variety of brain disorders.
Max ERC Funding
1 991 743 €
Duration
Start date: 2009-01-01, End date: 2013-12-31
Project acronym ANGIOPLACE
Project Expression and Methylation Status of Genes Regulating Placental Angiogenesis in Normal, Cloned, IVF and Monoparental Sheep Foetuses
Researcher (PI) Grazyna Ewa Ptak
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TERAMO
Call Details Starting Grant (StG), LS7, ERC-2007-StG
Summary Normal placental angiogenesis is critical for embryonic survival and development. Epigenetic modifications, such as methylation of CpG islands, regulate the expression and imprinting of genes. Epigenetic abnormalities have been observed in embryos from assisted reproductive technologies (ART), which could explain the poor placental vascularisation, embryonic/fetal death, and altered fetal growth in these pregnancies. Both cloned (somatic cell nuclear transfer, or SNCT) and monoparental (parthogenotes, only maternal genes; androgenotes, only paternal genes) embryos provide important models for studying defects in expression and methylation status/imprinting of genes regulating placental function. Our hypothesis is that placental vascular development is compromised during early pregnancy in embryos from ART, in part due to altered expression or imprinting/methylation status of specific genes regulating placental angiogenesis. We will evaluate fetal growth, placental vascular growth, and expression and epigenetic status of genes regulating placental angiogenesis during early pregnancy in 3 Specific Aims: (1) after natural mating; (2) after transfer of biparental embryos from in vitro fertilization, and SCNT; and (3) after transfer of parthenogenetic or androgenetic embryos. These studies will therefore contribute substantially to our understanding of the regulation of placental development and vascularisation during early pregnancy, and could pinpoint the mechanism contributing to embryonic loss and developmental abnormalities in foetuses from ART. Any or all of these observations will contribute to our understanding of and also our ability to successfully employ ART, which are becoming very wide spread and important in human medicine as well as in animal production.
Summary
Normal placental angiogenesis is critical for embryonic survival and development. Epigenetic modifications, such as methylation of CpG islands, regulate the expression and imprinting of genes. Epigenetic abnormalities have been observed in embryos from assisted reproductive technologies (ART), which could explain the poor placental vascularisation, embryonic/fetal death, and altered fetal growth in these pregnancies. Both cloned (somatic cell nuclear transfer, or SNCT) and monoparental (parthogenotes, only maternal genes; androgenotes, only paternal genes) embryos provide important models for studying defects in expression and methylation status/imprinting of genes regulating placental function. Our hypothesis is that placental vascular development is compromised during early pregnancy in embryos from ART, in part due to altered expression or imprinting/methylation status of specific genes regulating placental angiogenesis. We will evaluate fetal growth, placental vascular growth, and expression and epigenetic status of genes regulating placental angiogenesis during early pregnancy in 3 Specific Aims: (1) after natural mating; (2) after transfer of biparental embryos from in vitro fertilization, and SCNT; and (3) after transfer of parthenogenetic or androgenetic embryos. These studies will therefore contribute substantially to our understanding of the regulation of placental development and vascularisation during early pregnancy, and could pinpoint the mechanism contributing to embryonic loss and developmental abnormalities in foetuses from ART. Any or all of these observations will contribute to our understanding of and also our ability to successfully employ ART, which are becoming very wide spread and important in human medicine as well as in animal production.
Max ERC Funding
363 600 €
Duration
Start date: 2008-10-01, End date: 2012-06-30
Project acronym AnonymClassic
Project The Arabic Anonymous in a World Classic
Researcher (PI) Beatrice GRUENDLER
Host Institution (HI) FREIE UNIVERSITAET BERLIN
Call Details Advanced Grant (AdG), SH5, ERC-2016-ADG
Summary AnonymClassic is the first ever comprehensive study of Kalila and Dimna (a book of wisdom in fable form), a text of premodern world literature. Its spread is comparable to that of the Bible, except that it passed from Hinduism and Buddhism via Islam to Christianity. Its Arabic version, produced in the 8th century, when this was the lingua franca of the Near East, became the source of all further translations up to the 19th century. The work’s multilingual history involving circa forty languages has never been systematically studied. The absence of available research has made world literature ignore it, while scholars of Arabic avoided it because of its widely diverging manuscripts, so that the actual shape of the Arabic key version is still in need of investigation. AnonymClassic tests a number of ‘high-risk’ propositions, including three key hypotheses: 1) The anonymous Arabic copyists of Kalila and Dimna are de facto co-authors, 2) their agency is comparable to that of the named medieval translators, and 3) the fluctuation of the Arabic versions is conditioned by the work’s fictional status. AnonymClassic’s methodology relies on a cross-lingual narratological analysis of the Arabic versions and all medieval translations (supported by a synoptic digital edition), which takes precisely the interventions at each stage of transmission (redaction, translation) as its subject. Considering the work’s paths of dissemination from India to Europe, AnonymClassic will challenge the prevalent Western theoretical lens on world literature conceived ‘from above’ with the view ‘from below,’ based on the attested cross-cultural network constituted by its versions. AnonymClassic will introduce a new paradigm of an East-Western literary continuum with Arabic as a cultural bridge. Against the current background of Europe’s diversifying and multicultural society, AnonymClassic purposes to integrate pre-modern Near Eastern literature and culture into our understanding of Global Culture.
Summary
AnonymClassic is the first ever comprehensive study of Kalila and Dimna (a book of wisdom in fable form), a text of premodern world literature. Its spread is comparable to that of the Bible, except that it passed from Hinduism and Buddhism via Islam to Christianity. Its Arabic version, produced in the 8th century, when this was the lingua franca of the Near East, became the source of all further translations up to the 19th century. The work’s multilingual history involving circa forty languages has never been systematically studied. The absence of available research has made world literature ignore it, while scholars of Arabic avoided it because of its widely diverging manuscripts, so that the actual shape of the Arabic key version is still in need of investigation. AnonymClassic tests a number of ‘high-risk’ propositions, including three key hypotheses: 1) The anonymous Arabic copyists of Kalila and Dimna are de facto co-authors, 2) their agency is comparable to that of the named medieval translators, and 3) the fluctuation of the Arabic versions is conditioned by the work’s fictional status. AnonymClassic’s methodology relies on a cross-lingual narratological analysis of the Arabic versions and all medieval translations (supported by a synoptic digital edition), which takes precisely the interventions at each stage of transmission (redaction, translation) as its subject. Considering the work’s paths of dissemination from India to Europe, AnonymClassic will challenge the prevalent Western theoretical lens on world literature conceived ‘from above’ with the view ‘from below,’ based on the attested cross-cultural network constituted by its versions. AnonymClassic will introduce a new paradigm of an East-Western literary continuum with Arabic as a cultural bridge. Against the current background of Europe’s diversifying and multicultural society, AnonymClassic purposes to integrate pre-modern Near Eastern literature and culture into our understanding of Global Culture.
Max ERC Funding
2 435 113 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym ANTS
Project A new technology of microthermal sensing for application in microcalorimetry
Researcher (PI) Rivadulla Fernandez Jose Francisco
Host Institution (HI) UNIVERSIDAD DE SANTIAGO DE COMPOSTELA
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary ANTS aims to prove the viability of a novel thermal microsensor, with highly improved thermal, temporal and spatial resolution, to be the basis of a breakthrough micro/nano-calorimeter. The resulting device shall quantify binding rates and enthalpy/entropy changes in interactions of biological interest in a much more accurate and straightforward manner than available techniques. Consequently, ANTS-microcalorimeter will facilitate enormously drug discovery and development of biomedical products and technologies. We propose to exploit the large Nernst effect in ferromagnetic conductors for electrical sensing of temperature gradients with exceptional sensitivity. The active sensing elements are composed of a single material, thus offering important advantages for miniaturization over conventional micro-calorimetry, based on diverse Peltier modules, whereas easy to fabricate by standard, scalable deposition and photolithographic methods. Standard microcalorimeter configuration can also be maintained in the novel device, which is convenient to ensure compatibility and foster adoption by users and manufacturers.
Summary
ANTS aims to prove the viability of a novel thermal microsensor, with highly improved thermal, temporal and spatial resolution, to be the basis of a breakthrough micro/nano-calorimeter. The resulting device shall quantify binding rates and enthalpy/entropy changes in interactions of biological interest in a much more accurate and straightforward manner than available techniques. Consequently, ANTS-microcalorimeter will facilitate enormously drug discovery and development of biomedical products and technologies. We propose to exploit the large Nernst effect in ferromagnetic conductors for electrical sensing of temperature gradients with exceptional sensitivity. The active sensing elements are composed of a single material, thus offering important advantages for miniaturization over conventional micro-calorimetry, based on diverse Peltier modules, whereas easy to fabricate by standard, scalable deposition and photolithographic methods. Standard microcalorimeter configuration can also be maintained in the novel device, which is convenient to ensure compatibility and foster adoption by users and manufacturers.
Max ERC Funding
149 250 €
Duration
Start date: 2017-01-01, End date: 2018-06-30
Project acronym ANXIETY & COGNITION
Project How anxiety transforms human cognition: an Affective Neuroscience perspective
Researcher (PI) Gilles Roger Charles Pourtois
Host Institution (HI) UNIVERSITEIT GENT
Call Details Starting Grant (StG), SH3, ERC-2007-StG
Summary Anxiety, a state of apprehension or fear, may provoke cognitive or behavioural disorders and eventually lead to serious medical illnesses. The high prevalence of anxiety disorders in our society sharply contrasts with the lack of clear factual knowledge about the corresponding brain mechanisms at the origin of this profound change in the appraisal of the environment. Little is known about how the psychopathological state of anxiety ultimately turns to a medical condition. The core of this proposal is to gain insight in the neural underpinnings of anxiety and disorders related to anxiety using modern human brain-imaging such as scalp EEG and fMRI. I propose to enlighten how anxiety transforms and shapes human cognition and what the neural correlates and time-course of this modulatory effect are. The primary innovation of this project is the systematic use scalp EEG and fMRI in human participants to better understand the neural mechanisms by which anxiety profoundly influences specific cognitive functions, in particular selective attention and decision-making. The goal of this proposal is to precisely determine the exact timing (using scalp EEG), location, size and extent (using fMRI) of anxiety-related modulations on selective attention and decision-making in the human brain. Here I propose to focus on these two specific processes, because they are likely to reveal selective effects of anxiety on human cognition and can thus serve as powerful models to better figure out how anxiety operates in the human brain. Another important aspect of this project is the fact I envision to help bridge the gap in Health Psychology between fundamental research and clinical practice by proposing alternative revalidation strategies for human adult subjects affected by anxiety-related disorders, which could directly exploit the neuro-scientific discoveries generated in this scientific project.
Summary
Anxiety, a state of apprehension or fear, may provoke cognitive or behavioural disorders and eventually lead to serious medical illnesses. The high prevalence of anxiety disorders in our society sharply contrasts with the lack of clear factual knowledge about the corresponding brain mechanisms at the origin of this profound change in the appraisal of the environment. Little is known about how the psychopathological state of anxiety ultimately turns to a medical condition. The core of this proposal is to gain insight in the neural underpinnings of anxiety and disorders related to anxiety using modern human brain-imaging such as scalp EEG and fMRI. I propose to enlighten how anxiety transforms and shapes human cognition and what the neural correlates and time-course of this modulatory effect are. The primary innovation of this project is the systematic use scalp EEG and fMRI in human participants to better understand the neural mechanisms by which anxiety profoundly influences specific cognitive functions, in particular selective attention and decision-making. The goal of this proposal is to precisely determine the exact timing (using scalp EEG), location, size and extent (using fMRI) of anxiety-related modulations on selective attention and decision-making in the human brain. Here I propose to focus on these two specific processes, because they are likely to reveal selective effects of anxiety on human cognition and can thus serve as powerful models to better figure out how anxiety operates in the human brain. Another important aspect of this project is the fact I envision to help bridge the gap in Health Psychology between fundamental research and clinical practice by proposing alternative revalidation strategies for human adult subjects affected by anxiety-related disorders, which could directly exploit the neuro-scientific discoveries generated in this scientific project.
Max ERC Funding
812 986 €
Duration
Start date: 2008-11-01, End date: 2013-10-31
Project acronym AORVM
Project The Effects of Aging on Object Representation in Visual Working Memory
Researcher (PI) James Robert Brockmole
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Starting Grant (StG), SH3, ERC-2007-StG
Summary One’s ability to remember visual material such as objects, faces, and spatial locations over a short period of time declines with age. The proposed research will examine whether these deficits are explained by a reduction in visual working memory (VWM) capacity, or an impairment in one’s ability to maintain, or ‘bind’ appropriate associations among pieces of related information. In this project successful binding is operationally defined as the proper recall or recognition of objects that are defined by the conjunction of multiple visual features. While tests of long-term memory have demonstrated that, despite preserved memory for isolated features, older adults have more difficulty remembering conjunctions of features, no research has yet investigated analogous age related binding deficits in VWM. This is a critical oversight because, given the current state of the science, it is unknown whether these deficits are specific to the long-term memory system, or if they originate in VWM. The project interweaves three strands of research that each investigate whether older adults have more difficulty creating, maintaining, and updating bound multi-feature object representations than younger adults. This theoretical program of enquiry will provide insight into the cognitive architecture of VWM and how this system changes with age, and its outcomes will have wide ranging multi-disciplinary applications in applied theory and intervention techniques that may reduce the adverse consequences of aging on memory.
Summary
One’s ability to remember visual material such as objects, faces, and spatial locations over a short period of time declines with age. The proposed research will examine whether these deficits are explained by a reduction in visual working memory (VWM) capacity, or an impairment in one’s ability to maintain, or ‘bind’ appropriate associations among pieces of related information. In this project successful binding is operationally defined as the proper recall or recognition of objects that are defined by the conjunction of multiple visual features. While tests of long-term memory have demonstrated that, despite preserved memory for isolated features, older adults have more difficulty remembering conjunctions of features, no research has yet investigated analogous age related binding deficits in VWM. This is a critical oversight because, given the current state of the science, it is unknown whether these deficits are specific to the long-term memory system, or if they originate in VWM. The project interweaves three strands of research that each investigate whether older adults have more difficulty creating, maintaining, and updating bound multi-feature object representations than younger adults. This theoretical program of enquiry will provide insight into the cognitive architecture of VWM and how this system changes with age, and its outcomes will have wide ranging multi-disciplinary applications in applied theory and intervention techniques that may reduce the adverse consequences of aging on memory.
Max ERC Funding
500 000 €
Duration
Start date: 2008-09-01, End date: 2011-08-31
Project acronym APACHE
Project Atmospheric Pressure plAsma meets biomaterials for bone Cancer HEaling
Researcher (PI) Cristina CANAL BARNILS
Host Institution (HI) UNIVERSITAT POLITECNICA DE CATALUNYA
Call Details Starting Grant (StG), PE8, ERC-2016-STG
Summary Cold atmospheric pressure plasmas (APP) have been reported to selectively kill cancer cells without damaging the surrounding tissues. Studies have been conducted on a variety of cancer types but to the best of our knowledge not on any kind of bone cancer. Treatment options for bone cancer include surgery, chemotherapy, etc. and may involve the use of bone grafting biomaterials to replace the surgically removed bone.
APACHE brings a totally different and ground-breaking approach in the design of a novel therapy for bone cancer by taking advantage of the active species generated by APP in combination with biomaterials to deliver the active species locally in the diseased site. The feasibility of this approach is rooted in the evidence that the cellular effects of APP appear to strongly involve the suite of reactive species created by plasmas, which can be derived from a) direct treatment of the malignant cells by APP or b) indirect treatment of the liquid media by APP which is then put in contact with the cancer cells.
In APACHE we aim to investigate the fundamentals involved in the lethal effects of cold plasmas on bone cancer cells, and to develop improved bone cancer therapies. To achieve this we will take advantage of the highly reactive species generated by APP in the liquid media, which we will use in an incremental strategy: i) to investigate the effects of APP treated liquid on bone cancer cells, ii) to evaluate the potential of combining APP treated liquid in a hydrogel vehicle with/wo CaP biomaterials and iii) to ascertain the potential three directional interactions between APP reactive species in liquid medium with biomaterials and with chemotherapeutic drugs.
The methodological approach will involve an interdisciplinary team, dealing with plasma diagnostics in gas and liquid media; with cell biology and the effects of APP treated with bone tumor cells and its combination with biomaterials and/or with anticancer drugs.
Summary
Cold atmospheric pressure plasmas (APP) have been reported to selectively kill cancer cells without damaging the surrounding tissues. Studies have been conducted on a variety of cancer types but to the best of our knowledge not on any kind of bone cancer. Treatment options for bone cancer include surgery, chemotherapy, etc. and may involve the use of bone grafting biomaterials to replace the surgically removed bone.
APACHE brings a totally different and ground-breaking approach in the design of a novel therapy for bone cancer by taking advantage of the active species generated by APP in combination with biomaterials to deliver the active species locally in the diseased site. The feasibility of this approach is rooted in the evidence that the cellular effects of APP appear to strongly involve the suite of reactive species created by plasmas, which can be derived from a) direct treatment of the malignant cells by APP or b) indirect treatment of the liquid media by APP which is then put in contact with the cancer cells.
In APACHE we aim to investigate the fundamentals involved in the lethal effects of cold plasmas on bone cancer cells, and to develop improved bone cancer therapies. To achieve this we will take advantage of the highly reactive species generated by APP in the liquid media, which we will use in an incremental strategy: i) to investigate the effects of APP treated liquid on bone cancer cells, ii) to evaluate the potential of combining APP treated liquid in a hydrogel vehicle with/wo CaP biomaterials and iii) to ascertain the potential three directional interactions between APP reactive species in liquid medium with biomaterials and with chemotherapeutic drugs.
The methodological approach will involve an interdisciplinary team, dealing with plasma diagnostics in gas and liquid media; with cell biology and the effects of APP treated with bone tumor cells and its combination with biomaterials and/or with anticancer drugs.
Max ERC Funding
1 499 887 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym AppSAM
Project A Flexible Platform for the Application of SAM-dependent enzymes
Researcher (PI) Jennifer Nina ANDEXER
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Starting Grant (StG), LS9, ERC-2016-STG
Summary AppSAM will unlock the synthetic capability of S-adenosyl¬methionine (SAM)-dependent methyltransferases and radical SAM enzymes for application in environmentally friendly and fully sustainable reactions. The biotechnological application of these enzymes will provide access to chemo-, regio- and stereoselective methylations and alkylations, as well as to a wide range of complex rearrangement reactions that are currently not possible through traditional approaches. Methylation reactions are of particular interest due to their importance in epigenetics, cancer metabolism and the development of novel pharmaceuticals. As chemical methylation methods often involve toxic compounds and rarely exhibit the desired selectivity and specificity, there is an urgent need for new, environmentally friendly methodologies.
The proposed project will meet these demands by the provision of modular in vitro and in vivo systems that can be tailored to specific applications. In the first phase of AppSAM, efficient in vitro SAM-regeneration systems will be developed for use with methyltransferases as well as radical SAM enzymes. To achieve this aim, enzymes from different biosynthetic pathways will be combined in multi-enzyme cascades; methods from enzyme and reaction engineering will be used for optimisation. The second phase of AppSAM will address the application on a preparative scale. This will include the isolation of pure product from the in vitro systems, reactions using immobilised enzymes and extracts from in vivo productions. In addition to E. coli, the methylotrophic bacterium Methylobacter extorquens AM1 will be used as a host for the in vivo systems. M. extorquens can use C1 building blocks such as methanol as the sole carbon source, thereby initiating the biotechnological methylation process from a green source material and making the process fully sustainable, as well as being compatible with an envisaged “methanol economy”.
Summary
AppSAM will unlock the synthetic capability of S-adenosyl¬methionine (SAM)-dependent methyltransferases and radical SAM enzymes for application in environmentally friendly and fully sustainable reactions. The biotechnological application of these enzymes will provide access to chemo-, regio- and stereoselective methylations and alkylations, as well as to a wide range of complex rearrangement reactions that are currently not possible through traditional approaches. Methylation reactions are of particular interest due to their importance in epigenetics, cancer metabolism and the development of novel pharmaceuticals. As chemical methylation methods often involve toxic compounds and rarely exhibit the desired selectivity and specificity, there is an urgent need for new, environmentally friendly methodologies.
The proposed project will meet these demands by the provision of modular in vitro and in vivo systems that can be tailored to specific applications. In the first phase of AppSAM, efficient in vitro SAM-regeneration systems will be developed for use with methyltransferases as well as radical SAM enzymes. To achieve this aim, enzymes from different biosynthetic pathways will be combined in multi-enzyme cascades; methods from enzyme and reaction engineering will be used for optimisation. The second phase of AppSAM will address the application on a preparative scale. This will include the isolation of pure product from the in vitro systems, reactions using immobilised enzymes and extracts from in vivo productions. In addition to E. coli, the methylotrophic bacterium Methylobacter extorquens AM1 will be used as a host for the in vivo systems. M. extorquens can use C1 building blocks such as methanol as the sole carbon source, thereby initiating the biotechnological methylation process from a green source material and making the process fully sustainable, as well as being compatible with an envisaged “methanol economy”.
Max ERC Funding
1 499 219 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym APROCS
Project Automated Linear Parameter-Varying Modeling and Control Synthesis for Nonlinear Complex Systems
Researcher (PI) Roland TOTH
Host Institution (HI) TECHNISCHE UNIVERSITEIT EINDHOVEN
Call Details Starting Grant (StG), PE7, ERC-2016-STG
Summary Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Summary
Linear Parameter-Varying (LPV) systems are flexible mathematical models capable of representing Nonlinear (NL)/Time-Varying (TV) dynamical behaviors of complex physical systems (e.g., wafer scanners, car engines, chemical reactors), often encountered in engineering, via a linear structure. The LPV framework provides computationally efficient and robust approaches to synthesize digital controllers that can ensure desired operation of such systems - making it attractive to (i) high-tech mechatronic, (ii) automotive and (iii) chemical-process applications. Such a framework is important to meet with the increasing operational demands of systems in these industrial sectors and to realize future technological targets. However, recent studies have shown that, to fully exploit the potential of the LPV framework, a number of limiting factors of the underlying theory ask a for serious innovation, as currently it is not understood how to (1) automate exact and low-complexity LPV modeling of real-world applications and how to refine uncertain aspects of these models efficiently by the help of measured data, (2) incorporate control objectives directly into modeling and to develop model reduction approaches for control, and (3) how to see modeling & control synthesis as a unified, closed-loop system synthesis approach directly oriented for the underlying NL/TV system. Furthermore, due to the increasingly cyber-physical nature of applications, (4) control synthesis is needed in a plug & play fashion, where if sub-systems are modified or exchanged, then the control design and the model of the whole system are only incrementally updated. This project aims to surmount Challenges (1)-(4) by establishing an innovative revolution of the LPV framework supported by a software suite and extensive empirical studies on real-world industrial applications; with a potential to ensure a leading role of technological innovation of the EU in the high-impact industrial sectors (i)-(iii).
Max ERC Funding
1 493 561 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AQSuS
Project Analog Quantum Simulation using Superconducting Qubits
Researcher (PI) Gerhard KIRCHMAIR
Host Institution (HI) UNIVERSITAET INNSBRUCK
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary AQSuS aims at experimentally implementing analogue quantum simulation of interacting spin models in two-dimensional geometries. The proposed experimental approach paves the way to investigate a broad range of currently inaccessible quantum phenomena, for which existing analytical and numerical methods reach their limitations. Developing precisely controlled interacting quantum systems in 2D is an important current goal well beyond the field of quantum simulation and has applications in e.g. solid state physics, computing and metrology.
To access these models, I propose to develop a novel circuit quantum-electrodynamics (cQED) platform based on the 3D transmon qubit architecture. This platform utilizes the highly engineerable properties and long coherence times of these qubits. A central novel idea behind AQSuS is to exploit the spatial dependence of the naturally occurring dipolar interactions between the qubits to engineer the desired spin-spin interactions. This approach avoids the complicated wiring, typical for other cQED experiments and reduces the complexity of the experimental setup. The scheme is therefore directly scalable to larger systems. The experimental goals are:
1) Demonstrate analogue quantum simulation of an interacting spin system in 1D & 2D.
2) Establish methods to precisely initialize the state of the system, control the interactions and readout single qubit states and multi-qubit correlations.
3) Investigate unobserved quantum phenomena on 2D geometries e.g. kagome and triangular lattices.
4) Study open system dynamics with interacting spin systems.
AQSuS builds on my backgrounds in both superconducting qubits and quantum simulation with trapped-ions. With theory collaborators my young research group and I have recently published an article in PRB [9] describing and analysing the proposed platform. The ERC starting grant would allow me to open a big new research direction and capitalize on the foundations established over the last two years.
Summary
AQSuS aims at experimentally implementing analogue quantum simulation of interacting spin models in two-dimensional geometries. The proposed experimental approach paves the way to investigate a broad range of currently inaccessible quantum phenomena, for which existing analytical and numerical methods reach their limitations. Developing precisely controlled interacting quantum systems in 2D is an important current goal well beyond the field of quantum simulation and has applications in e.g. solid state physics, computing and metrology.
To access these models, I propose to develop a novel circuit quantum-electrodynamics (cQED) platform based on the 3D transmon qubit architecture. This platform utilizes the highly engineerable properties and long coherence times of these qubits. A central novel idea behind AQSuS is to exploit the spatial dependence of the naturally occurring dipolar interactions between the qubits to engineer the desired spin-spin interactions. This approach avoids the complicated wiring, typical for other cQED experiments and reduces the complexity of the experimental setup. The scheme is therefore directly scalable to larger systems. The experimental goals are:
1) Demonstrate analogue quantum simulation of an interacting spin system in 1D & 2D.
2) Establish methods to precisely initialize the state of the system, control the interactions and readout single qubit states and multi-qubit correlations.
3) Investigate unobserved quantum phenomena on 2D geometries e.g. kagome and triangular lattices.
4) Study open system dynamics with interacting spin systems.
AQSuS builds on my backgrounds in both superconducting qubits and quantum simulation with trapped-ions. With theory collaborators my young research group and I have recently published an article in PRB [9] describing and analysing the proposed platform. The ERC starting grant would allow me to open a big new research direction and capitalize on the foundations established over the last two years.
Max ERC Funding
1 498 515 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym AQUARAMAN
Project Pipet Based Scanning Probe Microscopy Tip-Enhanced Raman Spectroscopy: A Novel Approach for TERS in Liquids
Researcher (PI) Aleix Garcia Guell
Host Institution (HI) ECOLE POLYTECHNIQUE
Call Details Starting Grant (StG), PE4, ERC-2016-STG
Summary Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Summary
Tip-enhanced Raman spectroscopy (TERS) is often described as the most powerful tool for optical characterization of surfaces and their proximities. It combines the intrinsic spatial resolution of scanning probe techniques (AFM or STM) with the chemical information content of vibrational Raman spectroscopy. Capable to reveal surface heterogeneity at the nanoscale, TERS is currently playing a fundamental role in the understanding of interfacial physicochemical processes in key areas of science and technology such as chemistry, biology and material science.
Unfortunately, the undeniable potential of TERS as a label-free tool for nanoscale chemical and structural characterization is, nowadays, limited to air and vacuum environments, with it failing to operate in a reliable and systematic manner in liquid. The reasons are more technical than fundamental, as what is hindering the application of TERS in water is, among other issues, the low stability of the probes and their consistency. Fields of science and technology where the presence of water/electrolyte is unavoidable, such as biology and electrochemistry, remain unexplored with this powerful technique.
We propose a revolutionary approach for TERS in liquids founded on the employment of pipet-based scanning probe microscopy techniques (pb-SPM) as an alternative to AFM and STM. The use of recent but well established pb-SPM brings the opportunity to develop unprecedented pipet-based TERS probes (beyond the classic and limited metallized solid probes from AFM and STM), together with the implementation of ingenious and innovative measures to enhance tip stability, sensitivity and reliability, unattainable with the current techniques.
We will be in possession of a unique nano-spectroscopy platform capable of experiments in liquids, to follow dynamic processes in-situ, addressing fundamental questions and bringing insight into interfacial phenomena spanning from materials science, physics, chemistry and biology.
Max ERC Funding
1 528 442 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ArcheoDyn
Project Globular clusters as living fossils of the past of galaxies
Researcher (PI) Petrus VAN DE VEN
Host Institution (HI) UNIVERSITAT WIEN
Call Details Consolidator Grant (CoG), PE9, ERC-2016-COG
Summary Globular clusters (GCs) are enigmatic objects that hide a wealth of information. They are the living fossils of the history of their native galaxies and the record keepers of the violent events that made them change their domicile. This proposal aims to mine GCs as living fossils of galaxy evolution to address fundamental questions in astrophysics: (1) Do satellite galaxies merge as predicted by the hierarchical build-up of galaxies? (2) Which are the seeds of supermassive black holes in the centres of galaxies? (3) How did star formation originate in the earliest phases of galaxy formation? To answer these questions, novel population-dependent dynamical modelling techniques are required, whose development the PI has led over the past years. This uniquely positions him to take full advantage of the emerging wealth of chemical and kinematical data on GCs.
Following the tidal disruption of satellite galaxies, their dense GCs, and maybe even their nuclei, are left as the most visible remnants in the main galaxy. The hierarchical build-up of their new host galaxy can thus be unearthed by recovering the GCs’ orbits. However, currently it is unclear which of the GCs are accretion survivors. Actually, the existence of a central intermediate mass black hole (IMBH) or of multiple stellar populations in GCs might tell which ones are accreted. At the same time, detection of IMBHs is important as they are predicted seeds for supermassive black holes in galaxies; while the multiple stellar populations in GCs are vital witnesses to the extreme modes of star formation in the early Universe. However, for every putative dynamical IMBH detection so far there is a corresponding non-detection; also the origin of multiple stellar populations in GCs still lacks any uncontrived explanation. The synergy of novel techniques and exquisite data proposed here promises a breakthrough in this emerging field of dynamical archeology with GCs as living fossils of the past of galaxies.
Summary
Globular clusters (GCs) are enigmatic objects that hide a wealth of information. They are the living fossils of the history of their native galaxies and the record keepers of the violent events that made them change their domicile. This proposal aims to mine GCs as living fossils of galaxy evolution to address fundamental questions in astrophysics: (1) Do satellite galaxies merge as predicted by the hierarchical build-up of galaxies? (2) Which are the seeds of supermassive black holes in the centres of galaxies? (3) How did star formation originate in the earliest phases of galaxy formation? To answer these questions, novel population-dependent dynamical modelling techniques are required, whose development the PI has led over the past years. This uniquely positions him to take full advantage of the emerging wealth of chemical and kinematical data on GCs.
Following the tidal disruption of satellite galaxies, their dense GCs, and maybe even their nuclei, are left as the most visible remnants in the main galaxy. The hierarchical build-up of their new host galaxy can thus be unearthed by recovering the GCs’ orbits. However, currently it is unclear which of the GCs are accretion survivors. Actually, the existence of a central intermediate mass black hole (IMBH) or of multiple stellar populations in GCs might tell which ones are accreted. At the same time, detection of IMBHs is important as they are predicted seeds for supermassive black holes in galaxies; while the multiple stellar populations in GCs are vital witnesses to the extreme modes of star formation in the early Universe. However, for every putative dynamical IMBH detection so far there is a corresponding non-detection; also the origin of multiple stellar populations in GCs still lacks any uncontrived explanation. The synergy of novel techniques and exquisite data proposed here promises a breakthrough in this emerging field of dynamical archeology with GCs as living fossils of the past of galaxies.
Max ERC Funding
1 999 250 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARCTIC CULT
Project ARCTIC CULTURES: SITES OF COLLECTION IN THE FORMATION OF THE EUROPEAN AND AMERICAN NORTHLANDS
Researcher (PI) Richard Charles POWELL
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Consolidator Grant (CoG), SH5, ERC-2016-COG
Summary The Arctic has risen to global attention in recent years, as it has been reconfigured through debates about global environmental change, resource extraction and disputes over sovereign rights. Within these discourses, little attention has been paid to the cultures of the Arctic. Indeed, it often seems as if the Circumpolar Arctic in global public understanding remains framed as a 'natural region' - that is, a place where the environment dominates the creation of culture. This framing has consequences for the region, because through this the Arctic becomes constructed as a space where people are absent. This proposal aims to discover how and why this might be so.
The proposal argues that this construction of the Arctic emerged from the exploration of the region by Europeans and North Americans and their contacts with indigenous people from the middle of the eighteenth century. Particular texts, cartographic representations and objects were collected and returned to sites like London, Copenhagen, Berlin and Philadelphia. The construction of the Arctic thereby became entwined within the growth of colonial museum cultures and, indeed, western modernity. This project aims to delineate the networks and collecting cultures involved in this creation of Arctic Cultures. It will bring repositories in colonial metropoles into dialogue with sites of collection in the Arctic by tracing the contexts of discovery and memorialisation. In doing so, it aspires to a new understanding of the consequences of certain forms of colonial representation for debates about the Circumpolar Arctic today.
The project involves research by the Principal Investigator and four Post Doctoral Researchers at museums, archives, libraries and repositories across Europe and North America, as well as in Greenland and the Canadian Arctic. A Project Assistant based in Oxford will help facilitate the completion of the research.
Summary
The Arctic has risen to global attention in recent years, as it has been reconfigured through debates about global environmental change, resource extraction and disputes over sovereign rights. Within these discourses, little attention has been paid to the cultures of the Arctic. Indeed, it often seems as if the Circumpolar Arctic in global public understanding remains framed as a 'natural region' - that is, a place where the environment dominates the creation of culture. This framing has consequences for the region, because through this the Arctic becomes constructed as a space where people are absent. This proposal aims to discover how and why this might be so.
The proposal argues that this construction of the Arctic emerged from the exploration of the region by Europeans and North Americans and their contacts with indigenous people from the middle of the eighteenth century. Particular texts, cartographic representations and objects were collected and returned to sites like London, Copenhagen, Berlin and Philadelphia. The construction of the Arctic thereby became entwined within the growth of colonial museum cultures and, indeed, western modernity. This project aims to delineate the networks and collecting cultures involved in this creation of Arctic Cultures. It will bring repositories in colonial metropoles into dialogue with sites of collection in the Arctic by tracing the contexts of discovery and memorialisation. In doing so, it aspires to a new understanding of the consequences of certain forms of colonial representation for debates about the Circumpolar Arctic today.
The project involves research by the Principal Investigator and four Post Doctoral Researchers at museums, archives, libraries and repositories across Europe and North America, as well as in Greenland and the Canadian Arctic. A Project Assistant based in Oxford will help facilitate the completion of the research.
Max ERC Funding
1 996 250 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ARENA
Project Aligned Roll-to-Roll Shear Coating of Nanotubes
Researcher (PI) Michael DE VOLDER
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Carbon Nanotubes (CNTs) are considered to be one of 21st century’s most promising materials and over the past decade, tremendous scientific advances have been achieved in the synthesis and processing of these materials. However, the uptake of CNTs by high-tech industry is hampered by a lack of high-throughput processes to structure CNTs into aligned and densely packed assemblies. This is key to fabricate next generation CNT devices, and to date, the CNT community is still struggling to achieve this, especially over large areas.
As part of the ERC Starting Grant HiENA, we are pioneering a potentially disruptive strategy to control the packing of CNTs and to fabricate large area films of aligned CNT. In this process, we start from newly developed ultra-high density dispersion of CNTs which can form liquid crystal domains. These domains are aligned by controlling shear in a custom designed coating head which then continuously dispenses the CNTs on a roll-to-roll coater which was recently purchased by the host group. To quantify the performance of the proposed technology, the parameter space of the coating process will be mapped out in terms of throughput, film thickness, uniformity, and conductivity.
Finally, we devised a two-step commercialisation plan which targets less to more demanding markets including thin film heaters, ultra-lightweight electro-magnetic shields, as well as interconnects and sensors for flexible electronics. We believe this project is timely on the one hand because of the technology push of improved CNT processing and on the other hand by the pull from several new markets including flexible electronics and the rise of the Internet of Things which will require a drastic increase in low cost electronic manufacturing technologies. The ERC Proof of Concept grant ARENA aspires to contribute to this need by taking a leap forward in the large scale processing of next generation CNT devices.
Summary
Carbon Nanotubes (CNTs) are considered to be one of 21st century’s most promising materials and over the past decade, tremendous scientific advances have been achieved in the synthesis and processing of these materials. However, the uptake of CNTs by high-tech industry is hampered by a lack of high-throughput processes to structure CNTs into aligned and densely packed assemblies. This is key to fabricate next generation CNT devices, and to date, the CNT community is still struggling to achieve this, especially over large areas.
As part of the ERC Starting Grant HiENA, we are pioneering a potentially disruptive strategy to control the packing of CNTs and to fabricate large area films of aligned CNT. In this process, we start from newly developed ultra-high density dispersion of CNTs which can form liquid crystal domains. These domains are aligned by controlling shear in a custom designed coating head which then continuously dispenses the CNTs on a roll-to-roll coater which was recently purchased by the host group. To quantify the performance of the proposed technology, the parameter space of the coating process will be mapped out in terms of throughput, film thickness, uniformity, and conductivity.
Finally, we devised a two-step commercialisation plan which targets less to more demanding markets including thin film heaters, ultra-lightweight electro-magnetic shields, as well as interconnects and sensors for flexible electronics. We believe this project is timely on the one hand because of the technology push of improved CNT processing and on the other hand by the pull from several new markets including flexible electronics and the rise of the Internet of Things which will require a drastic increase in low cost electronic manufacturing technologies. The ERC Proof of Concept grant ARENA aspires to contribute to this need by taking a leap forward in the large scale processing of next generation CNT devices.
Max ERC Funding
149 963 €
Duration
Start date: 2017-07-01, End date: 2018-12-31
Project acronym ARS
Project Autonomous Robotic Surgery
Researcher (PI) Paolo FIORINI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI VERONA
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Summary
The goal of the ARS project is the derivation of a unified framework for the autonomous execution of robotic tasks in challenging environments in which accurate performance and safety are of paramount importance. We have chosen surgery as the research scenario because of its importance, its intrinsic challenges, and the presence of three factors that make this project feasible and timely. In fact, we have recently concluded the I-SUR project demonstrating the feasibility of autonomous surgical actions, we have access to the first big data made available to researchers of clinical robotic surgeries, and we will be able to demonstrate the project results on the high performance surgical robot “da Vinci Research Kit”. The impact of autonomous robots on the workforce is a current subject of discussion, but surgical autonomy will be welcome by the medical personnel, e.g. to carry out simple intervention steps, react faster to unexpected events, or monitor the insurgence of fatigue. The framework for autonomous robotic surgery will include five main research objectives. The first will address the analysis of robotic surgery data set to extract action and knowledge models of the intervention. The second objective will focus on planning, which will consist of instantiating the intervention models to a patient specific anatomy. The third objective will address the design of the hybrid controllers for the discrete and continuous parts of the intervention. The fourth research objective will focus on real time reasoning to assess the intervention state and the overall surgical situation. Finally, the last research objective will address the verification, validation and benchmark of the autonomous surgical robotic capabilities. The research results to be achieved by ARS will contribute to paving the way towards enhancing autonomy and operational capabilities of service robots, with the ambitious goal of bridging the gap between robotic and human task execution capability.
Max ERC Funding
2 750 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ARTEFACT
Project The Global as Artefact: Understanding the Patterns of Global Political History Through an Anthropology of Knowledge -- The Case of Agriculture in Four Global Systems from the Neolithic to the Present
Researcher (PI) INANNA HAMATI-ATAYA
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARS OF THE UNIVERSITY OF CAMBRIDGE
Call Details Consolidator Grant (CoG), SH2, ERC-2016-COG
Summary Knowledge is an anthropological constant that is indissociable from the birth and interactions of human societies, but is at best a secondary concern for scholars of international relations and globalization. Contemporary global studies are thus unable to account for the co-constitution of knowledge and politics at a macro-scale, and remain especially blind to the historical patterns of epistemic development that operate at the level of the species as a whole and have shaped its global political history in specific, path-dependent ways up to now.
ARTEFACT is the first project to pursue a knowledge-centered investigation of global politics. It is uniquely grounded in an anthropological approach that treats globalization and human knowledges beyond their modern manifestations, from the longue-durée perspective of our species’ social history. 'The global as artefact' is more than a metaphor. It reflects the premise that human collectives 'make' the political world not merely through ideas, language, or norms, but primordially through the material infrastructures, solutions, objects, practices, and skills they develop in response to evolving structural challenges.
ARTEFACT takes agriculture as an exemplary and especially timely case-study to illuminate the entangled global histories of knowledge and politics, analyzing and comparing four increasingly inclusive 'global political systems' of the Ancient, Medieval, Modern, and Contemporary eras and their associated agrarian socio-epistemic revolutions.
ARTEFACT ultimately aims to 1) develop an original theory of the global, 2) launch Global Knowledge Studies as a new cross-disciplinary domain of systematic empirical and theoretical study, and 3) push the respective boundaries of the anthropology of knowledge, global history, and international theory beyond the state-of-the-art and toward a holistic understanding that can illuminate how past trends of socio-epistemic evolution might shape future paths of global life.
Summary
Knowledge is an anthropological constant that is indissociable from the birth and interactions of human societies, but is at best a secondary concern for scholars of international relations and globalization. Contemporary global studies are thus unable to account for the co-constitution of knowledge and politics at a macro-scale, and remain especially blind to the historical patterns of epistemic development that operate at the level of the species as a whole and have shaped its global political history in specific, path-dependent ways up to now.
ARTEFACT is the first project to pursue a knowledge-centered investigation of global politics. It is uniquely grounded in an anthropological approach that treats globalization and human knowledges beyond their modern manifestations, from the longue-durée perspective of our species’ social history. 'The global as artefact' is more than a metaphor. It reflects the premise that human collectives 'make' the political world not merely through ideas, language, or norms, but primordially through the material infrastructures, solutions, objects, practices, and skills they develop in response to evolving structural challenges.
ARTEFACT takes agriculture as an exemplary and especially timely case-study to illuminate the entangled global histories of knowledge and politics, analyzing and comparing four increasingly inclusive 'global political systems' of the Ancient, Medieval, Modern, and Contemporary eras and their associated agrarian socio-epistemic revolutions.
ARTEFACT ultimately aims to 1) develop an original theory of the global, 2) launch Global Knowledge Studies as a new cross-disciplinary domain of systematic empirical and theoretical study, and 3) push the respective boundaries of the anthropology of knowledge, global history, and international theory beyond the state-of-the-art and toward a holistic understanding that can illuminate how past trends of socio-epistemic evolution might shape future paths of global life.
Max ERC Funding
1 428 165 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARTHUS
Project Advances in Research on Theories of the Dark Universe - Inhomogeneity Effects in Relativistic Cosmology
Researcher (PI) Thomas BUCHERT
Host Institution (HI) UNIVERSITE LYON 1 CLAUDE BERNARD
Call Details Advanced Grant (AdG), PE9, ERC-2016-ADG
Summary The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Summary
The project ARTHUS aims at determining the physical origin of Dark Energy: in addition to the energy sources of the standard model of cosmology, effective terms arise through spatially averaging inhomogeneous cosmological models in General Relativity. It has been demonstrated that these additional terms can play the role of Dark Energy on large scales (but they can also mimic Dark Matter on scales of mass accumulations). The underlying rationale is that fluctuations in the Universe generically couple to spatially averaged intrinsic properties of space, such as its averaged scalar curvature, thus changing the global evolution of the effective (spatially averaged) cosmological model. At present, we understand these so- called backreaction effects only qualitatively. The project ARTHUS is directed towards a conclusive quantitative evaluation of these effects by developing generic and non-perturbative relativistic models of structure formation, by statistically measuring the key-variables of the models in observations and in simulation data, and by reinterpreting observational results in light of the new models. It is to be emphasized that there is no doubt about the existence of backreaction effects; the question is whether they are even capable of getting rid of the dark sources (as some models discussed in the literature suggest), or whether their impact is substantially smaller. The project thus addresses an essential issue of current cosmological research: to find pertinent answers concerning the quantitative impact of inhomogeneity effects, a necessary, worldwide recognized step toward high-precision cosmology. If the project objectives are attained, the results will have a far-reaching impact on theoretical and observational cosmology, on the interpretation of astronomical experiments such as Planck and Euclid, as well as on a wide spectrum of particle physics theories and experiments.
Max ERC Funding
2 091 000 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ARTIV1
Project An Artificial Visual Cortex for Image Processing
Researcher (PI) Ugo Vittorio BOSCAIN
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary The ERC starting grant GECOMETHODS, on which this POC is based, tackled problems of diffusion equations via geometric control methods. One of the most striking achievements of the project has been the development of an algorithm of image reconstruction based mainly on non-isotropic diffusion. This algorithm is bio-mimetic in the sense that it replicates the way in which the primary visual cortex V1 of mammals processes the signals arriving from the eyes. It has performances that are at the state of the art in image processing. These results together with others obtained in the ERC project show that image processing algorithms based on the functional architecture of V1 can go very far. However, the exceptional performances of the primary visual cortex V1 rely not only on the particular algorithm used, but also on the fact that such algorithm “runs” on a dedicated hardware having the following features: 1. an exceptional level of parallelism; 2. connections that are well adapted to transmit information in a non-isotropic way as it is required by the algorithms of image reconstruction and recognition.
The idea of this POC is to create a dedicated hardware (called ARTIV1) emulating the functional architecture of V1 and hence having on one hand a huge degree of parallelism and on the other hand connections among the CPUs that reflect the non-isotropic structure of the visual cortex V1. Such a hardware that we plan to build as an integrated circuit with an industrial partner will be a veritable artificial visual cortex. It will be fully programmable and it will be able to perform many biomimetic image processing tasks that we expect to be exceptionally performant.
ARTIV1 will come to the marked accompanied by some dedicated software for image reconstruction and image recognition. However we expect that other applications will be developed by customers, as for instance softwares for optical flow estimation or for sound processing.
Summary
The ERC starting grant GECOMETHODS, on which this POC is based, tackled problems of diffusion equations via geometric control methods. One of the most striking achievements of the project has been the development of an algorithm of image reconstruction based mainly on non-isotropic diffusion. This algorithm is bio-mimetic in the sense that it replicates the way in which the primary visual cortex V1 of mammals processes the signals arriving from the eyes. It has performances that are at the state of the art in image processing. These results together with others obtained in the ERC project show that image processing algorithms based on the functional architecture of V1 can go very far. However, the exceptional performances of the primary visual cortex V1 rely not only on the particular algorithm used, but also on the fact that such algorithm “runs” on a dedicated hardware having the following features: 1. an exceptional level of parallelism; 2. connections that are well adapted to transmit information in a non-isotropic way as it is required by the algorithms of image reconstruction and recognition.
The idea of this POC is to create a dedicated hardware (called ARTIV1) emulating the functional architecture of V1 and hence having on one hand a huge degree of parallelism and on the other hand connections among the CPUs that reflect the non-isotropic structure of the visual cortex V1. Such a hardware that we plan to build as an integrated circuit with an industrial partner will be a veritable artificial visual cortex. It will be fully programmable and it will be able to perform many biomimetic image processing tasks that we expect to be exceptionally performant.
ARTIV1 will come to the marked accompanied by some dedicated software for image reconstruction and image recognition. However we expect that other applications will be developed by customers, as for instance softwares for optical flow estimation or for sound processing.
Max ERC Funding
149 937 €
Duration
Start date: 2017-04-01, End date: 2018-09-30
Project acronym ASSHURED
Project Analysing South-South Humanitarian Responses to Displacement from Syria: Views from Lebanon, Jordan and Turkey
Researcher (PI) Elena FIDDIAN-QASMIYEH
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Starting Grant (StG), SH3, ERC-2016-STG
Summary Since 2012, over 4 million people have fled Syria in ‘the most dramatic humanitarian crisis that we have ever faced’ (UNHCR). By November 2015 there were 1,078,338 refugees from Syria in Lebanon, 630,776 in Jordan and 2,181,293 in Turkey. Humanitarian agencies and donor states from both the global North and the global South have funded and implemented aid programmes, and yet commentators have argued that civil society groups from the global South are the most significant actors supporting refugees in Lebanon, Jordan and Turkey. Whilst they are highly significant responses, however, major gaps in knowledge remain regarding the motivations, nature and implications of Southern-led responses to conflict-induced displacement. This project draws on multi-sited ethnographic and participatory research with refugees from Syria and their aid providers in Lebanon, Jordan and Turkey to critically examine why, how and with what effect actors from the South have responded to the displacement of refugees from Syria. The main research aims are:
1. identifying diverse models of Southern-led responses to conflict-induced displacement,
2. examining the (un)official motivations, nature and implications of Southern-led responses,
3. examining refugees’ experiences and perceptions of Southern-led responses,
4. exploring diverse Southern and Northern actors’ perceptions of Southern-led responses,
5. tracing the implications of Southern-led initiatives for humanitarian theory and practice.
Based on a critical theoretical framework inspired by post-colonial and feminist approaches, the project contributes to theories of humanitarianism and debates regarding donor-recipient relations and refugees’ agency in displacement situations. It will also inform the development of policies to most appropriately address refugees’ needs and rights. This highly topical and innovative project thus has far-reaching implications for refugees and local communities, academics, policy-makers and practitioners.
Summary
Since 2012, over 4 million people have fled Syria in ‘the most dramatic humanitarian crisis that we have ever faced’ (UNHCR). By November 2015 there were 1,078,338 refugees from Syria in Lebanon, 630,776 in Jordan and 2,181,293 in Turkey. Humanitarian agencies and donor states from both the global North and the global South have funded and implemented aid programmes, and yet commentators have argued that civil society groups from the global South are the most significant actors supporting refugees in Lebanon, Jordan and Turkey. Whilst they are highly significant responses, however, major gaps in knowledge remain regarding the motivations, nature and implications of Southern-led responses to conflict-induced displacement. This project draws on multi-sited ethnographic and participatory research with refugees from Syria and their aid providers in Lebanon, Jordan and Turkey to critically examine why, how and with what effect actors from the South have responded to the displacement of refugees from Syria. The main research aims are:
1. identifying diverse models of Southern-led responses to conflict-induced displacement,
2. examining the (un)official motivations, nature and implications of Southern-led responses,
3. examining refugees’ experiences and perceptions of Southern-led responses,
4. exploring diverse Southern and Northern actors’ perceptions of Southern-led responses,
5. tracing the implications of Southern-led initiatives for humanitarian theory and practice.
Based on a critical theoretical framework inspired by post-colonial and feminist approaches, the project contributes to theories of humanitarianism and debates regarding donor-recipient relations and refugees’ agency in displacement situations. It will also inform the development of policies to most appropriately address refugees’ needs and rights. This highly topical and innovative project thus has far-reaching implications for refugees and local communities, academics, policy-makers and practitioners.
Max ERC Funding
1 498 069 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym ATM-GTP
Project Atmospheric Gas-to-Particle conversion
Researcher (PI) Markku KULMALA
Host Institution (HI) HELSINGIN YLIOPISTO
Call Details Advanced Grant (AdG), PE10, ERC-2016-ADG
Summary Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Summary
Atmospheric Gas-to-Particle conversion (ATM-GTP) is a 5-year project focusing on one of the most critical atmospheric processes relevant to global climate and air quality: the first steps of atmospheric aerosol particle formation and growth. The project will concentrate on the currently lacking environmentally-specific knowledge about the interacting, non-linear, physical and chemical atmospheric processes associated with nano-scale gas-to-particle conversion (GTP). The main scientific objective of ATM-GTP is to create a deep understanding on atmospheric GTP taking place at the sub-5 nm size range, particularly in heavily-polluted Chinese mega cities like Beijing and in pristine environments like Siberia and Nordic high-latitude regions. We also aim to find out how nano-GTM is associated with air quality-climate interactions and feedbacks. We are interested in quantifying the effect of nano-GTP on the COBACC (Continental Biosphere-Aerosol-Cloud-Climate) feedback loop that is important in Arctic and boreal regions. Our approach enables to point out the effective reduction mechanisms of the secondary air pollution by a factor of 5-10 and to make reliable estimates of the global and regional aerosol loads, including anthropogenic and biogenic contributions to these loads. We can estimate the future role of Northern Hemispheric biosphere in reducing the global radiative forcing via the quantified feedbacks. The project is carried out by the world-leading scientist in atmospheric aerosol science, being also one of the founders of terrestrial ecosystem meteorology, together with his research team. The project uses novel infrastructures including SMEAR (Stations Measuring Ecosystem Atmospheric Relations) stations, related modelling platforms and regional data from Russia and China. The work will be carried out in synergy with several national, Nordic and EU research-innovation projects: Finnish Center of Excellence-ATM, Nordic CoE-CRAICC and EU-FP7-BACCHUS.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym AtoFun
Project Atomic Scale Defects: Structure and Function
Researcher (PI) Felix HOFMANN
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Starting Grant (StG), PE5, ERC-2016-STG
Summary Atomic scale defects play a key role in determining the behaviour of all crystalline materials, profoundly modifying mechanical, thermal and electrical properties. Many current technological applications make do with phenomenological descriptions of these effects; yet myriad intriguing questions about the fundamental link between defect structure and material function remain.
Transmission electron microscopy revolutionised the study of atomic scale defects by enabling their direct imaging. The novel coherent X-ray diffraction techniques developed in this project promise a similar advancement, making it possible to probe the strain fields that govern defect interactions in 3D with high spatial resolution (<10 nm). They will allow us to clarify the effect of impurities and retained gas on dislocation strain fields, shedding light on opportunities to engineer dislocation properties. The exceptional strain sensitivity of coherent diffraction will enable us to explore the fundamental mechanisms governing the behaviour of ion-implantation-induced point defects that are invisible to TEM. While we concentrate on dislocations and point defects, the new techniques will apply to all crystalline materials where defects are important. Our characterisation of defect structure will be combined with laser transient grating measurements of thermal transport changes due to specific defect populations. This unique multifaceted perspective of defect behaviour will transform our ability to devise modelling approaches linking defect structure to material function.
Our proof-of-concept results highlight the feasibility of this ambitious research project. It opens up a vast range of exciting possibilities to gain a deep, fundamental understanding of atomic scale defects and their effect on material function. This is an essential prerequisite for exploiting and engineering defects to enhance material properties.
Summary
Atomic scale defects play a key role in determining the behaviour of all crystalline materials, profoundly modifying mechanical, thermal and electrical properties. Many current technological applications make do with phenomenological descriptions of these effects; yet myriad intriguing questions about the fundamental link between defect structure and material function remain.
Transmission electron microscopy revolutionised the study of atomic scale defects by enabling their direct imaging. The novel coherent X-ray diffraction techniques developed in this project promise a similar advancement, making it possible to probe the strain fields that govern defect interactions in 3D with high spatial resolution (<10 nm). They will allow us to clarify the effect of impurities and retained gas on dislocation strain fields, shedding light on opportunities to engineer dislocation properties. The exceptional strain sensitivity of coherent diffraction will enable us to explore the fundamental mechanisms governing the behaviour of ion-implantation-induced point defects that are invisible to TEM. While we concentrate on dislocations and point defects, the new techniques will apply to all crystalline materials where defects are important. Our characterisation of defect structure will be combined with laser transient grating measurements of thermal transport changes due to specific defect populations. This unique multifaceted perspective of defect behaviour will transform our ability to devise modelling approaches linking defect structure to material function.
Our proof-of-concept results highlight the feasibility of this ambitious research project. It opens up a vast range of exciting possibilities to gain a deep, fundamental understanding of atomic scale defects and their effect on material function. This is an essential prerequisite for exploiting and engineering defects to enhance material properties.
Max ERC Funding
1 610 231 €
Duration
Start date: 2017-03-01, End date: 2022-02-28
Project acronym ATOM
Project Advanced Holographic Tomographies for Nanoscale Materials: Revealing Electromagnetic and Deformation Fields, Chemical Composition and Quantum States at Atomic Resolution.
Researcher (PI) Axel LUBK
Host Institution (HI) LEIBNIZ-INSTITUT FUER FESTKOERPER- UND WERKSTOFFFORSCHUNG DRESDEN E.V.
Call Details Starting Grant (StG), PE3, ERC-2016-STG
Summary The ongoing miniaturization in nanotechnology and functional materials puts an ever increasing focus on the development of three-dimensional (3D) nanostructures, such as quantum dot arrays, structured nanowires, or non-trivial topological magnetic textures such as skyrmions, which permit a better performance of logical or memory devices in terms of speed and energy efficiency. To develop and advance such technologies and to improve the understanding of the underlying fundamental solid state physics effects, the nondestructive and quantitative 3D characterization of physical, e.g., electric or magnetic, fields down to atomic resolution is indispensable. Current nanoscale metrology methods only inadequately convey this information, e.g., because they probe surfaces, record projections, or lack resolution. AToM will provide a ground-breaking tomographic methodology for current nanotechnology by mapping electric and magnetic fields as well as crucial properties of the underlying atomic structure in solids, such as the chemical composition, mechanical strain or spin configuration in 3D down to atomic resolution. To achieve that goal, advanced holographic and tomographic setups in the Transmission Electron Microscope (TEM) are combined with novel computational methods, e.g., taking into account the ramifications of electron diffraction. Moreover, fundamental application limits are overcome (A) by extending the holographic principle, requiring coherent electron beams, to quantum state reconstructions applicable to electrons of any (in)coherence; and (B) by adapting a unique in-situ TEM with a very large sample chamber to facilitate holographic field sensing down to very low temperatures (6 K) under application of external, e.g., electric, stimuli. The joint development of AToM in response to current problems of nanotechnology, including the previously mentioned ones, is anticipated to immediately and sustainably advance nanotechnology in its various aspects.
Summary
The ongoing miniaturization in nanotechnology and functional materials puts an ever increasing focus on the development of three-dimensional (3D) nanostructures, such as quantum dot arrays, structured nanowires, or non-trivial topological magnetic textures such as skyrmions, which permit a better performance of logical or memory devices in terms of speed and energy efficiency. To develop and advance such technologies and to improve the understanding of the underlying fundamental solid state physics effects, the nondestructive and quantitative 3D characterization of physical, e.g., electric or magnetic, fields down to atomic resolution is indispensable. Current nanoscale metrology methods only inadequately convey this information, e.g., because they probe surfaces, record projections, or lack resolution. AToM will provide a ground-breaking tomographic methodology for current nanotechnology by mapping electric and magnetic fields as well as crucial properties of the underlying atomic structure in solids, such as the chemical composition, mechanical strain or spin configuration in 3D down to atomic resolution. To achieve that goal, advanced holographic and tomographic setups in the Transmission Electron Microscope (TEM) are combined with novel computational methods, e.g., taking into account the ramifications of electron diffraction. Moreover, fundamental application limits are overcome (A) by extending the holographic principle, requiring coherent electron beams, to quantum state reconstructions applicable to electrons of any (in)coherence; and (B) by adapting a unique in-situ TEM with a very large sample chamber to facilitate holographic field sensing down to very low temperatures (6 K) under application of external, e.g., electric, stimuli. The joint development of AToM in response to current problems of nanotechnology, including the previously mentioned ones, is anticipated to immediately and sustainably advance nanotechnology in its various aspects.
Max ERC Funding
1 499 602 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym ATOMKI-PPROCESS
Project Nuclear reaction studies relevant to the astrophysical p-process nucleosynthesis
Researcher (PI) György Gyürky
Host Institution (HI) Magyar Tudomanyos Akademia Atommagkutato Intezete
Call Details Starting Grant (StG), PE2, ERC-2007-StG
Summary The astrophysical p-process, the stellar production mechanism of the heavy, proton rich isotopes (p-isotopes), is one of the least studied processes in nucleosynthesis. The astrophysical site(s) for the p-process could not yet be clearly identified. In order to reproduce the natural abundances of the p-isotopes, the p-process models must take into account a huge nuclear reaction network. A precise knowledge of the rate of the nuclear reactions in this network is essential for a reliable abundance calculation and for a clear assignment of the astrophysical site(s). For lack of experimental data the nuclear physics inputs for the reaction networks are based on statistical model calculations. These calculations are largely untested in the mass and energy range relevant to the p-process and the uncertainties in the reaction rate values result in a correspondingly uncertain prediction of the p-isotope abundances. Therefore, experiments aiming at the determination of reaction rates for the p-process are of great importance. In this project nuclear reaction cross section measurements will be carried out in the mass and energy range of p-process to check the reliability of the statistical model calculations and to put the p-process models on a more reliable base. The accelerators of the Institute of Nuclear Research in Debrecen, Hungary provide the necessary basis for such studies. The p-process model calculations are especially sensitive to the rates of reactions involving alpha particles and heavy nuclei. Because of technical difficulties, so far there are practically no experimental data available on such reactions and the uncertainty in these reaction rates is presently one of the biggest contributions to the uncertainty of p-isotope abundance calculations. With the help of the ERC grant the alpha-induced reaction cross sections can be measured on heavy isotopes for the first time, which could contribute to a better understanding of the astrophysical p-process.
Summary
The astrophysical p-process, the stellar production mechanism of the heavy, proton rich isotopes (p-isotopes), is one of the least studied processes in nucleosynthesis. The astrophysical site(s) for the p-process could not yet be clearly identified. In order to reproduce the natural abundances of the p-isotopes, the p-process models must take into account a huge nuclear reaction network. A precise knowledge of the rate of the nuclear reactions in this network is essential for a reliable abundance calculation and for a clear assignment of the astrophysical site(s). For lack of experimental data the nuclear physics inputs for the reaction networks are based on statistical model calculations. These calculations are largely untested in the mass and energy range relevant to the p-process and the uncertainties in the reaction rate values result in a correspondingly uncertain prediction of the p-isotope abundances. Therefore, experiments aiming at the determination of reaction rates for the p-process are of great importance. In this project nuclear reaction cross section measurements will be carried out in the mass and energy range of p-process to check the reliability of the statistical model calculations and to put the p-process models on a more reliable base. The accelerators of the Institute of Nuclear Research in Debrecen, Hungary provide the necessary basis for such studies. The p-process model calculations are especially sensitive to the rates of reactions involving alpha particles and heavy nuclei. Because of technical difficulties, so far there are practically no experimental data available on such reactions and the uncertainty in these reaction rates is presently one of the biggest contributions to the uncertainty of p-isotope abundance calculations. With the help of the ERC grant the alpha-induced reaction cross sections can be measured on heavy isotopes for the first time, which could contribute to a better understanding of the astrophysical p-process.
Max ERC Funding
750 000 €
Duration
Start date: 2008-07-01, End date: 2013-06-30
Project acronym ATOMPHOTONLOQIP
Project Experimental Linear Optics Quantum Information Processing with Atoms and Photons
Researcher (PI) Jian-Wei Pan
Host Institution (HI) RUPRECHT-KARLS-UNIVERSITAET HEIDELBERG
Call Details Starting Grant (StG), PE2, ERC-2007-StG
Summary Quantum information science and atom optics are among the most active fields in modern physics. In recent years, many theoretical efforts have been made to combine these two fields. Recent experimental progresses have shown the in-principle possibility to perform scalable quantum information processing (QIP) with linear optics and atomic ensembles. The main purpose of the present project is to use atomic qubits as quantum memory and exploit photonic qubits for information transfer and processing to achieve efficient linear optics QIP. On the one hand, utilizing the interaction between laser pulses and atomic ensembles we will experimentally investigate the potentials of atomic ensembles in the gas phase to build quantum repeaters for long-distance quantum communication, that is, to develop a new technological solution for quantum repeaters making use of the effective qubit-type entanglement of two cold atomic ensembles by a projective measurement of individual photons by spontaneous Raman processes. On this basis, we will further investigate the advantages of cold atoms in an optical trap to enhance the coherence time of atomic qubits beyond the threshold for scalable realization of quantum repeaters. Moreover, building on our long experience in research on multi-photon entanglement, we also plan to perform a number of significant experiments in the field of QIP with particular emphasis on fault-tolerant quantum computation, photon-loss-tolerant quantum computation and cluster-state based quantum simulation. Finally, by combining the techniques developed in the above quantum memory and multi-photon interference experiments, we will further experimentally investigate the possibility to achieve quantum teleportation between photonic and atomic qubits, quantum teleportation between remote atomic qubits and efficient entanglement generation via classical feed-forward. The techniques that will be developed in the present project will lay the basis for future large scale
Summary
Quantum information science and atom optics are among the most active fields in modern physics. In recent years, many theoretical efforts have been made to combine these two fields. Recent experimental progresses have shown the in-principle possibility to perform scalable quantum information processing (QIP) with linear optics and atomic ensembles. The main purpose of the present project is to use atomic qubits as quantum memory and exploit photonic qubits for information transfer and processing to achieve efficient linear optics QIP. On the one hand, utilizing the interaction between laser pulses and atomic ensembles we will experimentally investigate the potentials of atomic ensembles in the gas phase to build quantum repeaters for long-distance quantum communication, that is, to develop a new technological solution for quantum repeaters making use of the effective qubit-type entanglement of two cold atomic ensembles by a projective measurement of individual photons by spontaneous Raman processes. On this basis, we will further investigate the advantages of cold atoms in an optical trap to enhance the coherence time of atomic qubits beyond the threshold for scalable realization of quantum repeaters. Moreover, building on our long experience in research on multi-photon entanglement, we also plan to perform a number of significant experiments in the field of QIP with particular emphasis on fault-tolerant quantum computation, photon-loss-tolerant quantum computation and cluster-state based quantum simulation. Finally, by combining the techniques developed in the above quantum memory and multi-photon interference experiments, we will further experimentally investigate the possibility to achieve quantum teleportation between photonic and atomic qubits, quantum teleportation between remote atomic qubits and efficient entanglement generation via classical feed-forward. The techniques that will be developed in the present project will lay the basis for future large scale
Max ERC Funding
1 435 000 €
Duration
Start date: 2008-07-01, End date: 2013-12-31
Project acronym AutoCode
Project Programming with Big Code
Researcher (PI) Eran Yahav
Host Institution (HI) TECHNION - ISRAEL INSTITUTE OF TECHNOLOGY
Call Details Proof of Concept (PoC), ERC-2016-PoC, ERC-2016-PoC
Summary Software synthesis aims to automate the creation of software by generating parts of software from a higher-level description. Until recently it was believed to be impossible to practically synthesize software beyond very small fragments. However, synthesis based on learning from existing large code-bases (“Big Code”) is making synthesis into a practical reality . The purpose of this PoC is to develop a platform that would lead to commercialization of our technology to improve programming productivity and code quality. We target two closely related applications: (1) Providing automatic assistance in programming tasks by learning from existing code, and (2) Providing on-line assessment of code quality as it is being developed using learned models. These applications have the potential to dramatically reduce time-to-market of new software, and improve its quality and security.
Summary
Software synthesis aims to automate the creation of software by generating parts of software from a higher-level description. Until recently it was believed to be impossible to practically synthesize software beyond very small fragments. However, synthesis based on learning from existing large code-bases (“Big Code”) is making synthesis into a practical reality . The purpose of this PoC is to develop a platform that would lead to commercialization of our technology to improve programming productivity and code quality. We target two closely related applications: (1) Providing automatic assistance in programming tasks by learning from existing code, and (2) Providing on-line assessment of code quality as it is being developed using learned models. These applications have the potential to dramatically reduce time-to-market of new software, and improve its quality and security.
Max ERC Funding
150 000 €
Duration
Start date: 2017-05-01, End date: 2018-10-31
Project acronym AUTOCOMPLEMENT
Project The role of complement in the induction of autoimmunity against post-translationally modified proteins
Researcher (PI) Leendert TROUW
Host Institution (HI) ACADEMISCH ZIEKENHUIS LEIDEN
Call Details Consolidator Grant (CoG), LS7, ERC-2016-COG
Summary In many prevalent autoimmune diseases such as rheumatoid arthritis (RA) and systemic lupus erythematosus (SLE) autoantibodies are used as diagnostic and prognostic tools. Several of these autoantibodies target proteins that have been post-translationally modified (PTM). Examples of such modifications are citrullination and carbamylation. The success of B cell-targeted therapies in many auto-antibody positive diseases suggests that B cell mediated auto-immunity is playing a direct pathogenic role. Despite the wealth of information on the clinical associations of these anti-PTM protein antibodies as biomarkers we have currently no insight into why these antibodies are formed.
Immunization studies reveal that PTM proteins can induce antibody responses even in the absence of exogenous adjuvant. The reason why these PTM proteins have ‘autoadjuvant’ properties that lead to a breach of tolerance is currently unknown. In this proposal, I hypothesise that the breach of tolerance towards PTM proteins is mediated by complement factors that bind directly to these PTM. Our preliminary data indeed reveal that several complement factors bind specifically to PTM proteins. Complement could be involved in the autoadjuvant property of PTM proteins as next to killing pathogens complement can also boost adaptive immune responses. I plan to unravel the importance of the complement–PTM protein interaction by answering these questions:
1) What is the physiological function of complement binding to PTM proteins?
2) Is the breach of tolerance towards PTM proteins influenced by complement?
3) Can the adjuvant function of PTM be used to increase vaccine efficacy and/or decrease autoreactivity?
With AUTOCOMPLEMENT I will elucidate how PTM-reactive B cells receive ‘autoadjuvant’ signals. This insight will impact on patient care as we can now design strategies to either block unwanted ‘autoadjuvant’ signals to inhibit autoimmunity or to utilize ‘autoadjuvant’ signals to potentiate vaccination.
Summary
In many prevalent autoimmune diseases such as rheumatoid arthritis (RA) and systemic lupus erythematosus (SLE) autoantibodies are used as diagnostic and prognostic tools. Several of these autoantibodies target proteins that have been post-translationally modified (PTM). Examples of such modifications are citrullination and carbamylation. The success of B cell-targeted therapies in many auto-antibody positive diseases suggests that B cell mediated auto-immunity is playing a direct pathogenic role. Despite the wealth of information on the clinical associations of these anti-PTM protein antibodies as biomarkers we have currently no insight into why these antibodies are formed.
Immunization studies reveal that PTM proteins can induce antibody responses even in the absence of exogenous adjuvant. The reason why these PTM proteins have ‘autoadjuvant’ properties that lead to a breach of tolerance is currently unknown. In this proposal, I hypothesise that the breach of tolerance towards PTM proteins is mediated by complement factors that bind directly to these PTM. Our preliminary data indeed reveal that several complement factors bind specifically to PTM proteins. Complement could be involved in the autoadjuvant property of PTM proteins as next to killing pathogens complement can also boost adaptive immune responses. I plan to unravel the importance of the complement–PTM protein interaction by answering these questions:
1) What is the physiological function of complement binding to PTM proteins?
2) Is the breach of tolerance towards PTM proteins influenced by complement?
3) Can the adjuvant function of PTM be used to increase vaccine efficacy and/or decrease autoreactivity?
With AUTOCOMPLEMENT I will elucidate how PTM-reactive B cells receive ‘autoadjuvant’ signals. This insight will impact on patient care as we can now design strategies to either block unwanted ‘autoadjuvant’ signals to inhibit autoimmunity or to utilize ‘autoadjuvant’ signals to potentiate vaccination.
Max ERC Funding
1 999 803 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AutoLiqHand
Project A Compact and Automated Liquid Handling Platform for Biomedical Assays
Researcher (PI) Andreas Richard Dr. Bausch
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Call Details Proof of Concept (PoC), PC1, ERC-2016-PoC
Summary Liquid handling is an integral part of biological and medical assays. For many applications the method of choice is manual pipetting, which has significant limitations. Chiefly, it is user-time intensive and prone to sample handling issues (poor time accuracy and pipetting errors). While liquid handling robots do exist, they are expensive instruments, shared by many users and targeted for high-throughput applications. This leaves, in practice, most research and diagnostics without an automated solution. Here, we present an innovative solution to this problem: a compact and mobile automated liquid handling (AutoLiqHand) device for the individual user. This platform will enable to automate biomedical experiments and diagnostic routines which are currently done by hand. Moreover, its unique design enables it to run in a much larger range of biomedical settings than currently offered by existing solutions, and at the fraction of the cost. This platform was developed as part of the ERC-funded project SelfOrg and is routinely used in our lab for a variety of specialized and standard routines (e.g. drug treatment and immunostaining). The AutoLiqHand system mimics the main advantages of manual pipetting, namely simplicity and versatility, through a unique design of a fully integrated and microfluidic-based platform. In addition, when interfaced with well-established biomedical equipment such as ELISA readers or PCR machines, our platform can form a fully automated lab at significantly lower costs than commercially available devices. Thus, it has the potential to become a standard tool for researchers both in basic and early pharmaceutical/clinical research as well as for clinicians in point-of-care diagnostics. The aim of this Proof-of-Concept proposal is to adapt the AutoLiqHand platform to market needs and optimize it for production in order to make it available to the market.
Summary
Liquid handling is an integral part of biological and medical assays. For many applications the method of choice is manual pipetting, which has significant limitations. Chiefly, it is user-time intensive and prone to sample handling issues (poor time accuracy and pipetting errors). While liquid handling robots do exist, they are expensive instruments, shared by many users and targeted for high-throughput applications. This leaves, in practice, most research and diagnostics without an automated solution. Here, we present an innovative solution to this problem: a compact and mobile automated liquid handling (AutoLiqHand) device for the individual user. This platform will enable to automate biomedical experiments and diagnostic routines which are currently done by hand. Moreover, its unique design enables it to run in a much larger range of biomedical settings than currently offered by existing solutions, and at the fraction of the cost. This platform was developed as part of the ERC-funded project SelfOrg and is routinely used in our lab for a variety of specialized and standard routines (e.g. drug treatment and immunostaining). The AutoLiqHand system mimics the main advantages of manual pipetting, namely simplicity and versatility, through a unique design of a fully integrated and microfluidic-based platform. In addition, when interfaced with well-established biomedical equipment such as ELISA readers or PCR machines, our platform can form a fully automated lab at significantly lower costs than commercially available devices. Thus, it has the potential to become a standard tool for researchers both in basic and early pharmaceutical/clinical research as well as for clinicians in point-of-care diagnostics. The aim of this Proof-of-Concept proposal is to adapt the AutoLiqHand platform to market needs and optimize it for production in order to make it available to the market.
Max ERC Funding
149 750 €
Duration
Start date: 2017-01-01, End date: 2018-06-30
Project acronym AveTransRisk
Project Average - Transaction Costs and Risk Management during the First Globalization (Sixteenth-Eighteenth Centuries)
Researcher (PI) Maria FUSARO
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Consolidator Grant (CoG), SH6, ERC-2016-COG
Summary This project focuses on the historical analysis of institutions and their impact on economic development through the investigation of a legal instrument – general average (GA) – which underpins maritime trade by redistributing damages’ costs across all interested parties. This will be pursued through the comparative investigation of GA in those European countries where substantial data exists: Italy, Spain, England, France and the Low Countries (1500-1800). Average and insurance were both created in the Middle Ages to facilitate trade through the redistribution of risk. Insurance has been widely studied, average – the expenses which can befall ships and cargoes from the time of their loading aboard until their unloading (due to accidents, jettison, and unexpected costs) – has been neglected. GA still plays an essential role in the redistribution of transaction costs, and being a form of strictly mutual self-protection, never evolved into a speculative financial instrument as insurance did; it therefore represents an excellent case of long-term effectiveness of a non-market economic phenomenon. Although the principle behind GA was very similar across Europe, in practice there were substantial differences in declaring and adjudicating claims. GA reports provide unparalleled evidence on maritime trade which, analysed quantitatively and quantitatively through a novel interdisciplinary approach, will contribute to the reassessment of the role played by the maritime sector in fostering economic growth during the early modern first globalization, when GA was the object of fierce debates on state jurisdiction and standardization of practice. Today they are regulated by the York-Antwerp Rules (YAR), currently under revision. This timely conjuncture provides plenty of opportunities for active engagement with practitioners, thereby fostering a creative dialogue on GA historical study and its future development to better face the challenges of mature globalization.
Summary
This project focuses on the historical analysis of institutions and their impact on economic development through the investigation of a legal instrument – general average (GA) – which underpins maritime trade by redistributing damages’ costs across all interested parties. This will be pursued through the comparative investigation of GA in those European countries where substantial data exists: Italy, Spain, England, France and the Low Countries (1500-1800). Average and insurance were both created in the Middle Ages to facilitate trade through the redistribution of risk. Insurance has been widely studied, average – the expenses which can befall ships and cargoes from the time of their loading aboard until their unloading (due to accidents, jettison, and unexpected costs) – has been neglected. GA still plays an essential role in the redistribution of transaction costs, and being a form of strictly mutual self-protection, never evolved into a speculative financial instrument as insurance did; it therefore represents an excellent case of long-term effectiveness of a non-market economic phenomenon. Although the principle behind GA was very similar across Europe, in practice there were substantial differences in declaring and adjudicating claims. GA reports provide unparalleled evidence on maritime trade which, analysed quantitatively and quantitatively through a novel interdisciplinary approach, will contribute to the reassessment of the role played by the maritime sector in fostering economic growth during the early modern first globalization, when GA was the object of fierce debates on state jurisdiction and standardization of practice. Today they are regulated by the York-Antwerp Rules (YAR), currently under revision. This timely conjuncture provides plenty of opportunities for active engagement with practitioners, thereby fostering a creative dialogue on GA historical study and its future development to better face the challenges of mature globalization.
Max ERC Funding
1 854 256 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym AVIANEGG
Project Evolutionary genetics in a ‘classical’ avian study system by high throughput transcriptome sequencing and SNP genotyping
Researcher (PI) Jon Slate
Host Institution (HI) THE UNIVERSITY OF SHEFFIELD
Call Details Starting Grant (StG), LS5, ERC-2007-StG
Summary Long-term studies of free-living vertebrate populations have proved a rich resource for understanding evolutionary and ecological processes, because individuals’ life histories can be measured by tracking them from birth/hatching through to death. In recent years the ‘animal model’ has been applied to pedigreed long-term study populations with great success, dramatically advancing our understanding of quantitative genetic parameters such as heritabilities, genetic correlations and plasticities of traits that are relevant to microevolutionary responses to environmental change. Unfortunately, quantitative genetic approaches have one major drawback – they cannot identify the actual genes responsible for genetic variation. Therefore, it is impossible to link evolutionary responses to a changing environment to molecular genetic variation, making our picture of the process incomplete. Many of the best long-term studies have been conducted in passerine birds. Unfortunately genomics resources are only available for two model avian species, and are absent for bird species that are studied in the wild. I will fill this gap by exploiting recent advances in genomics technology to sequence the entire transcriptome of the longest running study of wild birds – the great tit population in Wytham Woods, Oxford. Having identified most of the sequence variation in the great tit transcriptome, I will then genotype all birds for whom phenotype records and blood samples are available This will be, by far, the largest phenotype-genotype dataset of any free-living vertebrate population. I will then use gene mapping techniques to identify genes and genomic regions responsible for variation in a number of key traits such as lifetime recruitment, clutch size and breeding/laying date. This will result in a greater understanding, at the molecular level, how microevolutionary change can arise (or be constrained).
Summary
Long-term studies of free-living vertebrate populations have proved a rich resource for understanding evolutionary and ecological processes, because individuals’ life histories can be measured by tracking them from birth/hatching through to death. In recent years the ‘animal model’ has been applied to pedigreed long-term study populations with great success, dramatically advancing our understanding of quantitative genetic parameters such as heritabilities, genetic correlations and plasticities of traits that are relevant to microevolutionary responses to environmental change. Unfortunately, quantitative genetic approaches have one major drawback – they cannot identify the actual genes responsible for genetic variation. Therefore, it is impossible to link evolutionary responses to a changing environment to molecular genetic variation, making our picture of the process incomplete. Many of the best long-term studies have been conducted in passerine birds. Unfortunately genomics resources are only available for two model avian species, and are absent for bird species that are studied in the wild. I will fill this gap by exploiting recent advances in genomics technology to sequence the entire transcriptome of the longest running study of wild birds – the great tit population in Wytham Woods, Oxford. Having identified most of the sequence variation in the great tit transcriptome, I will then genotype all birds for whom phenotype records and blood samples are available This will be, by far, the largest phenotype-genotype dataset of any free-living vertebrate population. I will then use gene mapping techniques to identify genes and genomic regions responsible for variation in a number of key traits such as lifetime recruitment, clutch size and breeding/laying date. This will result in a greater understanding, at the molecular level, how microevolutionary change can arise (or be constrained).
Max ERC Funding
1 560 770 €
Duration
Start date: 2008-10-01, End date: 2014-06-30
Project acronym AXION
Project Axions: From Heaven to Earth
Researcher (PI) Frank Wilczek
Host Institution (HI) STOCKHOLMS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2016-ADG
Summary Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Summary
Axions are hypothetical particles whose existence would solve two major problems: the strong P, T problem (a major blemish on the standard model); and the dark matter problem. It is a most important goal to either observe or rule out the existence of a cosmic axion background. It appears that decisive observations may be possible, but only after orchestrating insight from specialities ranging from quantum field theory and astrophysical modeling to ultra-low noise quantum measurement theory. Detailed predictions for the magnitude and structure of the cosmic axion background depend on cosmological and astrophysical modeling, which can be constrained by theoretical insight and numerical simulation. In parallel, we must optimize strategies for extracting accessible signals from that very weakly interacting source.
While the existence of axions as fundamental particles remains hypothetical, the equations governing how axions interact with electromagnetic fields also govern (with different parameters) how certain materials interact with electromagnetic fields. Thus those materials embody “emergent” axions. The equations have remarkable properties, which one can test in these materials, and possibly put to practical use.
Closely related to axions, mathematically, are anyons. Anyons are particle-like excitations that elude the familiar classification into bosons and fermions. Theoretical and numerical studies indicate that they are common emergent features of highly entangled states of matter in two dimensions. Recent work suggests the existence of states of matter, both natural and engineered, in which anyon dynamics is both important and experimentally accessible. Since the equations for anyons and axions are remarkably similar, and both have common, deep roots in symmetry and topology, it will be fruitful to consider them together.
Max ERC Funding
2 324 391 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym B-PhosphoChem
Project Exploration of the 2D-Chemistry of Black Phosphorous
Researcher (PI) Andreas Hirsch
Host Institution (HI) FRIEDRICH-ALEXANDER-UNIVERSITAET ERLANGEN NUERNBERG
Call Details Advanced Grant (AdG), PE5, ERC-2016-ADG
Summary We propose the development of the chemistry of black phosphorus (BP). B-PhosphoChem will constitute a new text book chapter in the realm of synthetic chemistry located at the interface of inorganic-, organic-, and materials chemistry as well as solid state physics. B-PhosphoChem will provide the basis for exciting and so far elusive applications such as ion batteries and stable high performance devices. Thin sheets of BP represent a new class of 2D materials and have recently raised tremendous interest in the scientific community. Outstanding physical properties such as high charge carrier mobility, combined with transparency and the persistence of a band gap have been discovered. However, the chemistry of BP remains still unexplored. B-PhosphoChem will close this gap and will a) provide the opportunity to modulate and fine tune the physical properties, b) allow for considerably improving the processability and increasing the solubility, c) establish concepts for the desired chemical stabilization, d) give access to the combination of BP properties with those of other compound classes, e) reveal the fundamental chemical properties and reactivity principles, and f) provide methods for establishing practical applications. Five work packages will be addressed: 1) Production of Thin Layer BP, 2) Supramolecular Chemistry of BP, 3) Intercalation Compounds of BP, 4) Covalent Chemistry of BP, and 5) BP-Based Materials and Devices. The work packages will be supported by systematic calculations. For our group, whose core competence is synthetic organic and supramolecular chemistry, the orientation towards inorganic phosphorus chemistry constitutes a major step into a completely new direction. However, we are convinced to be the most predestinated research group in the world successfully facing this challenge because of our leadership and well documented interdisciplinary experience in synthesizing and characterizing 0D-, 1D-, and 2D nanostructures.
Summary
We propose the development of the chemistry of black phosphorus (BP). B-PhosphoChem will constitute a new text book chapter in the realm of synthetic chemistry located at the interface of inorganic-, organic-, and materials chemistry as well as solid state physics. B-PhosphoChem will provide the basis for exciting and so far elusive applications such as ion batteries and stable high performance devices. Thin sheets of BP represent a new class of 2D materials and have recently raised tremendous interest in the scientific community. Outstanding physical properties such as high charge carrier mobility, combined with transparency and the persistence of a band gap have been discovered. However, the chemistry of BP remains still unexplored. B-PhosphoChem will close this gap and will a) provide the opportunity to modulate and fine tune the physical properties, b) allow for considerably improving the processability and increasing the solubility, c) establish concepts for the desired chemical stabilization, d) give access to the combination of BP properties with those of other compound classes, e) reveal the fundamental chemical properties and reactivity principles, and f) provide methods for establishing practical applications. Five work packages will be addressed: 1) Production of Thin Layer BP, 2) Supramolecular Chemistry of BP, 3) Intercalation Compounds of BP, 4) Covalent Chemistry of BP, and 5) BP-Based Materials and Devices. The work packages will be supported by systematic calculations. For our group, whose core competence is synthetic organic and supramolecular chemistry, the orientation towards inorganic phosphorus chemistry constitutes a major step into a completely new direction. However, we are convinced to be the most predestinated research group in the world successfully facing this challenge because of our leadership and well documented interdisciplinary experience in synthesizing and characterizing 0D-, 1D-, and 2D nanostructures.
Max ERC Funding
2 491 250 €
Duration
Start date: 2017-08-01, End date: 2022-07-31
Project acronym Baby DCs
Project Age-dependent Regulation of Dendritic Cell Development and Function
Researcher (PI) Barbara Ursula SCHRAML
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Starting Grant (StG), LS6, ERC-2016-STG
Summary Early life immune balance is essential for survival and establishment of healthy immunity in later life. We aim to define how age-dependent regulation of dendritic cell (DC) development contributes to this crucial immune balance. DCs are versatile controllers of immunity that in neonates are qualitatively distinct from adults. Why such age-dependent differences exist is unclear but newborn DCs are considered underdeveloped and functionally immature.
Using ontogenetic tracing of conventional DC precursors, I have found a previously unappreciated developmental heterogeneity of DCs that is particularly prominent in young mice. Preliminary data indicate that distinct waves of DC poiesis contribute to the functional differences between neonatal and adult DCs. I hypothesize that the neonatal DC compartment is not immature but rather that DC poiesis is developmentally regulated to create essential age-dependent immune balance. Further, I have identified a unique situation in early life to address a fundamental biological question, namely to what extent cellular function is pre-programmed by developmental origin (nature) versus environmental factors (nurture).
In this proposal, we will first use novel models to fate map the origin of the DC compartment with age. We will then define to what extent cellular origin determines age-dependent functions of DCs in immunity. Using innovative comparative gene expression profiling and integrative epigenomic analysis the cell intrinsic mechanisms regulating the age-dependent functions of DCs will be characterized. Because environmental factors in utero and after birth critically influence immune balance, we will finally define the impact of maternal infection and metabolic disease, as well as early microbial encounter on DC poiesis. Characterizing how developmentally regulated DC poiesis shapes the unique features of early life immunity will provide novel insights into immune development that are vital to advance vaccine strategies.
Summary
Early life immune balance is essential for survival and establishment of healthy immunity in later life. We aim to define how age-dependent regulation of dendritic cell (DC) development contributes to this crucial immune balance. DCs are versatile controllers of immunity that in neonates are qualitatively distinct from adults. Why such age-dependent differences exist is unclear but newborn DCs are considered underdeveloped and functionally immature.
Using ontogenetic tracing of conventional DC precursors, I have found a previously unappreciated developmental heterogeneity of DCs that is particularly prominent in young mice. Preliminary data indicate that distinct waves of DC poiesis contribute to the functional differences between neonatal and adult DCs. I hypothesize that the neonatal DC compartment is not immature but rather that DC poiesis is developmentally regulated to create essential age-dependent immune balance. Further, I have identified a unique situation in early life to address a fundamental biological question, namely to what extent cellular function is pre-programmed by developmental origin (nature) versus environmental factors (nurture).
In this proposal, we will first use novel models to fate map the origin of the DC compartment with age. We will then define to what extent cellular origin determines age-dependent functions of DCs in immunity. Using innovative comparative gene expression profiling and integrative epigenomic analysis the cell intrinsic mechanisms regulating the age-dependent functions of DCs will be characterized. Because environmental factors in utero and after birth critically influence immune balance, we will finally define the impact of maternal infection and metabolic disease, as well as early microbial encounter on DC poiesis. Characterizing how developmentally regulated DC poiesis shapes the unique features of early life immunity will provide novel insights into immune development that are vital to advance vaccine strategies.
Max ERC Funding
1 500 000 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym BabyVir
Project The role of the virome in shaping the gut ecosystem during the first year of life
Researcher (PI) Alexandra Petrovna ZHERNAKOVA
Host Institution (HI) ACADEMISCH ZIEKENHUIS GRONINGEN
Call Details Starting Grant (StG), LS8, ERC-2016-STG
Summary The role of intestinal bacteria in human health and disease has been intensively studied; however the viral composition of the microbiome, the virome, remains largely unknown. As many of the viruses are bacteriophages, they are expected to be a major factor shaping the human microbiome. The dynamics of the virome during early life, its interaction with host and environmental factors, is likely to have profound effects on human physiology. Therefore it is extremely timely to study the virome in depth and on a wide scale.
This ERC project aims at understanding how the gut virome develops during the first year of life and how that relates to the composition of the bacterial microbiome. In particular, we will determine which intrinsic and environmental factors, including genetics and the mother’s microbiome and diet, interact with the virome in shaping the early gut microbiome ecosystem. In a longitudinal study of 1,000 newborns followed at 7 time points from birth till age 12 months, I will investigate: (1) the composition and evolution of the virome and bacterial microbiome in the first year of life; (2) the role of factors coming from the mother and from the host genome on virome and bacterial microbiome development and their co-evolution; and (3) the role of environmental factors, like infectious diseases, vaccinations and diet habits, on establishing the virome and overall microbiome composition during the first year of life.
This project will provide crucial knowledge about composition and maturation of the virome during the first year of life, and its symbiotic relation with the bacterial microbiome. This longitudinal dataset will be instrumental for identification of microbiome markers of diseases and for the follow up analysis of the long-term effect of microbiota maturation later in life. Knowledge of the role of viruses in shaping the microbiota may promote future directions for manipulating the human gut microbiota in health and disease.
Summary
The role of intestinal bacteria in human health and disease has been intensively studied; however the viral composition of the microbiome, the virome, remains largely unknown. As many of the viruses are bacteriophages, they are expected to be a major factor shaping the human microbiome. The dynamics of the virome during early life, its interaction with host and environmental factors, is likely to have profound effects on human physiology. Therefore it is extremely timely to study the virome in depth and on a wide scale.
This ERC project aims at understanding how the gut virome develops during the first year of life and how that relates to the composition of the bacterial microbiome. In particular, we will determine which intrinsic and environmental factors, including genetics and the mother’s microbiome and diet, interact with the virome in shaping the early gut microbiome ecosystem. In a longitudinal study of 1,000 newborns followed at 7 time points from birth till age 12 months, I will investigate: (1) the composition and evolution of the virome and bacterial microbiome in the first year of life; (2) the role of factors coming from the mother and from the host genome on virome and bacterial microbiome development and their co-evolution; and (3) the role of environmental factors, like infectious diseases, vaccinations and diet habits, on establishing the virome and overall microbiome composition during the first year of life.
This project will provide crucial knowledge about composition and maturation of the virome during the first year of life, and its symbiotic relation with the bacterial microbiome. This longitudinal dataset will be instrumental for identification of microbiome markers of diseases and for the follow up analysis of the long-term effect of microbiota maturation later in life. Knowledge of the role of viruses in shaping the microbiota may promote future directions for manipulating the human gut microbiota in health and disease.
Max ERC Funding
1 499 881 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym BACCO
Project Bias and Clustering Calculations Optimised: Maximising discovery with galaxy surveys
Researcher (PI) Raúl Esteban ANGULO de la Fuente
Host Institution (HI) FUNDACION CENTRO DE ESTUDIOS DE FISICA DEL COSMOS DE ARAGON
Call Details Starting Grant (StG), PE9, ERC-2016-STG
Summary A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Summary
A new generation of galaxy surveys will soon start measuring the spatial distribution of millions of galaxies over a broad range of redshifts, offering an imminent opportunity to discover new physics. A detailed comparison of these measurements with theoretical models of galaxy clustering may reveal a new fundamental particle, a breakdown of General Relativity, or a hint on the nature of cosmic acceleration. Despite a large progress in the analytic treatment of structure formation in recent years, traditional clustering models still suffer from large uncertainties. This limits cosmological analyses to a very restricted range of scales and statistics, which will be one of the main obstacles to reach a comprehensive exploitation of future surveys.
Here I propose to develop a novel simulation--based approach to predict galaxy clustering. Combining recent advances in computational cosmology, from cosmological N--body calculations to physically-motivated galaxy formation models, I will develop a unified framework to directly predict the position and velocity of individual dark matter structures and galaxies as function of cosmological and astrophysical parameters. In this formulation, galaxy clustering will be a prediction of a set of physical assumptions in a given cosmological setting. The new theoretical framework will be flexible, accurate and fast: it will provide predictions for any clustering statistic, down to scales 100 times smaller than in state-of-the-art perturbation--theory--based models, and in less than 1 minute of CPU time. These advances will enable major improvements in future cosmological constraints, which will significantly increase the overall power of future surveys maximising our potential to discover new physics.
Max ERC Funding
1 484 240 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BACTERIAL SPORES
Project Investigating the Nature of Bacterial Spores
Researcher (PI) Sigal Ben-Yehuda
Host Institution (HI) THE HEBREW UNIVERSITY OF JERUSALEM
Call Details Starting Grant (StG), LS3, ERC-2007-StG
Summary When triggered by nutrient limitation, the Gram-positive bacterium Bacillus subtilis and its relatives enter a pathway of cellular differentiation culminating in the formation of a dormant cell type called a spore, the most resilient cell type known. Bacterial spores can survive for long periods of time and are able to endure extremes of heat, radiation and chemical assault. Remarkably, dormant spores can rapidly convert back to actively growing cells by a process called germination. Consequently, spore forming bacteria, including dangerous pathogens, (such as C. botulinum and B. anthracis) are highly resistant to antibacterial treatments and difficult to eradicate. Despite significant advances in our understanding of the process of spore formation, little is known about the nature of the mature spore. It is unrevealed how dormancy is maintained within the spore and how it is ceased, as the organization and the dynamics of the spore macromolecules remain obscure. The unusual biochemical and biophysical characteristics of the dormant spore make it a challenging biological system to investigate using conventional methods, and thus set the need to develop innovative approaches to study spore biology. We propose to explore the nature of spores by using B. subtilis as a primary experimental system. We intend to: (1) define the architecture of the spore chromosome, (2) track the complexity and fate of mRNA and protein molecules during sporulation, dormancy and germination, (3) revisit the basic notion of the spore dormancy (is it metabolically inert?), (4) compare the characteristics of bacilli spores from diverse ecophysiological groups, (5) investigate the features of spores belonging to distant bacterial genera, (6) generate an integrative database that categorizes the molecular features of spores. Our study will provide original insights and introduce novel concepts to the field of spore biology and may help devise innovative ways to combat spore forming pathogens.
Summary
When triggered by nutrient limitation, the Gram-positive bacterium Bacillus subtilis and its relatives enter a pathway of cellular differentiation culminating in the formation of a dormant cell type called a spore, the most resilient cell type known. Bacterial spores can survive for long periods of time and are able to endure extremes of heat, radiation and chemical assault. Remarkably, dormant spores can rapidly convert back to actively growing cells by a process called germination. Consequently, spore forming bacteria, including dangerous pathogens, (such as C. botulinum and B. anthracis) are highly resistant to antibacterial treatments and difficult to eradicate. Despite significant advances in our understanding of the process of spore formation, little is known about the nature of the mature spore. It is unrevealed how dormancy is maintained within the spore and how it is ceased, as the organization and the dynamics of the spore macromolecules remain obscure. The unusual biochemical and biophysical characteristics of the dormant spore make it a challenging biological system to investigate using conventional methods, and thus set the need to develop innovative approaches to study spore biology. We propose to explore the nature of spores by using B. subtilis as a primary experimental system. We intend to: (1) define the architecture of the spore chromosome, (2) track the complexity and fate of mRNA and protein molecules during sporulation, dormancy and germination, (3) revisit the basic notion of the spore dormancy (is it metabolically inert?), (4) compare the characteristics of bacilli spores from diverse ecophysiological groups, (5) investigate the features of spores belonging to distant bacterial genera, (6) generate an integrative database that categorizes the molecular features of spores. Our study will provide original insights and introduce novel concepts to the field of spore biology and may help devise innovative ways to combat spore forming pathogens.
Max ERC Funding
1 630 000 €
Duration
Start date: 2008-10-01, End date: 2013-09-30
Project acronym BAM
Project Becoming A Minority
Researcher (PI) Maurice CRUL
Host Institution (HI) STICHTING VU
Call Details Advanced Grant (AdG), SH3, ERC-2016-ADG
Summary In the last forty years, researchers in the Field of Migration and Ethnic Studies looked at the integration of migrants and their descendants. Concepts, methodological tools and theoretical frameworks have been developed to measure and predict integration outcomes both across different ethnic groups and in comparison with people of native descent. But are we also looking into the actual integration of the receiving group of native ‘white’ descent in city contexts where they have become a numerical minority themselves? In cities like Amsterdam, now only one in three youngsters under age fifteen is of native descent. This situation, referred to as a majority-minority context, is a new phenomenon in Western Europe and it presents itself as one of the most important societal and psychological transformations of our time. I argue that the field of migration and ethnic studies is stagnating because of the one-sided focus on migrants and their children. This is even more urgent given the increased ant-immigrant vote. These pressing scientific and societal reasons pushed me to develop the project BAM (Becoming A Minority). The project will be executed in three harbor cities, Rotterdam, Antwerp and Malmö, and three service sector cities, Amsterdam, Frankfurt and Vienna. BAM consists of 5 subprojects: (1) A meta-analysis of secondary data on people of native ‘white’ descent in the six research sites; (2) A newly developed survey for the target group; (3) An analysis of critical circumstances of encounter that trigger either positive or rather negative responses to increased ethnic diversity (4) Experimental diversity labs to test under which circumstances people will change their attitudes or their actions towards increased ethnic diversity; (5) The formulation of a new theory of integration that includes the changed position of the group of native ‘white’ descent as an important actor.
Summary
In the last forty years, researchers in the Field of Migration and Ethnic Studies looked at the integration of migrants and their descendants. Concepts, methodological tools and theoretical frameworks have been developed to measure and predict integration outcomes both across different ethnic groups and in comparison with people of native descent. But are we also looking into the actual integration of the receiving group of native ‘white’ descent in city contexts where they have become a numerical minority themselves? In cities like Amsterdam, now only one in three youngsters under age fifteen is of native descent. This situation, referred to as a majority-minority context, is a new phenomenon in Western Europe and it presents itself as one of the most important societal and psychological transformations of our time. I argue that the field of migration and ethnic studies is stagnating because of the one-sided focus on migrants and their children. This is even more urgent given the increased ant-immigrant vote. These pressing scientific and societal reasons pushed me to develop the project BAM (Becoming A Minority). The project will be executed in three harbor cities, Rotterdam, Antwerp and Malmö, and three service sector cities, Amsterdam, Frankfurt and Vienna. BAM consists of 5 subprojects: (1) A meta-analysis of secondary data on people of native ‘white’ descent in the six research sites; (2) A newly developed survey for the target group; (3) An analysis of critical circumstances of encounter that trigger either positive or rather negative responses to increased ethnic diversity (4) Experimental diversity labs to test under which circumstances people will change their attitudes or their actions towards increased ethnic diversity; (5) The formulation of a new theory of integration that includes the changed position of the group of native ‘white’ descent as an important actor.
Max ERC Funding
2 499 714 €
Duration
Start date: 2017-11-01, End date: 2022-10-31
Project acronym BantuFirst
Project The First Bantu Speakers South of the Rainforest: A Cross-Disciplinary Approach to Human Migration, Language Spread, Climate Change and Early Farming in Late Holocene Central Africa
Researcher (PI) Koen André G. BOSTOEN
Host Institution (HI) UNIVERSITEIT GENT
Call Details Consolidator Grant (CoG), SH6, ERC-2016-COG
Summary The Bantu Expansion is not only the main linguistic, cultural and demographic process in Late Holocene Africa. It is also one of the most controversial issues in African History that still has political repercussions today. It has sparked debate across the disciplines and far beyond Africanist circles in an attempt to understand how the young Bantu language family (ca. 5000 years) could spread over large parts of Central, Eastern and Southern Africa. This massive dispersal is commonly seen as the result of a single migratory macro-event driven by agriculture, but many questions about the movement and subsistence of ancestral Bantu speakers are still open. They can only be answered through real interdisciplinary collaboration. This project will unite researchers with outstanding expertise in African archaeology, archaeobotany and historical linguistics to form a unique cross-disciplinary team that will shed new light on the first Bantu-speaking village communities south of the rainforest. Fieldwork is planned in parts of the Democratic Republic of Congo, the Republic of Congo and Angola that are terra incognita for archaeologists to determine the timing, location and archaeological signature of the earliest villagers and to establish how they interacted with autochthonous hunter-gatherers. Special attention will be paid to archaeobotanical and palaeoenvironmental data to get an idea of their subsistence, diet and habitat. Historical linguistics will be pushed beyond the boundaries of vocabulary-based phylogenetics and open new pathways in lexical reconstruction, especially regarding subsistence and land use of early Bantu speakers. Through interuniversity collaboration archaeozoological, palaeoenvironmental and genetic data and phylogenetic modelling will be brought into the cross-disciplinary approach to acquire a new holistic view on the interconnections between human migration, language spread, climate change and early farming in Late Holocene Central Africa.
Summary
The Bantu Expansion is not only the main linguistic, cultural and demographic process in Late Holocene Africa. It is also one of the most controversial issues in African History that still has political repercussions today. It has sparked debate across the disciplines and far beyond Africanist circles in an attempt to understand how the young Bantu language family (ca. 5000 years) could spread over large parts of Central, Eastern and Southern Africa. This massive dispersal is commonly seen as the result of a single migratory macro-event driven by agriculture, but many questions about the movement and subsistence of ancestral Bantu speakers are still open. They can only be answered through real interdisciplinary collaboration. This project will unite researchers with outstanding expertise in African archaeology, archaeobotany and historical linguistics to form a unique cross-disciplinary team that will shed new light on the first Bantu-speaking village communities south of the rainforest. Fieldwork is planned in parts of the Democratic Republic of Congo, the Republic of Congo and Angola that are terra incognita for archaeologists to determine the timing, location and archaeological signature of the earliest villagers and to establish how they interacted with autochthonous hunter-gatherers. Special attention will be paid to archaeobotanical and palaeoenvironmental data to get an idea of their subsistence, diet and habitat. Historical linguistics will be pushed beyond the boundaries of vocabulary-based phylogenetics and open new pathways in lexical reconstruction, especially regarding subsistence and land use of early Bantu speakers. Through interuniversity collaboration archaeozoological, palaeoenvironmental and genetic data and phylogenetic modelling will be brought into the cross-disciplinary approach to acquire a new holistic view on the interconnections between human migration, language spread, climate change and early farming in Late Holocene Central Africa.
Max ERC Funding
1 997 500 €
Duration
Start date: 2018-01-01, End date: 2022-12-31
Project acronym BAPS
Project Bayesian Agent-based Population Studies: Transforming Simulation Models of Human Migration
Researcher (PI) Jakub KAZIMIERZ BIJAK
Host Institution (HI) UNIVERSITY OF SOUTHAMPTON
Call Details Consolidator Grant (CoG), SH3, ERC-2016-COG
Summary The aim of BAPS is to develop a ground-breaking simulation model of international migration, based on a population of intelligent, cognitive agents, their social networks and institutions, all interacting with one another. The project will transform the study of migration – one of the most uncertain population processes and a top-priority EU policy area – by offering a step change in the way it can be understood, predicted and managed. In this way, BAPS will effectively integrate behavioural and social theory with modelling.
To develop micro-foundations for migration studies, model design will follow cutting-edge developments in demography, statistics, cognitive psychology and computer science. BAPS will also offer a pioneering environment for applying the findings in practice through a bespoke modelling language. Bayesian statistical principles will be used to design innovative computer experiments, and learn about modelling the simulated individuals and the way they make decisions.
In BAPS, we will collate available information for migration models; build and test the simulations by applying experimental design principles to enhance our knowledge of migration processes; collect information on the underpinning decision-making mechanisms through psychological experiments; and design software for implementing Bayesian agent-based models in practice. The project will use various information sources to build models bottom-up, filling an important epistemological gap in demography.
BAPS will be carried out by the Allianz European Demographer 2015, recognised as a leader in the field for methodological innovation, directing an interdisciplinary team with expertise in demography, agent-based models, statistical analysis of uncertainty, meta-cognition, and computer simulations. The project will open up exciting research possibilities beyond demography, and will generate both academic and practical impact, offering methodological advice for policy-relevant simulations.
Summary
The aim of BAPS is to develop a ground-breaking simulation model of international migration, based on a population of intelligent, cognitive agents, their social networks and institutions, all interacting with one another. The project will transform the study of migration – one of the most uncertain population processes and a top-priority EU policy area – by offering a step change in the way it can be understood, predicted and managed. In this way, BAPS will effectively integrate behavioural and social theory with modelling.
To develop micro-foundations for migration studies, model design will follow cutting-edge developments in demography, statistics, cognitive psychology and computer science. BAPS will also offer a pioneering environment for applying the findings in practice through a bespoke modelling language. Bayesian statistical principles will be used to design innovative computer experiments, and learn about modelling the simulated individuals and the way they make decisions.
In BAPS, we will collate available information for migration models; build and test the simulations by applying experimental design principles to enhance our knowledge of migration processes; collect information on the underpinning decision-making mechanisms through psychological experiments; and design software for implementing Bayesian agent-based models in practice. The project will use various information sources to build models bottom-up, filling an important epistemological gap in demography.
BAPS will be carried out by the Allianz European Demographer 2015, recognised as a leader in the field for methodological innovation, directing an interdisciplinary team with expertise in demography, agent-based models, statistical analysis of uncertainty, meta-cognition, and computer simulations. The project will open up exciting research possibilities beyond demography, and will generate both academic and practical impact, offering methodological advice for policy-relevant simulations.
Max ERC Funding
1 455 590 €
Duration
Start date: 2017-06-01, End date: 2021-05-31
Project acronym BARCODED-CELLTRACING
Project Endogenous barcoding for in vivo fate mapping of lineage development in the blood and immune system
Researcher (PI) Hans-Reimer RODEWALD
Host Institution (HI) DEUTSCHES KREBSFORSCHUNGSZENTRUM HEIDELBERG
Call Details Advanced Grant (AdG), LS6, ERC-2016-ADG
Summary The immune system is a complex ensemble of diverse lineages. Studies on in-vivo-hematopoiesis have until
now largely rested on transplantation. More physiological experiments have been limited by the inability to
analyze hematopoietic stem (HSC) and progenitor cells in situ without cell isolation and other disruptive
manipulations. We have developed mouse mutants in which a fluorescent marker can be switched on in HSC
in situ (inducible fate mapping), and traced HSC lineage output under unperturbed conditions in vivo. These
experiments uncovered marked differences comparing in situ and post-transplantation hematopoiesis. These
new developments raise several important questions, notably on the developmental fates HSC realize in vivo
(as opposed to their experimental potential), and on the structure (routes and nodes) of hematopoiesis from
HSC to peripheral blood and immune lineages. Answers to these questions (and in fact the deconvolution of
any tissue) require the development of non-invasive, high resolution barcoding systems. We have now
designed, built and tested a DNA-based barcoding system, termed Polylox, that is based on an artificial
recombination locus in which Cre recombinase can generate several hundred thousand genetic tags in mice.
We chose the Cre-loxP system to link high resolution barcoding (i.e. the ability to barcode single cells and to
fate map their progeny) to the zoo of tissue- or stage-specific, inducible Cre-driver mice. Here, I will present
the principles of this endogenous barcoding system, demonstrate its experimental and analytical feasibilities
and its power to resolve complex lineages. The work program addresses in a comprehensive manner major
open questions on the structure of the hematopoietic system that builds and maintains the immune system.
This project ultimately aims at an in depth dissection of unique or common lineage pathways emerging from
HSC, and at resolving relationships within cell lineages of the immune system.
Summary
The immune system is a complex ensemble of diverse lineages. Studies on in-vivo-hematopoiesis have until
now largely rested on transplantation. More physiological experiments have been limited by the inability to
analyze hematopoietic stem (HSC) and progenitor cells in situ without cell isolation and other disruptive
manipulations. We have developed mouse mutants in which a fluorescent marker can be switched on in HSC
in situ (inducible fate mapping), and traced HSC lineage output under unperturbed conditions in vivo. These
experiments uncovered marked differences comparing in situ and post-transplantation hematopoiesis. These
new developments raise several important questions, notably on the developmental fates HSC realize in vivo
(as opposed to their experimental potential), and on the structure (routes and nodes) of hematopoiesis from
HSC to peripheral blood and immune lineages. Answers to these questions (and in fact the deconvolution of
any tissue) require the development of non-invasive, high resolution barcoding systems. We have now
designed, built and tested a DNA-based barcoding system, termed Polylox, that is based on an artificial
recombination locus in which Cre recombinase can generate several hundred thousand genetic tags in mice.
We chose the Cre-loxP system to link high resolution barcoding (i.e. the ability to barcode single cells and to
fate map their progeny) to the zoo of tissue- or stage-specific, inducible Cre-driver mice. Here, I will present
the principles of this endogenous barcoding system, demonstrate its experimental and analytical feasibilities
and its power to resolve complex lineages. The work program addresses in a comprehensive manner major
open questions on the structure of the hematopoietic system that builds and maintains the immune system.
This project ultimately aims at an in depth dissection of unique or common lineage pathways emerging from
HSC, and at resolving relationships within cell lineages of the immune system.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-07-01, End date: 2022-06-30
Project acronym BaSaR
Project Beyond the Silk Road: Economic Development, Frontier Zones and Inter-Imperiality in the Afro-Eurasian World Region, 300 BCE to 300 CE
Researcher (PI) Sitta Valerie Ilse Alberta VON REDEN
Host Institution (HI) ALBERT-LUDWIGS-UNIVERSITAET FREIBURG
Call Details Advanced Grant (AdG), SH6, ERC-2016-ADG
Summary This interdisciplinary project will show that inter-imperial zones and small to mid-size regional networks of exchange were crucial for ancient Transeurasian exchange connections. It will demonstrate the significance of exchange in imperial frontier zones emerging from political, economic, infrastructural, institutional and technological development within empires. This will lead to a new conceptual frame for analyzing inter-imperiality and the morphology of exchange networks within and across imperial zones.
The centuries from 300 BCE to 300 CE were a period of accelerated empire transformation involving also new regions of the Afro-Eurasian world. Consumption centres shifted, affecting production, settlement, and regional exchange networks. They changed the dynamics of exchange, created new geographies, and greater cultural convergence between imperial spheres of influence. The development of imperial frontier zones of intense exchange and mobility (e.g. Northern China, Bactria, Gandhara, Syria, and the Red Sea/Gulf/Indian Ocean coasts) was related to imperial hinterlands, their fiscal-military-administrative regimes, the development of media of exchange and infrastructures, settlement, urban growth, and so on. It was also related to new forms and levels of consumption in imperial centres. In order to understand Transeurasian connectivity, the interdependence of frontier zone and inner-imperial development is crucial. We will reveal that competitions for social power within empires mobilized and concentrated resources reclaimed from natural landscapes and subsistence economies. Greater mobility of resources, both human and material, endowed competitions for power with economic force, feeding into inter-imperial prestige economies and trade. This new model of Afro-Eurasian connectivity will abandon some problematic assumptions of Silk Road trade, while maintaining the Afro-Eurasian macro-region as a meaningful unit for cultural and economic analysis.
Summary
This interdisciplinary project will show that inter-imperial zones and small to mid-size regional networks of exchange were crucial for ancient Transeurasian exchange connections. It will demonstrate the significance of exchange in imperial frontier zones emerging from political, economic, infrastructural, institutional and technological development within empires. This will lead to a new conceptual frame for analyzing inter-imperiality and the morphology of exchange networks within and across imperial zones.
The centuries from 300 BCE to 300 CE were a period of accelerated empire transformation involving also new regions of the Afro-Eurasian world. Consumption centres shifted, affecting production, settlement, and regional exchange networks. They changed the dynamics of exchange, created new geographies, and greater cultural convergence between imperial spheres of influence. The development of imperial frontier zones of intense exchange and mobility (e.g. Northern China, Bactria, Gandhara, Syria, and the Red Sea/Gulf/Indian Ocean coasts) was related to imperial hinterlands, their fiscal-military-administrative regimes, the development of media of exchange and infrastructures, settlement, urban growth, and so on. It was also related to new forms and levels of consumption in imperial centres. In order to understand Transeurasian connectivity, the interdependence of frontier zone and inner-imperial development is crucial. We will reveal that competitions for social power within empires mobilized and concentrated resources reclaimed from natural landscapes and subsistence economies. Greater mobility of resources, both human and material, endowed competitions for power with economic force, feeding into inter-imperial prestige economies and trade. This new model of Afro-Eurasian connectivity will abandon some problematic assumptions of Silk Road trade, while maintaining the Afro-Eurasian macro-region as a meaningful unit for cultural and economic analysis.
Max ERC Funding
2 498 750 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym BCLYM
Project Molecular mechanisms of mature B cell lymphomagenesis
Researcher (PI) Almudena Ramiro
Host Institution (HI) CENTRO NACIONAL DE INVESTIGACIONESCARDIOVASCULARES CARLOS III (F.S.P.)
Call Details Starting Grant (StG), LS3, ERC-2007-StG
Summary Most of the lymphomas diagnosed in the western world are originated from mature B cells. The hallmark of these malignancies is the presence of recurrent chromosome translocations that usually involve the immunoglobulin loci and a proto-oncogene. As a result of the translocation event the proto-oncogene becomes deregulated under the influence of immunoglobulin cis sequences thus playing an important role in the etiology of the disease. Upon antigen encounter mature B cells engage in the germinal center reaction, a complex differentiation program of critical importance to the development of the secondary immune response. The germinal center reaction entails the somatic remodelling of immunoglobulin genes by the somatic hypermutation and class switch recombination reactions, both of which are triggered by Activation Induced Deaminase (AID). We have previously shown that AID also initiates lymphoma-associated c-myc/IgH chromosome translocations. In addition, the germinal center reaction involves a fine-tuned balance between intense B cell proliferation and program cell death. This environment seems to render B cells particularly vulnerable to malignant transformation. We aim at studying the molecular events responsible for B cell susceptibility to lymphomagenesis from two perspectives. First, we will address the role of AID in the generation of lymphomagenic lesions in the context of AID specificity and transcriptional activation. Second, we will approach the regulatory function of microRNAs of AID-dependent, germinal center events. The proposal aims at the molecular understanding of a process that lies in the interface of immune regulation and oncogenic transformation and therefore the results will have profound implications both to basic and clinical understanding of lymphomagenesis.
Summary
Most of the lymphomas diagnosed in the western world are originated from mature B cells. The hallmark of these malignancies is the presence of recurrent chromosome translocations that usually involve the immunoglobulin loci and a proto-oncogene. As a result of the translocation event the proto-oncogene becomes deregulated under the influence of immunoglobulin cis sequences thus playing an important role in the etiology of the disease. Upon antigen encounter mature B cells engage in the germinal center reaction, a complex differentiation program of critical importance to the development of the secondary immune response. The germinal center reaction entails the somatic remodelling of immunoglobulin genes by the somatic hypermutation and class switch recombination reactions, both of which are triggered by Activation Induced Deaminase (AID). We have previously shown that AID also initiates lymphoma-associated c-myc/IgH chromosome translocations. In addition, the germinal center reaction involves a fine-tuned balance between intense B cell proliferation and program cell death. This environment seems to render B cells particularly vulnerable to malignant transformation. We aim at studying the molecular events responsible for B cell susceptibility to lymphomagenesis from two perspectives. First, we will address the role of AID in the generation of lymphomagenic lesions in the context of AID specificity and transcriptional activation. Second, we will approach the regulatory function of microRNAs of AID-dependent, germinal center events. The proposal aims at the molecular understanding of a process that lies in the interface of immune regulation and oncogenic transformation and therefore the results will have profound implications both to basic and clinical understanding of lymphomagenesis.
Max ERC Funding
1 596 000 €
Duration
Start date: 2008-12-01, End date: 2014-11-30
Project acronym BEAM-EDM
Project Unique Method for a Neutron Electric Dipole Moment Search using a Pulsed Beam
Researcher (PI) Florian Michael PIEGSA
Host Institution (HI) UNIVERSITAET BERN
Call Details Starting Grant (StG), PE2, ERC-2016-STG
Summary My research encompasses the application of novel methods and strategies in the field of low energy particle physics. The goal of the presented program is to lead an independent and highly competitive experiment to search for a CP violating neutron electric dipole moment (nEDM), as well as for new exotic interactions using highly sensitive neutron and proton spin resonance techniques.
The measurement of the nEDM is considered to be one of the most important fundamental physics experiments at low energy. It represents a promising route for finding new physics beyond the standard model (SM) and describes an important search for new sources of CP violation in order to understand the observed large baryon asymmetry in our universe. The main project will follow a novel concept based on my original idea, which plans to employ a pulsed neutron beam at high intensity instead of the established use of storable ultracold neutrons. This complementary and potentially ground-breaking method provides the possibility to distinguish between the signal due to a nEDM and previously limiting systematic effects, and should lead to an improved result compared to the present best nEDM beam experiment. The findings of these investigations will be of paramount importance and will form the cornerstone for the success of the full-scale experiment intended for the European Spallation Source. A second scientific question will be addressed by performing spin precession experiments searching for exotic short-range interactions and associated light bosons. This is a vivid field of research motivated by various extensions to the SM. The goal of these measurements, using neutrons and protons, is to search for additional interactions such new bosons mediate between ordinary particles.
Both topics describe ambitious and unique efforts. They use related techniques, address important questions in fundamental physics, and have the potential of substantial scientific implications and high-impact results.
Summary
My research encompasses the application of novel methods and strategies in the field of low energy particle physics. The goal of the presented program is to lead an independent and highly competitive experiment to search for a CP violating neutron electric dipole moment (nEDM), as well as for new exotic interactions using highly sensitive neutron and proton spin resonance techniques.
The measurement of the nEDM is considered to be one of the most important fundamental physics experiments at low energy. It represents a promising route for finding new physics beyond the standard model (SM) and describes an important search for new sources of CP violation in order to understand the observed large baryon asymmetry in our universe. The main project will follow a novel concept based on my original idea, which plans to employ a pulsed neutron beam at high intensity instead of the established use of storable ultracold neutrons. This complementary and potentially ground-breaking method provides the possibility to distinguish between the signal due to a nEDM and previously limiting systematic effects, and should lead to an improved result compared to the present best nEDM beam experiment. The findings of these investigations will be of paramount importance and will form the cornerstone for the success of the full-scale experiment intended for the European Spallation Source. A second scientific question will be addressed by performing spin precession experiments searching for exotic short-range interactions and associated light bosons. This is a vivid field of research motivated by various extensions to the SM. The goal of these measurements, using neutrons and protons, is to search for additional interactions such new bosons mediate between ordinary particles.
Both topics describe ambitious and unique efforts. They use related techniques, address important questions in fundamental physics, and have the potential of substantial scientific implications and high-impact results.
Max ERC Funding
1 404 062 €
Duration
Start date: 2017-04-01, End date: 2022-03-31
Project acronym BEAT
Project The functional interaction of EGFR and beta-catenin signalling in colorectal cancer: Genetics, mechanisms, and therapeutic potential.
Researcher (PI) Andrea BERTOTTI
Host Institution (HI) UNIVERSITA DEGLI STUDI DI TORINO
Call Details Consolidator Grant (CoG), LS7, ERC-2016-COG
Summary Monoclonal antibodies against the EGF receptor (EGFR) provide substantive benefit to colorectal cancer (CRC) patients. However, no genetic lesions that robustly predict ‘addiction’ to the EGFR pathway have been yet identified. Further, even in tumours that regress after EGFR blockade, subsets of drug-tolerant cells often linger and foster ‘minimal residual disease’ (MRD), which portends tumour relapse.
Our preliminary evidence suggests that reliance on EGFR activity, as opposed to MRD persistence, could be assisted by genetically-based variations in transcription factor partnerships and activities, gene expression outputs, and biological fates controlled by the WNT/beta-catenin pathway. On such premises, BEAT (Beta-catenin and EGFR Abrogation Therapy) will elucidate the mechanisms of EGFR dependency, and escape from it, with the goal to identify biomarkers for more efficient clinical management of CRC and develop new therapies for MRD eradication.
A multidisciplinary approach will be pursued spanning from integrative gene regulation analyses to functional genomics in vitro, pharmacological experiments in vivo, and clinical investigation, to address whether: (i) specific genetic alterations of the WNT pathway affect anti-EGFR sensitivity; (ii) combined neutralisation of EGFR and WNT signals fuels MRD deterioration; (iii) data from analysis of this synergy can lead to the discovery of clinically meaningful biomarkers with predictive and prognostic significance.
This proposal capitalises on a unique proprietary platform for high-content studies based on a large biobank of viable CRC samples, which ensures strong analytical power and unprecedented biological flexibility. By providing fresh insight into the mechanisms whereby WNT/beta-catenin signalling differentially sustains EGFR dependency or drug tolerance, the project is expected to put forward an innovative reinterpretation of CRC molecular bases and advance the rational application of more effective therapies.
Summary
Monoclonal antibodies against the EGF receptor (EGFR) provide substantive benefit to colorectal cancer (CRC) patients. However, no genetic lesions that robustly predict ‘addiction’ to the EGFR pathway have been yet identified. Further, even in tumours that regress after EGFR blockade, subsets of drug-tolerant cells often linger and foster ‘minimal residual disease’ (MRD), which portends tumour relapse.
Our preliminary evidence suggests that reliance on EGFR activity, as opposed to MRD persistence, could be assisted by genetically-based variations in transcription factor partnerships and activities, gene expression outputs, and biological fates controlled by the WNT/beta-catenin pathway. On such premises, BEAT (Beta-catenin and EGFR Abrogation Therapy) will elucidate the mechanisms of EGFR dependency, and escape from it, with the goal to identify biomarkers for more efficient clinical management of CRC and develop new therapies for MRD eradication.
A multidisciplinary approach will be pursued spanning from integrative gene regulation analyses to functional genomics in vitro, pharmacological experiments in vivo, and clinical investigation, to address whether: (i) specific genetic alterations of the WNT pathway affect anti-EGFR sensitivity; (ii) combined neutralisation of EGFR and WNT signals fuels MRD deterioration; (iii) data from analysis of this synergy can lead to the discovery of clinically meaningful biomarkers with predictive and prognostic significance.
This proposal capitalises on a unique proprietary platform for high-content studies based on a large biobank of viable CRC samples, which ensures strong analytical power and unprecedented biological flexibility. By providing fresh insight into the mechanisms whereby WNT/beta-catenin signalling differentially sustains EGFR dependency or drug tolerance, the project is expected to put forward an innovative reinterpretation of CRC molecular bases and advance the rational application of more effective therapies.
Max ERC Funding
1 793 421 €
Duration
Start date: 2017-10-01, End date: 2022-09-30