Project acronym 0MSPIN
Project Spintronics based on relativistic phenomena in systems with zero magnetic moment
Researcher (PI) Tomáš Jungwirth
Host Institution (HI) FYZIKALNI USTAV AV CR V.V.I
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Summary
The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Max ERC Funding
1 938 000 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym 100 Archaic Genomes
Project Genome sequences from extinct hominins
Researcher (PI) Svante PÄÄBO
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2015-AdG
Summary Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Summary
Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Max ERC Funding
2 350 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym 14Constraint
Project Radiocarbon constraints for models of C cycling in terrestrial ecosystems: from process understanding to global benchmarking
Researcher (PI) Susan Trumbore
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Summary
The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Max ERC Funding
2 283 747 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym 2DHIBSA
Project Nanoscopic and Hierachical Materials via Living Crystallization-Driven Self-Assembly
Researcher (PI) Ian MANNERS
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Summary
A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Max ERC Funding
2 499 597 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym 2DNanoSpec
Project Nanoscale Vibrational Spectroscopy of Sensitive 2D Molecular Materials
Researcher (PI) Renato ZENOBI
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Summary
I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Max ERC Funding
2 311 696 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 3-TOP
Project Exploring the physics of 3-dimensional topological insulators
Researcher (PI) Laurens Wigbolt Molenkamp
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary Topological insulators constitute a novel class of materials where the topological details of the bulk band structure induce a robust surface state on the edges of the material. While transport data for 2-dimensional topological insulators have recently become available, experiments on their 3-dimensional counterparts are mainly limited to photoelectron spectroscopy. At the same time, a plethora of interesting novel physical phenomena have been predicted to occur in such systems.
In this proposal, we sketch an approach to tackle the transport and magnetic properties of the surface states in these materials. This starts with high quality layer growth, using molecular beam epitaxy, of bulk layers of HgTe, Bi2Se3 and Bi2Te3, which are the prime candidates to show the novel physics expected in this field. The existence of the relevant surface states will be assessed spectroscopically, but from there on research will focus on fabricating and characterizing nanostructures designed to elucidate the transport and magnetic properties of the topological surfaces using electrical, optical and scanning probe techniques. Apart from a general characterization of the Dirac band structure of the surface states, research will focus on the predicted magnetic monopole-like response of the system to an electrical test charge. In addition, much effort will be devoted to contacting the surface state with superconducting and magnetic top layers, with the final aim of demonstrating Majorana fermion behavior. As a final benefit, growth of thin high quality thin Bi2Se3 or Bi2Te3 layers could allow for a demonstration of the (2-dimensional) quantum spin Hall effect at room temperature - offering a road map to dissipation-less transport for the semiconductor industry.
Summary
Topological insulators constitute a novel class of materials where the topological details of the bulk band structure induce a robust surface state on the edges of the material. While transport data for 2-dimensional topological insulators have recently become available, experiments on their 3-dimensional counterparts are mainly limited to photoelectron spectroscopy. At the same time, a plethora of interesting novel physical phenomena have been predicted to occur in such systems.
In this proposal, we sketch an approach to tackle the transport and magnetic properties of the surface states in these materials. This starts with high quality layer growth, using molecular beam epitaxy, of bulk layers of HgTe, Bi2Se3 and Bi2Te3, which are the prime candidates to show the novel physics expected in this field. The existence of the relevant surface states will be assessed spectroscopically, but from there on research will focus on fabricating and characterizing nanostructures designed to elucidate the transport and magnetic properties of the topological surfaces using electrical, optical and scanning probe techniques. Apart from a general characterization of the Dirac band structure of the surface states, research will focus on the predicted magnetic monopole-like response of the system to an electrical test charge. In addition, much effort will be devoted to contacting the surface state with superconducting and magnetic top layers, with the final aim of demonstrating Majorana fermion behavior. As a final benefit, growth of thin high quality thin Bi2Se3 or Bi2Te3 layers could allow for a demonstration of the (2-dimensional) quantum spin Hall effect at room temperature - offering a road map to dissipation-less transport for the semiconductor industry.
Max ERC Funding
2 419 590 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym 3D-E
Project 3D Engineered Environments for Regenerative Medicine
Researcher (PI) Ruth Elizabeth Cameron
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE8, ERC-2012-ADG_20120216
Summary "This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Summary
"This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Max ERC Funding
2 486 267 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym 3DBrainStrom
Project Brain metastases: Deciphering tumor-stroma interactions in three dimensions for the rational design of nanomedicines
Researcher (PI) Ronit Satchi Fainaro
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), LS7, ERC-2018-ADG
Summary Brain metastases represent a major therapeutic challenge. Despite significant breakthroughs in targeted therapies, survival rates of patients with brain metastases remain poor. Nowadays, discovery, development and evaluation of new therapies are performed on human cancer cells grown in 2D on rigid plastic plates followed by in vivo testing in immunodeficient mice. These experimental settings are lacking and constitute a fundamental hurdle for the translation of preclinical discoveries into clinical practice. We propose to establish 3D-printed models of brain metastases (Aim 1), which include brain extracellular matrix, stroma and serum containing immune cells flowing in functional tumor vessels. Our unique models better capture the clinical physio-mechanical tissue properties, signaling pathways, hemodynamics and drug responsiveness. Using our 3D-printed models, we aim to develop two new fronts for identifying novel clinically-relevant molecular drivers (Aim 2) followed by the development of precision nanomedicines (Aim 3). We will exploit our vast experience in anticancer nanomedicines to design three therapeutic approaches that target various cellular compartments involved in brain metastases: 1) Prevention of brain metastatic colonization using targeted nano-vaccines, which elicit antitumor immune response; 2) Intervention of tumor-brain stroma cells crosstalk when brain micrometastases establish; 3) Regression of macrometastatic disease by selectively targeting tumor cells. These approaches will materialize using our libraries of polymeric nanocarriers that selectively accumulate in tumors.
This project will result in a paradigm shift by generating new preclinical cancer models that will bridge the translational gap in cancer therapeutics. The insights and tumor-stroma-targeted nanomedicines developed here will pave the way for prediction of patient outcome, revolutionizing our perception of tumor modelling and consequently the way we prevent and treat cancer.
Summary
Brain metastases represent a major therapeutic challenge. Despite significant breakthroughs in targeted therapies, survival rates of patients with brain metastases remain poor. Nowadays, discovery, development and evaluation of new therapies are performed on human cancer cells grown in 2D on rigid plastic plates followed by in vivo testing in immunodeficient mice. These experimental settings are lacking and constitute a fundamental hurdle for the translation of preclinical discoveries into clinical practice. We propose to establish 3D-printed models of brain metastases (Aim 1), which include brain extracellular matrix, stroma and serum containing immune cells flowing in functional tumor vessels. Our unique models better capture the clinical physio-mechanical tissue properties, signaling pathways, hemodynamics and drug responsiveness. Using our 3D-printed models, we aim to develop two new fronts for identifying novel clinically-relevant molecular drivers (Aim 2) followed by the development of precision nanomedicines (Aim 3). We will exploit our vast experience in anticancer nanomedicines to design three therapeutic approaches that target various cellular compartments involved in brain metastases: 1) Prevention of brain metastatic colonization using targeted nano-vaccines, which elicit antitumor immune response; 2) Intervention of tumor-brain stroma cells crosstalk when brain micrometastases establish; 3) Regression of macrometastatic disease by selectively targeting tumor cells. These approaches will materialize using our libraries of polymeric nanocarriers that selectively accumulate in tumors.
This project will result in a paradigm shift by generating new preclinical cancer models that will bridge the translational gap in cancer therapeutics. The insights and tumor-stroma-targeted nanomedicines developed here will pave the way for prediction of patient outcome, revolutionizing our perception of tumor modelling and consequently the way we prevent and treat cancer.
Max ERC Funding
2 353 125 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym 3DEpi
Project Transgenerational epigenetic inheritance of chromatin states : the role of Polycomb and 3D chromosome architecture
Researcher (PI) Giacomo CAVALLI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS2, ERC-2017-ADG
Summary Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Summary
Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym 3DIMAGE
Project 3D Imaging Across Lengthscales: From Atoms to Grains
Researcher (PI) Paul Anthony Midgley
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary "Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Summary
"Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Max ERC Funding
2 337 330 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym 3DNANOMECH
Project Three-dimensional molecular resolution mapping of soft matter-liquid interfaces
Researcher (PI) Ricardo Garcia
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Summary
Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Max ERC Funding
2 499 928 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym 3SPIN
Project Three Dimensional Spintronics
Researcher (PI) Russell Paul Cowburn
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary Spintronics, in which both the spin and the charge of the electron are used, is one of the most exciting new disciplines to emerge from nanoscience. The 3SPIN project seeks to open a new research front within spintronics: namely 3-dimensional spintronics, in which magnetic nanostructures are formed into a 3-dimensional interacting network of unrivalled density and hence technological benefit. 3SPIN will explore early-stage science that could underpin 3-dimensional metallic spintronics. The thesis of the project is: that by careful control of the constituent nanostructure properties, a 3-dimensional medium can be created in which a large number of topological solitons can exist. Although hardly studied at all to date, these solitons should be stable at room temperature, extremely compact and easy to manipulate and propagate. This makes them potentially ideal candidates to form the basis of a new spintronics in which the soliton is the basic transport vector instead of electrical current. ¬3.5M of funding is requested to form a new team of 5 researchers who, over a period of 60 months, will perform computer simulations and experimental studies of solitons in 3-dimensional networks of magnetic nanostructures and develop a laboratory demonstrator 3-dimensional memory device using solitons to represent and store data. A high performance electron beam lithography system (cost 1M¬) will be purchased to allow state-of-the-art magnetic nanostructures to be fabricated with perfect control over their magnetic properties, thus allowing the ideal conditions for solitons to be created and controllably manipulated. Outputs from the project will be a complete understanding of the properties of these new objects and a road map charting the next steps for research in the field.
Summary
Spintronics, in which both the spin and the charge of the electron are used, is one of the most exciting new disciplines to emerge from nanoscience. The 3SPIN project seeks to open a new research front within spintronics: namely 3-dimensional spintronics, in which magnetic nanostructures are formed into a 3-dimensional interacting network of unrivalled density and hence technological benefit. 3SPIN will explore early-stage science that could underpin 3-dimensional metallic spintronics. The thesis of the project is: that by careful control of the constituent nanostructure properties, a 3-dimensional medium can be created in which a large number of topological solitons can exist. Although hardly studied at all to date, these solitons should be stable at room temperature, extremely compact and easy to manipulate and propagate. This makes them potentially ideal candidates to form the basis of a new spintronics in which the soliton is the basic transport vector instead of electrical current. ¬3.5M of funding is requested to form a new team of 5 researchers who, over a period of 60 months, will perform computer simulations and experimental studies of solitons in 3-dimensional networks of magnetic nanostructures and develop a laboratory demonstrator 3-dimensional memory device using solitons to represent and store data. A high performance electron beam lithography system (cost 1M¬) will be purchased to allow state-of-the-art magnetic nanostructures to be fabricated with perfect control over their magnetic properties, thus allowing the ideal conditions for solitons to be created and controllably manipulated. Outputs from the project will be a complete understanding of the properties of these new objects and a road map charting the next steps for research in the field.
Max ERC Funding
2 799 996 €
Duration
Start date: 2010-03-01, End date: 2016-02-29
Project acronym 4-TOPS
Project Four experiments in Topological Superconductivity.
Researcher (PI) Laurens Molenkamp
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Call Details Advanced Grant (AdG), PE3, ERC-2016-ADG
Summary Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Summary
Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Max ERC Funding
2 497 567 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym 4D IMAGING
Project Towards 4D Imaging of Fundamental Processes on the Atomic and Sub-Atomic Scale
Researcher (PI) Ferenc Krausz
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE2, ERC-2009-AdG
Summary State-of-the-art microscopy and diffraction imaging provides insight into the atomic and sub-atomic structure of matter. They permit determination of the positions of atoms in a crystal lattice or in a molecule as well as the distribution of electrons inside atoms. State-of-the-art time-resolved spectroscopy with femtosecond and attosecond resolution provides access to dynamic changes in the atomic and electronic structure of matter. Our proposal aims at combining these two frontier techniques of XXI century science to make a long-standing dream of scientist come true: the direct observation of atoms and electrons in their natural state: in motion. Shifts in the atoms positions by tens to hundreds of picometers can make chemical bonds break apart or newly form, changing the structure and/or chemical composition of matter. Electronic motion on similar scales may result in the emission of light, or the initiation of processes that lead to a change in physical or chemical properties, or biological function. These motions happen within femtoseconds and attoseconds, respectively. To make them observable, we need a 4-dimensional (4D) imaging technique capable of recording freeze-frame snapshots of microscopic systems with picometer spatial resolution and femtosecond to attosecond exposure time. The motion can then be visualized by slow-motion replay of the freeze-frame shots. The goal of this project is to develop a 4D imaging technique that will ultimately offer picometer resolution is space and attosecond resolution in time.
Summary
State-of-the-art microscopy and diffraction imaging provides insight into the atomic and sub-atomic structure of matter. They permit determination of the positions of atoms in a crystal lattice or in a molecule as well as the distribution of electrons inside atoms. State-of-the-art time-resolved spectroscopy with femtosecond and attosecond resolution provides access to dynamic changes in the atomic and electronic structure of matter. Our proposal aims at combining these two frontier techniques of XXI century science to make a long-standing dream of scientist come true: the direct observation of atoms and electrons in their natural state: in motion. Shifts in the atoms positions by tens to hundreds of picometers can make chemical bonds break apart or newly form, changing the structure and/or chemical composition of matter. Electronic motion on similar scales may result in the emission of light, or the initiation of processes that lead to a change in physical or chemical properties, or biological function. These motions happen within femtoseconds and attoseconds, respectively. To make them observable, we need a 4-dimensional (4D) imaging technique capable of recording freeze-frame snapshots of microscopic systems with picometer spatial resolution and femtosecond to attosecond exposure time. The motion can then be visualized by slow-motion replay of the freeze-frame shots. The goal of this project is to develop a 4D imaging technique that will ultimately offer picometer resolution is space and attosecond resolution in time.
Max ERC Funding
2 500 000 €
Duration
Start date: 2010-03-01, End date: 2015-02-28
Project acronym 4D-EEG
Project 4D-EEG: A new tool to investigate the spatial and temporal activity patterns in the brain
Researcher (PI) Franciscus C.T. Van Der Helm
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Summary
Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Max ERC Funding
3 477 202 €
Duration
Start date: 2012-06-01, End date: 2017-05-31
Project acronym 4D-PET
Project Innovative PET scanner for dynamic imaging
Researcher (PI) José María BENLLOCH BAVIERA
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), LS7, ERC-2015-AdG
Summary The main objective of 4D-PET is to develop an innovative whole-body PET scanner based in a new detector concept that stores 3D position and time of every single gamma interaction with unprecedented resolution. The combination of scanner geometrical design and high timing resolution will enable developing a full sequence of all gamma-ray interactions inside the scanner, including Compton interactions, like in a 3D movie. 4D-PET fully exploits Time Of Flight (TOF) information to obtain a better image quality and to increase scanner sensitivity, through the inclusion in the image formation of all Compton events occurring inside the detector, which are always rejected in state-of-the-art PET scanners. The new PET design will radically improve state-of-the-art PET performance features, overcoming limitations of current PET technology and opening up new diagnostic venues and very valuable physiological information
Summary
The main objective of 4D-PET is to develop an innovative whole-body PET scanner based in a new detector concept that stores 3D position and time of every single gamma interaction with unprecedented resolution. The combination of scanner geometrical design and high timing resolution will enable developing a full sequence of all gamma-ray interactions inside the scanner, including Compton interactions, like in a 3D movie. 4D-PET fully exploits Time Of Flight (TOF) information to obtain a better image quality and to increase scanner sensitivity, through the inclusion in the image formation of all Compton events occurring inside the detector, which are always rejected in state-of-the-art PET scanners. The new PET design will radically improve state-of-the-art PET performance features, overcoming limitations of current PET technology and opening up new diagnostic venues and very valuable physiological information
Max ERC Funding
2 048 386 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym 4DBIOSERS
Project Four-Dimensional Monitoring of Tumour Growth by Surface Enhanced Raman Scattering
Researcher (PI) Luis LIZ-MARZAN
Host Institution (HI) ASOCIACION CENTRO DE INVESTIGACION COOPERATIVA EN BIOMATERIALES- CIC biomaGUNE
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary Optical bioimaging is limited by visible light penetration depth and stability of fluorescent dyes over extended periods of time. Surface enhanced Raman scattering (SERS) offers the possibility to overcome these drawbacks, through SERS-encoded nanoparticle tags, which can be excited with near-IR light (within the biological transparency window), providing high intensity, stable, multiplexed signals. SERS can also be used to monitor relevant bioanalytes within cells and tissues, during the development of diseases, such as tumours. In 4DBIOSERS we shall combine both capabilities of SERS, to go well beyond the current state of the art, by building three-dimensional scaffolds that support tissue (tumour) growth within a controlled environment, so that not only the fate of each (SERS-labelled) cell within the tumour can be monitored in real time (thus adding a fourth dimension to SERS bioimaging), but also recording the release of tumour metabolites and other indicators of cellular activity. Although 4DBIOSERS can be applied to a variety of diseases, we shall focus on cancer, melanoma and breast cancer in particular, as these are readily accessible by optical methods. We aim at acquiring a better understanding of tumour growth and dynamics, while avoiding animal experimentation. 3D printing will be used to generate hybrid scaffolds where tumour and healthy cells will be co-incubated to simulate a more realistic environment, thus going well beyond the potential of 2D cell cultures. Each cell type will be encoded with ultra-bright SERS tags, so that real-time monitoring can be achieved by confocal SERS microscopy. Tumour development will be correlated with simultaneous detection of various cancer biomarkers, during standard conditions and upon addition of selected drugs. The scope of 4DBIOSERS is multidisciplinary, as it involves the design of high-end nanocomposites, development of 3D cell culture models and optimization of emerging SERS tomography methods.
Summary
Optical bioimaging is limited by visible light penetration depth and stability of fluorescent dyes over extended periods of time. Surface enhanced Raman scattering (SERS) offers the possibility to overcome these drawbacks, through SERS-encoded nanoparticle tags, which can be excited with near-IR light (within the biological transparency window), providing high intensity, stable, multiplexed signals. SERS can also be used to monitor relevant bioanalytes within cells and tissues, during the development of diseases, such as tumours. In 4DBIOSERS we shall combine both capabilities of SERS, to go well beyond the current state of the art, by building three-dimensional scaffolds that support tissue (tumour) growth within a controlled environment, so that not only the fate of each (SERS-labelled) cell within the tumour can be monitored in real time (thus adding a fourth dimension to SERS bioimaging), but also recording the release of tumour metabolites and other indicators of cellular activity. Although 4DBIOSERS can be applied to a variety of diseases, we shall focus on cancer, melanoma and breast cancer in particular, as these are readily accessible by optical methods. We aim at acquiring a better understanding of tumour growth and dynamics, while avoiding animal experimentation. 3D printing will be used to generate hybrid scaffolds where tumour and healthy cells will be co-incubated to simulate a more realistic environment, thus going well beyond the potential of 2D cell cultures. Each cell type will be encoded with ultra-bright SERS tags, so that real-time monitoring can be achieved by confocal SERS microscopy. Tumour development will be correlated with simultaneous detection of various cancer biomarkers, during standard conditions and upon addition of selected drugs. The scope of 4DBIOSERS is multidisciplinary, as it involves the design of high-end nanocomposites, development of 3D cell culture models and optimization of emerging SERS tomography methods.
Max ERC Funding
2 410 771 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 4PI-SKY
Project 4 pi sky: Extreme Astrophysics with Revolutionary Radio Telescopes
Researcher (PI) Robert Philip Fender
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE9, ERC-2010-AdG_20100224
Summary Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Summary
Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Max ERC Funding
2 999 847 €
Duration
Start date: 2011-07-01, End date: 2017-06-30
Project acronym 5COFM
Project Five Centuries of Marriages
Researcher (PI) Anna Cabré
Host Institution (HI) UNIVERSIDAD AUTONOMA DE BARCELONA
Call Details Advanced Grant (AdG), SH6, ERC-2010-AdG_20100407
Summary This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Summary
This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Max ERC Funding
1 847 400 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym 5D Heart Patch
Project A Functional, Mature In vivo Human Ventricular Muscle Patch for Cardiomyopathy
Researcher (PI) Kenneth Randall Chien
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Advanced Grant (AdG), LS7, ERC-2016-ADG
Summary Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Summary
Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Max ERC Funding
2 149 228 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym 5HT-OPTOGENETICS
Project Optogenetic Analysis of Serotonin Function in the Mammalian Brain
Researcher (PI) Zachary Mainen
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Advanced Grant (AdG), LS5, ERC-2009-AdG
Summary Serotonin (5-HT) is implicated in a wide spectrum of brain functions and disorders. However, its functions remain controversial and enigmatic. We suggest that past work on the 5-HT system have been significantly hampered by technical limitations in the selectivity and temporal resolution of the conventional pharmacological and electrophysiological methods that have been applied. We therefore propose to apply novel optogenetic methods that will allow us to overcome these limitations and thereby gain new insight into the biological functions of this important molecule. In preliminary studies, we have demonstrated that we can deliver exogenous proteins specifically to 5-HT neurons using viral vectors. Our objectives are to (1) record, (2) stimulate and (3) silence the activity of 5-HT neurons with high molecular selectivity and temporal precision by using genetically-encoded sensors, activators and inhibitors of neural function. These tools will allow us to monitor and control the 5-HT system in real-time in freely-behaving animals and thereby to establish causal links between information processing in 5-HT neurons and specific behaviors. In combination with quantitative behavioral assays, we will use this approach to define the role of 5-HT in sensory, motor and cognitive functions. The significance of the work is three-fold. First, we will establish a new arsenal of tools for probing the physiological and behavioral functions of 5-HT neurons. Second, we will make definitive tests of major hypotheses of 5-HT function. Third, we will have possible therapeutic applications. In this way, the proposed work has the potential for a major impact in research on the role of 5-HT in brain function and dysfunction.
Summary
Serotonin (5-HT) is implicated in a wide spectrum of brain functions and disorders. However, its functions remain controversial and enigmatic. We suggest that past work on the 5-HT system have been significantly hampered by technical limitations in the selectivity and temporal resolution of the conventional pharmacological and electrophysiological methods that have been applied. We therefore propose to apply novel optogenetic methods that will allow us to overcome these limitations and thereby gain new insight into the biological functions of this important molecule. In preliminary studies, we have demonstrated that we can deliver exogenous proteins specifically to 5-HT neurons using viral vectors. Our objectives are to (1) record, (2) stimulate and (3) silence the activity of 5-HT neurons with high molecular selectivity and temporal precision by using genetically-encoded sensors, activators and inhibitors of neural function. These tools will allow us to monitor and control the 5-HT system in real-time in freely-behaving animals and thereby to establish causal links between information processing in 5-HT neurons and specific behaviors. In combination with quantitative behavioral assays, we will use this approach to define the role of 5-HT in sensory, motor and cognitive functions. The significance of the work is three-fold. First, we will establish a new arsenal of tools for probing the physiological and behavioral functions of 5-HT neurons. Second, we will make definitive tests of major hypotheses of 5-HT function. Third, we will have possible therapeutic applications. In this way, the proposed work has the potential for a major impact in research on the role of 5-HT in brain function and dysfunction.
Max ERC Funding
2 318 636 €
Duration
Start date: 2010-07-01, End date: 2015-12-31
Project acronym 5HTCircuits
Project Modulation of cortical circuits and predictive neural coding by serotonin
Researcher (PI) Zachary Mainen
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Advanced Grant (AdG), LS5, ERC-2014-ADG
Summary Serotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.
Summary
Serotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.
Max ERC Funding
2 486 074 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym A2F2
Project Beyond Biopolymers: Protein-Sized Aromatic Amide Functional Foldamers
Researcher (PI) Ivan Huc
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE5, ERC-2012-ADG_20120216
Summary Nature has evolved ultimate chemical functions based on controlling and altering conformation of its molecular machinery. Prominent examples include enzyme catalysis and information storage/duplication in nucleic acids. These achievements are based on large and complex yet remarkably defined structures obtained through folding of polymeric chains and a subtle interplay of non-covalent forces. Nature uses a limited set of building blocks – e.g. twenty amino-acids and four nucleobases – with specific abilities to impart well-defined folds. In the last decade, chemists have discovered foldamers: non-natural oligomers and polymers also prone to adopt folded structures. The emergence of foldamers has far reaching implications. A new major long term prospect is open to chemistry: the de novo synthesis of artificial objects resembling biopolymers in terms of their size, complexity, and efficiency at achieving defined functions, yet having chemical structures beyond the reach of biopolymers amenable to new properties and functions. The PI of this project has shown internationally recognized leadership in the development of a class of foldamers, aromatic oligoamides, whose features arguably make them the most suitable candidates to systematically explore what folded structures beyond biopolymers give access to. This project aims at developing methods to allow the routine fabrication of 20-40 units long aromatic oligoamide foldamers (6-15 kDa) designed to fold into artificial molecular containers having engineerable cavities and surfaces for molecular recognition of organic substrates, in particular large peptides and saccharides, polymers, and proteins. The methodology rests on modelling based design, multistep organic synthesis of heterocyclic monomers and their assembly into long sequences, structural elucidation using, among other techniques, x-ray crystallography, and the physico-chemical characterization of molecular recognition events.
Summary
Nature has evolved ultimate chemical functions based on controlling and altering conformation of its molecular machinery. Prominent examples include enzyme catalysis and information storage/duplication in nucleic acids. These achievements are based on large and complex yet remarkably defined structures obtained through folding of polymeric chains and a subtle interplay of non-covalent forces. Nature uses a limited set of building blocks – e.g. twenty amino-acids and four nucleobases – with specific abilities to impart well-defined folds. In the last decade, chemists have discovered foldamers: non-natural oligomers and polymers also prone to adopt folded structures. The emergence of foldamers has far reaching implications. A new major long term prospect is open to chemistry: the de novo synthesis of artificial objects resembling biopolymers in terms of their size, complexity, and efficiency at achieving defined functions, yet having chemical structures beyond the reach of biopolymers amenable to new properties and functions. The PI of this project has shown internationally recognized leadership in the development of a class of foldamers, aromatic oligoamides, whose features arguably make them the most suitable candidates to systematically explore what folded structures beyond biopolymers give access to. This project aims at developing methods to allow the routine fabrication of 20-40 units long aromatic oligoamide foldamers (6-15 kDa) designed to fold into artificial molecular containers having engineerable cavities and surfaces for molecular recognition of organic substrates, in particular large peptides and saccharides, polymers, and proteins. The methodology rests on modelling based design, multistep organic synthesis of heterocyclic monomers and their assembly into long sequences, structural elucidation using, among other techniques, x-ray crystallography, and the physico-chemical characterization of molecular recognition events.
Max ERC Funding
2 496 216 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym AAA
Project Adaptive Actin Architectures
Researcher (PI) Laurent Blanchoin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2016-ADG
Summary Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Summary
Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Max ERC Funding
2 349 898 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym AARTFAAC
Project Amsterdam-ASTRON Radio Transient Facility And Analysis Centre: Probing the Extremes of Astrophysics
Researcher (PI) Ralph Antoine Marie Joseph Wijers
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE9, ERC-2009-AdG
Summary Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Summary
Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Max ERC Funding
3 499 128 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ABCvolume
Project The ABC of Cell Volume Regulation
Researcher (PI) Berend Poolman
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), LS1, ERC-2014-ADG
Summary Cell volume regulation is crucial for any living cell because changes in volume determine the metabolic activity through e.g. changes in ionic strength, pH, macromolecular crowding and membrane tension. These physical chemical parameters influence interaction rates and affinities of biomolecules, folding rates, and fold stabilities in vivo. Understanding of the underlying volume regulatory mechanisms has immediate application in biotechnology and health, yet these factors are generally ignored in systems analyses of cellular functions.
My team has uncovered a number of mechanisms and insights of cell volume regulation. The next step forward is to elucidate how the components of a cell volume regulatory circuit work together and control the physicochemical conditions of the cell.
I propose construction of a synthetic cell in which an osmoregulatory transporter and mechanosensitive channel form a minimal volume regulatory network. My group has developed the technology to reconstitute membrane proteins into lipid vesicles (synthetic cells). One of the challenges is to incorporate into the vesicles an efficient pathway for ATP production and maintain energy homeostasis while the load on the system varies. We aim to control the transmembrane flux of osmolytes, which requires elucidation of the molecular mechanism of gating of the osmoregulatory transporter. We will focus on the glycine betaine ABC importer, which is one of the most complex transporters known to date with ten distinct protein domains, transiently interacting with each other.
The proposed synthetic metabolic circuit constitutes a fascinating out-of-equilibrium system, allowing us to understand cell volume regulatory mechanisms in a context and at a level of complexity minimally needed for life. Analysis of this circuit will address many outstanding questions and eventually allow us to design more sophisticated vesicular systems with applications, for example as compartmentalized reaction networks.
Summary
Cell volume regulation is crucial for any living cell because changes in volume determine the metabolic activity through e.g. changes in ionic strength, pH, macromolecular crowding and membrane tension. These physical chemical parameters influence interaction rates and affinities of biomolecules, folding rates, and fold stabilities in vivo. Understanding of the underlying volume regulatory mechanisms has immediate application in biotechnology and health, yet these factors are generally ignored in systems analyses of cellular functions.
My team has uncovered a number of mechanisms and insights of cell volume regulation. The next step forward is to elucidate how the components of a cell volume regulatory circuit work together and control the physicochemical conditions of the cell.
I propose construction of a synthetic cell in which an osmoregulatory transporter and mechanosensitive channel form a minimal volume regulatory network. My group has developed the technology to reconstitute membrane proteins into lipid vesicles (synthetic cells). One of the challenges is to incorporate into the vesicles an efficient pathway for ATP production and maintain energy homeostasis while the load on the system varies. We aim to control the transmembrane flux of osmolytes, which requires elucidation of the molecular mechanism of gating of the osmoregulatory transporter. We will focus on the glycine betaine ABC importer, which is one of the most complex transporters known to date with ten distinct protein domains, transiently interacting with each other.
The proposed synthetic metabolic circuit constitutes a fascinating out-of-equilibrium system, allowing us to understand cell volume regulatory mechanisms in a context and at a level of complexity minimally needed for life. Analysis of this circuit will address many outstanding questions and eventually allow us to design more sophisticated vesicular systems with applications, for example as compartmentalized reaction networks.
Max ERC Funding
2 247 231 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ABEP
Project Asset Bubbles and Economic Policy
Researcher (PI) Jaume Ventura Fontanet
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Summary
Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Max ERC Funding
1 000 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym ABYSS
Project ABYSS - Assessment of bacterial life and matter cycling in deep-sea surface sediments
Researcher (PI) Antje Boetius
Host Institution (HI) ALFRED-WEGENER-INSTITUT HELMHOLTZ-ZENTRUM FUR POLAR- UND MEERESFORSCHUNG
Call Details Advanced Grant (AdG), LS8, ERC-2011-ADG_20110310
Summary The deep-sea floor hosts a distinct microbial biome covering 67% of the Earth’s surface, characterized by cold temperatures, permanent darkness, high pressure and food limitation. The surface sediments are dominated by bacteria, with on average a billion cells per ml. Benthic bacteria are highly relevant to the Earth’s element cycles as they remineralize most of the organic matter sinking from the productive surface ocean, and return nutrients, thereby promoting ocean primary production. What passes the bacterial filter is a relevant sink for carbon on geological time scales, influencing global oxygen and carbon budgets, and fueling the deep subsurface biosphere. Despite the relevance of deep-sea sediment bacteria to climate, geochemical cycles and ecology of the seafloor, their genetic and functional diversity, niche differentiation and biological interactions remain unknown. Our preliminary work in a global survey of deep-sea sediments enables us now to target specific genes for the quantification of abyssal bacteria. We can trace isotope-labeled elements into communities and single cells, and analyze the molecular alteration of organic matter during microbial degradation, all in context with environmental dynamics recorded at the only long-term deep-sea ecosystem observatory in the Arctic that we maintain. I propose to bridge biogeochemistry, ecology, microbiology and marine biology to develop a systematic understanding of abyssal sediment bacterial community distribution, diversity, function and interactions, by combining in situ flux studies and different visualization techniques with a wide range of molecular tools. Substantial progress is expected in understanding I) identity and function of the dominant types of indigenous benthic bacteria, II) dynamics in bacterial activity and diversity caused by variations in particle flux, III) interactions with different types and ages of organic matter, and other biological factors.
Summary
The deep-sea floor hosts a distinct microbial biome covering 67% of the Earth’s surface, characterized by cold temperatures, permanent darkness, high pressure and food limitation. The surface sediments are dominated by bacteria, with on average a billion cells per ml. Benthic bacteria are highly relevant to the Earth’s element cycles as they remineralize most of the organic matter sinking from the productive surface ocean, and return nutrients, thereby promoting ocean primary production. What passes the bacterial filter is a relevant sink for carbon on geological time scales, influencing global oxygen and carbon budgets, and fueling the deep subsurface biosphere. Despite the relevance of deep-sea sediment bacteria to climate, geochemical cycles and ecology of the seafloor, their genetic and functional diversity, niche differentiation and biological interactions remain unknown. Our preliminary work in a global survey of deep-sea sediments enables us now to target specific genes for the quantification of abyssal bacteria. We can trace isotope-labeled elements into communities and single cells, and analyze the molecular alteration of organic matter during microbial degradation, all in context with environmental dynamics recorded at the only long-term deep-sea ecosystem observatory in the Arctic that we maintain. I propose to bridge biogeochemistry, ecology, microbiology and marine biology to develop a systematic understanding of abyssal sediment bacterial community distribution, diversity, function and interactions, by combining in situ flux studies and different visualization techniques with a wide range of molecular tools. Substantial progress is expected in understanding I) identity and function of the dominant types of indigenous benthic bacteria, II) dynamics in bacterial activity and diversity caused by variations in particle flux, III) interactions with different types and ages of organic matter, and other biological factors.
Max ERC Funding
3 375 693 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym ACB
Project The Analytic Conformal Bootstrap
Researcher (PI) Luis Fernando ALDAY
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Summary
The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Max ERC Funding
2 171 483 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ACCELERATES
Project Acceleration in Extreme Shocks: from the microphysics to laboratory and astrophysics scenarios
Researcher (PI) Luis Miguel De Oliveira E Silva
Host Institution (HI) INSTITUTO SUPERIOR TECNICO
Call Details Advanced Grant (AdG), PE2, ERC-2010-AdG_20100224
Summary What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Summary
What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Max ERC Funding
1 588 800 €
Duration
Start date: 2011-06-01, End date: 2016-07-31
Project acronym ACCI
Project Atmospheric Chemistry-Climate Interactions
Researcher (PI) John Adrian Pyle
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Global change involves a large number of complex interactions between various earth system processes. In the atmosphere, one component of the earth system, there are crucial feedbacks between physical, chemical and biological processes. Thus many of the drivers of climate change depend on chemical processes in the atmosphere including, in addition to ozone and water vapour, methane, nitrous oxide, the halocarbons as well as a range of inorganic and organic aerosols. The link between chemistry and climate is two-way and changes in climate can influence atmospheric chemistry processes in a variety of ways.
Previous studies have looked at these interactions in isolation but the time is now right for more comprehensive studies. The crucial contribution that will be made here is in improving our understanding of the processes within this complex system. Process understanding has been the hallmark of my previous work. The earth system scope here will be ambitiously wide but with a similar drive to understand fundamental processes.
The ambitious programme of research is built around four interrelated questions using new state-of-the-art modelling tools: How will the composition of the stratosphere change in the future, given changes in the concentrations of ozone depleting substances and greenhouse gases? How will these changes in the stratosphere affect tropospheric composition and climate? How will the composition of the troposphere change in the future, given changes in the emissions of ozone precursors and greenhouse gases? How will these changes in the troposphere affect the troposphere-stratosphere climate system?
ACCI will break new ground in bringing all of these questions into a single modelling and diagnostic framework, enabling interrelated questions to be answered which should radically improve our overall projections for global change.
Summary
Global change involves a large number of complex interactions between various earth system processes. In the atmosphere, one component of the earth system, there are crucial feedbacks between physical, chemical and biological processes. Thus many of the drivers of climate change depend on chemical processes in the atmosphere including, in addition to ozone and water vapour, methane, nitrous oxide, the halocarbons as well as a range of inorganic and organic aerosols. The link between chemistry and climate is two-way and changes in climate can influence atmospheric chemistry processes in a variety of ways.
Previous studies have looked at these interactions in isolation but the time is now right for more comprehensive studies. The crucial contribution that will be made here is in improving our understanding of the processes within this complex system. Process understanding has been the hallmark of my previous work. The earth system scope here will be ambitiously wide but with a similar drive to understand fundamental processes.
The ambitious programme of research is built around four interrelated questions using new state-of-the-art modelling tools: How will the composition of the stratosphere change in the future, given changes in the concentrations of ozone depleting substances and greenhouse gases? How will these changes in the stratosphere affect tropospheric composition and climate? How will the composition of the troposphere change in the future, given changes in the emissions of ozone precursors and greenhouse gases? How will these changes in the troposphere affect the troposphere-stratosphere climate system?
ACCI will break new ground in bringing all of these questions into a single modelling and diagnostic framework, enabling interrelated questions to be answered which should radically improve our overall projections for global change.
Max ERC Funding
2 496 926 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACCOMPLI
Project Assembly and maintenance of a co-regulated chromosomal compartment
Researcher (PI) Peter Burkhard Becker
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), LS2, ERC-2011-ADG_20110310
Summary "Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Summary
"Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Max ERC Funding
2 482 770 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACCRETE
Project Accretion and Early Differentiation of the Earth and Terrestrial Planets
Researcher (PI) David Crowhurst Rubie
Host Institution (HI) UNIVERSITAT BAYREUTH
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary Formation of the Earth and the other terrestrial planets of our Solar System (Mercury, Venus and Mars) commenced 4.568 billion years ago and occurred on a time scale of about 100 million years. These planets grew by the process of accretion, which involved numerous collisions with smaller (Moon- to Mars-size) bodies. Impacts with such bodies released sufficient energy to cause large-scale melting and the formation of deep “magma oceans”. Such magma oceans enabled liquid metal to separate from liquid silicate, sink and accumulate to form the metallic cores of the planets. Thus core formation in terrestrial planets was a multistage process, intimately related to the major impacts during accretion, that determined the chemistry of planetary mantles. However, until now, accretion, as modelled by astrophysicists, and core formation, as modelled by geochemists, have been treated as completely independent processes. The fundamental and crucial aim of this ambitious interdisciplinary proposal is to integrate astrophysical models of planetary accretion with geochemical models of planetary differentiation together with cosmochemical constraints obtained from meteorites. The research will involve integrating new models of planetary accretion with core formation models based on the partitioning of a large number of elements between liquid metal and liquid silicate that we will determine experimentally at pressures up to about 100 gigapascals (equivalent to 2400 km deep in the Earth). By comparing our results with the known physical and chemical characteristics of the terrestrial planets, we will obtain a comprehensive understanding of how these planets formed, grew and evolved, both physically and chemically, with time. The integration of chemistry and planetary differentiation with accretion models is a new ground-breaking concept that will lead, through synergies and feedback, to major new advances in the Earth and planetary sciences.
Summary
Formation of the Earth and the other terrestrial planets of our Solar System (Mercury, Venus and Mars) commenced 4.568 billion years ago and occurred on a time scale of about 100 million years. These planets grew by the process of accretion, which involved numerous collisions with smaller (Moon- to Mars-size) bodies. Impacts with such bodies released sufficient energy to cause large-scale melting and the formation of deep “magma oceans”. Such magma oceans enabled liquid metal to separate from liquid silicate, sink and accumulate to form the metallic cores of the planets. Thus core formation in terrestrial planets was a multistage process, intimately related to the major impacts during accretion, that determined the chemistry of planetary mantles. However, until now, accretion, as modelled by astrophysicists, and core formation, as modelled by geochemists, have been treated as completely independent processes. The fundamental and crucial aim of this ambitious interdisciplinary proposal is to integrate astrophysical models of planetary accretion with geochemical models of planetary differentiation together with cosmochemical constraints obtained from meteorites. The research will involve integrating new models of planetary accretion with core formation models based on the partitioning of a large number of elements between liquid metal and liquid silicate that we will determine experimentally at pressures up to about 100 gigapascals (equivalent to 2400 km deep in the Earth). By comparing our results with the known physical and chemical characteristics of the terrestrial planets, we will obtain a comprehensive understanding of how these planets formed, grew and evolved, both physically and chemically, with time. The integration of chemistry and planetary differentiation with accretion models is a new ground-breaking concept that will lead, through synergies and feedback, to major new advances in the Earth and planetary sciences.
Max ERC Funding
1 826 200 €
Duration
Start date: 2012-05-01, End date: 2018-04-30
Project acronym ACCUPOL
Project Unlimited Growth? A Comparative Analysis of Causes and Consequences of Policy Accumulation
Researcher (PI) Christoph KNILL
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), SH2, ERC-2017-ADG
Summary ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Summary
ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Max ERC Funding
2 359 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym ACETOGENS
Project Acetogenic bacteria: from basic physiology via gene regulation to application in industrial biotechnology
Researcher (PI) Volker MÜLLER
Host Institution (HI) JOHANN WOLFGANG GOETHE-UNIVERSITATFRANKFURT AM MAIN
Call Details Advanced Grant (AdG), LS9, ERC-2016-ADG
Summary Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Summary
Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Max ERC Funding
2 497 140 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ACMO
Project Systematic dissection of molecular machines and neural circuits coordinating C. elegans aggregation behaviour
Researcher (PI) Mario De Bono
Host Institution (HI) MEDICAL RESEARCH COUNCIL
Call Details Advanced Grant (AdG), LS5, ERC-2010-AdG_20100317
Summary Elucidating how neural circuits coordinate behaviour, and how molecules underpin the properties of individual neurons are major goals of neuroscience. Optogenetics and neural imaging combined with the powerful genetics and well-described nervous system of C. elegans offer special opportunities to address these questions. Previously, we identified a series of sensory neurons that modulate aggregation of C. elegans. These include neurons that respond to O2, CO2, noxious cues, satiety state, and pheromones. We propose to take our analysis to the next level by dissecting how, in mechanistic molecular terms, these distributed inputs modify the activity of populations of interneurons and motoneurons to coordinate group formation. Our strategy is to develop new, highly parallel approaches to replace the traditional piecemeal analysis.
We propose to:
1) Harness next generation sequencing (NGS) to forward genetics, rapidly to identify a molecular ¿parts list¿ for aggregation. Much of the genetics has been done: we have identified almost 200 mutations that inhibit or enhance aggregation but otherwise show no overt phenotype. A pilot study of 50 of these mutations suggests they identify dozens of genes not previously implicated in aggregation. NGS will allow us to molecularly identify these genes in a few months, providing multiple entry points to study molecular and circuitry mechanisms for behaviour.
2) Develop new methods to image the activity of populations of neurons in immobilized and freely moving animals, using genetically encoded indicators such as the calcium sensor cameleon and the voltage indicator mermaid.
This will be the first time a complex behaviour has been dissected in this way. We expect to identify novel conserved molecular and circuitry mechanisms.
Summary
Elucidating how neural circuits coordinate behaviour, and how molecules underpin the properties of individual neurons are major goals of neuroscience. Optogenetics and neural imaging combined with the powerful genetics and well-described nervous system of C. elegans offer special opportunities to address these questions. Previously, we identified a series of sensory neurons that modulate aggregation of C. elegans. These include neurons that respond to O2, CO2, noxious cues, satiety state, and pheromones. We propose to take our analysis to the next level by dissecting how, in mechanistic molecular terms, these distributed inputs modify the activity of populations of interneurons and motoneurons to coordinate group formation. Our strategy is to develop new, highly parallel approaches to replace the traditional piecemeal analysis.
We propose to:
1) Harness next generation sequencing (NGS) to forward genetics, rapidly to identify a molecular ¿parts list¿ for aggregation. Much of the genetics has been done: we have identified almost 200 mutations that inhibit or enhance aggregation but otherwise show no overt phenotype. A pilot study of 50 of these mutations suggests they identify dozens of genes not previously implicated in aggregation. NGS will allow us to molecularly identify these genes in a few months, providing multiple entry points to study molecular and circuitry mechanisms for behaviour.
2) Develop new methods to image the activity of populations of neurons in immobilized and freely moving animals, using genetically encoded indicators such as the calcium sensor cameleon and the voltage indicator mermaid.
This will be the first time a complex behaviour has been dissected in this way. We expect to identify novel conserved molecular and circuitry mechanisms.
Max ERC Funding
2 439 996 €
Duration
Start date: 2011-04-01, End date: 2017-03-31
Project acronym ACRCC
Project Understanding the atmospheric circulation response to climate change
Researcher (PI) Theodore Shepherd
Host Institution (HI) THE UNIVERSITY OF READING
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Summary
Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Max ERC Funding
2 489 151 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTINONSRF
Project MAL: an actin-regulated SRF transcriptional coactivator
Researcher (PI) Richard Treisman
Host Institution (HI) THE FRANCIS CRICK INSTITUTE LIMITED
Call Details Advanced Grant (AdG), LS1, ERC-2010-AdG_20100317
Summary MAL: an actin-regulated SRF transcriptional coactivator
Recent years have seen a revitalised interest in the role of actin in nuclear processes, but the molecular mechanisms involved remain largely unexplored. We will elucidate the molecular basis for the actin-based control of the SRF transcriptional coactivator, MAL. SRF controls transcription through two families of coactivators, the actin-binding MRTFs (MAL, Mkl2), which couple its activity to cytoskeletal dynamics, and the ERK-regulated TCFs (Elk-1, SAP-1, Net). MAL subcellular localisation and transcriptional activity responds to signal-induced changes in G-actin concentration, which are sensed by its actin-binding N-terminal RPEL domain. Members of a second family of RPEL proteins, the Phactrs, also exhibit actin-regulated nucleocytoplasmic shuttling. The proposal addresses the following novel features of actin biology:
¿ Actin as a transcriptional regulator
¿ Actin as a signalling molecule
¿ Actin-binding proteins as targets for regulation by actin, rather than regulators of actin function
We will analyse the sequences and proteins involved in actin-regulated nucleocytoplasmic shuttling, using structural biology and biochemistry to analyse its control by changes in actin-RPEL domain interactions. We will characterise the dynamics of shuttling, and develop reporters for changes in actin-MAL interaction for analysis of pathway activation in vivo. We will identify genes controlling MAL itself, and the balance between the nuclear and cytoplasmic actin pools. The mechanism by which actin represses transcriptional activation by MAL in the nucleus, and its relation to MAL phosphorylation, will be elucidated. Finally, we will map MRTF and TCF cofactor recruitment to SRF targets on a genome-wide scale, and identify the steps in transcription controlled by actin-MAL interaction.
Summary
MAL: an actin-regulated SRF transcriptional coactivator
Recent years have seen a revitalised interest in the role of actin in nuclear processes, but the molecular mechanisms involved remain largely unexplored. We will elucidate the molecular basis for the actin-based control of the SRF transcriptional coactivator, MAL. SRF controls transcription through two families of coactivators, the actin-binding MRTFs (MAL, Mkl2), which couple its activity to cytoskeletal dynamics, and the ERK-regulated TCFs (Elk-1, SAP-1, Net). MAL subcellular localisation and transcriptional activity responds to signal-induced changes in G-actin concentration, which are sensed by its actin-binding N-terminal RPEL domain. Members of a second family of RPEL proteins, the Phactrs, also exhibit actin-regulated nucleocytoplasmic shuttling. The proposal addresses the following novel features of actin biology:
¿ Actin as a transcriptional regulator
¿ Actin as a signalling molecule
¿ Actin-binding proteins as targets for regulation by actin, rather than regulators of actin function
We will analyse the sequences and proteins involved in actin-regulated nucleocytoplasmic shuttling, using structural biology and biochemistry to analyse its control by changes in actin-RPEL domain interactions. We will characterise the dynamics of shuttling, and develop reporters for changes in actin-MAL interaction for analysis of pathway activation in vivo. We will identify genes controlling MAL itself, and the balance between the nuclear and cytoplasmic actin pools. The mechanism by which actin represses transcriptional activation by MAL in the nucleus, and its relation to MAL phosphorylation, will be elucidated. Finally, we will map MRTF and TCF cofactor recruitment to SRF targets on a genome-wide scale, and identify the steps in transcription controlled by actin-MAL interaction.
Max ERC Funding
1 889 995 €
Duration
Start date: 2011-10-01, End date: 2017-09-30
Project acronym ActiveCortex
Project Active dendrites and cortical associations
Researcher (PI) Matthew Larkum
Host Institution (HI) HUMBOLDT-UNIVERSITAET ZU BERLIN
Call Details Advanced Grant (AdG), LS5, ERC-2014-ADG
Summary Converging studies from psychophysics in humans to single-cell recordings in monkeys and rodents indicate that most important cognitive processes depend on both feed-forward and feedback information interacting in the brain. Intriguingly, feedback to early cortical processing stages appears to play a causal role in these processes. Despite the central nature of this fact to understanding brain cognition, there is still no mechanistic explanation as to how this information could be so pivotal and what events take place that might be decisive. In this research program, we will test the hypothesis that the extraordinary performance of the cortex derives from an associative mechanism built into the basic neuronal unit: the pyramidal cell. The hypothesis is based on two important facts: (1) feedback information is conveyed predominantly to layer 1 and (2) the apical tuft dendrites that are the major recipient of this feedback information are highly electrogenic.
The research program is divided in to several workpackages to systematically investigate the hypothesis at every level. As a whole, we will investigate the causal link between intrinsic cellular activity and behaviour. To do this we will use eletrophysiological and optical techniques to record and influence cell the intrinsic properties of cells (in particular dendritic activity) in vivo and in vitro in rodents. In vivo experiments will have a specific focus on context driven behaviour and in vitro experiments on the impact of long-range (feedback-carrying) fibers on cell activity. The study will also focus on synaptic plasticity at the interface of feedback information and dendritic electrogenesis, namely synapses on to the tuft dendrite of pyramidal neurons. The proposed program will not only address a long-standing and important hypothesis but also provide a transformational contribution towards understanding the operation of the cerebral cortex.
Summary
Converging studies from psychophysics in humans to single-cell recordings in monkeys and rodents indicate that most important cognitive processes depend on both feed-forward and feedback information interacting in the brain. Intriguingly, feedback to early cortical processing stages appears to play a causal role in these processes. Despite the central nature of this fact to understanding brain cognition, there is still no mechanistic explanation as to how this information could be so pivotal and what events take place that might be decisive. In this research program, we will test the hypothesis that the extraordinary performance of the cortex derives from an associative mechanism built into the basic neuronal unit: the pyramidal cell. The hypothesis is based on two important facts: (1) feedback information is conveyed predominantly to layer 1 and (2) the apical tuft dendrites that are the major recipient of this feedback information are highly electrogenic.
The research program is divided in to several workpackages to systematically investigate the hypothesis at every level. As a whole, we will investigate the causal link between intrinsic cellular activity and behaviour. To do this we will use eletrophysiological and optical techniques to record and influence cell the intrinsic properties of cells (in particular dendritic activity) in vivo and in vitro in rodents. In vivo experiments will have a specific focus on context driven behaviour and in vitro experiments on the impact of long-range (feedback-carrying) fibers on cell activity. The study will also focus on synaptic plasticity at the interface of feedback information and dendritic electrogenesis, namely synapses on to the tuft dendrite of pyramidal neurons. The proposed program will not only address a long-standing and important hypothesis but also provide a transformational contribution towards understanding the operation of the cerebral cortex.
Max ERC Funding
2 386 304 €
Duration
Start date: 2016-01-01, End date: 2020-12-31