Project acronym 0MSPIN
Project Spintronics based on relativistic phenomena in systems with zero magnetic moment
Researcher (PI) Tomáš Jungwirth
Host Institution (HI) FYZIKALNI USTAV AV CR V.V.I
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Summary
The 0MSPIN project consists of an extensive integrated theoretical, experimental and device development programme of research opening a radical new approach to spintronics. Spintronics has the potential to supersede existing storage and memory applications, and to provide alternatives to current CMOS technology. Ferromagnetic matels used in all current spintronics applications may make it impractical to realise the full potential of spintronics. Metals are unsuitable for transistor and information processing applications, for opto-electronics, or for high-density integration. The 0MSPIN project aims to remove the major road-block holding back the development of spintronics in a radical way: removing the ferromagnetic component from key active parts or from the whole of the spintronic devices. This approach is based on exploiting the combination of exchange and spin-orbit coupling phenomena and material systems with zero macroscopic moment. The goal of the 0MSPIN is to provide a new paradigm by which spintronics can enter the realms of conventional semiconductors in both fundamental condensed matter research and in information technologies. In the central part of the proposal, the research towards this goal is embedded within a materials science project whose aim is to introduce into physics and microelectronics an entirely new class of semiconductors. 0MSPIN seeks to exploit three classes of material systems: (1) Antiferromagnetic bi-metallic 3d-5d alloys (e.g. Mn2Au). (2) Antiferromagnetic I-II-V semiconductors (e.g. LiMnAs). (3) Non-magnetic spin-orbit coupled semiconductors with injected spin-polarized currents (e.g. 2D III-V structures). Proof of concept devices operating at high temperatures will be fabricated to show-case new functionalities offered by zero-moment systems for sensing and memory applications, information processing, and opto-electronics technologies.
Max ERC Funding
1 938 000 €
Duration
Start date: 2011-06-01, End date: 2016-05-31
Project acronym 100 Archaic Genomes
Project Genome sequences from extinct hominins
Researcher (PI) Svante PÄÄBO
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), LS2, ERC-2015-AdG
Summary Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Summary
Neandertals and Denisovans, an Asian group distantly related to Neandertals, are the closest evolutionary relatives of present-day humans. They are thus of direct relevance for understanding the origin of modern humans and how modern humans differ from their closest relatives. We will generate genome-wide data from a large number of Neandertal and Denisovan individuals from across their geographical and temporal range as well as from other extinct hominin groups which we may discover. This will be possible by automating highly sensitive approaches to ancient DNA extraction and DNA libraries construction that we have developed so that they can be applied to many specimens from many sites in order to identify those that contain retrievable DNA. Whenever possible we will sequence whole genomes and in other cases use DNA capture methods to generate high-quality data from representative parts of the genome. This will allow us to study the population history of Neandertals and Denisovans, elucidate how many times and where these extinct hominins contributed genes to present-day people, and the extent to which modern humans and archaic groups contributed genetically to Neandertals and Denisovans. By retrieving DNA from specimens that go back to the Middle Pleistocene we will furthermore shed light on the early history and origins of Neandertals and Denisovans.
Max ERC Funding
2 350 000 €
Duration
Start date: 2016-11-01, End date: 2021-10-31
Project acronym 14Constraint
Project Radiocarbon constraints for models of C cycling in terrestrial ecosystems: from process understanding to global benchmarking
Researcher (PI) Susan Trumbore
Host Institution (HI) MAX-PLANCK-GESELLSCHAFT ZUR FORDERUNG DER WISSENSCHAFTEN EV
Call Details Advanced Grant (AdG), PE10, ERC-2015-AdG
Summary The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Summary
The overall goal of 14Constraint is to enhance the availability and use of radiocarbon data as constraints for process-based understanding of the age distribution of carbon in and respired by soils and ecosystems. Carbon enters ecosystems by a single process, photosynthesis. It returns by a range of processes that depend on plant allocation and turnover, the efficiency and rate of litter decomposition and the mechanisms stabilizing C in soils. Thus the age distribution of respired CO2 and the age of C residing in plants, litter and soils are diagnostic properties of ecosystems that provide key constraints for testing carbon cycle models. Radiocarbon, especially the transit of ‘bomb’ 14C created in the 1960s, is a powerful tool for tracing C exchange on decadal to centennial timescales. 14Constraint will assemble a global database of existing radiocarbon data (WP1) and demonstrate how they can constrain and test ecosystem carbon cycle models. WP2 will fill data gaps and add new data from sites in key biomes that have ancillary data sufficient to construct belowground C and 14C budgets. These detailed investigations will focus on the role of time lags caused in necromass and fine roots, as well as the dynamics of deep soil C. Spatial extrapolation beyond the WP2 sites will require sampling along global gradients designed to explore the relative roles of mineralogy, vegetation and climate on the age of C in and respired from soil (WP3). Products of this 14Constraint will include the first publicly available global synthesis of terrestrial 14C data, and will add over 5000 new measurements. This project is urgently needed before atmospheric 14C levels decline to below 1950 levels as expected in the next decade.
Max ERC Funding
2 283 747 €
Duration
Start date: 2016-12-01, End date: 2021-11-30
Project acronym 1stProposal
Project An alternative development of analytic number theory and applications
Researcher (PI) ANDREW Granville
Host Institution (HI) UNIVERSITY COLLEGE LONDON
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Summary
The traditional (Riemann) approach to analytic number theory uses the zeros of zeta functions. This requires the associated multiplicative function, say f(n), to have special enough properties that the associated Dirichlet series may be analytically continued. In this proposal we continue to develop an approach which requires less of the multiplicative function, linking the original question with the mean value of f. Such techniques have been around for a long time but have generally been regarded as “ad hoc”. In this project we aim to show that one can develop a coherent approach to the whole subject, not only reproving all of the old results, but also many new ones that appear inaccessible to traditional methods.
Our first goal is to complete a monograph yielding a reworking of all the classical theory using these new methods and then to push forward in new directions. The most important is to extend these techniques to GL(n) L-functions, which we hope will now be feasible having found the correct framework in which to proceed. Since we rarely know how to analytically continue such L-functions this could be of great benefit to the subject.
We are developing the large sieve so that it can be used for individual moduli, and will determine a strong form of that. Also a new method to give asymptotics for mean values, when they are not too small.
We wish to incorporate techniques of analytic number theory into our theory, for example recent advances on mean values of Dirichlet polynomials. Also the recent breakthroughs on the sieve suggest strong links that need further exploration.
Additive combinatorics yields important results in many areas. There are strong analogies between its results, and those for multiplicative functions, especially in large value spectrum theory, and its applications. We hope to develop these further.
Much of this is joint work with K Soundararajan of Stanford University.
Max ERC Funding
2 011 742 €
Duration
Start date: 2015-08-01, End date: 2020-07-31
Project acronym 2DHIBSA
Project Nanoscopic and Hierachical Materials via Living Crystallization-Driven Self-Assembly
Researcher (PI) Ian MANNERS
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Summary
A key synthetic challenge of widespread interest in chemical science involves the creation of well-defined 2D functional materials that exist on a length-scale of nanometers to microns. In this ambitious 5 year proposal we aim to tackle this issue by exploiting the unique opportunities made possible by recent developments with the living crystallization-driven self-assembly (CDSA) platform. Using this solution processing approach, amphiphilic block copolymers (BCPs) with crystallizable blocks, related amphiphiles, and polymers with charged end groups will be used to predictably construct monodisperse samples of tailored, functional soft matter-based 2D nanostructures with controlled shape, size, and spatially-defined chemistries. Many of the resulting nanostructures will also offer unprecedented opportunities as precursors to materials with hierarchical structures through further solution-based “bottom-up” assembly methods. In addition to fundamental studies, the proposed work also aims to make important impact in the cutting-edge fields of liquid crystals, interface stabilization, catalysis, supramolecular polymers, and hierarchical materials.
Max ERC Funding
2 499 597 €
Duration
Start date: 2018-05-01, End date: 2023-04-30
Project acronym 2DNanoSpec
Project Nanoscale Vibrational Spectroscopy of Sensitive 2D Molecular Materials
Researcher (PI) Renato ZENOBI
Host Institution (HI) EIDGENOESSISCHE TECHNISCHE HOCHSCHULE ZUERICH
Call Details Advanced Grant (AdG), PE4, ERC-2016-ADG
Summary I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Summary
I propose to investigate the nanometer scale organization of delicate 2-dimensional molecular materials using nanoscale vibrational spectroscopy. 2D structures are of great scientific and technological importance, for example as novel materials (graphene, MoS2, WS2, etc.), and in the form of biological membranes and synthetic 2D-polymers. Powerful methods for their analysis and imaging with molecular selectivity and sufficient spatial resolution, however, are lacking. Tip-enhanced Raman spectroscopy (TERS) allows label-free spectroscopic identification of molecular species, with ≈10 nm spatial resolution, and with single molecule sensitivity for strong Raman scatterers. So far, however, TERS is not being carried out in liquids, which is the natural environment for membranes, and its application to poor Raman scatterers such as components of 2D polymers, lipids, or other membrane compounds (proteins, sugars) is difficult. TERS has the potential to overcome the restrictions of other optical/spectroscopic methods to study 2D materials, namely (i) insufficient spatial resolution of diffraction-limited optical methods; (ii) the need for labelling for all methods relying on fluorescence; and (iii) the inability of some methods to work in liquids. I propose to address a number of scientific questions associated with the spatial organization, and the occurrence of defects in sensitive 2D molecular materials. The success of these studies will also rely critically on technical innovations of TERS that notably address the problem of energy dissipation. This will for the first time allow its application to study of complex, delicate 2D molecular systems without photochemical damage.
Max ERC Funding
2 311 696 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym 2G-CSAFE
Project Combustion of Sustainable Alternative Fuels for Engines used in aeronautics and automotives
Researcher (PI) Philippe Dagaut
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE8, ERC-2011-ADG_20110209
Summary This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Summary
This project aims at promoting sustainable combustion technologies for transport via validation of advanced combustion kinetic models obtained using sophisticated new laboratory experiments, engines, and theoretical computations, breaking through the current frontier of knowledge. It will focus on the unexplored kinetics of ignition and combustion of 2nd generation (2G) biofuels and blends with conventional fuels, which should provide energy safety and sustainability to Europe. The motivation is that no accurate kinetic models are available for the ignition, oxidation and combustion of 2G-biofuels, and improved ignition control is needed for new compression ignition engines. Crucial information is missing: data from well characterised experiments on combustion-generated pollutants and data on key-intermediates for fuels ignition in new engines.
To provide that knowledge new well-instrumented complementary experiments and kinetic modelling will be used. Measurements of key-intermediates, stables species, and pollutants will be performed. New ignition control strategies will be designed, opening new technological horizons. Kinetic modelling will be used for rationalising the results. Due to the complexity of 2G-biofuels and their unusual composition, innovative surrogates will be designed. Kinetic models for surrogate fuels will be generalised for extension to other compounds. The experimental results, together with ab-initio and detailed modelling, will serve to characterise the kinetics of ignition, combustion, and pollutants formation of fuels including 2G biofuels, and provide relevant data and models.
This research is risky because this is (i) the 1st effort to measure radicals by reactor/CRDS coupling, (ii) the 1st effort to use a μ-channel reactor to build ignition databases for conventional and bio-fuels, (iii) the 1st effort to design and use controlled generation and injection of reactive species to control ignition/combustion in compression ignition engines
Max ERC Funding
2 498 450 €
Duration
Start date: 2011-12-01, End date: 2016-11-30
Project acronym 3-TOP
Project Exploring the physics of 3-dimensional topological insulators
Researcher (PI) Laurens Wigbolt Molenkamp
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Call Details Advanced Grant (AdG), PE3, ERC-2010-AdG_20100224
Summary Topological insulators constitute a novel class of materials where the topological details of the bulk band structure induce a robust surface state on the edges of the material. While transport data for 2-dimensional topological insulators have recently become available, experiments on their 3-dimensional counterparts are mainly limited to photoelectron spectroscopy. At the same time, a plethora of interesting novel physical phenomena have been predicted to occur in such systems.
In this proposal, we sketch an approach to tackle the transport and magnetic properties of the surface states in these materials. This starts with high quality layer growth, using molecular beam epitaxy, of bulk layers of HgTe, Bi2Se3 and Bi2Te3, which are the prime candidates to show the novel physics expected in this field. The existence of the relevant surface states will be assessed spectroscopically, but from there on research will focus on fabricating and characterizing nanostructures designed to elucidate the transport and magnetic properties of the topological surfaces using electrical, optical and scanning probe techniques. Apart from a general characterization of the Dirac band structure of the surface states, research will focus on the predicted magnetic monopole-like response of the system to an electrical test charge. In addition, much effort will be devoted to contacting the surface state with superconducting and magnetic top layers, with the final aim of demonstrating Majorana fermion behavior. As a final benefit, growth of thin high quality thin Bi2Se3 or Bi2Te3 layers could allow for a demonstration of the (2-dimensional) quantum spin Hall effect at room temperature - offering a road map to dissipation-less transport for the semiconductor industry.
Summary
Topological insulators constitute a novel class of materials where the topological details of the bulk band structure induce a robust surface state on the edges of the material. While transport data for 2-dimensional topological insulators have recently become available, experiments on their 3-dimensional counterparts are mainly limited to photoelectron spectroscopy. At the same time, a plethora of interesting novel physical phenomena have been predicted to occur in such systems.
In this proposal, we sketch an approach to tackle the transport and magnetic properties of the surface states in these materials. This starts with high quality layer growth, using molecular beam epitaxy, of bulk layers of HgTe, Bi2Se3 and Bi2Te3, which are the prime candidates to show the novel physics expected in this field. The existence of the relevant surface states will be assessed spectroscopically, but from there on research will focus on fabricating and characterizing nanostructures designed to elucidate the transport and magnetic properties of the topological surfaces using electrical, optical and scanning probe techniques. Apart from a general characterization of the Dirac band structure of the surface states, research will focus on the predicted magnetic monopole-like response of the system to an electrical test charge. In addition, much effort will be devoted to contacting the surface state with superconducting and magnetic top layers, with the final aim of demonstrating Majorana fermion behavior. As a final benefit, growth of thin high quality thin Bi2Se3 or Bi2Te3 layers could allow for a demonstration of the (2-dimensional) quantum spin Hall effect at room temperature - offering a road map to dissipation-less transport for the semiconductor industry.
Max ERC Funding
2 419 590 €
Duration
Start date: 2011-04-01, End date: 2016-03-31
Project acronym 3D-E
Project 3D Engineered Environments for Regenerative Medicine
Researcher (PI) Ruth Elizabeth Cameron
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE8, ERC-2012-ADG_20120216
Summary "This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Summary
"This proposal develops a unified, underpinning technology to create novel, complex and biomimetic 3D environments for the control of tissue growth. As director of Cambridge Centre for Medical Materials, I have recently been approached by medical colleagues to help to solve important problems in the separate therapeutic areas of breast cancer, cardiac disease and blood disorders. In each case, the solution lies in complex 3D engineered environments for cell culture. These colleagues make it clear that existing 3D scaffolds fail to provide the required complex orientational and spatial anisotropy, and are limited in their ability to impart appropriate biochemical and mechanical cues.
I have a strong track record in this area. A particular success has been the use of a freeze drying technology to make collagen based porous implants for the cartilage-bone interface in the knee, which has now been commercialised. The novelty of this proposal lies in the broadening of the established scientific base of this technology to enable biomacromolecular structures with:
(A) controlled and complex pore orientation to mimic many normal multi-oriented tissue structures
(B) compositional and positional control to match varying local biochemical environments,
(C) the attachment of novel peptides designed to control cell behaviour, and
(D) mechanical control at both a local and macroscopic level to provide mechanical cues for cells.
These will be complemented by the development of
(E) robust characterisation methodologies for the structures created.
These advances will then be employed in each of the medical areas above.
This approach is highly interdisciplinary. Existing working relationships with experts in each medical field will guarantee expertise and licensed facilities in the required biological disciplines. Funds for this proposal would therefore establish a rich hub of mutually beneficial research and opportunities for cross-disciplinary sharing of expertise."
Max ERC Funding
2 486 267 €
Duration
Start date: 2013-04-01, End date: 2018-03-31
Project acronym 3DBrainStrom
Project Brain metastases: Deciphering tumor-stroma interactions in three dimensions for the rational design of nanomedicines
Researcher (PI) Ronit Satchi Fainaro
Host Institution (HI) TEL AVIV UNIVERSITY
Call Details Advanced Grant (AdG), LS7, ERC-2018-ADG
Summary Brain metastases represent a major therapeutic challenge. Despite significant breakthroughs in targeted therapies, survival rates of patients with brain metastases remain poor. Nowadays, discovery, development and evaluation of new therapies are performed on human cancer cells grown in 2D on rigid plastic plates followed by in vivo testing in immunodeficient mice. These experimental settings are lacking and constitute a fundamental hurdle for the translation of preclinical discoveries into clinical practice. We propose to establish 3D-printed models of brain metastases (Aim 1), which include brain extracellular matrix, stroma and serum containing immune cells flowing in functional tumor vessels. Our unique models better capture the clinical physio-mechanical tissue properties, signaling pathways, hemodynamics and drug responsiveness. Using our 3D-printed models, we aim to develop two new fronts for identifying novel clinically-relevant molecular drivers (Aim 2) followed by the development of precision nanomedicines (Aim 3). We will exploit our vast experience in anticancer nanomedicines to design three therapeutic approaches that target various cellular compartments involved in brain metastases: 1) Prevention of brain metastatic colonization using targeted nano-vaccines, which elicit antitumor immune response; 2) Intervention of tumor-brain stroma cells crosstalk when brain micrometastases establish; 3) Regression of macrometastatic disease by selectively targeting tumor cells. These approaches will materialize using our libraries of polymeric nanocarriers that selectively accumulate in tumors.
This project will result in a paradigm shift by generating new preclinical cancer models that will bridge the translational gap in cancer therapeutics. The insights and tumor-stroma-targeted nanomedicines developed here will pave the way for prediction of patient outcome, revolutionizing our perception of tumor modelling and consequently the way we prevent and treat cancer.
Summary
Brain metastases represent a major therapeutic challenge. Despite significant breakthroughs in targeted therapies, survival rates of patients with brain metastases remain poor. Nowadays, discovery, development and evaluation of new therapies are performed on human cancer cells grown in 2D on rigid plastic plates followed by in vivo testing in immunodeficient mice. These experimental settings are lacking and constitute a fundamental hurdle for the translation of preclinical discoveries into clinical practice. We propose to establish 3D-printed models of brain metastases (Aim 1), which include brain extracellular matrix, stroma and serum containing immune cells flowing in functional tumor vessels. Our unique models better capture the clinical physio-mechanical tissue properties, signaling pathways, hemodynamics and drug responsiveness. Using our 3D-printed models, we aim to develop two new fronts for identifying novel clinically-relevant molecular drivers (Aim 2) followed by the development of precision nanomedicines (Aim 3). We will exploit our vast experience in anticancer nanomedicines to design three therapeutic approaches that target various cellular compartments involved in brain metastases: 1) Prevention of brain metastatic colonization using targeted nano-vaccines, which elicit antitumor immune response; 2) Intervention of tumor-brain stroma cells crosstalk when brain micrometastases establish; 3) Regression of macrometastatic disease by selectively targeting tumor cells. These approaches will materialize using our libraries of polymeric nanocarriers that selectively accumulate in tumors.
This project will result in a paradigm shift by generating new preclinical cancer models that will bridge the translational gap in cancer therapeutics. The insights and tumor-stroma-targeted nanomedicines developed here will pave the way for prediction of patient outcome, revolutionizing our perception of tumor modelling and consequently the way we prevent and treat cancer.
Max ERC Funding
2 353 125 €
Duration
Start date: 2019-04-01, End date: 2024-03-31
Project acronym 3DEpi
Project Transgenerational epigenetic inheritance of chromatin states : the role of Polycomb and 3D chromosome architecture
Researcher (PI) Giacomo CAVALLI
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS2, ERC-2017-ADG
Summary Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Summary
Epigenetic inheritance entails transmission of phenotypic traits not encoded in the DNA sequence and, in the most extreme case, Transgenerational Epigenetic Inheritance (TEI) involves transmission of memory through multiple generations. Very little is known on the mechanisms governing TEI and this is the subject of the present proposal. By transiently enhancing long-range chromatin interactions, we recently established isogenic Drosophila epilines that carry stable alternative epialleles, defined by differential levels of the Polycomb-dependent H3K27me3 mark. Furthermore, we extended our paradigm to natural phenotypes. These are ideal systems to study the role of Polycomb group (PcG) proteins and other components in regulating nuclear organization and epigenetic inheritance of chromatin states. The present project conjugates genetics, epigenomics, imaging and molecular biology to reach three critical aims.
Aim 1: Analysis of the molecular mechanisms regulating Polycomb-mediated TEI. We will identify the DNA, protein and RNA components that trigger and maintain transgenerational chromatin inheritance as well as their mechanisms of action.
Aim 2: Role of 3D genome organization in the regulation of TEI. We will analyze the developmental dynamics of TEI-inducing long-range chromatin interactions, identify chromatin components mediating 3D chromatin contacts and characterize their function in the TEI process.
Aim 3: Identification of a broader role of TEI during development. TEI might reflect a normal role of PcG components in the transmission of parental chromatin onto the next embryonic generation. We will explore this possibility by establishing other TEI paradigms and by relating TEI to the normal PcG function in these systems and in normal development.
This research program will unravel the biological significance and the molecular underpinnings of TEI and lead the way towards establishing this area of research into a consolidated scientific discipline.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-11-01, End date: 2023-10-31
Project acronym 3DIMAGE
Project 3D Imaging Across Lengthscales: From Atoms to Grains
Researcher (PI) Paul Anthony Midgley
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary "Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Summary
"Understanding structure-property relationships across lengthscales is key to the design of functional and structural materials and devices. Moreover, the complexity of modern devices extends to three dimensions and as such 3D characterization is required across those lengthscales to provide a complete understanding and enable improvement in the material’s physical and chemical behaviour. 3D imaging and analysis from the atomic scale through to granular microstructure is proposed through the development of electron tomography using (S)TEM, and ‘dual beam’ SEM-FIB, techniques offering complementary approaches to 3D imaging across lengthscales stretching over 5 orders of magnitude.
We propose to extend tomography to include novel methods to determine atom positions in 3D with approaches incorporating new reconstruction algorithms, image processing and complementary nano-diffraction techniques. At the nanoscale, true 3D nano-metrology of morphology and composition is a key objective of the project, minimizing reconstruction and visualization artefacts. Mapping strain and optical properties in 3D are ambitious and exciting challenges that will yield new information at the nanoscale. Using the SEM-FIB, 3D ‘mesoscale’ structures will be revealed: morphology, crystallography and composition can be mapped simultaneously, with ~5nm resolution and over volumes too large to tackle by (S)TEM and too small for most x-ray techniques. In parallel, we will apply 3D imaging to a wide variety of key materials including heterogeneous catalysts, aerospace alloys, biomaterials, photovoltaic materials, and novel semiconductors.
We will collaborate with many departments in Cambridge and institutes worldwide. The personnel on the proposal will cover all aspects of the tomography proposed using high-end TEMs, including an aberration-corrected Titan, and a Helios dual beam. Importantly, a postdoc is dedicated to developing new algorithms for reconstruction, image and spectral processing."
Max ERC Funding
2 337 330 €
Duration
Start date: 2012-01-01, End date: 2017-12-31
Project acronym 3DNANOMECH
Project Three-dimensional molecular resolution mapping of soft matter-liquid interfaces
Researcher (PI) Ricardo Garcia
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), PE4, ERC-2013-ADG
Summary Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Summary
Optical, electron and probe microscopes are enabling tools for discoveries and knowledge generation in nanoscale sicence and technology. High resolution –nanoscale or molecular-, noninvasive and label-free imaging of three-dimensional soft matter-liquid interfaces has not been achieved by any microscopy method.
Force microscopy (AFM) is considered the second most relevant advance in materials science since 1960. Despite its impressive range of applications, the technique has some key limitations. Force microscopy has not three dimensional depth. What lies above or in the subsurface is not readily characterized.
3DNanoMech proposes to design, build and operate a high speed force-based method for the three-dimensional characterization soft matter-liquid interfaces (3D AFM). The microscope will combine a detection method based on force perturbations, adaptive algorithms, high speed piezo actuators and quantitative-oriented multifrequency approaches. The development of the microscope cannot be separated from its applications: imaging the error-free DNA repair and to understand the relationship existing between the nanomechanical properties and the malignancy of cancer cells. Those problems encompass the different spatial –molecular-nano-mesoscopic- and time –milli to seconds- scales of the instrument.
In short, 3DNanoMech aims to image, map and measure with picoNewton, millisecond and angstrom resolution soft matter surfaces and interfaces in liquid. The long-term vision of 3DNanoMech is to replace models or computer animations of bimolecular-liquid interfaces by real time, molecular resolution maps of properties and processes.
Max ERC Funding
2 499 928 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym 3SPIN
Project Three Dimensional Spintronics
Researcher (PI) Russell Paul Cowburn
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary Spintronics, in which both the spin and the charge of the electron are used, is one of the most exciting new disciplines to emerge from nanoscience. The 3SPIN project seeks to open a new research front within spintronics: namely 3-dimensional spintronics, in which magnetic nanostructures are formed into a 3-dimensional interacting network of unrivalled density and hence technological benefit. 3SPIN will explore early-stage science that could underpin 3-dimensional metallic spintronics. The thesis of the project is: that by careful control of the constituent nanostructure properties, a 3-dimensional medium can be created in which a large number of topological solitons can exist. Although hardly studied at all to date, these solitons should be stable at room temperature, extremely compact and easy to manipulate and propagate. This makes them potentially ideal candidates to form the basis of a new spintronics in which the soliton is the basic transport vector instead of electrical current. ¬3.5M of funding is requested to form a new team of 5 researchers who, over a period of 60 months, will perform computer simulations and experimental studies of solitons in 3-dimensional networks of magnetic nanostructures and develop a laboratory demonstrator 3-dimensional memory device using solitons to represent and store data. A high performance electron beam lithography system (cost 1M¬) will be purchased to allow state-of-the-art magnetic nanostructures to be fabricated with perfect control over their magnetic properties, thus allowing the ideal conditions for solitons to be created and controllably manipulated. Outputs from the project will be a complete understanding of the properties of these new objects and a road map charting the next steps for research in the field.
Summary
Spintronics, in which both the spin and the charge of the electron are used, is one of the most exciting new disciplines to emerge from nanoscience. The 3SPIN project seeks to open a new research front within spintronics: namely 3-dimensional spintronics, in which magnetic nanostructures are formed into a 3-dimensional interacting network of unrivalled density and hence technological benefit. 3SPIN will explore early-stage science that could underpin 3-dimensional metallic spintronics. The thesis of the project is: that by careful control of the constituent nanostructure properties, a 3-dimensional medium can be created in which a large number of topological solitons can exist. Although hardly studied at all to date, these solitons should be stable at room temperature, extremely compact and easy to manipulate and propagate. This makes them potentially ideal candidates to form the basis of a new spintronics in which the soliton is the basic transport vector instead of electrical current. ¬3.5M of funding is requested to form a new team of 5 researchers who, over a period of 60 months, will perform computer simulations and experimental studies of solitons in 3-dimensional networks of magnetic nanostructures and develop a laboratory demonstrator 3-dimensional memory device using solitons to represent and store data. A high performance electron beam lithography system (cost 1M¬) will be purchased to allow state-of-the-art magnetic nanostructures to be fabricated with perfect control over their magnetic properties, thus allowing the ideal conditions for solitons to be created and controllably manipulated. Outputs from the project will be a complete understanding of the properties of these new objects and a road map charting the next steps for research in the field.
Max ERC Funding
2 799 996 €
Duration
Start date: 2010-03-01, End date: 2016-02-29
Project acronym 4-TOPS
Project Four experiments in Topological Superconductivity.
Researcher (PI) Laurens Molenkamp
Host Institution (HI) JULIUS-MAXIMILIANS-UNIVERSITAT WURZBURG
Call Details Advanced Grant (AdG), PE3, ERC-2016-ADG
Summary Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Summary
Topological materials have developed rapidly in recent years, with my previous ERC-AG project 3-TOP playing a major role in this development. While so far no bulk topological superconductor has been unambiguously demonstrated, their properties can be studied in a very flexible manner by inducing superconductivity through the proximity effect into the surface or edge states of a topological insulator. In 4-TOPS we will explore the possibilities of this approach in full, and conduct a thorough study of induced superconductivity in both two and three dimensional HgTe based topological insulators. The 4 avenues we will follow are:
-SQUID based devices to investigate full phase dependent spectroscopy of the gapless Andreev bound state by studying their Josephson radiation and current-phase relationships.
-Experiments aimed at providing unambiguous proof of localized Majorana states in TI junctions by studying tunnelling transport into such states.
-Attempts to induce superconductivity in Quantum Hall states with the aim of creating a chiral topological superconductor. These chiral superconductors host Majorana fermions at their edges, which, at least in the case of a single QH edge mode, follow non-Abelian statistics and are therefore promising for explorations in topological quantum computing.
-Studies of induced superconductivity in Weyl semimetals, a completely unexplored state of matter.
Taken together, these four sets of experiments will greatly enhance our understanding of topological superconductivity, which is not only a subject of great academic interest as it constitutes the study of new phases of matter, but also has potential application in the field of quantum information processing.
Max ERC Funding
2 497 567 €
Duration
Start date: 2017-06-01, End date: 2022-05-31
Project acronym 4D IMAGING
Project Towards 4D Imaging of Fundamental Processes on the Atomic and Sub-Atomic Scale
Researcher (PI) Ferenc Krausz
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE2, ERC-2009-AdG
Summary State-of-the-art microscopy and diffraction imaging provides insight into the atomic and sub-atomic structure of matter. They permit determination of the positions of atoms in a crystal lattice or in a molecule as well as the distribution of electrons inside atoms. State-of-the-art time-resolved spectroscopy with femtosecond and attosecond resolution provides access to dynamic changes in the atomic and electronic structure of matter. Our proposal aims at combining these two frontier techniques of XXI century science to make a long-standing dream of scientist come true: the direct observation of atoms and electrons in their natural state: in motion. Shifts in the atoms positions by tens to hundreds of picometers can make chemical bonds break apart or newly form, changing the structure and/or chemical composition of matter. Electronic motion on similar scales may result in the emission of light, or the initiation of processes that lead to a change in physical or chemical properties, or biological function. These motions happen within femtoseconds and attoseconds, respectively. To make them observable, we need a 4-dimensional (4D) imaging technique capable of recording freeze-frame snapshots of microscopic systems with picometer spatial resolution and femtosecond to attosecond exposure time. The motion can then be visualized by slow-motion replay of the freeze-frame shots. The goal of this project is to develop a 4D imaging technique that will ultimately offer picometer resolution is space and attosecond resolution in time.
Summary
State-of-the-art microscopy and diffraction imaging provides insight into the atomic and sub-atomic structure of matter. They permit determination of the positions of atoms in a crystal lattice or in a molecule as well as the distribution of electrons inside atoms. State-of-the-art time-resolved spectroscopy with femtosecond and attosecond resolution provides access to dynamic changes in the atomic and electronic structure of matter. Our proposal aims at combining these two frontier techniques of XXI century science to make a long-standing dream of scientist come true: the direct observation of atoms and electrons in their natural state: in motion. Shifts in the atoms positions by tens to hundreds of picometers can make chemical bonds break apart or newly form, changing the structure and/or chemical composition of matter. Electronic motion on similar scales may result in the emission of light, or the initiation of processes that lead to a change in physical or chemical properties, or biological function. These motions happen within femtoseconds and attoseconds, respectively. To make them observable, we need a 4-dimensional (4D) imaging technique capable of recording freeze-frame snapshots of microscopic systems with picometer spatial resolution and femtosecond to attosecond exposure time. The motion can then be visualized by slow-motion replay of the freeze-frame shots. The goal of this project is to develop a 4D imaging technique that will ultimately offer picometer resolution is space and attosecond resolution in time.
Max ERC Funding
2 500 000 €
Duration
Start date: 2010-03-01, End date: 2015-02-28
Project acronym 4D-EEG
Project 4D-EEG: A new tool to investigate the spatial and temporal activity patterns in the brain
Researcher (PI) Franciscus C.T. Van Der Helm
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Summary
Our first goal is to develop a new tool to determine brain activity with a high temporal (< 1 msec) and spatial (about 2 mm) resolution with the focus on motor control. High density EEG (up to 256 electrodes) will be used for EEG source localization. Advanced force-controlled robot manipulators will be used to impose continuous force perturbations to the joints. Advanced closed-loop system identification algorithms will identify the dynamic EEG response of multiple brain areas to the perturbation, leading to a functional interpretation of EEG. The propagation of the signal in time and 3D space through the cortex can be monitored: 4D-EEG. Preliminary experiments with EEG localization have shown that the continuous force perturbations resulted in a better signal-to-noise ratio and coherence than the current method using transient perturbations..
4D-EEG will be a direct measure of the neural activity in the brain with an excellent temporal response and easy to use in combination with motor control tasks. The new 4D-EEG method is expected to provide a breakthrough in comparison to functional MRI (fMRI) when elucidating the meaning of cortical map plasticity in motor learning.
Our second goal is to generate and validate new hypotheses about the longitudinal relationship between motor learning and cortical map plasticity by clinically using 4D-EEG in an intensive, repeated measurement design in patients suffering from a stroke. The application of 4D-EEG combined with haptic robots will allow us to discover how dynamics in cortical map plasticity are related with upper limb recovery after stroke in terms of neural repair and using behavioral compensation strategies while performing a meaningful motor tasks.. The non-invasive 4D-EEG technique combined with haptic robots will open the window about what and how patients (re)learn when showing motor recovery after stroke in order to allow us to develop more effective patient-tailored therapies in neuro-rehabilitation.
Max ERC Funding
3 477 202 €
Duration
Start date: 2012-06-01, End date: 2017-05-31
Project acronym 4D-PET
Project Innovative PET scanner for dynamic imaging
Researcher (PI) José María BENLLOCH BAVIERA
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), LS7, ERC-2015-AdG
Summary The main objective of 4D-PET is to develop an innovative whole-body PET scanner based in a new detector concept that stores 3D position and time of every single gamma interaction with unprecedented resolution. The combination of scanner geometrical design and high timing resolution will enable developing a full sequence of all gamma-ray interactions inside the scanner, including Compton interactions, like in a 3D movie. 4D-PET fully exploits Time Of Flight (TOF) information to obtain a better image quality and to increase scanner sensitivity, through the inclusion in the image formation of all Compton events occurring inside the detector, which are always rejected in state-of-the-art PET scanners. The new PET design will radically improve state-of-the-art PET performance features, overcoming limitations of current PET technology and opening up new diagnostic venues and very valuable physiological information
Summary
The main objective of 4D-PET is to develop an innovative whole-body PET scanner based in a new detector concept that stores 3D position and time of every single gamma interaction with unprecedented resolution. The combination of scanner geometrical design and high timing resolution will enable developing a full sequence of all gamma-ray interactions inside the scanner, including Compton interactions, like in a 3D movie. 4D-PET fully exploits Time Of Flight (TOF) information to obtain a better image quality and to increase scanner sensitivity, through the inclusion in the image formation of all Compton events occurring inside the detector, which are always rejected in state-of-the-art PET scanners. The new PET design will radically improve state-of-the-art PET performance features, overcoming limitations of current PET technology and opening up new diagnostic venues and very valuable physiological information
Max ERC Funding
2 048 386 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym 4DBIOSERS
Project Four-Dimensional Monitoring of Tumour Growth by Surface Enhanced Raman Scattering
Researcher (PI) Luis LIZ-MARZAN
Host Institution (HI) ASOCIACION CENTRO DE INVESTIGACION COOPERATIVA EN BIOMATERIALES- CIC biomaGUNE
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary Optical bioimaging is limited by visible light penetration depth and stability of fluorescent dyes over extended periods of time. Surface enhanced Raman scattering (SERS) offers the possibility to overcome these drawbacks, through SERS-encoded nanoparticle tags, which can be excited with near-IR light (within the biological transparency window), providing high intensity, stable, multiplexed signals. SERS can also be used to monitor relevant bioanalytes within cells and tissues, during the development of diseases, such as tumours. In 4DBIOSERS we shall combine both capabilities of SERS, to go well beyond the current state of the art, by building three-dimensional scaffolds that support tissue (tumour) growth within a controlled environment, so that not only the fate of each (SERS-labelled) cell within the tumour can be monitored in real time (thus adding a fourth dimension to SERS bioimaging), but also recording the release of tumour metabolites and other indicators of cellular activity. Although 4DBIOSERS can be applied to a variety of diseases, we shall focus on cancer, melanoma and breast cancer in particular, as these are readily accessible by optical methods. We aim at acquiring a better understanding of tumour growth and dynamics, while avoiding animal experimentation. 3D printing will be used to generate hybrid scaffolds where tumour and healthy cells will be co-incubated to simulate a more realistic environment, thus going well beyond the potential of 2D cell cultures. Each cell type will be encoded with ultra-bright SERS tags, so that real-time monitoring can be achieved by confocal SERS microscopy. Tumour development will be correlated with simultaneous detection of various cancer biomarkers, during standard conditions and upon addition of selected drugs. The scope of 4DBIOSERS is multidisciplinary, as it involves the design of high-end nanocomposites, development of 3D cell culture models and optimization of emerging SERS tomography methods.
Summary
Optical bioimaging is limited by visible light penetration depth and stability of fluorescent dyes over extended periods of time. Surface enhanced Raman scattering (SERS) offers the possibility to overcome these drawbacks, through SERS-encoded nanoparticle tags, which can be excited with near-IR light (within the biological transparency window), providing high intensity, stable, multiplexed signals. SERS can also be used to monitor relevant bioanalytes within cells and tissues, during the development of diseases, such as tumours. In 4DBIOSERS we shall combine both capabilities of SERS, to go well beyond the current state of the art, by building three-dimensional scaffolds that support tissue (tumour) growth within a controlled environment, so that not only the fate of each (SERS-labelled) cell within the tumour can be monitored in real time (thus adding a fourth dimension to SERS bioimaging), but also recording the release of tumour metabolites and other indicators of cellular activity. Although 4DBIOSERS can be applied to a variety of diseases, we shall focus on cancer, melanoma and breast cancer in particular, as these are readily accessible by optical methods. We aim at acquiring a better understanding of tumour growth and dynamics, while avoiding animal experimentation. 3D printing will be used to generate hybrid scaffolds where tumour and healthy cells will be co-incubated to simulate a more realistic environment, thus going well beyond the potential of 2D cell cultures. Each cell type will be encoded with ultra-bright SERS tags, so that real-time monitoring can be achieved by confocal SERS microscopy. Tumour development will be correlated with simultaneous detection of various cancer biomarkers, during standard conditions and upon addition of selected drugs. The scope of 4DBIOSERS is multidisciplinary, as it involves the design of high-end nanocomposites, development of 3D cell culture models and optimization of emerging SERS tomography methods.
Max ERC Funding
2 410 771 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym 4PI-SKY
Project 4 pi sky: Extreme Astrophysics with Revolutionary Radio Telescopes
Researcher (PI) Robert Philip Fender
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE9, ERC-2010-AdG_20100224
Summary Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Summary
Extreme astrophysical events such as relativistic flows, cataclysmic explosions and black hole accretion are one of the key areas for astrophysics in the 21st century. The extremes of physics experienced in these environments are beyond anything achievable in any laboratory on Earth, and provide a unique glimpse at the laws of physics operating in extraordinary regimes. All of these events are associated with transient radio emission, a tracer both of the acceleration of particles to relativistic energies, and coherent emitting regions with huge effective temperatures. By studying radio bursts from these phenomena we can pinpoint the sources of explosive events, understand the budget of kinetic feedback by explosive events in the ambient medium, and probe the physical state of the universe back to the epoch of reionisation, less than a billion years after the big bang. In seeking to push back the frontiers of extreme astrophysics, I will use a trio of revolutionary new radio telescopes, LOFAR, ASKAP and MeerKAT, pathfinders for the Square Kilometre Array, and all facilities in which I have a major role in the search for transients. I will build an infrastructure which transforms their combined operations for the discovery, classification and reporting of transient astrophysical events, over the whole sky, making them much more than the sum of their parts. This will include development of environments for the coordinated handling of extreme astrophysical events, in real time, via automated systems, as well as novel techniques for the detection of these events in a sea of noise. I will furthermore augment this program by buying in as a major partner to a rapid-response robotic optical telescope, and by cementing my relationship with an orbiting X-ray facility. This multiwavelength dimension will secure the astrophysical interpretation of our observational results and help to revolutionise high-energy astrophysics via a strong scientific exploitation program.
Max ERC Funding
2 999 847 €
Duration
Start date: 2011-07-01, End date: 2017-06-30
Project acronym 5COFM
Project Five Centuries of Marriages
Researcher (PI) Anna Cabré
Host Institution (HI) UNIVERSIDAD AUTONOMA DE BARCELONA
Call Details Advanced Grant (AdG), SH6, ERC-2010-AdG_20100407
Summary This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Summary
This long-term research project is based on the data-mining of the Llibres d'Esposalles conserved at the Archives of the Barcelona Cathedral, an extraordinary data source comprising 244 books of marriage licenses records. It covers about 550.000 unions from over 250 parishes of the Diocese between 1451 and 1905. Its impeccable conservation is a miracle in a region where parish archives have undergone massive destruction. The books include data on the tax posed on each couple depending on their social class, on an eight-tiered scale. These data allow for research on multiple aspects of demographic research, especially on the very long run, such as: population estimates, marriage dynamics, cycles, and indirect estimations for fertility, migration and survival, as well as socio-economic studies related to social homogamy, social mobility, and transmission of social and occupational position. Being continuous over five centuries, the source constitutes a unique instrument to study the dynamics of population distribution, the expansion of the city of Barcelona and the constitution of its metropolitan area, as well as the chronology and the geography in the constitution of new social classes.
To this end, a digital library and a database, the Barcelona Historical Marriages Database (BHiMaD), are to be created and completed. An ERC-AG will help doing so while undertaking the research analysis of the database in parallel.
The research team, at the U. Autònoma de Barcelona, involves researchers from the Center for Demo-graphic Studies and the Computer Vision Center experts in historical databases and computer-aided recognition of ancient manuscripts. 5CofM will serve the preservation of the original “Llibres d’Esposalles” and unlock the full potential embedded in the collection.
Max ERC Funding
1 847 400 €
Duration
Start date: 2011-05-01, End date: 2016-04-30
Project acronym 5D Heart Patch
Project A Functional, Mature In vivo Human Ventricular Muscle Patch for Cardiomyopathy
Researcher (PI) Kenneth Randall Chien
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Advanced Grant (AdG), LS7, ERC-2016-ADG
Summary Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Summary
Developing new therapeutic strategies for heart regeneration is a major goal for cardiac biology and medicine. While cardiomyocytes can be generated from human pluripotent stem (hPSC) cells in vitro, it has proven difficult to use these cells to generate a large scale, mature human heart ventricular muscle graft on the injured heart in vivo. The central objective of this proposal is to optimize the generation of a large-scale pure, fully functional human ventricular muscle patch in vivo through the self-assembly of purified human ventricular progenitors and the localized expression of defined paracrine factors that drive their expansion, differentiation, vascularization, matrix formation, and maturation. Recently, we have found that purified hPSC-derived ventricular progenitors (HVPs) can self-assemble in vivo on the epicardial surface into a 3D vascularized, and functional ventricular patch with its own extracellular matrix via a cell autonomous pathway. A two-step protocol and FACS purification of HVP receptors can generate billions of pure HVPs- The current proposal will lead to the identification of defined paracrine pathways to enhance the survival, grafting/implantation, expansion, differentiation, matrix formation, vascularization and maturation of the graft in vivo. We will captalize on our unique HVP system and our novel modRNA technology to deliver therapeutic strategies by using the in vivo human ventricular muscle to model in vivo arrhythmogenic cardiomyopathy, and optimize the ability of the graft to compensate for the massive loss of functional muscle during ischemic cardiomyopathy and post-myocardial infarction. The studies will lead to new in vivo chimeric models of human cardiac disease and an experimental paradigm to optimize organ-on-organ cardiac tissue engineers of an in vivo, functional mature ventricular patch for cardiomyopathy
Max ERC Funding
2 149 228 €
Duration
Start date: 2017-12-01, End date: 2022-11-30
Project acronym 5HT-OPTOGENETICS
Project Optogenetic Analysis of Serotonin Function in the Mammalian Brain
Researcher (PI) Zachary Mainen
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Advanced Grant (AdG), LS5, ERC-2009-AdG
Summary Serotonin (5-HT) is implicated in a wide spectrum of brain functions and disorders. However, its functions remain controversial and enigmatic. We suggest that past work on the 5-HT system have been significantly hampered by technical limitations in the selectivity and temporal resolution of the conventional pharmacological and electrophysiological methods that have been applied. We therefore propose to apply novel optogenetic methods that will allow us to overcome these limitations and thereby gain new insight into the biological functions of this important molecule. In preliminary studies, we have demonstrated that we can deliver exogenous proteins specifically to 5-HT neurons using viral vectors. Our objectives are to (1) record, (2) stimulate and (3) silence the activity of 5-HT neurons with high molecular selectivity and temporal precision by using genetically-encoded sensors, activators and inhibitors of neural function. These tools will allow us to monitor and control the 5-HT system in real-time in freely-behaving animals and thereby to establish causal links between information processing in 5-HT neurons and specific behaviors. In combination with quantitative behavioral assays, we will use this approach to define the role of 5-HT in sensory, motor and cognitive functions. The significance of the work is three-fold. First, we will establish a new arsenal of tools for probing the physiological and behavioral functions of 5-HT neurons. Second, we will make definitive tests of major hypotheses of 5-HT function. Third, we will have possible therapeutic applications. In this way, the proposed work has the potential for a major impact in research on the role of 5-HT in brain function and dysfunction.
Summary
Serotonin (5-HT) is implicated in a wide spectrum of brain functions and disorders. However, its functions remain controversial and enigmatic. We suggest that past work on the 5-HT system have been significantly hampered by technical limitations in the selectivity and temporal resolution of the conventional pharmacological and electrophysiological methods that have been applied. We therefore propose to apply novel optogenetic methods that will allow us to overcome these limitations and thereby gain new insight into the biological functions of this important molecule. In preliminary studies, we have demonstrated that we can deliver exogenous proteins specifically to 5-HT neurons using viral vectors. Our objectives are to (1) record, (2) stimulate and (3) silence the activity of 5-HT neurons with high molecular selectivity and temporal precision by using genetically-encoded sensors, activators and inhibitors of neural function. These tools will allow us to monitor and control the 5-HT system in real-time in freely-behaving animals and thereby to establish causal links between information processing in 5-HT neurons and specific behaviors. In combination with quantitative behavioral assays, we will use this approach to define the role of 5-HT in sensory, motor and cognitive functions. The significance of the work is three-fold. First, we will establish a new arsenal of tools for probing the physiological and behavioral functions of 5-HT neurons. Second, we will make definitive tests of major hypotheses of 5-HT function. Third, we will have possible therapeutic applications. In this way, the proposed work has the potential for a major impact in research on the role of 5-HT in brain function and dysfunction.
Max ERC Funding
2 318 636 €
Duration
Start date: 2010-07-01, End date: 2015-12-31
Project acronym 5HTCircuits
Project Modulation of cortical circuits and predictive neural coding by serotonin
Researcher (PI) Zachary Mainen
Host Institution (HI) FUNDACAO D. ANNA SOMMER CHAMPALIMAUD E DR. CARLOS MONTEZ CHAMPALIMAUD
Call Details Advanced Grant (AdG), LS5, ERC-2014-ADG
Summary Serotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.
Summary
Serotonin (5-HT) is a central neuromodulator and a major target of therapeutic psychoactive drugs, but relatively little is known about how it modulates information processing in neural circuits. The theory of predictive coding postulates that the brain combines raw bottom-up sensory information with top-down information from internal models to make perceptual inferences about the world. We hypothesize, based on preliminary data and prior literature, that a role of 5-HT in this process is to report prediction errors and promote the suppression and weakening of erroneous internal models. We propose that it does this by inhibiting top-down relative to bottom-up cortical information flow. To test this hypothesis, we propose a set of experiments in mice performing olfactory perceptual tasks. Our specific aims are: (1) We will test whether 5-HT neurons encode sensory prediction errors. (2) We will test their causal role in using predictive cues to guide perceptual decisions. (3) We will characterize how 5-HT influences the encoding of sensory information by neuronal populations in the olfactory cortex and identify the underlying circuitry. (4) Finally, we will map the effects of 5-HT across the whole brain and use this information to target further causal manipulations to specific 5-HT projections. We accomplish these aims using state-of-the-art optogenetic, electrophysiological and imaging techniques (including 9.4T small-animal functional magnetic resonance imaging) as well as psychophysical tasks amenable to quantitative analysis and computational theory. Together, these experiments will tackle multiple facets of an important general computational question, bringing to bear an array of cutting-edge technologies to address with unprecedented mechanistic detail how 5-HT impacts neural coding and perceptual decision-making.
Max ERC Funding
2 486 074 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym A-DATADRIVE-B
Project Advanced Data-Driven Black-box modelling
Researcher (PI) Johan Adelia K Suykens
Host Institution (HI) KATHOLIEKE UNIVERSITEIT LEUVEN
Call Details Advanced Grant (AdG), PE7, ERC-2011-ADG_20110209
Summary Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Summary
Making accurate predictions is a crucial factor in many systems (such as in modelling energy consumption, power load forecasting, traffic networks, process industry, environmental modelling, biomedicine, brain-machine interfaces) for cost savings, efficiency, health, safety and organizational purposes. In this proposal we aim at realizing a new generation of more advanced black-box modelling techniques for estimating predictive models from measured data. We will study different optimization modelling frameworks in order to obtain improved black-box modelling approaches. This will be done by specifying models through constrained optimization problems by studying different candidate core models (parametric models, support vector machines and kernel methods) together with additional sets of constraints and regularization mechanisms. Different candidate mathematical frameworks will be considered with models that possess primal and (Lagrange) dual model representations, functional analysis in reproducing kernel Hilbert spaces, operator splitting and optimization in Banach spaces. Several aspects that are relevant to black-box models will be studied including incorporation of prior knowledge, structured dynamical systems, tensorial data representations, interpretability and sparsity, and general purpose optimization algorithms. The methods should be suitable for handling larger data sets and high dimensional input spaces. The final goal is also to realize a next generation software tool (including symbolic generation of models and handling different supervised and unsupervised learning tasks, static and dynamic systems) that can be generically applied to data from different application areas. The proposal A-DATADRIVE-B aims at getting end-users connected to the more advanced methods through a user-friendly data-driven black-box modelling tool. The methods and tool will be tested in connection to several real-life applications.
Max ERC Funding
2 485 800 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym A2C2
Project Atmospheric flow Analogues and Climate Change
Researcher (PI) Pascal Yiou
Host Institution (HI) COMMISSARIAT A L ENERGIE ATOMIQUE ET AUX ENERGIES ALTERNATIVES
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary "The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Summary
"The A2C2 project treats two major challenges in climate and atmospheric research: the time dependence of the climate attractor to external forcings (solar, volcanic eruptions and anthropogenic), and the attribution of extreme climate events occurring in the northern extra-tropics. The main difficulties are the limited climate information, the computer cost of model simulations, and mathematical assumptions that are hardly verified and often overlooked in the literature.
A2C2 proposes a practical framework to overcome those three difficulties, linking the theory of dynamical systems and statistics. We will generalize the methodology of flow analogues to multiple databases in order to obtain probabilistic descriptions of analogue decompositions.
The project is divided into three workpackages (WP). WP1 embeds the analogue method in the theory of dynamical systems in order to provide a metric of an attractor deformation in time. The important methodological step is to detect trends or persisting outliers in the dates and scores of analogues when the system yields time-varying forcings. This is done from idealized models and full size climate models in which the forcings (anthropogenic and natural) are known.
A2C2 creates an open source toolkit to compute flow analogues from a wide array of databases (WP2). WP3 treats the two scientific challenges with the analogue method and multiple model ensembles, hence allowing uncertainty estimates under realistic mathematical hypotheses. The flow analogue methodology allows a systematic and quasi real-time analysis of extreme events, which is currently out of the reach of conventional climate modeling approaches.
The major breakthrough of A2C2 is to bridge the gap between operational needs (the immediate analysis of climate events) and the understanding long-term climate changes. A2C2 opens new research horizons for the exploitation of ensembles of simulations and reliable estimates of uncertainty."
Max ERC Funding
1 491 457 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym A2F2
Project Beyond Biopolymers: Protein-Sized Aromatic Amide Functional Foldamers
Researcher (PI) Ivan Huc
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE5, ERC-2012-ADG_20120216
Summary Nature has evolved ultimate chemical functions based on controlling and altering conformation of its molecular machinery. Prominent examples include enzyme catalysis and information storage/duplication in nucleic acids. These achievements are based on large and complex yet remarkably defined structures obtained through folding of polymeric chains and a subtle interplay of non-covalent forces. Nature uses a limited set of building blocks – e.g. twenty amino-acids and four nucleobases – with specific abilities to impart well-defined folds. In the last decade, chemists have discovered foldamers: non-natural oligomers and polymers also prone to adopt folded structures. The emergence of foldamers has far reaching implications. A new major long term prospect is open to chemistry: the de novo synthesis of artificial objects resembling biopolymers in terms of their size, complexity, and efficiency at achieving defined functions, yet having chemical structures beyond the reach of biopolymers amenable to new properties and functions. The PI of this project has shown internationally recognized leadership in the development of a class of foldamers, aromatic oligoamides, whose features arguably make them the most suitable candidates to systematically explore what folded structures beyond biopolymers give access to. This project aims at developing methods to allow the routine fabrication of 20-40 units long aromatic oligoamide foldamers (6-15 kDa) designed to fold into artificial molecular containers having engineerable cavities and surfaces for molecular recognition of organic substrates, in particular large peptides and saccharides, polymers, and proteins. The methodology rests on modelling based design, multistep organic synthesis of heterocyclic monomers and their assembly into long sequences, structural elucidation using, among other techniques, x-ray crystallography, and the physico-chemical characterization of molecular recognition events.
Summary
Nature has evolved ultimate chemical functions based on controlling and altering conformation of its molecular machinery. Prominent examples include enzyme catalysis and information storage/duplication in nucleic acids. These achievements are based on large and complex yet remarkably defined structures obtained through folding of polymeric chains and a subtle interplay of non-covalent forces. Nature uses a limited set of building blocks – e.g. twenty amino-acids and four nucleobases – with specific abilities to impart well-defined folds. In the last decade, chemists have discovered foldamers: non-natural oligomers and polymers also prone to adopt folded structures. The emergence of foldamers has far reaching implications. A new major long term prospect is open to chemistry: the de novo synthesis of artificial objects resembling biopolymers in terms of their size, complexity, and efficiency at achieving defined functions, yet having chemical structures beyond the reach of biopolymers amenable to new properties and functions. The PI of this project has shown internationally recognized leadership in the development of a class of foldamers, aromatic oligoamides, whose features arguably make them the most suitable candidates to systematically explore what folded structures beyond biopolymers give access to. This project aims at developing methods to allow the routine fabrication of 20-40 units long aromatic oligoamide foldamers (6-15 kDa) designed to fold into artificial molecular containers having engineerable cavities and surfaces for molecular recognition of organic substrates, in particular large peptides and saccharides, polymers, and proteins. The methodology rests on modelling based design, multistep organic synthesis of heterocyclic monomers and their assembly into long sequences, structural elucidation using, among other techniques, x-ray crystallography, and the physico-chemical characterization of molecular recognition events.
Max ERC Funding
2 496 216 €
Duration
Start date: 2013-06-01, End date: 2018-05-31
Project acronym AAA
Project Adaptive Actin Architectures
Researcher (PI) Laurent Blanchoin
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS3, ERC-2016-ADG
Summary Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Summary
Although we have extensive knowledge of many important processes in cell biology, including information on many of the molecules involved and the physical interactions among them, we still do not understand most of the dynamical features that are the essence of living systems. This is particularly true for the actin cytoskeleton, a major component of the internal architecture of eukaryotic cells. In living cells, actin networks constantly assemble and disassemble filaments while maintaining an apparent stable structure, suggesting a perfect balance between the two processes. Such behaviors are called “dynamic steady states”. They confer upon actin networks a high degree of plasticity allowing them to adapt in response to external changes and enable cells to adjust to their environments. Despite their fundamental importance in the regulation of cell physiology, the basic mechanisms that control the coordinated dynamics of co-existing actin networks are poorly understood. In the AAA project, first, we will characterize the parameters that allow the coupling among co-existing actin networks at steady state. In vitro reconstituted systems will be used to control the actin nucleation patterns, the closed volume of the reaction chamber and the physical interaction of the networks. We hope to unravel the mechanism allowing the global coherence of a dynamic actin cytoskeleton. Second, we will use our unique capacity to perform dynamic micropatterning, to add or remove actin nucleation sites in real time, in order to investigate the ability of dynamic networks to adapt to changes and the role of coupled network dynamics in this emergent property. In this part, in vitro experiments will be complemented by the analysis of actin network remodeling in living cells. In the end, our project will provide a comprehensive understanding of how the adaptive response of the cytoskeleton derives from the complex interplay between its biochemical, structural and mechanical properties.
Max ERC Funding
2 349 898 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym AAMOT
Project Arithmetic of automorphic motives
Researcher (PI) Michael Harris
Host Institution (HI) INSTITUT DES HAUTES ETUDES SCIENTIFIQUES
Call Details Advanced Grant (AdG), PE1, ERC-2011-ADG_20110209
Summary The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Summary
The primary purpose of this project is to build on recent spectacular progress in the Langlands program to study the arithmetic properties of automorphic motives constructed in the cohomology of Shimura varieties. Because automorphic methods are available to study the L-functions of these motives, which include elliptic curves and certain families of Calabi-Yau varieties over totally real fields (possibly after base change), they represent the most accessible class of varieties for which one can hope to verify fundamental conjectures on special values of L-functions, including Deligne's conjecture and the Main Conjecture of Iwasawa theory. Immediate goals include the proof of irreducibility of automorphic Galois representations; the establishment of period relations for automorphic and potentially automorphic realizations of motives in the cohomology of distinct Shimura varieties; the construction of p-adic L-functions for these and related motives, notably adjoint and tensor product L-functions in p-adic families; and the geometrization of the p-adic and mod p Langlands program. All four goals, as well as the others mentioned in the body of the proposal, are interconnected; the final goal provides a bridge to related work in geometric representation theory, algebraic geometry, and mathematical physics.
Max ERC Funding
1 491 348 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym AARTFAAC
Project Amsterdam-ASTRON Radio Transient Facility And Analysis Centre: Probing the Extremes of Astrophysics
Researcher (PI) Ralph Antoine Marie Joseph Wijers
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE9, ERC-2009-AdG
Summary Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Summary
Some of the most extreme tests of physical law come from its manifestations in the behaviour of black holes and neutron stars, and as such these objects should be used as fundamental physics labs. Due to advances in both theoretical work and observational techniques, I have a major opportunity now to significantly push this agenda forward and get better answers to questions like: How are black holes born? How can energy be extracted from black holes? What is the origin of magnetic fields and cosmic rays in jets and shocks? Is their primary energy stream hadronic or magnetic? I propose to do this by exploiting the advent of wide-field radio astronomy: extreme objects are very rare and usually transient, so not only must one survey large areas of sky, but also must one do this often. I propose to form and shape a group that will use the LOFAR wide-field radio telescope to hunt for these extreme transients and systematically collect enough well-documented examples of the behaviour of each type of transient. Furthermore, I propose to expand LOFAR with a true 24/7 all-sky monitor to catch and study even the rarest of events. Next, I will use my experience in gamma-ray burst followup to conduct a vigorous multi-wavelength programme of study of these objects, to constrain their physics from as many angles as possible. This will eventually include results from multi-messenger astrophysics, in which we use neutrinos, gravity waves, and other non-electromagnetic messengers as extra diagnostics of the physics of these sources. Finally, I will build on my experience in modelling accretion phenomena and relativistic explosions to develop a theoretical framework for these phenomena and constrain the resulting models with the rich data sets we obtain.
Max ERC Funding
3 499 128 €
Duration
Start date: 2010-10-01, End date: 2016-09-30
Project acronym ABACUS
Project Ab-initio adiabatic-connection curves for density-functional analysis and construction
Researcher (PI) Trygve Ulf Helgaker
Host Institution (HI) UNIVERSITETET I OSLO
Call Details Advanced Grant (AdG), PE4, ERC-2010-AdG_20100224
Summary Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Summary
Quantum chemistry provides two approaches to molecular electronic-structure calculations: the systematically refinable but expensive many-body wave-function methods and the inexpensive but not systematically refinable Kohn Sham method of density-functional theory (DFT). The accuracy of Kohn Sham calculations is determined by the quality of the exchange correlation functional, from which the effects of exchange and correlation among the electrons are extracted using the density rather than the wave function. However, the exact exchange correlation functional is unknown—instead, many approximate forms have been developed, by fitting to experimental data or by satisfying exact relations. Here, a new approach to density-functional analysis and construction is proposed: the Lieb variation principle, usually regarded as conceptually important but impracticable. By invoking the Lieb principle, it becomes possible to approach the development of approximate functionals in a novel manner, being directly guided by the behaviour of exact functional, accurately calculated for a wide variety of chemical systems. In particular, this principle will be used to calculate ab-initio adiabatic connection curves, studying the exchange correlation functional for a fixed density as the electronic interactions are turned on from zero to one. Pilot calculations have indicated the feasibility of this approach in simple cases—here, a comprehensive set of adiabatic-connection curves will be generated and utilized for calibration, construction, and analysis of density functionals, the objective being to produce improved functionals for Kohn Sham calculations by modelling or fitting such curves. The ABACUS approach will be particularly important in cases where little experimental information is available—for example, for understanding and modelling the behaviour of the exchange correlation functional in electromagnetic fields.
Max ERC Funding
2 017 932 €
Duration
Start date: 2011-03-01, End date: 2016-02-29
Project acronym ABCvolume
Project The ABC of Cell Volume Regulation
Researcher (PI) Berend Poolman
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), LS1, ERC-2014-ADG
Summary Cell volume regulation is crucial for any living cell because changes in volume determine the metabolic activity through e.g. changes in ionic strength, pH, macromolecular crowding and membrane tension. These physical chemical parameters influence interaction rates and affinities of biomolecules, folding rates, and fold stabilities in vivo. Understanding of the underlying volume regulatory mechanisms has immediate application in biotechnology and health, yet these factors are generally ignored in systems analyses of cellular functions.
My team has uncovered a number of mechanisms and insights of cell volume regulation. The next step forward is to elucidate how the components of a cell volume regulatory circuit work together and control the physicochemical conditions of the cell.
I propose construction of a synthetic cell in which an osmoregulatory transporter and mechanosensitive channel form a minimal volume regulatory network. My group has developed the technology to reconstitute membrane proteins into lipid vesicles (synthetic cells). One of the challenges is to incorporate into the vesicles an efficient pathway for ATP production and maintain energy homeostasis while the load on the system varies. We aim to control the transmembrane flux of osmolytes, which requires elucidation of the molecular mechanism of gating of the osmoregulatory transporter. We will focus on the glycine betaine ABC importer, which is one of the most complex transporters known to date with ten distinct protein domains, transiently interacting with each other.
The proposed synthetic metabolic circuit constitutes a fascinating out-of-equilibrium system, allowing us to understand cell volume regulatory mechanisms in a context and at a level of complexity minimally needed for life. Analysis of this circuit will address many outstanding questions and eventually allow us to design more sophisticated vesicular systems with applications, for example as compartmentalized reaction networks.
Summary
Cell volume regulation is crucial for any living cell because changes in volume determine the metabolic activity through e.g. changes in ionic strength, pH, macromolecular crowding and membrane tension. These physical chemical parameters influence interaction rates and affinities of biomolecules, folding rates, and fold stabilities in vivo. Understanding of the underlying volume regulatory mechanisms has immediate application in biotechnology and health, yet these factors are generally ignored in systems analyses of cellular functions.
My team has uncovered a number of mechanisms and insights of cell volume regulation. The next step forward is to elucidate how the components of a cell volume regulatory circuit work together and control the physicochemical conditions of the cell.
I propose construction of a synthetic cell in which an osmoregulatory transporter and mechanosensitive channel form a minimal volume regulatory network. My group has developed the technology to reconstitute membrane proteins into lipid vesicles (synthetic cells). One of the challenges is to incorporate into the vesicles an efficient pathway for ATP production and maintain energy homeostasis while the load on the system varies. We aim to control the transmembrane flux of osmolytes, which requires elucidation of the molecular mechanism of gating of the osmoregulatory transporter. We will focus on the glycine betaine ABC importer, which is one of the most complex transporters known to date with ten distinct protein domains, transiently interacting with each other.
The proposed synthetic metabolic circuit constitutes a fascinating out-of-equilibrium system, allowing us to understand cell volume regulatory mechanisms in a context and at a level of complexity minimally needed for life. Analysis of this circuit will address many outstanding questions and eventually allow us to design more sophisticated vesicular systems with applications, for example as compartmentalized reaction networks.
Max ERC Funding
2 247 231 €
Duration
Start date: 2015-07-01, End date: 2020-06-30
Project acronym ABEL
Project "Alpha-helical Barrels: Exploring, Understanding and Exploiting a New Class of Protein Structure"
Researcher (PI) Derek Neil Woolfson
Host Institution (HI) UNIVERSITY OF BRISTOL
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Summary
"Recently through de novo peptide design, we have discovered and presented a new protein structure. This is an all-parallel, 6-helix bundle with a continuous central channel of 0.5 – 0.6 nm diameter. We posit that this is one of a broader class of protein structures that we call the alpha-helical barrels. Here, in three Work Packages, we propose to explore these structures and to develop protein functions within them. First, through a combination of computer-aided design, peptide synthesis and thorough biophysical characterization, we will examine the extents and limits of the alpha-helical-barrel structures. Whilst this is curiosity driven research, it also has practical consequences for the studies that will follow; that is, alpha-helical barrels made from increasing numbers of helices have channels or pores that increase in a predictable way. Second, we will use rational and empirical design approaches to engineer a range of functions within these cavities, including binding capabilities and enzyme-like activities. Finally, and taking the programme into another ambitious area, we will use the alpha-helical barrels to template other folds that are otherwise difficult to design and engineer, notably beta-barrels that insert into membranes to render ion-channel and sensor functions."
Max ERC Funding
2 467 844 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ABEP
Project Asset Bubbles and Economic Policy
Researcher (PI) Jaume Ventura Fontanet
Host Institution (HI) Centre de Recerca en Economia Internacional (CREI)
Call Details Advanced Grant (AdG), SH1, ERC-2009-AdG
Summary Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Summary
Advanced capitalist economies experience large and persistent movements in asset prices that are difficult to justify with economic fundamentals. The internet bubble of the 1990s and the real state market bubble of the 2000s are two recent examples. The predominant view is that these bubbles are a market failure, and are caused by some form of individual irrationality on the part of market participants. This project is based instead on the view that market participants are individually rational, although this does not preclude sometimes collectively sub-optimal outcomes. Bubbles are thus not a source of market failure by themselves but instead arise as a result of a pre-existing market failure, namely, the existence of pockets of dynamically inefficient investments. Under some conditions, bubbles partly solve this problem, increasing market efficiency and welfare. It is also possible however that bubbles do not solve the underlying problem and, in addition, create negative side-effects. The main objective of this project is to develop this view of asset bubbles, and produce an empirically-relevant macroeconomic framework that allows us to address the following questions: (i) What is the relationship between bubbles and financial market frictions? Special emphasis is given to how the globalization of financial markets and the development of new financial products affect the size and effects of bubbles. (ii) What is the relationship between bubbles, economic growth and unemployment? The theory suggests the presence of virtuous and vicious cycles, as economic growth creates the conditions for bubbles to pop up, while bubbles create incentives for economic growth to happen. (iii) What is the optimal policy to manage bubbles? We need to develop the tools that allow policy makers to sustain those bubbles that have positive effects and burst those that have negative effects.
Max ERC Funding
1 000 000 €
Duration
Start date: 2010-04-01, End date: 2015-03-31
Project acronym ABYSS
Project ABYSS - Assessment of bacterial life and matter cycling in deep-sea surface sediments
Researcher (PI) Antje Boetius
Host Institution (HI) ALFRED-WEGENER-INSTITUT HELMHOLTZ-ZENTRUM FUR POLAR- UND MEERESFORSCHUNG
Call Details Advanced Grant (AdG), LS8, ERC-2011-ADG_20110310
Summary The deep-sea floor hosts a distinct microbial biome covering 67% of the Earth’s surface, characterized by cold temperatures, permanent darkness, high pressure and food limitation. The surface sediments are dominated by bacteria, with on average a billion cells per ml. Benthic bacteria are highly relevant to the Earth’s element cycles as they remineralize most of the organic matter sinking from the productive surface ocean, and return nutrients, thereby promoting ocean primary production. What passes the bacterial filter is a relevant sink for carbon on geological time scales, influencing global oxygen and carbon budgets, and fueling the deep subsurface biosphere. Despite the relevance of deep-sea sediment bacteria to climate, geochemical cycles and ecology of the seafloor, their genetic and functional diversity, niche differentiation and biological interactions remain unknown. Our preliminary work in a global survey of deep-sea sediments enables us now to target specific genes for the quantification of abyssal bacteria. We can trace isotope-labeled elements into communities and single cells, and analyze the molecular alteration of organic matter during microbial degradation, all in context with environmental dynamics recorded at the only long-term deep-sea ecosystem observatory in the Arctic that we maintain. I propose to bridge biogeochemistry, ecology, microbiology and marine biology to develop a systematic understanding of abyssal sediment bacterial community distribution, diversity, function and interactions, by combining in situ flux studies and different visualization techniques with a wide range of molecular tools. Substantial progress is expected in understanding I) identity and function of the dominant types of indigenous benthic bacteria, II) dynamics in bacterial activity and diversity caused by variations in particle flux, III) interactions with different types and ages of organic matter, and other biological factors.
Summary
The deep-sea floor hosts a distinct microbial biome covering 67% of the Earth’s surface, characterized by cold temperatures, permanent darkness, high pressure and food limitation. The surface sediments are dominated by bacteria, with on average a billion cells per ml. Benthic bacteria are highly relevant to the Earth’s element cycles as they remineralize most of the organic matter sinking from the productive surface ocean, and return nutrients, thereby promoting ocean primary production. What passes the bacterial filter is a relevant sink for carbon on geological time scales, influencing global oxygen and carbon budgets, and fueling the deep subsurface biosphere. Despite the relevance of deep-sea sediment bacteria to climate, geochemical cycles and ecology of the seafloor, their genetic and functional diversity, niche differentiation and biological interactions remain unknown. Our preliminary work in a global survey of deep-sea sediments enables us now to target specific genes for the quantification of abyssal bacteria. We can trace isotope-labeled elements into communities and single cells, and analyze the molecular alteration of organic matter during microbial degradation, all in context with environmental dynamics recorded at the only long-term deep-sea ecosystem observatory in the Arctic that we maintain. I propose to bridge biogeochemistry, ecology, microbiology and marine biology to develop a systematic understanding of abyssal sediment bacterial community distribution, diversity, function and interactions, by combining in situ flux studies and different visualization techniques with a wide range of molecular tools. Substantial progress is expected in understanding I) identity and function of the dominant types of indigenous benthic bacteria, II) dynamics in bacterial activity and diversity caused by variations in particle flux, III) interactions with different types and ages of organic matter, and other biological factors.
Max ERC Funding
3 375 693 €
Duration
Start date: 2012-06-01, End date: 2018-05-31
Project acronym ACB
Project The Analytic Conformal Bootstrap
Researcher (PI) Luis Fernando ALDAY
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE2, ERC-2017-ADG
Summary The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Summary
The aim of the present proposal is to establish a research team developing and exploiting innovative techniques to study conformal field theories (CFT) analytically. Our approach does not rely on a Lagrangian description but on symmetries and consistency conditions. As such it applies to any CFT, offering a unified framework to study generic CFTs analytically. The initial implementation of this program has already led to striking new results and insights for both Lagrangian and non-Lagrangian CFTs.
The overarching aims of my team will be: To develop an analytic bootstrap program for CFTs in general dimensions; to complement these techniques with more traditional methods and develop a systematic machinery to obtain analytic results for generic CFTs; and to use these results to gain new insights into the mathematical structure of the space of quantum field theories.
The proposal will bring together researchers from different areas. The objectives in brief are:
1) Develop an alternative to Feynman diagram computations for Lagrangian CFTs.
2) Develop a machinery to compute loops for QFT on AdS, with and without gravity.
3) Develop an analytic approach to non-perturbative N=4 SYM and other CFTs.
4) Determine the space of all CFTs.
5) Gain new insights into the mathematical structure of the space of quantum field theories.
The outputs of this proposal will include a new way of doing perturbative computations based on symmetries; a constructive derivation of the AdS/CFT duality; new analytic techniques to attack strongly coupled systems and invaluable new lessons about the space of CFTs and QFTs.
Success in this research will lead to a completely new, unified way to view and solve CFTs, with a huge impact on several branches of physics and mathematics.
Max ERC Funding
2 171 483 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ACCELERATES
Project Acceleration in Extreme Shocks: from the microphysics to laboratory and astrophysics scenarios
Researcher (PI) Luis Miguel De Oliveira E Silva
Host Institution (HI) INSTITUTO SUPERIOR TECNICO
Call Details Advanced Grant (AdG), PE2, ERC-2010-AdG_20100224
Summary What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Summary
What is the origin of cosmic rays, what are the dominant acceleration mechanisms in relativistic shocks, how do cosmic rays self-consistently influence the shock dynamics, how are relativistic collisionless shocks formed are longstanding scientific questions, closely tied to extreme plasma physics processes, and where a close interplay between the micro-instabilities and the global dynamics is critical.
Relativistic shocks are closely connected with the propagation of intense streams of particles pervasive in many astrophysical scenarios. The possibility of exciting shocks in the laboratory will also be available very soon with multi-PW lasers or intense relativistic particle beams.
Computational modeling is now established as a prominent research tool, by enabling the fully kinetic modeling of these systems for the first time. With the fast paced developments in high performance computing, the time is ripe for a focused research programme on simulation-based studies of relativistic shocks. This proposal therefore focuses on using self-consistent ab initio massively parallel simulations to study the physics of relativistic shocks, bridging the gap between the multidimensional microphysics of shock onset, formation, and propagation and the global system dynamics. Particular focus will be given to the shock acceleration mechanisms and the radiation signatures of the various physical processes, with the goal of solving some of the central questions in plasma/relativistic phenomena in astrophysics and in the laboratory, and opening new avenues between theoretical/massive computational studies, laboratory experiments and astrophysical observations.
Max ERC Funding
1 588 800 €
Duration
Start date: 2011-06-01, End date: 2016-07-31
Project acronym ACCI
Project Atmospheric Chemistry-Climate Interactions
Researcher (PI) John Adrian Pyle
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE10, ERC-2010-AdG_20100224
Summary Global change involves a large number of complex interactions between various earth system processes. In the atmosphere, one component of the earth system, there are crucial feedbacks between physical, chemical and biological processes. Thus many of the drivers of climate change depend on chemical processes in the atmosphere including, in addition to ozone and water vapour, methane, nitrous oxide, the halocarbons as well as a range of inorganic and organic aerosols. The link between chemistry and climate is two-way and changes in climate can influence atmospheric chemistry processes in a variety of ways.
Previous studies have looked at these interactions in isolation but the time is now right for more comprehensive studies. The crucial contribution that will be made here is in improving our understanding of the processes within this complex system. Process understanding has been the hallmark of my previous work. The earth system scope here will be ambitiously wide but with a similar drive to understand fundamental processes.
The ambitious programme of research is built around four interrelated questions using new state-of-the-art modelling tools: How will the composition of the stratosphere change in the future, given changes in the concentrations of ozone depleting substances and greenhouse gases? How will these changes in the stratosphere affect tropospheric composition and climate? How will the composition of the troposphere change in the future, given changes in the emissions of ozone precursors and greenhouse gases? How will these changes in the troposphere affect the troposphere-stratosphere climate system?
ACCI will break new ground in bringing all of these questions into a single modelling and diagnostic framework, enabling interrelated questions to be answered which should radically improve our overall projections for global change.
Summary
Global change involves a large number of complex interactions between various earth system processes. In the atmosphere, one component of the earth system, there are crucial feedbacks between physical, chemical and biological processes. Thus many of the drivers of climate change depend on chemical processes in the atmosphere including, in addition to ozone and water vapour, methane, nitrous oxide, the halocarbons as well as a range of inorganic and organic aerosols. The link between chemistry and climate is two-way and changes in climate can influence atmospheric chemistry processes in a variety of ways.
Previous studies have looked at these interactions in isolation but the time is now right for more comprehensive studies. The crucial contribution that will be made here is in improving our understanding of the processes within this complex system. Process understanding has been the hallmark of my previous work. The earth system scope here will be ambitiously wide but with a similar drive to understand fundamental processes.
The ambitious programme of research is built around four interrelated questions using new state-of-the-art modelling tools: How will the composition of the stratosphere change in the future, given changes in the concentrations of ozone depleting substances and greenhouse gases? How will these changes in the stratosphere affect tropospheric composition and climate? How will the composition of the troposphere change in the future, given changes in the emissions of ozone precursors and greenhouse gases? How will these changes in the troposphere affect the troposphere-stratosphere climate system?
ACCI will break new ground in bringing all of these questions into a single modelling and diagnostic framework, enabling interrelated questions to be answered which should radically improve our overall projections for global change.
Max ERC Funding
2 496 926 €
Duration
Start date: 2011-05-01, End date: 2017-04-30
Project acronym ACCLIMATE
Project Elucidating the Causes and Effects of Atlantic Circulation Changes through Model-Data Integration
Researcher (PI) Claire Waelbroeck
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Summary
Rapid changes in ocean circulation and climate have been observed in marine sediment and ice cores, notably over the last 60 thousand years (ky), highlighting the non-linear character of the climate system and underlining the possibility of rapid climate shifts in response to anthropogenic greenhouse gas forcing.
To date, these rapid changes in climate and ocean circulation are still not fully explained. Two main obstacles prevent going beyond the current state of knowledge:
- Paleoclimatic proxy data are by essence only indirect indicators of the climatic variables, and thus can not be directly compared with model outputs;
- A 4-D (latitude, longitude, water depth, time) reconstruction of Atlantic water masses over the past 40 ky is lacking: previous studies have generated isolated records with disparate timescales which do not allow the causes of circulation changes to be identified.
Overcoming these two major limitations will lead to major breakthroughs in climate research. Concretely, I will create the first database of Atlantic deep-sea records over the last 40 ky, and extract full climatic information from these records through an innovative model-data integration scheme using an isotopic proxy forward modeling approach. The novelty and exceptional potential of this scheme is twofold: (i) it avoids hypotheses on proxy interpretation and hence suppresses or strongly reduces the errors of interpretation of paleoclimatic records; (ii) it produces states of the climate system that best explain the observations over the last 40 ky, while being consistent with the model physics.
Expected results include:
• The elucidation of the mechanisms explaining rapid changes in ocean circulation and climate over the last 40 ky,
• Improved climate model physics and parameterizations,
• The first projections of future climate changes obtained with a model able to reproduce the highly non linear behavior of the climate system observed over the last 40 ky.
Max ERC Funding
3 000 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ACCOMPLI
Project Assembly and maintenance of a co-regulated chromosomal compartment
Researcher (PI) Peter Burkhard Becker
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), LS2, ERC-2011-ADG_20110310
Summary "Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Summary
"Eukaryotic nuclei are organised into functional compartments, – local microenvironments that are enriched in certain molecules or biochemical activities and therefore specify localised functional outputs. Our study seeks to unveil fundamental principles of co-regulation of genes in a chromo¬somal compartment and the preconditions for homeostasis of such a compartment in the dynamic nuclear environment.
The dosage-compensated X chromosome of male Drosophila flies satisfies the criteria for a functional com¬partment. It is rendered structurally distinct from all other chromosomes by association of a regulatory ribonucleoprotein ‘Dosage Compensation Complex’ (DCC), enrichment of histone modifications and global decondensation. As a result, most genes on the X chromosome are co-ordinately activated. Autosomal genes inserted into the X acquire X-chromosomal features and are subject to the X-specific regulation.
We seek to uncover the molecular principles that initiate, establish and maintain the dosage-compensated chromosome. We will follow the kinetics of DCC assembly and the timing of association with different types of chromosomal targets in nuclei with high spatial resolution afforded by sub-wavelength microscopy and deep sequencing of DNA binding sites. We will characterise DCC sub-complexes with respect to their roles as kinetic assembly intermediates or as representations of local, functional heterogeneity. We will evaluate the roles of a DCC- novel ubiquitin ligase activity for homeostasis.
Crucial to the recruitment of the DCC and its distribution to target genes are non-coding roX RNAs that are transcribed from the X. We will determine the secondary structure ‘signatures’ of roX RNAs in vitro and determine the binding sites of the protein subunits in vivo. By biochemical and cellular reconstitution will test the hypothesis that roX-encoded RNA aptamers orchestrate the assembly of the DCC and contribute to the exquisite targeting of the complex."
Max ERC Funding
2 482 770 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ACCOPT
Project ACelerated COnvex OPTimization
Researcher (PI) Yurii NESTEROV
Host Institution (HI) UNIVERSITE CATHOLIQUE DE LOUVAIN
Call Details Advanced Grant (AdG), PE1, ERC-2017-ADG
Summary The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Summary
The amazing rate of progress in the computer technologies and telecommunications presents many new challenges for Optimization Theory. New problems are usually very big in size, very special in structure and possibly have a distributed data support. This makes them unsolvable by the standard optimization methods. In these situations, old theoretical models, based on the hidden Black-Box information, cannot work. New theoretical and algorithmic solutions are urgently needed. In this project we will concentrate on development of fast optimization methods for problems of big and very big size. All the new methods will be endowed with provable efficiency guarantees for large classes of optimization problems, arising in practical applications. Our main tool is the acceleration technique developed for the standard Black-Box methods as applied to smooth convex functions. However, we will have to adapt it to deal with different situations.
The first line of development will be based on the smoothing technique as applied to a non-smooth functions. We propose to substantially extend this approach to generate approximate solutions in relative scale. The second line of research will be related to applying acceleration techniques to the second-order methods minimizing functions with sparse Hessians. Finally, we aim to develop fast gradient methods for huge-scale problems. The size of these problems is so big that even the usual vector operations are extremely expensive. Thus, we propose to develop new methods with sublinear iteration costs. In our approach, the main source for achieving improvements will be the proper use of problem structure.
Our overall aim is to be able to solve in a routine way many important problems, which currently look unsolvable. Moreover, the theoretical development of Convex Optimization will reach the state, when there is no gap between theory and practice: the theoretically most efficient methods will definitely outperform any homebred heuristics.
Max ERC Funding
2 090 038 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym ACCRETE
Project Accretion and Early Differentiation of the Earth and Terrestrial Planets
Researcher (PI) David Crowhurst Rubie
Host Institution (HI) UNIVERSITAT BAYREUTH
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary Formation of the Earth and the other terrestrial planets of our Solar System (Mercury, Venus and Mars) commenced 4.568 billion years ago and occurred on a time scale of about 100 million years. These planets grew by the process of accretion, which involved numerous collisions with smaller (Moon- to Mars-size) bodies. Impacts with such bodies released sufficient energy to cause large-scale melting and the formation of deep “magma oceans”. Such magma oceans enabled liquid metal to separate from liquid silicate, sink and accumulate to form the metallic cores of the planets. Thus core formation in terrestrial planets was a multistage process, intimately related to the major impacts during accretion, that determined the chemistry of planetary mantles. However, until now, accretion, as modelled by astrophysicists, and core formation, as modelled by geochemists, have been treated as completely independent processes. The fundamental and crucial aim of this ambitious interdisciplinary proposal is to integrate astrophysical models of planetary accretion with geochemical models of planetary differentiation together with cosmochemical constraints obtained from meteorites. The research will involve integrating new models of planetary accretion with core formation models based on the partitioning of a large number of elements between liquid metal and liquid silicate that we will determine experimentally at pressures up to about 100 gigapascals (equivalent to 2400 km deep in the Earth). By comparing our results with the known physical and chemical characteristics of the terrestrial planets, we will obtain a comprehensive understanding of how these planets formed, grew and evolved, both physically and chemically, with time. The integration of chemistry and planetary differentiation with accretion models is a new ground-breaking concept that will lead, through synergies and feedback, to major new advances in the Earth and planetary sciences.
Summary
Formation of the Earth and the other terrestrial planets of our Solar System (Mercury, Venus and Mars) commenced 4.568 billion years ago and occurred on a time scale of about 100 million years. These planets grew by the process of accretion, which involved numerous collisions with smaller (Moon- to Mars-size) bodies. Impacts with such bodies released sufficient energy to cause large-scale melting and the formation of deep “magma oceans”. Such magma oceans enabled liquid metal to separate from liquid silicate, sink and accumulate to form the metallic cores of the planets. Thus core formation in terrestrial planets was a multistage process, intimately related to the major impacts during accretion, that determined the chemistry of planetary mantles. However, until now, accretion, as modelled by astrophysicists, and core formation, as modelled by geochemists, have been treated as completely independent processes. The fundamental and crucial aim of this ambitious interdisciplinary proposal is to integrate astrophysical models of planetary accretion with geochemical models of planetary differentiation together with cosmochemical constraints obtained from meteorites. The research will involve integrating new models of planetary accretion with core formation models based on the partitioning of a large number of elements between liquid metal and liquid silicate that we will determine experimentally at pressures up to about 100 gigapascals (equivalent to 2400 km deep in the Earth). By comparing our results with the known physical and chemical characteristics of the terrestrial planets, we will obtain a comprehensive understanding of how these planets formed, grew and evolved, both physically and chemically, with time. The integration of chemistry and planetary differentiation with accretion models is a new ground-breaking concept that will lead, through synergies and feedback, to major new advances in the Earth and planetary sciences.
Max ERC Funding
1 826 200 €
Duration
Start date: 2012-05-01, End date: 2018-04-30
Project acronym ACCUPOL
Project Unlimited Growth? A Comparative Analysis of Causes and Consequences of Policy Accumulation
Researcher (PI) Christoph KNILL
Host Institution (HI) LUDWIG-MAXIMILIANS-UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), SH2, ERC-2017-ADG
Summary ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Summary
ACCUPOL systematically analyzes an intuitively well-known, but curiously under-researched phenomenon: policy accumulation. Societal modernization and progress bring about a continuously growing pile of policies in most political systems. At the same time, however, the administrative capacities for implementation are largely stagnant. While being societally desirable in principle, ever-more policies hence may potentially imply less in terms of policy achievements. Whether or not policy accumulation remains at a ‘sustainable’ rate thus crucially affects the long-term output legitimacy of modern democracies.
Given this development, the central focus of ACCUPOL lies on three questions: Do accumulation rates vary across countries and policy sectors? Which factors mitigate policy accumulation? And to what extent is policy accumulation really associated with an increasing prevalence of implementation deficits? In answering these questions, ACCUPOL radically departs from established research traditions in public policy.
First, the project develops new analytical concepts: Rather than relying on individual policy change as the unit of analysis, we consider policy accumulation to assess the growth of policy portfolios over time. In terms of implementation, ACCUPOL takes into account the overall prevalence of implementation deficits in a given sector instead of analyzing the effectiveness of individual implementation processes.
Second, this analytical innovation also implies a paradigmatic theoretical shift. Because existing theories focus on the analysis of individual policies, they are of limited help to understand causes and consequences of policy accumulation. ACCUPOL develops a novel theoretical approach to fill this theoretical gap.
Third, the project provides new empirical evidence on the prevalence of policy accumulation and implementation deficits focusing on 25 OECD countries and two key policy areas (social and environmental policy).
Max ERC Funding
2 359 000 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym ACETOGENS
Project Acetogenic bacteria: from basic physiology via gene regulation to application in industrial biotechnology
Researcher (PI) Volker MÜLLER
Host Institution (HI) JOHANN WOLFGANG GOETHE-UNIVERSITATFRANKFURT AM MAIN
Call Details Advanced Grant (AdG), LS9, ERC-2016-ADG
Summary Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Summary
Demand for biofuels and other biologically derived commodities is growing worldwide as efforts increase to reduce reliance on fossil fuels and to limit climate change. Most commercial approaches rely on fermentations of organic matter with its inherent problems in producing CO2 and being in conflict with the food supply of humans. These problems are avoided if CO2 can be used as feedstock. Autotrophic organisms can fix CO2 by producing chemicals that are used as building blocks for the synthesis of cellular components (Biomass). Acetate-forming bacteria (acetogens) do neither require light nor oxygen for this and they can be used in bioreactors to reduce CO2 with hydrogen gas, carbon monoxide or an organic substrate. Gas fermentation using these bacteria has already been realized on an industrial level in two pre-commercial 100,000 gal/yr demonstration facilities to produce fuel ethanol from abundant waste gas resources (by LanzaTech). Acetogens can metabolise a wide variety of substrates that could be used for the production of biocommodities. However, their broad use to produce biofuels and platform chemicals from substrates other than gases or together with gases is hampered by our very limited knowledge about their metabolism and ability to use different substrates simultaneously. Nearly nothing is known about regulatory processes involved in substrate utilization or product formation but this is an absolute requirement for metabolic engineering approaches. The aim of this project is to provide this basic knowledge about metabolic routes in the acetogenic model strain Acetobacterium woodii and their regulation. We will unravel the function of “organelles” found in this bacterium and explore their potential as bio-nanoreactors for the production of biocommodities and pave the road for the industrial use of A. woodii in energy (hydrogen) storage. Thus, this project creates cutting-edge opportunities for the development of biosustainable technologies in Europe.
Max ERC Funding
2 497 140 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ACMO
Project Systematic dissection of molecular machines and neural circuits coordinating C. elegans aggregation behaviour
Researcher (PI) Mario De Bono
Host Institution (HI) MEDICAL RESEARCH COUNCIL
Call Details Advanced Grant (AdG), LS5, ERC-2010-AdG_20100317
Summary Elucidating how neural circuits coordinate behaviour, and how molecules underpin the properties of individual neurons are major goals of neuroscience. Optogenetics and neural imaging combined with the powerful genetics and well-described nervous system of C. elegans offer special opportunities to address these questions. Previously, we identified a series of sensory neurons that modulate aggregation of C. elegans. These include neurons that respond to O2, CO2, noxious cues, satiety state, and pheromones. We propose to take our analysis to the next level by dissecting how, in mechanistic molecular terms, these distributed inputs modify the activity of populations of interneurons and motoneurons to coordinate group formation. Our strategy is to develop new, highly parallel approaches to replace the traditional piecemeal analysis.
We propose to:
1) Harness next generation sequencing (NGS) to forward genetics, rapidly to identify a molecular ¿parts list¿ for aggregation. Much of the genetics has been done: we have identified almost 200 mutations that inhibit or enhance aggregation but otherwise show no overt phenotype. A pilot study of 50 of these mutations suggests they identify dozens of genes not previously implicated in aggregation. NGS will allow us to molecularly identify these genes in a few months, providing multiple entry points to study molecular and circuitry mechanisms for behaviour.
2) Develop new methods to image the activity of populations of neurons in immobilized and freely moving animals, using genetically encoded indicators such as the calcium sensor cameleon and the voltage indicator mermaid.
This will be the first time a complex behaviour has been dissected in this way. We expect to identify novel conserved molecular and circuitry mechanisms.
Summary
Elucidating how neural circuits coordinate behaviour, and how molecules underpin the properties of individual neurons are major goals of neuroscience. Optogenetics and neural imaging combined with the powerful genetics and well-described nervous system of C. elegans offer special opportunities to address these questions. Previously, we identified a series of sensory neurons that modulate aggregation of C. elegans. These include neurons that respond to O2, CO2, noxious cues, satiety state, and pheromones. We propose to take our analysis to the next level by dissecting how, in mechanistic molecular terms, these distributed inputs modify the activity of populations of interneurons and motoneurons to coordinate group formation. Our strategy is to develop new, highly parallel approaches to replace the traditional piecemeal analysis.
We propose to:
1) Harness next generation sequencing (NGS) to forward genetics, rapidly to identify a molecular ¿parts list¿ for aggregation. Much of the genetics has been done: we have identified almost 200 mutations that inhibit or enhance aggregation but otherwise show no overt phenotype. A pilot study of 50 of these mutations suggests they identify dozens of genes not previously implicated in aggregation. NGS will allow us to molecularly identify these genes in a few months, providing multiple entry points to study molecular and circuitry mechanisms for behaviour.
2) Develop new methods to image the activity of populations of neurons in immobilized and freely moving animals, using genetically encoded indicators such as the calcium sensor cameleon and the voltage indicator mermaid.
This will be the first time a complex behaviour has been dissected in this way. We expect to identify novel conserved molecular and circuitry mechanisms.
Max ERC Funding
2 439 996 €
Duration
Start date: 2011-04-01, End date: 2017-03-31
Project acronym ACRCC
Project Understanding the atmospheric circulation response to climate change
Researcher (PI) Theodore Shepherd
Host Institution (HI) THE UNIVERSITY OF READING
Call Details Advanced Grant (AdG), PE10, ERC-2013-ADG
Summary Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Summary
Computer models based on known physical laws are our primary tool for predicting climate change. Yet the state-of-the-art models exhibit a disturbingly wide range of predictions of future climate change, especially when examined at the regional scale, which has not decreased as the models have become more comprehensive. The reasons for this are not understood. This represents a basic challenge to our fundamental understanding of climate.
The divergence of model projections is presumably related to systematic model errors in the large-scale fluxes of heat, moisture and momentum that control regional aspects of climate. That these errors stubbornly persist in spite of increases in the spatial resolution of the models suggests that they are associated with errors in the representation of unresolved processes, whose effects must be parameterised.
Most attention in climate science has hitherto focused on the thermodynamic aspects of climate. Dynamical aspects, which involve the atmospheric circulation, have received much less attention. However regional climate, including persistent climate regimes and extremes, is strongly controlled by atmospheric circulation patterns, which exhibit chaotic variability and whose representation in climate models depends sensitively on parameterised processes. Moreover the dynamical aspects of model projections are much less robust than the thermodynamic ones. There are good reasons to believe that model bias, the divergence of model projections, and chaotic variability are somehow related, although the relationships are not well understood. This calls for studying them together.
My proposed research will focus on this problem, addressing these three aspects of the atmospheric circulation response to climate change in parallel: (i) diagnosing the sources of model error; (ii) elucidating the relationship between model error and the spread in model projections; (iii) understanding the physical mechanisms of atmospheric variability.
Max ERC Funding
2 489 151 €
Duration
Start date: 2014-03-01, End date: 2020-02-29
Project acronym ACROSS
Project 3D Reconstruction and Modeling across Different Levels of Abstraction
Researcher (PI) Leif Kobbelt
Host Institution (HI) RHEINISCH-WESTFAELISCHE TECHNISCHE HOCHSCHULE AACHEN
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Summary
"Digital 3D models are gaining more and more importance in diverse application fields ranging from computer graphics, multimedia and simulation sciences to engineering, architecture, and medicine. Powerful technologies to digitize the 3D shape of real objects and scenes are becoming available even to consumers. However, the raw geometric data emerging from, e.g., 3D scanning or multi-view stereo often lacks a consistent structure and meta-information which are necessary for the effective deployment of such models in sophisticated down-stream applications like animation, simulation, or CAD/CAM that go beyond mere visualization. Our goal is to develop new fundamental algorithms which transform raw geometric input data into augmented 3D models that are equipped with structural meta information such as feature aligned meshes, patch segmentations, local and global geometric constraints, statistical shape variation data, or even procedural descriptions. Our methodological approach is inspired by the human perceptual system that integrates bottom-up (data-driven) and top-down (model-driven) mechanisms in its hierarchical processing. Similarly we combine algorithms operating on different levels of abstraction into reconstruction and modeling networks. Instead of developing an individual solution for each specific application scenario, we create an eco-system of algorithms for automatic processing and interactive design of highly complex 3D models. A key concept is the information flow across all levels of abstraction in a bottom-up as well as top-down fashion. We not only aim at optimizing geometric representations but in fact at bridging the gap between reconstruction and recognition of geometric objects. The results from this project will make it possible to bring 3D models of real world objects into many highly relevant applications in science, industry, and entertainment, greatly reducing the excessive manual effort that is still necessary today."
Max ERC Funding
2 482 000 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym Actanthrope
Project Computational Foundations of Anthropomorphic Action
Researcher (PI) Jean Paul Laumond
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE7, ERC-2013-ADG
Summary Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Summary
Actanthrope intends to promote a neuro-robotics perspective to explore original models of anthropomorphic action. The project targets contributions to humanoid robot autonomy (for rescue and service robotics), to advanced human body simulation (for applications in ergonomics), and to a new theory of embodied intelligence (by promoting a motion-based semiotics of the human action).
Actions take place in the physical space while they originate in the –robot or human– sensory-motor space. Geometry is the core abstraction that makes the link between these spaces. Considering that the structure of actions inherits from that of the body, the underlying intuition is that actions can be segmented within discrete sub-spaces lying in the entire continuous posture space. Such sub-spaces are viewed as symbols bridging deliberative reasoning and reactive control. Actanthrope argues that geometric approaches to motion segmentation and generation as promising and innovative routes to explore embodied intelligence:
- Motion segmentation: what are the sub-manifolds that define the structure of a given action?
- Motion generation: among all the solution paths within a given sub-manifold, what is the underlying law that makes the selection?
In Robotics these questions are related to the competition between abstract symbol manipulation and physical signal processing. In Computational Neuroscience the questions refer to the quest of motion invariants. The ambition of the project is to promote a dual perspective: exploring the computational foundations of human action to make better robots, while simultaneously doing better robotics to better understand human action.
A unique “Anthropomorphic Action Factory” supports the methodology. It aims at attracting to a single lab, researchers with complementary know-how and solid mathematical background. All of them will benefit from unique equipments, while being stimulated by four challenges dealing with locomotion and manipulation actions.
Max ERC Funding
2 500 000 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym ACTINONSRF
Project MAL: an actin-regulated SRF transcriptional coactivator
Researcher (PI) Richard Treisman
Host Institution (HI) THE FRANCIS CRICK INSTITUTE LIMITED
Call Details Advanced Grant (AdG), LS1, ERC-2010-AdG_20100317
Summary MAL: an actin-regulated SRF transcriptional coactivator
Recent years have seen a revitalised interest in the role of actin in nuclear processes, but the molecular mechanisms involved remain largely unexplored. We will elucidate the molecular basis for the actin-based control of the SRF transcriptional coactivator, MAL. SRF controls transcription through two families of coactivators, the actin-binding MRTFs (MAL, Mkl2), which couple its activity to cytoskeletal dynamics, and the ERK-regulated TCFs (Elk-1, SAP-1, Net). MAL subcellular localisation and transcriptional activity responds to signal-induced changes in G-actin concentration, which are sensed by its actin-binding N-terminal RPEL domain. Members of a second family of RPEL proteins, the Phactrs, also exhibit actin-regulated nucleocytoplasmic shuttling. The proposal addresses the following novel features of actin biology:
¿ Actin as a transcriptional regulator
¿ Actin as a signalling molecule
¿ Actin-binding proteins as targets for regulation by actin, rather than regulators of actin function
We will analyse the sequences and proteins involved in actin-regulated nucleocytoplasmic shuttling, using structural biology and biochemistry to analyse its control by changes in actin-RPEL domain interactions. We will characterise the dynamics of shuttling, and develop reporters for changes in actin-MAL interaction for analysis of pathway activation in vivo. We will identify genes controlling MAL itself, and the balance between the nuclear and cytoplasmic actin pools. The mechanism by which actin represses transcriptional activation by MAL in the nucleus, and its relation to MAL phosphorylation, will be elucidated. Finally, we will map MRTF and TCF cofactor recruitment to SRF targets on a genome-wide scale, and identify the steps in transcription controlled by actin-MAL interaction.
Summary
MAL: an actin-regulated SRF transcriptional coactivator
Recent years have seen a revitalised interest in the role of actin in nuclear processes, but the molecular mechanisms involved remain largely unexplored. We will elucidate the molecular basis for the actin-based control of the SRF transcriptional coactivator, MAL. SRF controls transcription through two families of coactivators, the actin-binding MRTFs (MAL, Mkl2), which couple its activity to cytoskeletal dynamics, and the ERK-regulated TCFs (Elk-1, SAP-1, Net). MAL subcellular localisation and transcriptional activity responds to signal-induced changes in G-actin concentration, which are sensed by its actin-binding N-terminal RPEL domain. Members of a second family of RPEL proteins, the Phactrs, also exhibit actin-regulated nucleocytoplasmic shuttling. The proposal addresses the following novel features of actin biology:
¿ Actin as a transcriptional regulator
¿ Actin as a signalling molecule
¿ Actin-binding proteins as targets for regulation by actin, rather than regulators of actin function
We will analyse the sequences and proteins involved in actin-regulated nucleocytoplasmic shuttling, using structural biology and biochemistry to analyse its control by changes in actin-RPEL domain interactions. We will characterise the dynamics of shuttling, and develop reporters for changes in actin-MAL interaction for analysis of pathway activation in vivo. We will identify genes controlling MAL itself, and the balance between the nuclear and cytoplasmic actin pools. The mechanism by which actin represses transcriptional activation by MAL in the nucleus, and its relation to MAL phosphorylation, will be elucidated. Finally, we will map MRTF and TCF cofactor recruitment to SRF targets on a genome-wide scale, and identify the steps in transcription controlled by actin-MAL interaction.
Max ERC Funding
1 889 995 €
Duration
Start date: 2011-10-01, End date: 2017-09-30
Project acronym ActiveCortex
Project Active dendrites and cortical associations
Researcher (PI) Matthew Larkum
Host Institution (HI) HUMBOLDT-UNIVERSITAET ZU BERLIN
Call Details Advanced Grant (AdG), LS5, ERC-2014-ADG
Summary Converging studies from psychophysics in humans to single-cell recordings in monkeys and rodents indicate that most important cognitive processes depend on both feed-forward and feedback information interacting in the brain. Intriguingly, feedback to early cortical processing stages appears to play a causal role in these processes. Despite the central nature of this fact to understanding brain cognition, there is still no mechanistic explanation as to how this information could be so pivotal and what events take place that might be decisive. In this research program, we will test the hypothesis that the extraordinary performance of the cortex derives from an associative mechanism built into the basic neuronal unit: the pyramidal cell. The hypothesis is based on two important facts: (1) feedback information is conveyed predominantly to layer 1 and (2) the apical tuft dendrites that are the major recipient of this feedback information are highly electrogenic.
The research program is divided in to several workpackages to systematically investigate the hypothesis at every level. As a whole, we will investigate the causal link between intrinsic cellular activity and behaviour. To do this we will use eletrophysiological and optical techniques to record and influence cell the intrinsic properties of cells (in particular dendritic activity) in vivo and in vitro in rodents. In vivo experiments will have a specific focus on context driven behaviour and in vitro experiments on the impact of long-range (feedback-carrying) fibers on cell activity. The study will also focus on synaptic plasticity at the interface of feedback information and dendritic electrogenesis, namely synapses on to the tuft dendrite of pyramidal neurons. The proposed program will not only address a long-standing and important hypothesis but also provide a transformational contribution towards understanding the operation of the cerebral cortex.
Summary
Converging studies from psychophysics in humans to single-cell recordings in monkeys and rodents indicate that most important cognitive processes depend on both feed-forward and feedback information interacting in the brain. Intriguingly, feedback to early cortical processing stages appears to play a causal role in these processes. Despite the central nature of this fact to understanding brain cognition, there is still no mechanistic explanation as to how this information could be so pivotal and what events take place that might be decisive. In this research program, we will test the hypothesis that the extraordinary performance of the cortex derives from an associative mechanism built into the basic neuronal unit: the pyramidal cell. The hypothesis is based on two important facts: (1) feedback information is conveyed predominantly to layer 1 and (2) the apical tuft dendrites that are the major recipient of this feedback information are highly electrogenic.
The research program is divided in to several workpackages to systematically investigate the hypothesis at every level. As a whole, we will investigate the causal link between intrinsic cellular activity and behaviour. To do this we will use eletrophysiological and optical techniques to record and influence cell the intrinsic properties of cells (in particular dendritic activity) in vivo and in vitro in rodents. In vivo experiments will have a specific focus on context driven behaviour and in vitro experiments on the impact of long-range (feedback-carrying) fibers on cell activity. The study will also focus on synaptic plasticity at the interface of feedback information and dendritic electrogenesis, namely synapses on to the tuft dendrite of pyramidal neurons. The proposed program will not only address a long-standing and important hypothesis but also provide a transformational contribution towards understanding the operation of the cerebral cortex.
Max ERC Funding
2 386 304 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym ACTOMYOSIN RING
Project Understanding Cytokinetic Actomyosin Ring Assembly Through Genetic Code Expansion, Click Chemistry, DNA origami, and in vitro Reconstitution
Researcher (PI) Mohan Balasubramanian
Host Institution (HI) THE UNIVERSITY OF WARWICK
Call Details Advanced Grant (AdG), LS3, ERC-2014-ADG
Summary The mechanism of cell division is conserved in many eukaryotes, from yeast to man. A contractile ring of filamentous actin and myosin II motors generates the force to bisect a mother cell into two daughters. The actomyosin ring is among the most complex cellular machines, comprising over 150 proteins. Understanding how these proteins organize themselves into a functional ring with appropriate contractile properties remains one of the great challenges in cell biology. Efforts to generate a comprehensive understanding of the mechanism of actomyosin ring assembly have been hampered by the lack of structural information on the arrangement of actin, myosin II, and actin modulators in the ring in its native state. Fundamental questions such as how actin filaments are assembled and organized into a ring remain actively debated. This project will investigate key issues pertaining to cytokinesis in the fission yeast Schizosaccharomyces pombe, which divides employing an actomyosin based contractile ring, using the methods of genetics, biochemistry, cellular imaging, DNA origami, genetic code expansion, and click chemistry. Specifically, we will (1) attempt to visualize actin filament assembly in live cells expressing fluorescent actin generated through synthetic biological approaches, including genetic code expansion and click chemistry (2) decipher actin filament polarity in the actomyosin ring using total internal reflection fluorescence microscopy of labelled dimeric and multimeric myosins V and VI generated through DNA origami approaches (3) address when, where, and how actin filaments for cytokinesis are assembled and organized into a ring and (4) reconstitute actin filament and functional actomyosin ring assembly in permeabilized spheroplasts and in supported bilayers. Success in the project will provide major insight into the mechanism of actomyosin ring assembly and illuminate principles behind cytoskeletal self-organization.
Summary
The mechanism of cell division is conserved in many eukaryotes, from yeast to man. A contractile ring of filamentous actin and myosin II motors generates the force to bisect a mother cell into two daughters. The actomyosin ring is among the most complex cellular machines, comprising over 150 proteins. Understanding how these proteins organize themselves into a functional ring with appropriate contractile properties remains one of the great challenges in cell biology. Efforts to generate a comprehensive understanding of the mechanism of actomyosin ring assembly have been hampered by the lack of structural information on the arrangement of actin, myosin II, and actin modulators in the ring in its native state. Fundamental questions such as how actin filaments are assembled and organized into a ring remain actively debated. This project will investigate key issues pertaining to cytokinesis in the fission yeast Schizosaccharomyces pombe, which divides employing an actomyosin based contractile ring, using the methods of genetics, biochemistry, cellular imaging, DNA origami, genetic code expansion, and click chemistry. Specifically, we will (1) attempt to visualize actin filament assembly in live cells expressing fluorescent actin generated through synthetic biological approaches, including genetic code expansion and click chemistry (2) decipher actin filament polarity in the actomyosin ring using total internal reflection fluorescence microscopy of labelled dimeric and multimeric myosins V and VI generated through DNA origami approaches (3) address when, where, and how actin filaments for cytokinesis are assembled and organized into a ring and (4) reconstitute actin filament and functional actomyosin ring assembly in permeabilized spheroplasts and in supported bilayers. Success in the project will provide major insight into the mechanism of actomyosin ring assembly and illuminate principles behind cytoskeletal self-organization.
Max ERC Funding
2 863 705 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym ADAM
Project The Adaptive Auditory Mind
Researcher (PI) Shihab Shamma
Host Institution (HI) ECOLE NORMALE SUPERIEURE
Call Details Advanced Grant (AdG), SH4, ERC-2011-ADG_20110406
Summary Listening in realistic situations is an active process that engages perceptual and cognitive faculties, endowing speech with meaning, music with joy, and environmental sounds with emotion. Through hearing, humans and other animals navigate complex acoustic scenes, separate sound mixtures, and assess their behavioral relevance. These remarkable feats are currently beyond our understanding and exceed the capabilities of the most sophisticated audio engineering systems. The goal of the proposed research is to investigate experimentally a novel view of hearing, where active hearing emerges from a deep interplay between adaptive sensory processes and goal-directed cognition. Specifically, we shall explore the postulate that versatile perception is mediated by rapid-plasticity at the neuronal level. At the conjunction of sensory and cognitive processing, rapid-plasticity pervades all levels of auditory system, from the cochlea up to the auditory and prefrontal cortices. Exploiting fundamental statistical regularities of acoustics, it is what allows humans and other animal to deal so successfully with natural acoustic scenes where artificial systems fail. The project builds on the internationally recognized leadership of the PI in the fields of physiology and computational modeling, combined with the expertise of the Co-Investigator in psychophysics. Building on these highly complementary fields and several technical innovations, we hope to promote a novel view of auditory perception and cognition. We aim also to contribute significantly to translational research in the domain of signal processing for clinical hearing aids, given that many current limitations are not technological but rather conceptual. The project will finally result in the creation of laboratory facilities and an intellectual network unique in France and rare in all of Europe, combining cognitive, neural, and computational approaches to auditory neuroscience.
Summary
Listening in realistic situations is an active process that engages perceptual and cognitive faculties, endowing speech with meaning, music with joy, and environmental sounds with emotion. Through hearing, humans and other animals navigate complex acoustic scenes, separate sound mixtures, and assess their behavioral relevance. These remarkable feats are currently beyond our understanding and exceed the capabilities of the most sophisticated audio engineering systems. The goal of the proposed research is to investigate experimentally a novel view of hearing, where active hearing emerges from a deep interplay between adaptive sensory processes and goal-directed cognition. Specifically, we shall explore the postulate that versatile perception is mediated by rapid-plasticity at the neuronal level. At the conjunction of sensory and cognitive processing, rapid-plasticity pervades all levels of auditory system, from the cochlea up to the auditory and prefrontal cortices. Exploiting fundamental statistical regularities of acoustics, it is what allows humans and other animal to deal so successfully with natural acoustic scenes where artificial systems fail. The project builds on the internationally recognized leadership of the PI in the fields of physiology and computational modeling, combined with the expertise of the Co-Investigator in psychophysics. Building on these highly complementary fields and several technical innovations, we hope to promote a novel view of auditory perception and cognition. We aim also to contribute significantly to translational research in the domain of signal processing for clinical hearing aids, given that many current limitations are not technological but rather conceptual. The project will finally result in the creation of laboratory facilities and an intellectual network unique in France and rare in all of Europe, combining cognitive, neural, and computational approaches to auditory neuroscience.
Max ERC Funding
3 199 078 €
Duration
Start date: 2012-10-01, End date: 2018-09-30
Project acronym ADAPT
Project The Adoption of New Technological Arrays in the Production of Broadcast Television
Researcher (PI) John Cyril Paget Ellis
Host Institution (HI) ROYAL HOLLOWAY AND BEDFORD NEW COLLEGE
Call Details Advanced Grant (AdG), SH5, ERC-2012-ADG_20120411
Summary "Since 1960, the television industry has undergone successive waves of technological change. Both the methods of programme making and the programmes themselves have changed substantially. The current opening of TV’s vast archives to public and academic use has emphasised the need to explain old programming to new users. Why particular programmes are like they are is not obvious to the contemporary viewer: the prevailing technologies imposed limits and enabled forms that have fallen into disuse. The project will examine the processes of change which gave rise to the particular dominant configurations of technologies for sound and image capture and processing, and some idea of the national and regional variants that existed. It will emphasise the capabilities of the machines in use rather than the process of their invention. The project therefore studies how the technologies of film and tape were implemented; how both broadcasters and individual filmers coped with the conflicting demands of the different machines at their disposal; how new ‘standard ways of doing things’ gradually emerged; and how all of this enabled desired changes in the resultant programmes. The project will produce an overall written account of the principal changes in the technologies in use in broadcast TV since 1960 to the near present. It will offer a theory of technological innovation, and a major case study in the adoption of digital workflow management in production for broadcasting: the so-called ‘tapeless environment’ which is currently being implemented in major organisations. It will offer two historical case studies: a longditudinal study of the evolution of tape-based sound recording and one of the rapid change from 16mm film cutting to digital editing, a process that took less than five years. Reconstructions of the process of working with particular technological arrays will be filmed and will be made available as explanatory material for any online archive of TV material ."
Summary
"Since 1960, the television industry has undergone successive waves of technological change. Both the methods of programme making and the programmes themselves have changed substantially. The current opening of TV’s vast archives to public and academic use has emphasised the need to explain old programming to new users. Why particular programmes are like they are is not obvious to the contemporary viewer: the prevailing technologies imposed limits and enabled forms that have fallen into disuse. The project will examine the processes of change which gave rise to the particular dominant configurations of technologies for sound and image capture and processing, and some idea of the national and regional variants that existed. It will emphasise the capabilities of the machines in use rather than the process of their invention. The project therefore studies how the technologies of film and tape were implemented; how both broadcasters and individual filmers coped with the conflicting demands of the different machines at their disposal; how new ‘standard ways of doing things’ gradually emerged; and how all of this enabled desired changes in the resultant programmes. The project will produce an overall written account of the principal changes in the technologies in use in broadcast TV since 1960 to the near present. It will offer a theory of technological innovation, and a major case study in the adoption of digital workflow management in production for broadcasting: the so-called ‘tapeless environment’ which is currently being implemented in major organisations. It will offer two historical case studies: a longditudinal study of the evolution of tape-based sound recording and one of the rapid change from 16mm film cutting to digital editing, a process that took less than five years. Reconstructions of the process of working with particular technological arrays will be filmed and will be made available as explanatory material for any online archive of TV material ."
Max ERC Funding
1 680 121 €
Duration
Start date: 2013-08-01, End date: 2018-07-31
Project acronym ADAPT
Project Life in a cold climate: the adaptation of cereals to new environments and the establishment of agriculture in Europe
Researcher (PI) Terence Austen Brown
Host Institution (HI) THE UNIVERSITY OF MANCHESTER
Call Details Advanced Grant (AdG), SH6, ERC-2013-ADG
Summary "This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Summary
"This project explores the concept of agricultural spread as analogous to enforced climate change and asks how cereals adapted to new environments when agriculture was introduced into Europe. Archaeologists have long recognized that the ecological pressures placed on crops would have had an impact on the spread and subsequent development of agriculture, but previously there has been no means of directly assessing the scale and nature of this impact. Recent work that I have directed has shown how such a study could be carried out, and the purpose of this project is to exploit these breakthroughs with the goal of assessing the influence of environmental adaptation on the spread of agriculture, its adoption as the primary subsistence strategy, and the subsequent establishment of farming in different parts of Europe. This will correct the current imbalance between our understanding of the human and environmental dimensions to the domestication of Europe. I will use methods from population genomics to identify loci within the barley and wheat genomes that have undergone selection since the beginning of cereal cultivation in Europe. I will then use ecological modelling to identify those loci whose patterns of selection are associated with ecogeographical variables and hence represent adaptations to local environmental conditions. I will assign dates to the periods when adaptations occurred by sequencing ancient DNA from archaeobotanical assemblages and by computer methods that enable the temporal order of adaptations to be deduced. I will then synthesise the information on environmental adaptations with dating evidence for the spread of agriculture in Europe, which reveals pauses that might be linked to environmental adaptation, with demographic data that indicate regions where Neolithic populations declined, possibly due to inadequate crop productivity, and with an archaeobotanical database showing changes in the prevalence of individual cereals in different regions."
Max ERC Funding
2 492 964 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AdaptiveResponse
Project The evolution of adaptive response mechanisms
Researcher (PI) Franz WEISSING
Host Institution (HI) RIJKSUNIVERSITEIT GRONINGEN
Call Details Advanced Grant (AdG), LS8, ERC-2017-ADG
Summary In an era of rapid climate change there is a pressing need to understand whether and how organisms are able to adapt to novel environments. Such understanding is hampered by a major divide in the life sciences. Disciplines like systems biology or neurobiology make rapid progress in unravelling the mechanisms underlying the responses of organisms to their environment, but this knowledge is insufficiently integrated in eco-evolutionary theory. Current eco-evolutionary models focus on the response patterns themselves, largely neglecting the structures and mechanisms producing these patterns. Here I propose a new, mechanism-oriented framework that views the architecture of adaptation, rather than the resulting responses, as the primary target of natural selection. I am convinced that this change in perspective will yield fundamentally new insights, necessitating the re-evaluation of many seemingly well-established eco-evolutionary principles.
My aim is to develop a comprehensive theory of the eco-evolutionary causes and consequences of the architecture underlying adaptive responses. In three parallel lines of investigation, I will study how architecture is shaped by selection, how evolved response strategies reflect the underlying architecture, and how these responses affect the eco-evolutionary dynamics and the capacity to adapt to novel conditions. All three lines have the potential of making ground-breaking contributions to eco-evolutionary theory, including: the specification of evolutionary tipping points; resolving the puzzle that real organisms evolve much faster than predicted by current theory; a new and general explanation for the evolutionary emergence of individual variation; and a framework for studying the evolution of learning and other general-purpose mechanisms. By making use of concepts from information theory and artificial intelligence, the project will also introduce various methodological innovations.
Summary
In an era of rapid climate change there is a pressing need to understand whether and how organisms are able to adapt to novel environments. Such understanding is hampered by a major divide in the life sciences. Disciplines like systems biology or neurobiology make rapid progress in unravelling the mechanisms underlying the responses of organisms to their environment, but this knowledge is insufficiently integrated in eco-evolutionary theory. Current eco-evolutionary models focus on the response patterns themselves, largely neglecting the structures and mechanisms producing these patterns. Here I propose a new, mechanism-oriented framework that views the architecture of adaptation, rather than the resulting responses, as the primary target of natural selection. I am convinced that this change in perspective will yield fundamentally new insights, necessitating the re-evaluation of many seemingly well-established eco-evolutionary principles.
My aim is to develop a comprehensive theory of the eco-evolutionary causes and consequences of the architecture underlying adaptive responses. In three parallel lines of investigation, I will study how architecture is shaped by selection, how evolved response strategies reflect the underlying architecture, and how these responses affect the eco-evolutionary dynamics and the capacity to adapt to novel conditions. All three lines have the potential of making ground-breaking contributions to eco-evolutionary theory, including: the specification of evolutionary tipping points; resolving the puzzle that real organisms evolve much faster than predicted by current theory; a new and general explanation for the evolutionary emergence of individual variation; and a framework for studying the evolution of learning and other general-purpose mechanisms. By making use of concepts from information theory and artificial intelligence, the project will also introduce various methodological innovations.
Max ERC Funding
2 500 000 €
Duration
Start date: 2018-12-01, End date: 2023-11-30
Project acronym ADDECCO
Project Adaptive Schemes for Deterministic and Stochastic Flow Problems
Researcher (PI) Remi Abgrall
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE1, ERC-2008-AdG
Summary The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Summary
The numerical simulation of complex compressible flow problem is still a challenge nowaday even for simple models. In our opinion, the most important hard points that need currently to be tackled and solved is how to obtain stable, scalable, very accurate, easy to code and to maintain schemes on complex geometries. The method should easily handle mesh refinement, even near the boundary where the most interesting engineering quantities have to be evaluated. Unsteady uncertainties in the model, for example in the geometry or the boundary conditions should represented efficiently.This proposal goal is to design, develop and evaluate solutions to each of the above problems. Our work program will lead to significant breakthroughs for flow simulations. More specifically, we propose to work on 3 connected problems: 1-A class of very high order numerical schemes able to easily deal with the geometry of boundaries and still can solve steep problems. The geometry is generally defined by CAD tools. The output is used to generate a mesh which is then used by the scheme. Hence, any mesh refinement process is disconnected from the CAD, a situation that prevents the spread of mesh adaptation techniques in industry! 2-A class of very high order numerical schemes which can utilize possibly solution dependant basis functions in order to lower the number of degrees of freedom, for example to compute accurately boundary layers with low resolutions. 3-A general non intrusive technique for handling uncertainties in order to deal with irregular probability density functions (pdf) and also to handle pdf that may evolve in time, for example thanks to an optimisation loop. The curse of dimensionality will be dealt thanks Harten's multiresolution method combined with sparse grid methods. Currently, and up to our knowledge, no scheme has each of these properties. This research program will have an impact on numerical schemes and industrial applications.
Max ERC Funding
1 432 769 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym ADEQUATE
Project Advanced optoelectronic Devices with Enhanced QUAntum efficiency at THz frEquencies
Researcher (PI) Carlo Sirtori
Host Institution (HI) UNIVERSITE PARIS DIDEROT - PARIS 7
Call Details Advanced Grant (AdG), PE3, ERC-2009-AdG
Summary The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Summary
The aim of this project is the realisation of efficient mid-infrared and THz optoelectronic emitters. This work is motivated by the fact that the spontaneous emission in this frequency range is characterized by an extremely long lifetime when compared to non-radiative processes, giving rise to devices with very low quantum efficiency. To this end we want to develop hybrid light-matter systems, already well known in quantum optics, within optoelectronics devices, that will be driven by electrical injection. With this project we want to extend the field of optoelectronics by introducing some of the concepts of quantum optic, particularly the light-matter strong coupling, into semiconductor devices. More precisely this project aims at the implementation of novel optoelectronic emitters operating in the strong coupling regime between an intersubband excitation of a two-dimensional electron gas and a microcavity photonic mode. The quasiparticles issued from this coupling are called intersubband polaritons. The major difficulties and challenges of this project, do not lay in the observation of these quantum effects, but in their exploitation for a specific function, in particular an efficient electrical to optical conversion. To obtain efficient quantum emitters in the THz frequency range we will follow two different approaches: - In the first case we will try to exploit the additional characteristic time of the system introduced by the light-matter interaction in the strong (or ultra-strong) coupling regime. - The second approach will exploit the fact that, under certain conditions, intersubband polaritons have a bosonic character; as a consequence they can undergo stimulated scattering, giving rise to polaritons lasers as it has been shown for excitonic polaritons.
Max ERC Funding
1 761 000 €
Duration
Start date: 2010-05-01, End date: 2015-04-30
Project acronym ADMIRE
Project Atomic-scale Design of Majorana states and their Innovative Real-space Exploration
Researcher (PI) Roland WIESENDANGER
Host Institution (HI) UNIVERSITAET HAMBURG
Call Details Advanced Grant (AdG), PE3, ERC-2017-ADG
Summary Fault-tolerant topological quantum computation has become one of the most exciting research directions in modern condensed matter physics. As a key operation the braiding of non-Abelian anyons has been proposed theoretically. Such exotic quasiparticles can be realized as zero-energy Majorana bound states at the ends of one-dimensional magnetic nanowires in proximity to s-wave superconductors in the presence of high spin-orbit coupling. In contrast to previous attempts to realize such systems experimentally, based on the growth of semiconducting nanowires or the self-assembly of ferromagnetic nanowires on s-wave superconductors, we propose to design Majorana bound states in artificially constructed single-atom chains with non-collinear spin-textures on elemental superconducting substrates using scanning tunnelling microscope (STM)-based atom manipulation techniques. We would like to study at the atomic level the formation of Shiba bands as a result of hybridization of individual Shiba impurity states as well as the emergence of zero-energy Majorana bound states as a function of chain structure, length, and composition. Moreover, we will construct model-type platforms, such as T-junctions, rings, and more complex network structures with atomic-scale precision as a basis for demonstrating the manipulation and braiding of Majorana bound states. We will make use of sophisticated experimental techniques, such as spin-resolved scanning tunnelling spectroscopy (STS) at micro-eV energy resolution, scanning Josephson tunnelling spectroscopy, and multi-probe STS under well-defined ultra-high vacuum conditions, in order to directly probe the nature of the magnetic state of the atomic wires, the spin-polarization of the emergent Majorana states, as well as the spatial nature of the superconducting order parameter in real space. Finally, we will try to directly probe the quantum exchange statistics of non-Abelian anyons in these atomically precise fabricated model-type systems.
Summary
Fault-tolerant topological quantum computation has become one of the most exciting research directions in modern condensed matter physics. As a key operation the braiding of non-Abelian anyons has been proposed theoretically. Such exotic quasiparticles can be realized as zero-energy Majorana bound states at the ends of one-dimensional magnetic nanowires in proximity to s-wave superconductors in the presence of high spin-orbit coupling. In contrast to previous attempts to realize such systems experimentally, based on the growth of semiconducting nanowires or the self-assembly of ferromagnetic nanowires on s-wave superconductors, we propose to design Majorana bound states in artificially constructed single-atom chains with non-collinear spin-textures on elemental superconducting substrates using scanning tunnelling microscope (STM)-based atom manipulation techniques. We would like to study at the atomic level the formation of Shiba bands as a result of hybridization of individual Shiba impurity states as well as the emergence of zero-energy Majorana bound states as a function of chain structure, length, and composition. Moreover, we will construct model-type platforms, such as T-junctions, rings, and more complex network structures with atomic-scale precision as a basis for demonstrating the manipulation and braiding of Majorana bound states. We will make use of sophisticated experimental techniques, such as spin-resolved scanning tunnelling spectroscopy (STS) at micro-eV energy resolution, scanning Josephson tunnelling spectroscopy, and multi-probe STS under well-defined ultra-high vacuum conditions, in order to directly probe the nature of the magnetic state of the atomic wires, the spin-polarization of the emergent Majorana states, as well as the spatial nature of the superconducting order parameter in real space. Finally, we will try to directly probe the quantum exchange statistics of non-Abelian anyons in these atomically precise fabricated model-type systems.
Max ERC Funding
2 499 750 €
Duration
Start date: 2019-01-01, End date: 2023-12-31
Project acronym AdOMiS
Project Adaptive Optical Microscopy Systems: Unifying theory, practice and applications
Researcher (PI) Martin BOOTH
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE7, ERC-2015-AdG
Summary Recent technological advances in optical microscopy have vastly broadened the possibilities for applications in the biomedical sciences. Fluorescence microscopy is the central tool for investigation of molecular structures and dynamics that take place in the cellular and tissue environment. Coupled with progress in labeling methods, these microscopes permit observation of biological structures and processes with unprecedented sensitivity and resolution. This work has been enabled by the engineering development of diverse optical systems that provide different capabilities for the imaging toolkit. All such methods rely upon high fidelity optics to provide optimal resolution and efficiency, but they all suffer from aberrations caused by refractive index variations within the specimen. It is widely accepted that in many applications this fundamental problem prevents optimum operation and limits capability. Adaptive optics (AO) has been introduced to overcome these limitations by correcting aberrations and a range of demonstrations has shown clearly its potential. Indeed, it shows great promise to improve virtually all types of research or commercial microscopes, but significant challenges must still be met before AO can be widely implemented in routine imaging. Current advances are being made through development of bespoke AO solutions to individual imaging tasks. However, the diversity of microscopy methods means that individual solutions are often not translatable to other systems. This proposal is directed towards the creation of theoretical and practical frameworks that tie together AO concepts and provide a suite of scientific tools with broad application. This will be achieved through a systems approach that encompasses theoretical modelling, optical engineering and the requirements of biological applications. Additional outputs will include practical designs, operating protocols and software algorithms that will support next generation AO microscope systems.
Summary
Recent technological advances in optical microscopy have vastly broadened the possibilities for applications in the biomedical sciences. Fluorescence microscopy is the central tool for investigation of molecular structures and dynamics that take place in the cellular and tissue environment. Coupled with progress in labeling methods, these microscopes permit observation of biological structures and processes with unprecedented sensitivity and resolution. This work has been enabled by the engineering development of diverse optical systems that provide different capabilities for the imaging toolkit. All such methods rely upon high fidelity optics to provide optimal resolution and efficiency, but they all suffer from aberrations caused by refractive index variations within the specimen. It is widely accepted that in many applications this fundamental problem prevents optimum operation and limits capability. Adaptive optics (AO) has been introduced to overcome these limitations by correcting aberrations and a range of demonstrations has shown clearly its potential. Indeed, it shows great promise to improve virtually all types of research or commercial microscopes, but significant challenges must still be met before AO can be widely implemented in routine imaging. Current advances are being made through development of bespoke AO solutions to individual imaging tasks. However, the diversity of microscopy methods means that individual solutions are often not translatable to other systems. This proposal is directed towards the creation of theoretical and practical frameworks that tie together AO concepts and provide a suite of scientific tools with broad application. This will be achieved through a systems approach that encompasses theoretical modelling, optical engineering and the requirements of biological applications. Additional outputs will include practical designs, operating protocols and software algorithms that will support next generation AO microscope systems.
Max ERC Funding
3 234 789 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym ADOR
Project Assembly-disassembly-organisation-reassembly of microporous materials
Researcher (PI) Russell MORRIS
Host Institution (HI) THE UNIVERSITY COURT OF THE UNIVERSITY OF ST ANDREWS
Call Details Advanced Grant (AdG), PE5, ERC-2017-ADG
Summary Microporous materials are an important class of solid; the two main members of this family are zeolites and metal-organic frameworks (MOFs). Zeolites are industrial solids whose applications range from catalysis, through ion exchange and adsorption technologies to medicine. MOFs are some of the most exciting new materials to have been developed over the last two decades, and they are just beginning to be applied commercially.
Over recent years the applicant’s group has developed new synthetic strategies to prepare microporous materials, called the Assembly-Disassembly-Organisation-Reassembly (ADOR) process. In significant preliminary work the ADOR process has shown to be an extremely important new synthetic methodology that differs fundamentally from traditional solvothermal methods.
In this project I will look to overturn the conventional thinking in materials science by developing methodologies that can target both zeolites and MOF materials that are difficult to prepare using traditional methods – the so-called ‘unfeasible’ materials. The importance of such a new methodology is that it will open up routes to materials that have different properties (both chemical and topological) to those we currently have. Since zeolites and MOFs have so many actual and potential uses, the preparation of materials with different properties has a high chance of leading to new technologies in the medium/long term. To complete the major objective I will look to complete four closely linked activities covering the development of design strategies for zeolites and MOFs (activities 1 & 2), mechanistic studies to understand the process at the molecular level using in situ characterisation techniques (activity 3) and an exploration of potential applied science for the prepared materials (activity 4).
Summary
Microporous materials are an important class of solid; the two main members of this family are zeolites and metal-organic frameworks (MOFs). Zeolites are industrial solids whose applications range from catalysis, through ion exchange and adsorption technologies to medicine. MOFs are some of the most exciting new materials to have been developed over the last two decades, and they are just beginning to be applied commercially.
Over recent years the applicant’s group has developed new synthetic strategies to prepare microporous materials, called the Assembly-Disassembly-Organisation-Reassembly (ADOR) process. In significant preliminary work the ADOR process has shown to be an extremely important new synthetic methodology that differs fundamentally from traditional solvothermal methods.
In this project I will look to overturn the conventional thinking in materials science by developing methodologies that can target both zeolites and MOF materials that are difficult to prepare using traditional methods – the so-called ‘unfeasible’ materials. The importance of such a new methodology is that it will open up routes to materials that have different properties (both chemical and topological) to those we currently have. Since zeolites and MOFs have so many actual and potential uses, the preparation of materials with different properties has a high chance of leading to new technologies in the medium/long term. To complete the major objective I will look to complete four closely linked activities covering the development of design strategies for zeolites and MOFs (activities 1 & 2), mechanistic studies to understand the process at the molecular level using in situ characterisation techniques (activity 3) and an exploration of potential applied science for the prepared materials (activity 4).
Max ERC Funding
2 489 220 €
Duration
Start date: 2018-10-01, End date: 2023-09-30
Project acronym ADORA
Project Asymptotic approach to spatial and dynamical organizations
Researcher (PI) Benoit PERTHAME
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE1, ERC-2016-ADG
Summary The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Summary
The understanding of spatial, social and dynamical organization of large numbers of agents is presently a fundamental issue in modern science. ADORA focuses on problems motivated by biology because, more than anywhere else, access to precise and many data has opened the route to novel and complex biomathematical models. The problems we address are written in terms of nonlinear partial differential equations. The flux-limited Keller-Segel system, the integrate-and-fire Fokker-Planck equation, kinetic equations with internal state, nonlocal parabolic equations and constrained Hamilton-Jacobi equations are among examples of the equations under investigation.
The role of mathematics is not only to understand the analytical structure of these new problems, but it is also to explain the qualitative behavior of solutions and to quantify their properties. The challenge arises here because these goals should be achieved through a hierarchy of scales. Indeed, the problems under consideration share the common feature that the large scale behavior cannot be understood precisely without access to a hierarchy of finer scales, down to the individual behavior and sometimes its molecular determinants.
Major difficulties arise because the numerous scales present in these equations have to be discovered and singularities appear in the asymptotic process which yields deep compactness obstructions. Our vision is that the complexity inherent to models of biology can be enlightened by mathematical analysis and a classification of the possible asymptotic regimes.
However an enormous effort is needed to uncover the equations intimate mathematical structures, and bring them at the level of conceptual understanding they deserve being given the applications motivating these questions which range from medical science or neuroscience to cell biology.
Max ERC Funding
2 192 500 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ADOS
Project AMPA Receptor Dynamic Organization and Synaptic transmission in health and disease
Researcher (PI) Daniel Georges Gustave Choquet
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), LS5, ERC-2013-ADG
Summary AMPA glutamate receptors (AMPAR) play key roles in information processing by the brain as they mediate nearly all fast excitatory synaptic transmission. Their spatio-temporal organization in the post synapse with respect to presynaptic glutamate release sites is a key determinant in synaptic transmission. The activity-dependent regulation of AMPAR organization is at the heart of synaptic plasticity processes underlying learning and memory. Dysfunction of synaptic transmission - hence AMPAR organization - is likely at the origin of a number of brain diseases.
Building on discoveries made during my past ERC grant, our new ground-breaking objective is to uncover the mechanisms that link synaptic transmission with the dynamic organization of AMPAR and associated proteins. For this aim, we have assembled a team of neurobiologists, computer scientists and chemists with a track record of collaboration. We will combine physiology, cellular and molecular neurobiology with development of novel quantitative imaging and biomolecular tools to probe the molecular dynamics that regulate synaptic transmission.
Live high content 3D SuperResolution Light Imaging (SRLI) combined with electron microscopy will allow unprecedented visualization of AMPAR organization in synapses at the scale of individual subunits up to the level of intact tissue. Simultaneous SRLI and electrophysiology will elucidate the intricate relations between dynamic AMPAR organization, trafficking and synaptic transmission. Novel peptide- and small protein-based probes used as protein-protein interaction reporters and modulators will be developed to image and directly interfere with synapse organization.
We will identify new processes that are fundamental to activity dependent modifications of synaptic transmission. We will apply the above findings to understand the causes of early cognitive deficits in models of neurodegenerative disorders and open new avenues of research for innovative therapies.
Summary
AMPA glutamate receptors (AMPAR) play key roles in information processing by the brain as they mediate nearly all fast excitatory synaptic transmission. Their spatio-temporal organization in the post synapse with respect to presynaptic glutamate release sites is a key determinant in synaptic transmission. The activity-dependent regulation of AMPAR organization is at the heart of synaptic plasticity processes underlying learning and memory. Dysfunction of synaptic transmission - hence AMPAR organization - is likely at the origin of a number of brain diseases.
Building on discoveries made during my past ERC grant, our new ground-breaking objective is to uncover the mechanisms that link synaptic transmission with the dynamic organization of AMPAR and associated proteins. For this aim, we have assembled a team of neurobiologists, computer scientists and chemists with a track record of collaboration. We will combine physiology, cellular and molecular neurobiology with development of novel quantitative imaging and biomolecular tools to probe the molecular dynamics that regulate synaptic transmission.
Live high content 3D SuperResolution Light Imaging (SRLI) combined with electron microscopy will allow unprecedented visualization of AMPAR organization in synapses at the scale of individual subunits up to the level of intact tissue. Simultaneous SRLI and electrophysiology will elucidate the intricate relations between dynamic AMPAR organization, trafficking and synaptic transmission. Novel peptide- and small protein-based probes used as protein-protein interaction reporters and modulators will be developed to image and directly interfere with synapse organization.
We will identify new processes that are fundamental to activity dependent modifications of synaptic transmission. We will apply the above findings to understand the causes of early cognitive deficits in models of neurodegenerative disorders and open new avenues of research for innovative therapies.
Max ERC Funding
2 491 157 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym ADREEM
Project Adding Another Dimension – Arrays of 3D Bio-Responsive Materials
Researcher (PI) Mark Bradley
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary This proposal is focused in the areas of chemical medicine and chemical biology with the key drivers being the discovery and development of new materials that have practical functionality and application. The project will enable the fabrication of thousands of three-dimensional “smart-polymers” that will allow: (i). The precise and controlled release of drugs upon the addition of either a small molecule trigger or in response to disease, (ii). The discovery of materials that control and manipulate cells with the identification of scaffolds that provide the necessary biochemical cues for directing cell fate and drive tissue regeneration and (iii). The development of new classes of “smart-polymers” able, in real-time, to sense and report bacterial contamination. The newly discovered materials will find multiple biomedical applications in regenerative medicine and biotechnology ranging from 3D cell culture, bone repair and niche stabilisation to bacterial sensing/removal, while offering a new paradigm in drug delivery with biomarker triggered drug release.
Summary
This proposal is focused in the areas of chemical medicine and chemical biology with the key drivers being the discovery and development of new materials that have practical functionality and application. The project will enable the fabrication of thousands of three-dimensional “smart-polymers” that will allow: (i). The precise and controlled release of drugs upon the addition of either a small molecule trigger or in response to disease, (ii). The discovery of materials that control and manipulate cells with the identification of scaffolds that provide the necessary biochemical cues for directing cell fate and drive tissue regeneration and (iii). The development of new classes of “smart-polymers” able, in real-time, to sense and report bacterial contamination. The newly discovered materials will find multiple biomedical applications in regenerative medicine and biotechnology ranging from 3D cell culture, bone repair and niche stabilisation to bacterial sensing/removal, while offering a new paradigm in drug delivery with biomarker triggered drug release.
Max ERC Funding
2 310 884 €
Duration
Start date: 2014-11-01, End date: 2019-10-31
Project acronym AdS-CFT-solvable
Project Origins of integrability in AdS/CFT correspondence
Researcher (PI) Vladimir Kazakov
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE2, ERC-2012-ADG_20120216
Summary Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Summary
Fundamental interactions in nature are well described by quantum gauge fields in 4 space-time dimensions (4d). When the strength of gauge interaction is weak the Feynman perturbation techniques are very efficient for the description of most of the experimentally observable consequences of the Standard model and for the study of high energy processes in QCD.
But in the intermediate and strong coupling regime, such as the relatively small energies in QCD, the perturbation theory fails leaving us with no reliable analytic methods (except the Monte-Carlo simulation). The project aims at working out new analytic and computational methods for strongly coupled gauge theories in 4d. We will employ for that two important discoveries: 1) the gauge-string duality (AdS/CFT correspondence) relating certain strongly coupled gauge Conformal Field
Theories to the weakly coupled string theories on Anty-deSitter space; 2) the solvability, or integrability of maximally supersymmetric (N=4) 4d super Yang-Mills (SYM) theory in multicolor limit. Integrability made possible pioneering exact numerical and analytic results in the N=4 multicolor SYM at any coupling, effectively summing up all 4d Feynman diagrams. Recently, we conjectured a system of functional equations - the AdS/CFT Y-system – for the exact spectrum of anomalous dimensions of all local operators in N=4 SYM. The conjecture has passed all available checks. My project is aimed at the understanding of origins of this, still mysterious integrability. Deriving the AdS/CFT Y-system from the first principles on both sides of gauge-string duality should provide a long-awaited proof of the AdS/CFT correspondence itself. I plan to use the Y-system to study the systematic weak and strong coupling expansions and the so called BFKL limit, as well as for calculation of multi-point correlation functions of N=4 SYM. We hope on new insights into the strong coupling dynamics of less supersymmetric gauge theories and of QCD.
Max ERC Funding
1 456 140 €
Duration
Start date: 2013-11-01, End date: 2018-10-31
Project acronym ADSNeSP
Project Active and Driven Systems: Nonequilibrium Statistical Physics
Researcher (PI) Michael Elmhirst CATES
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE3, ERC-2016-ADG
Summary Active Matter systems, such as self-propelled colloids, violate time-reversal symmetry by producing entropy locally, typically converting fuel into mechanical motion at the particle scale. Other driven systems instead produce entropy because of global forcing by external fields, or boundary conditions that impose macroscopic fluxes (such as the momentum flux across a fluid sheared between moving parallel walls).
Nonequilibrium statistical physics (NeSP) is the basic toolbox for both classes of system. In recent years, much progress in NeSP has stemmed from bottom-up work on driven systems. This has provided a number of exactly solved benchmark models, and extended approximation techniques to address driven non-ergodic systems, such as sheared glasses. Meanwhile, work on fluctuation theorems and stochastic thermodynamics have created profound, model-independent insights into dynamics far from equilibrium.
More recently, the field of Active Matter has moved forward rapidly, leaving in its wake a series of generic and profound NeSP questions that now need answers: When is time-reversal symmetry, broken at the microscale, restored by coarse-graining? If it is restored, is an effective thermodynamic description is possible? How different is an active system's behaviour from a globally forced one?
ADSNeSP aims to distil from recent Active Matter research such fundamental questions; answer them first in the context of specific models and second in more general terms; and then, using the tools and insights gained, shed new light on longstanding problems in the wider class of driven systems.
I believe these new tools and insights will be substantial, because local activity takes systems far from equilibrium in a conceptually distinct direction from most types of global driving. By focusing on general principles and on simple models of activity, I seek to create a new vantage point that can inform, and potentially transform, wider areas of statistical physics.
Summary
Active Matter systems, such as self-propelled colloids, violate time-reversal symmetry by producing entropy locally, typically converting fuel into mechanical motion at the particle scale. Other driven systems instead produce entropy because of global forcing by external fields, or boundary conditions that impose macroscopic fluxes (such as the momentum flux across a fluid sheared between moving parallel walls).
Nonequilibrium statistical physics (NeSP) is the basic toolbox for both classes of system. In recent years, much progress in NeSP has stemmed from bottom-up work on driven systems. This has provided a number of exactly solved benchmark models, and extended approximation techniques to address driven non-ergodic systems, such as sheared glasses. Meanwhile, work on fluctuation theorems and stochastic thermodynamics have created profound, model-independent insights into dynamics far from equilibrium.
More recently, the field of Active Matter has moved forward rapidly, leaving in its wake a series of generic and profound NeSP questions that now need answers: When is time-reversal symmetry, broken at the microscale, restored by coarse-graining? If it is restored, is an effective thermodynamic description is possible? How different is an active system's behaviour from a globally forced one?
ADSNeSP aims to distil from recent Active Matter research such fundamental questions; answer them first in the context of specific models and second in more general terms; and then, using the tools and insights gained, shed new light on longstanding problems in the wider class of driven systems.
I believe these new tools and insights will be substantial, because local activity takes systems far from equilibrium in a conceptually distinct direction from most types of global driving. By focusing on general principles and on simple models of activity, I seek to create a new vantage point that can inform, and potentially transform, wider areas of statistical physics.
Max ERC Funding
2 043 630 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AEDNA
Project Amorphous and Evolutionary DNA Nanotechnology
Researcher (PI) Friedrich SIMMEL
Host Institution (HI) TECHNISCHE UNIVERSITAET MUENCHEN
Call Details Advanced Grant (AdG), PE5, ERC-2015-AdG
Summary Amorphous and evolutionary DNA nanotechnology (AEDNA) explores novel conceptual directions and applications for DNA nanotechnology, which are based on intelligent, DNA-programmed soft hybrid materials, and the utilization of evolutionary principles for the optimization of nucleic acid nanocomponents.
Amorphous DNA nanotechnology first aims at the creation of cell-sized, DNA-programmed microgels – DNA cells – with sensor, computation, communication, and actuator functions. Interacting DNA cells will be arranged into chemical cell consortia and artificial tissues using microfluidics, micromanipulation and 3D bioprinting techniques. Spatially distributed chemical circuits will then be utilized to establish collective behaviors such as quorum sensing, pattern formation, and self-differentiation within these consortia and tissues. The approach will be further scaled up to produce multicomponent DNA gel compositions that become active and differentiate upon mixing.
In evolutionary nanotechnology, techniques derived from directed molecular evolution experiments will be applied to optimize the arrangement of functional nucleic acids on DNA and RNA nanoscaffolds. Compartmentalization and microfluidics will be utilized to screen for nucleic acid nanostructures capable of superstructure formation, and also for the development of ligand-sensitive components for molecular programming. An evolutionary approach will then be applied to amorphous DNA cells, resulting in DNA cell populations which contain individuals with different molecular identities.
The proposal will pave the way for the creation of macroscopic materials with DNA-programmed intelligence, resulting in novel applications for DNA nanotechnology and molecular programming in diverse fields such as environmental and biological sensing, biocatalysis, smart adaptive materials, and soft robotics.
Summary
Amorphous and evolutionary DNA nanotechnology (AEDNA) explores novel conceptual directions and applications for DNA nanotechnology, which are based on intelligent, DNA-programmed soft hybrid materials, and the utilization of evolutionary principles for the optimization of nucleic acid nanocomponents.
Amorphous DNA nanotechnology first aims at the creation of cell-sized, DNA-programmed microgels – DNA cells – with sensor, computation, communication, and actuator functions. Interacting DNA cells will be arranged into chemical cell consortia and artificial tissues using microfluidics, micromanipulation and 3D bioprinting techniques. Spatially distributed chemical circuits will then be utilized to establish collective behaviors such as quorum sensing, pattern formation, and self-differentiation within these consortia and tissues. The approach will be further scaled up to produce multicomponent DNA gel compositions that become active and differentiate upon mixing.
In evolutionary nanotechnology, techniques derived from directed molecular evolution experiments will be applied to optimize the arrangement of functional nucleic acids on DNA and RNA nanoscaffolds. Compartmentalization and microfluidics will be utilized to screen for nucleic acid nanostructures capable of superstructure formation, and also for the development of ligand-sensitive components for molecular programming. An evolutionary approach will then be applied to amorphous DNA cells, resulting in DNA cell populations which contain individuals with different molecular identities.
The proposal will pave the way for the creation of macroscopic materials with DNA-programmed intelligence, resulting in novel applications for DNA nanotechnology and molecular programming in diverse fields such as environmental and biological sensing, biocatalysis, smart adaptive materials, and soft robotics.
Max ERC Funding
2 157 698 €
Duration
Start date: 2016-06-01, End date: 2021-05-31
Project acronym AEROCAT
Project Non-ordered nanoparticle superstructures – aerogels as efficient (electro-)catalysts
Researcher (PI) Alexander Eychmüller
Host Institution (HI) TECHNISCHE UNIVERSITAET DRESDEN
Call Details Advanced Grant (AdG), PE5, ERC-2013-ADG
Summary "AEROCAT aims at the elucidation of the potential of nanoparticle derived aerogels in catalytic applications. The materials will be produced from a variety of nanoparticles available in colloidal solutions, amongst which are metals and metal oxides. The evolving aerogels are extremely light, highly porous solids and have been demonstrated to exhibit in many cases the important properties of the nanosized objects they consist of instead of simply those of the respective bulk solids. The resulting aerogel materials will be characterized with respect to their morphology and composition and their resulting (electro-)catalytic properties examined in the light of the inherent electronic nature of the nanosized constituents. Using the knowledge gained within the project the aerogel materials will be further re-processed in order to exploit their full potential relevant to catalysis and electrocatalysis.
From the vast variety of possible applications of nanoparticle-based hydro- and aerogels like thermoelectrics, LEDs, pollutant clearance, sensorics and others we choose our strictly focused approach
(i) due to the paramount importance of catalysis for the Chemical Industry,
(ii) because we have successfully studied the Ethanol electrooxidation on a Pd-nanoparticle aerogel,
(iii) we have patented on the oxygen reduction reaction in fuel cells with bimetallic aerogels,
(iv) and we gained first and extremely promising results on the semi-hydrogenation of Acetylene on a mixed Pd/ZnO-nanoparticle aerogel.
With this we are on the forefront of a research field which impact might not be overestimated. We should quickly explore its potentials and transfer on a short track the knowledge gained into pre-industrial testing."
Summary
"AEROCAT aims at the elucidation of the potential of nanoparticle derived aerogels in catalytic applications. The materials will be produced from a variety of nanoparticles available in colloidal solutions, amongst which are metals and metal oxides. The evolving aerogels are extremely light, highly porous solids and have been demonstrated to exhibit in many cases the important properties of the nanosized objects they consist of instead of simply those of the respective bulk solids. The resulting aerogel materials will be characterized with respect to their morphology and composition and their resulting (electro-)catalytic properties examined in the light of the inherent electronic nature of the nanosized constituents. Using the knowledge gained within the project the aerogel materials will be further re-processed in order to exploit their full potential relevant to catalysis and electrocatalysis.
From the vast variety of possible applications of nanoparticle-based hydro- and aerogels like thermoelectrics, LEDs, pollutant clearance, sensorics and others we choose our strictly focused approach
(i) due to the paramount importance of catalysis for the Chemical Industry,
(ii) because we have successfully studied the Ethanol electrooxidation on a Pd-nanoparticle aerogel,
(iii) we have patented on the oxygen reduction reaction in fuel cells with bimetallic aerogels,
(iv) and we gained first and extremely promising results on the semi-hydrogenation of Acetylene on a mixed Pd/ZnO-nanoparticle aerogel.
With this we are on the forefront of a research field which impact might not be overestimated. We should quickly explore its potentials and transfer on a short track the knowledge gained into pre-industrial testing."
Max ERC Funding
2 194 000 €
Duration
Start date: 2014-02-01, End date: 2019-01-31
Project acronym AestApp
Project The Aesthetics of Applied Theatre
Researcher (PI) Matthias Warstat
Host Institution (HI) FREIE UNIVERSITAET BERLIN
Call Details Advanced Grant (AdG), SH5, ERC-2011-ADG_20110406
Summary The project aims to systematically explore an entire field of current forms of theatre, which despite its outstanding cultural and political significance has so far largely been ignored by theatre studies. Over the past two decades, notwithstanding intense competition from television and electronic media, theatre has been able to reassert and even reinforce its relevance in many different parts of the world and in widely diverse cultural fields (politics, business, social work, development aid, health care, and education). This renewed relevance originates not in traditional, experimental, or commercial theatre but rather among the many different types of applied theatre, which set in motion constructive social processes while upholding theatre’s aesthetic claim. Theatre with clear social, political, or economic aims is experiencing an unprecedented boom. The study will analyse this cross-cultural trend against the background of new theories of the aesthetics of performances and rehearsal processes. This theatre studies approach promises precise insights into the aesthetic forms of applied theatre, which constitute the (hitherto barely researched) foundation of its political effects. It will furthermore bring to light the ethical issues of applied theatre: intense aesthetic experiences – often linked with risks when it comes to performances – do not readily fit in with the claim to restore children, youngsters, patients, and other target groups to health, integrity, and self-confidence through theatrical practice. The project aims to show how aesthetic, political, and ethical aspects interact in the practice of applied theatre. Investigations will focus on carefully selected case studies in Africa, Europe, the Middle East, and Latin America, whose comparison will make it possible for the first time to capture the worldwide landscape of applied theatre in its full diversity, but also in its overarching structures and interrelations.
Summary
The project aims to systematically explore an entire field of current forms of theatre, which despite its outstanding cultural and political significance has so far largely been ignored by theatre studies. Over the past two decades, notwithstanding intense competition from television and electronic media, theatre has been able to reassert and even reinforce its relevance in many different parts of the world and in widely diverse cultural fields (politics, business, social work, development aid, health care, and education). This renewed relevance originates not in traditional, experimental, or commercial theatre but rather among the many different types of applied theatre, which set in motion constructive social processes while upholding theatre’s aesthetic claim. Theatre with clear social, political, or economic aims is experiencing an unprecedented boom. The study will analyse this cross-cultural trend against the background of new theories of the aesthetics of performances and rehearsal processes. This theatre studies approach promises precise insights into the aesthetic forms of applied theatre, which constitute the (hitherto barely researched) foundation of its political effects. It will furthermore bring to light the ethical issues of applied theatre: intense aesthetic experiences – often linked with risks when it comes to performances – do not readily fit in with the claim to restore children, youngsters, patients, and other target groups to health, integrity, and self-confidence through theatrical practice. The project aims to show how aesthetic, political, and ethical aspects interact in the practice of applied theatre. Investigations will focus on carefully selected case studies in Africa, Europe, the Middle East, and Latin America, whose comparison will make it possible for the first time to capture the worldwide landscape of applied theatre in its full diversity, but also in its overarching structures and interrelations.
Max ERC Funding
2 285 295 €
Duration
Start date: 2012-12-01, End date: 2017-11-30
Project acronym AFMIDMOA
Project "Applying Fundamental Mathematics in Discrete Mathematics, Optimization, and Algorithmics"
Researcher (PI) Alexander Schrijver
Host Institution (HI) UNIVERSITEIT VAN AMSTERDAM
Call Details Advanced Grant (AdG), PE1, ERC-2013-ADG
Summary "This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Summary
"This proposal aims at strengthening the connections between more fundamentally oriented areas of mathematics like algebra, geometry, analysis, and topology, and the more applied oriented and more recently emerging disciplines of discrete mathematics, optimization, and algorithmics.
The overall goal of the project is to obtain, with methods from fundamental mathematics, new effective tools to unravel the complexity of structures like graphs, networks, codes, knots, polynomials, and tensors, and to get a grip on such complex structures by new efficient characterizations, sharper bounds, and faster algorithms.
In the last few years, there have been several new developments where methods from representation theory, invariant theory, algebraic geometry, measure theory, functional analysis, and topology found new applications in discrete mathematics and optimization, both theoretically and algorithmically. Among the typical application areas are networks, coding, routing, timetabling, statistical and quantum physics, and computer science.
The project focuses in particular on:
A. Understanding partition functions with invariant theory and algebraic geometry
B. Graph limits, regularity, Hilbert spaces, and low rank approximation of polynomials
C. Reducing complexity in optimization by exploiting symmetry with representation theory
D. Reducing complexity in discrete optimization by homotopy and cohomology
These research modules are interconnected by themes like symmetry, regularity, and complexity, and by common methods from algebra, analysis, geometry, and topology."
Max ERC Funding
2 001 598 €
Duration
Start date: 2014-01-01, End date: 2018-12-31
Project acronym AFRICA-GHG
Project AFRICA-GHG: The role of African tropical forests on the Greenhouse Gases balance of the atmosphere
Researcher (PI) Riccardo Valentini
Host Institution (HI) FONDAZIONE CENTRO EURO-MEDITERRANEOSUI CAMBIAMENTI CLIMATICI
Call Details Advanced Grant (AdG), PE10, ERC-2009-AdG
Summary The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Summary
The role of the African continent in the global carbon cycle, and therefore in climate change, is increasingly recognised. Despite the increasingly acknowledged importance of Africa in the global carbon cycle and its high vulnerability to climate change there is still a lack of studies on the carbon cycle in representative African ecosystems (in particular tropical forests), and on the effects of climate on ecosystem-atmosphere exchange. In the present proposal we want to focus on these spoecifc objectives : 1. Understand the role of African tropical rainforest on the GHG balance of the atmosphere and revise their role on the global methane and N2O emissions. 2. Determine the carbon source/sink strength of African tropical rainforest in the pre-industrial versus the XXth century by temporal reconstruction of biomass growth with biogeochemical markers 3. Understand and quantify carbon and GHG fluxes variability across African tropical forests (west east equatorial belt) 4.Analyse the impact of forest degradation and deforestation on carbon and other GHG emissions
Max ERC Funding
2 406 950 €
Duration
Start date: 2010-04-01, End date: 2014-12-31
Project acronym AFRIGOS
Project African Governance and Space: Transport Corridors, Border Towns and Port Cities in Transition
Researcher (PI) Paul Christopher Nugent
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Advanced Grant (AdG), SH2, ERC-2014-ADG
Summary AFRIGOS investigates the process of 'respacing' Africa, a political drive towards regional and continental integration, on the one hand, and the re-casting of Africa's engagement with the global economy, on the other. This is reflected in unprecedented levels of investment in physical and communications infrastructure, and the outsourcing of key functions of Customs, Immigration and security agencies. AFRIGOS poses the question of how far respacing is genuinely forging institutions that are facilitating or obstructing the movement of people and goods; that are enabling or preventing urban and border spaces from being more effectively and responsively governed; and that take into account the needs of African populations whose livelihoods are rooted in mobility and informality. The principal research questions are approached through a comparative study of port cities, border towns and other strategic nodes situated along the busiest transport corridors in East, Central, West and Southern Africa. These represent sites of remarkable dynamism and cosmopolitanism, which reflects their role in connecting African urban centres to each other and to other global cities.
AFRIGOS considers how governance 'assemblages' are forged at different scales and is explicitly comparative. It works through 5 connected Streams that address specific questions: 1. AGENDA-SETTING is concerned with policy (re-)formulation. 2. PERIPHERAL URBANISM examines governance in border towns and port cities. 3. BORDER WORKERS addresses everyday governance emerging through the interaction of officials and others who make their livelihoods from the border. 4. CONNECTIVE INFRASTRUCTURE looks as the transformative effects of new technologies. 5. PEOPLE & GOODS IN MOTION traces the passage of people and goods and the regimes of regulation to which they are subjected. AFRIGOS contributes to interdisciplinary research on borderland studies, multi-level governance and the everyday state.
Summary
AFRIGOS investigates the process of 'respacing' Africa, a political drive towards regional and continental integration, on the one hand, and the re-casting of Africa's engagement with the global economy, on the other. This is reflected in unprecedented levels of investment in physical and communications infrastructure, and the outsourcing of key functions of Customs, Immigration and security agencies. AFRIGOS poses the question of how far respacing is genuinely forging institutions that are facilitating or obstructing the movement of people and goods; that are enabling or preventing urban and border spaces from being more effectively and responsively governed; and that take into account the needs of African populations whose livelihoods are rooted in mobility and informality. The principal research questions are approached through a comparative study of port cities, border towns and other strategic nodes situated along the busiest transport corridors in East, Central, West and Southern Africa. These represent sites of remarkable dynamism and cosmopolitanism, which reflects their role in connecting African urban centres to each other and to other global cities.
AFRIGOS considers how governance 'assemblages' are forged at different scales and is explicitly comparative. It works through 5 connected Streams that address specific questions: 1. AGENDA-SETTING is concerned with policy (re-)formulation. 2. PERIPHERAL URBANISM examines governance in border towns and port cities. 3. BORDER WORKERS addresses everyday governance emerging through the interaction of officials and others who make their livelihoods from the border. 4. CONNECTIVE INFRASTRUCTURE looks as the transformative effects of new technologies. 5. PEOPLE & GOODS IN MOTION traces the passage of people and goods and the regimes of regulation to which they are subjected. AFRIGOS contributes to interdisciplinary research on borderland studies, multi-level governance and the everyday state.
Max ERC Funding
2 491 364 €
Duration
Start date: 2016-01-01, End date: 2020-12-31
Project acronym AFTERTHEGOLDRUSH
Project Addressing global sustainability challenges by changing perceptions in catalyst design
Researcher (PI) Graham John Hutchings
Host Institution (HI) CARDIFF UNIVERSITY
Call Details Advanced Grant (AdG), PE4, ERC-2011-ADG_20110209
Summary One of the greatest challenges facing society is the sustainability of resources. At present, a step change in the sustainable use of resources is needed and catalysis lies at the heart of the solution by providing new routes to carbon dioxide mitigation, energy security and water conservation. It is clear that new high efficiency game-changing catalysts are required to meet the challenge. This proposal will focus on excellence in catalyst design by learning from recent step change advances in gold catalysis by challenging perceptions. Intense interest in gold catalysts over the past two decades has accelerated our understanding of gold particle-size effects, gold-support and gold-metal interactions, the interchange between atomic and ionic gold species, and the role of the gold-support interface in creating and maintaining catalytic activity. The field has also driven the development of cutting-edge techniques, particularly in microscopy and transient kinetics, providing detailed structural characterisation on the nano-scale and probing the short-range and often short-lived interactions. By comparison, our understanding of other metal catalysts has remained relatively static.
The proposed programme will engender a step change in the design of supported-metal catalysts, by exploiting the learning and the techniques emerging from gold catalysis. The research will be set out in two themes. In Theme 1 two established key grand challenges will be attacked; namely, energy vectors and greenhouse gas control. Theme 2 will address two new and emerging grand challenges in catalysis namely the effective low temperature activation of primary carbon hydrogen bonds and CO2 utilisation where instead of treating CO2 as a thermodynamic endpoint, the aim will be to re-use it as a feedstock for bulk chemical and fuel production. The legacy of the research will be the development of a new catalyst design approach that will provide a tool box for future catalyst development.
Summary
One of the greatest challenges facing society is the sustainability of resources. At present, a step change in the sustainable use of resources is needed and catalysis lies at the heart of the solution by providing new routes to carbon dioxide mitigation, energy security and water conservation. It is clear that new high efficiency game-changing catalysts are required to meet the challenge. This proposal will focus on excellence in catalyst design by learning from recent step change advances in gold catalysis by challenging perceptions. Intense interest in gold catalysts over the past two decades has accelerated our understanding of gold particle-size effects, gold-support and gold-metal interactions, the interchange between atomic and ionic gold species, and the role of the gold-support interface in creating and maintaining catalytic activity. The field has also driven the development of cutting-edge techniques, particularly in microscopy and transient kinetics, providing detailed structural characterisation on the nano-scale and probing the short-range and often short-lived interactions. By comparison, our understanding of other metal catalysts has remained relatively static.
The proposed programme will engender a step change in the design of supported-metal catalysts, by exploiting the learning and the techniques emerging from gold catalysis. The research will be set out in two themes. In Theme 1 two established key grand challenges will be attacked; namely, energy vectors and greenhouse gas control. Theme 2 will address two new and emerging grand challenges in catalysis namely the effective low temperature activation of primary carbon hydrogen bonds and CO2 utilisation where instead of treating CO2 as a thermodynamic endpoint, the aim will be to re-use it as a feedstock for bulk chemical and fuel production. The legacy of the research will be the development of a new catalyst design approach that will provide a tool box for future catalyst development.
Max ERC Funding
2 279 785 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym AGATM
Project A Global Anthropology of Transforming Marriage
Researcher (PI) Janet CARSTEN
Host Institution (HI) THE UNIVERSITY OF EDINBURGH
Call Details Advanced Grant (AdG), SH5, ERC-2015-AdG
Summary This research will create a new theoretical vision of the importance of marriage as an agent of transformation in human sociality. Marriage globally is undergoing profound change, provoking intense debate and anxiety. These concerns refract wider instabilities in political, economic, and familial institutions. They signal the critical role of marriage in bringing together - and separating - intimate, personal, and familial life with wider state institutions. But we have little up to date comparative research or general theory of how marriage changes or the long-term significance of such change. Paradoxically, social scientific and public discourse emphasise the conservative and normative aspects of marriage. This underlines the need for a new theoretical frame that takes account of cultural and historical specificity to grasp the importance of marriage as both vehicle of and engine for transformation. AGATM overturns conventional understandings by viewing marriage as inherently transformative, indeed at the heart of social and cultural change. The research will investigate current transformations of marriage in two distinct senses. First, it will undertake an ethnographic investigation of new forms of marriage in selected sites in Europe, N. America, Asia, and Africa. Second, it will subject ‘marriage’ to a rigorous theoretical critique that will denaturalise marriage and reintegrate it into the new anthropology of kinship. Research on five complementary and contrastive sub-projects examining emerging forms of marriage in different locations will be structured through the themes of care, property, and ritual forms. The overarching analytic of temporality will frame the theoretical vision of the research and connect the themes. The resulting six monographs, journal articles, and exhibition will together revitalise the study of kinship by placing the moral, practical, political, and imaginative significance of marriage over time at its centre.
Summary
This research will create a new theoretical vision of the importance of marriage as an agent of transformation in human sociality. Marriage globally is undergoing profound change, provoking intense debate and anxiety. These concerns refract wider instabilities in political, economic, and familial institutions. They signal the critical role of marriage in bringing together - and separating - intimate, personal, and familial life with wider state institutions. But we have little up to date comparative research or general theory of how marriage changes or the long-term significance of such change. Paradoxically, social scientific and public discourse emphasise the conservative and normative aspects of marriage. This underlines the need for a new theoretical frame that takes account of cultural and historical specificity to grasp the importance of marriage as both vehicle of and engine for transformation. AGATM overturns conventional understandings by viewing marriage as inherently transformative, indeed at the heart of social and cultural change. The research will investigate current transformations of marriage in two distinct senses. First, it will undertake an ethnographic investigation of new forms of marriage in selected sites in Europe, N. America, Asia, and Africa. Second, it will subject ‘marriage’ to a rigorous theoretical critique that will denaturalise marriage and reintegrate it into the new anthropology of kinship. Research on five complementary and contrastive sub-projects examining emerging forms of marriage in different locations will be structured through the themes of care, property, and ritual forms. The overarching analytic of temporality will frame the theoretical vision of the research and connect the themes. The resulting six monographs, journal articles, and exhibition will together revitalise the study of kinship by placing the moral, practical, political, and imaginative significance of marriage over time at its centre.
Max ERC Funding
2 297 584 €
Duration
Start date: 2017-01-01, End date: 2021-12-31
Project acronym AGNES
Project ACTIVE AGEING – RESILIENCE AND EXTERNAL SUPPORT AS MODIFIERS OF THE DISABLEMENT OUTCOME
Researcher (PI) Taina Tuulikki RANTANEN
Host Institution (HI) JYVASKYLAN YLIOPISTO
Call Details Advanced Grant (AdG), SH3, ERC-2015-AdG
Summary The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Summary
The goals are 1. To develop a scale assessing the diversity of active ageing with four dimensions that are ability (what people can do), activity (what people do do), ambition (what are the valued activities that people want to do), and autonomy (how satisfied people are with the opportunity to do valued activities); 2. To examine health and physical and psychological functioning as the determinants and social and build environment, resilience and personal skills as modifiers of active ageing; 3. To develop a multicomponent sustainable intervention aiming to promote active ageing (methods: counselling, information technology, help from volunteers); 4. To test the feasibility and effectiveness on the intervention; and 5. To study cohort effects on the phenotypes on the pathway to active ageing.
“If You Can Measure It, You Can Change It.” Active ageing assessment needs conceptual progress, which I propose to do. A quantifiable scale will be developed that captures the diversity of active ageing stemming from the WHO definition of active ageing as the process of optimizing opportunities for health and participation in the society for all people in line with their needs, goals and capacities as they age. I will collect cross-sectional data (N=1000, ages 75, 80 and 85 years) and model the pathway to active ageing with state-of-the art statistical methods. By doing this I will create novel knowledge on preconditions for active ageing. The collected cohort data will be compared to a pre-existing cohort data that was collected 25 years ago to obtain knowledge about changes over time in functioning of older people. A randomized controlled trial (N=200) will be conducted to assess the effectiveness of the envisioned intervention promoting active ageing through participation. The project will regenerate ageing research by launching a novel scale, by training young scientists, by creating new concepts and theory development and by producing evidence for active ageing promotion
Max ERC Funding
2 044 364 €
Duration
Start date: 2016-09-01, End date: 2021-08-31
Project acronym AGNOSTIC
Project Actively Enhanced Cognition based Framework for Design of Complex Systems
Researcher (PI) Björn Ottersten
Host Institution (HI) UNIVERSITE DU LUXEMBOURG
Call Details Advanced Grant (AdG), PE7, ERC-2016-ADG
Summary Parameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.
Summary
Parameterized mathematical models have been central to the understanding and design of communication, networking, and radar systems. However, they often lack the ability to model intricate interactions innate in complex systems. On the other hand, data-driven approaches do not need explicit mathematical models for data generation and have a wider applicability at the cost of flexibility. These approaches need labelled data, representing all the facets of the system interaction with the environment. With the aforementioned systems becoming increasingly complex with intricate interactions and operating in dynamic environments, the number of system configurations can be rather large leading to paucity of labelled data. Thus there are emerging networks of systems of critical importance whose cognition is not effectively covered by traditional approaches. AGNOSTIC uses the process of exploration through system probing and exploitation of observed data in an iterative manner drawing upon traditional model-based approaches and data-driven discriminative learning to enhance functionality, performance, and robustness through the notion of active cognition. AGNOSTIC clearly departs from a passive assimilation of data and aims to formalize the exploitation/exploration framework in dynamic environments. The development of this framework in three applications areas is central to AGNOSTIC. The project aims to provide active cognition in radar to learn the environment and other active systems to ensure situational awareness and coexistence; to apply active probing in radio access networks to infer network behaviour towards spectrum sharing and self-configuration; and to learn and adapt to user demand for content distribution in caching networks, drastically improving network efficiency. Although these cognitive systems interact with the environment in very different ways, sufficient abstraction allows cross-fertilization of insights and approaches motivating their joint treatment.
Max ERC Funding
2 499 595 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym AGRISCENTS
Project Scents and sensibility in agriculture: exploiting specificity in herbivore- and pathogen-induced plant volatiles for real-time crop monitoring
Researcher (PI) Theodoor Turlings
Host Institution (HI) UNIVERSITE DE NEUCHATEL
Call Details Advanced Grant (AdG), LS9, ERC-2017-ADG
Summary Plants typically release large quantities of volatiles in response to attack by herbivores or pathogens. I may claim to have contributed to various breakthroughs in this research field, including the discovery that the volatile blends induced by different attackers are astonishingly specific, resulting in characteristic, readily distinguishable odour blends. Using maize as our model plant, I wish to take several leaps forward in our understanding of this signal specificity and use this knowledge to develop sensors for the real-time detection of crop pests and diseases. For this, three interconnected work-packages will aim to:
• Develop chemical analytical techniques and statistical models to decipher the odorous vocabulary of plants, and to create a complete inventory of “odour-prints” for a wide range of herbivore-plant and pathogen-plant combinations, including simultaneous infestations.
• Develop and optimize nano-mechanical sensors for the detection of specific plant volatile mixtures. For this, we will initially adapt a prototype sensor that has been successfully developed for the detection of cancer-related volatiles in human breath.
• Genetically manipulate maize plants to release a unique blend of root-produced volatiles upon herbivory. For this, we will engineer gene cassettes that combine recently identified P450 (CYP) genes from poplar with inducible, root-specific promoters from maize. This will result in maize plants that, in response to pest attack, release easy-to-detect aldoximes and nitriles from their roots.
In short, by investigating and manipulating the specificity of inducible odour blends we will generate the necessary knowhow to develop a novel odour-detection device. The envisioned sensor technology will permit real-time monitoring of the pests and enable farmers to apply crop protection treatments at the right time and in the right place.
Summary
Plants typically release large quantities of volatiles in response to attack by herbivores or pathogens. I may claim to have contributed to various breakthroughs in this research field, including the discovery that the volatile blends induced by different attackers are astonishingly specific, resulting in characteristic, readily distinguishable odour blends. Using maize as our model plant, I wish to take several leaps forward in our understanding of this signal specificity and use this knowledge to develop sensors for the real-time detection of crop pests and diseases. For this, three interconnected work-packages will aim to:
• Develop chemical analytical techniques and statistical models to decipher the odorous vocabulary of plants, and to create a complete inventory of “odour-prints” for a wide range of herbivore-plant and pathogen-plant combinations, including simultaneous infestations.
• Develop and optimize nano-mechanical sensors for the detection of specific plant volatile mixtures. For this, we will initially adapt a prototype sensor that has been successfully developed for the detection of cancer-related volatiles in human breath.
• Genetically manipulate maize plants to release a unique blend of root-produced volatiles upon herbivory. For this, we will engineer gene cassettes that combine recently identified P450 (CYP) genes from poplar with inducible, root-specific promoters from maize. This will result in maize plants that, in response to pest attack, release easy-to-detect aldoximes and nitriles from their roots.
In short, by investigating and manipulating the specificity of inducible odour blends we will generate the necessary knowhow to develop a novel odour-detection device. The envisioned sensor technology will permit real-time monitoring of the pests and enable farmers to apply crop protection treatments at the right time and in the right place.
Max ERC Funding
2 498 086 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym AGRIWESTMED
Project Origins and spread of agriculture in the south-western Mediterranean region
Researcher (PI) Maria Leonor Peña Chocarro
Host Institution (HI) AGENCIA ESTATAL CONSEJO SUPERIOR DEINVESTIGACIONES CIENTIFICAS
Call Details Advanced Grant (AdG), SH6, ERC-2008-AdG
Summary This project focuses on one of the most fascinating events of the long history of the human species: the origins and spread of agriculture. Research over the past 40 years has provided an invaluable dataset on crop domestication and the spread of agriculture into Europe. However, despite the enormous advances in research there are important areas that remain almost unexplored, some of immense interest. This is the case of the western Mediterranean region from where our knowledge is still limited (Iberian Peninsula) or almost inexistent (northern Morocco). The last few years have witnessed a considerable increase in archaeobotany and the effort of a group of Spanish researchers working together in different aspects of agriculture has started to produce the first results. My proposal will approach the study of the arrival of agriculture to the western Mediterranean by exploring different interrelated research areas. The project involves the
application of different techniques (analysis of charred plant remains, pollen and non-pollen microfossils, phytoliths, micro-wear analyses, isotopes, soil micromorphology, genetics, and ethnoarchaeology) which will help to define the emergence and spread of agriculture in the area, its likely place of origin, its main technological attributes as well as the range crop husbandry practices carried out. The interaction between the different approaches and the methodologies involved will allow achieving a greater understanding of the type of agriculture that characterized the first farming communities in the most south-western part of Europe.
Summary
This project focuses on one of the most fascinating events of the long history of the human species: the origins and spread of agriculture. Research over the past 40 years has provided an invaluable dataset on crop domestication and the spread of agriculture into Europe. However, despite the enormous advances in research there are important areas that remain almost unexplored, some of immense interest. This is the case of the western Mediterranean region from where our knowledge is still limited (Iberian Peninsula) or almost inexistent (northern Morocco). The last few years have witnessed a considerable increase in archaeobotany and the effort of a group of Spanish researchers working together in different aspects of agriculture has started to produce the first results. My proposal will approach the study of the arrival of agriculture to the western Mediterranean by exploring different interrelated research areas. The project involves the
application of different techniques (analysis of charred plant remains, pollen and non-pollen microfossils, phytoliths, micro-wear analyses, isotopes, soil micromorphology, genetics, and ethnoarchaeology) which will help to define the emergence and spread of agriculture in the area, its likely place of origin, its main technological attributes as well as the range crop husbandry practices carried out. The interaction between the different approaches and the methodologies involved will allow achieving a greater understanding of the type of agriculture that characterized the first farming communities in the most south-western part of Europe.
Max ERC Funding
1 545 169 €
Duration
Start date: 2009-04-01, End date: 2013-03-31
Project acronym AHRIMMUNITY
Project The influence of Aryl hydrocarbon receptor ligands on protective and pathological immune responses
Researcher (PI) Brigitta Stockinger
Host Institution (HI) MEDICAL RESEARCH COUNCIL
Call Details Advanced Grant (AdG), LS6, ERC-2008-AdG
Summary The Aryl hydrocarbon receptor is an evolutionary conserved widely expressed transcription factor that mediates the toxicity of a substantial variety of exogenous toxins, but is also stimulated by endogenous physiological ligands. While it is known that this receptor mediates the toxicity of dioxin, this is unlikely to be its physiological function. We have recently identified selective expression of AhR in the Th17 subset of effector CD4 T cells. Ligation of AhR by a candidate endogenous ligand (FICZ) which is a UV metabolite of tryptophan causes expansion of Th17 cells and the induction of IL-22 production. As a consequence, AhR ligation will exacerbate autoimmune diseases such as experimental autoimmune encephalomyelitis. Little is known so far about the impact of AhR ligands on IL-17/IL-22 mediated immune defense functions. IL-22 is considered a pro-inflammatory Th17 cytokine, which is involved in the etiology of psoriasis, but it has also been shown to be a survival factor for epithelial cells. AhR is polymorphic and defined as high or low affinity receptor for dioxin leading to the classification of high and low responder mouse strains based on defined mutations. In humans similar polymorphisms exist and although on the whole human AhR is thought to be of low affinity in humans, there are identified mutations that confer high responder status. No correlations have been made with Th17 mediated immune responses in mice and humans. This study aims to investigate the role of AhR ligands and polymorphisms in autoimmunity as well as protective immune responses using both mouse models and human samples from normal controls as well as psoriasis patients.
Summary
The Aryl hydrocarbon receptor is an evolutionary conserved widely expressed transcription factor that mediates the toxicity of a substantial variety of exogenous toxins, but is also stimulated by endogenous physiological ligands. While it is known that this receptor mediates the toxicity of dioxin, this is unlikely to be its physiological function. We have recently identified selective expression of AhR in the Th17 subset of effector CD4 T cells. Ligation of AhR by a candidate endogenous ligand (FICZ) which is a UV metabolite of tryptophan causes expansion of Th17 cells and the induction of IL-22 production. As a consequence, AhR ligation will exacerbate autoimmune diseases such as experimental autoimmune encephalomyelitis. Little is known so far about the impact of AhR ligands on IL-17/IL-22 mediated immune defense functions. IL-22 is considered a pro-inflammatory Th17 cytokine, which is involved in the etiology of psoriasis, but it has also been shown to be a survival factor for epithelial cells. AhR is polymorphic and defined as high or low affinity receptor for dioxin leading to the classification of high and low responder mouse strains based on defined mutations. In humans similar polymorphisms exist and although on the whole human AhR is thought to be of low affinity in humans, there are identified mutations that confer high responder status. No correlations have been made with Th17 mediated immune responses in mice and humans. This study aims to investigate the role of AhR ligands and polymorphisms in autoimmunity as well as protective immune responses using both mouse models and human samples from normal controls as well as psoriasis patients.
Max ERC Funding
1 242 352 €
Duration
Start date: 2009-02-01, End date: 2014-01-31
Project acronym AIME
Project An Inquiry into Modes of Existence
Researcher (PI) Bruno Latour
Host Institution (HI) FONDATION NATIONALE DES SCIENCES POLITIQUES
Call Details Advanced Grant (AdG), SH2, ERC-2010-AdG_20100407
Summary "AIME is an inquiry to make more precise what is lumped together into the confusing word ""modernization"". The work done in the field of science studies (STS) on the progress and practice of science and technology has had the consequence of deeply modifying the definition of ""modernity"", resulting into the provocative idea that ""we (meaning the Europeans) have never been modern"". This is, however only a negative definition. To obtain a positive rendering of the European current situation, it is necessary to start an inquiry in the complex and conflicting set of values that have been invented. This inquiry is possible only if there is a clear and shareable way to judge the differences in the set of truth-conditions that make up those conflicting sets of values. AIME offers a grammar of those differences based on the key notion of modes of existence. Then it builds a procedure and an instrument to test this grammar into a selected set of situations where the definitions of the differing modes of existence is redefined and renegotiated. The result is a set of shareable definitions of what modernization has been in practice. This is important just at the moment when Europe has lost its privileged status and needs to be able to present itself in a new ways to the other cultures and civilizations which are making up the world of globalization with very different views on what it is to modernize themselves."
Summary
"AIME is an inquiry to make more precise what is lumped together into the confusing word ""modernization"". The work done in the field of science studies (STS) on the progress and practice of science and technology has had the consequence of deeply modifying the definition of ""modernity"", resulting into the provocative idea that ""we (meaning the Europeans) have never been modern"". This is, however only a negative definition. To obtain a positive rendering of the European current situation, it is necessary to start an inquiry in the complex and conflicting set of values that have been invented. This inquiry is possible only if there is a clear and shareable way to judge the differences in the set of truth-conditions that make up those conflicting sets of values. AIME offers a grammar of those differences based on the key notion of modes of existence. Then it builds a procedure and an instrument to test this grammar into a selected set of situations where the definitions of the differing modes of existence is redefined and renegotiated. The result is a set of shareable definitions of what modernization has been in practice. This is important just at the moment when Europe has lost its privileged status and needs to be able to present itself in a new ways to the other cultures and civilizations which are making up the world of globalization with very different views on what it is to modernize themselves."
Max ERC Funding
1 334 720 €
Duration
Start date: 2011-09-01, End date: 2015-06-30
Project acronym AIR-NB
Project Pre-natal exposure to urban AIR pollution and pre- and post-Natal Brain development
Researcher (PI) Jordi Sunyer
Host Institution (HI) FUNDACION PRIVADA INSTITUTO DE SALUD GLOBAL BARCELONA
Call Details Advanced Grant (AdG), LS7, ERC-2017-ADG
Summary Air pollution is the main urban-related environmental hazard. It appears to affect brain development, although current evidence is inadequate given the lack of studies during the most vulnerable stages of brain development and the lack of brain anatomical structure and regional connectivity data underlying these effects. Of particular interest is the prenatal period, when brain structures are forming and growing, and when the effect of in utero exposure to environmental factors may cause permanent brain injury. I and others have conducted studies focused on effects during school age which could be less profound. I postulate that: pre-natal exposure to urban air pollution during pregnancy impairs foetal and postnatal brain development, mainly by affecting myelination; these effects are at least partially mediated by translocation of airborne particulate matter to the placenta and by placental dysfunction; and prenatal exposure to air pollution impairs post-natal brain development independently of urban context and post-natal exposure to air pollution. I aim to evaluate the effect of pre-natal exposure to urban air pollution on pre- and post-natal brain structure and function by following 900 pregnant women and their neonates with contrasting levels of pre-natal exposure to air pollutants by: i) establishing a new pregnancy cohort and evaluating brain imaging (pre-natal and neo-natal brain structure, connectivity and function), and post-natal motor and cognitive development; ii) measuring total personal exposure and inhaled dose of air pollutants during specific time-windows of gestation, noise, paternal stress and other stressors, using personal samplers and sensors; iii) detecting nanoparticles in placenta and its vascular function; iv) modelling mathematical causality and mediation, including a replication study in an external cohort. The expected results will create an impulse to implement policy interventions that genuinely protect the health of urban citizens.
Summary
Air pollution is the main urban-related environmental hazard. It appears to affect brain development, although current evidence is inadequate given the lack of studies during the most vulnerable stages of brain development and the lack of brain anatomical structure and regional connectivity data underlying these effects. Of particular interest is the prenatal period, when brain structures are forming and growing, and when the effect of in utero exposure to environmental factors may cause permanent brain injury. I and others have conducted studies focused on effects during school age which could be less profound. I postulate that: pre-natal exposure to urban air pollution during pregnancy impairs foetal and postnatal brain development, mainly by affecting myelination; these effects are at least partially mediated by translocation of airborne particulate matter to the placenta and by placental dysfunction; and prenatal exposure to air pollution impairs post-natal brain development independently of urban context and post-natal exposure to air pollution. I aim to evaluate the effect of pre-natal exposure to urban air pollution on pre- and post-natal brain structure and function by following 900 pregnant women and their neonates with contrasting levels of pre-natal exposure to air pollutants by: i) establishing a new pregnancy cohort and evaluating brain imaging (pre-natal and neo-natal brain structure, connectivity and function), and post-natal motor and cognitive development; ii) measuring total personal exposure and inhaled dose of air pollutants during specific time-windows of gestation, noise, paternal stress and other stressors, using personal samplers and sensors; iii) detecting nanoparticles in placenta and its vascular function; iv) modelling mathematical causality and mediation, including a replication study in an external cohort. The expected results will create an impulse to implement policy interventions that genuinely protect the health of urban citizens.
Max ERC Funding
2 499 992 €
Duration
Start date: 2018-09-01, End date: 2023-08-31
Project acronym AIRSEA
Project Air-Sea Exchanges driven by Light
Researcher (PI) Christian George
Host Institution (HI) CENTRE NATIONAL DE LA RECHERCHE SCIENTIFIQUE CNRS
Call Details Advanced Grant (AdG), PE10, ERC-2011-ADG_20110209
Summary The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Summary
The scientific motivation of this project is the significant presence of organic compounds at the surface of the ocean. They form the link between ocean biogeochemistry through the physico-chemical processes near the water-air interface with primary and secondary aerosol formation and evolution in the air aloft and finally to the climate impact of marine boundary layer aerosols. However, their photochemistry and photosensitizer properties have only been suggested and discussed but never fully addressed because they were beyond reach. This project suggests going significantly beyond this matter of fact by a combination of innovative tools and the development of new ideas.
This project is therefore devoted to new laboratory investigations of processes occurring at the air sea interface to predict emission, formation and evolution of halogenated radicals and aerosols from this vast interface between oceans and atmosphere. It progresses from fundamental laboratory measurements, marine science, surface chemistry, photochemistry … and is therefore interdisciplinary in nature.
It will lead to the development of innovative techniques for characterising chemical processing at the air sea interface (e.g., a multiphase atmospheric simulation chamber, a time-resolved fluorescence technique for characterising chemical processing at the air-sea interface). It will allow the assessment of new emerging ideas such as a quantitative description of the importance of photosensitized reactions in the visible at the air/sea interface as a major source of halogenated radicals and aerosols in the marine environment.
This new understanding will impact on our ability to describe atmospheric chemistry in the marine environment which has strong impact on the urban air quality of coastal regions (which by the way represent highly populated regions ) but also on climate change by providing new input for global climate models.
Max ERC Funding
2 366 276 €
Duration
Start date: 2012-04-01, End date: 2017-03-31
Project acronym ALBUGON
Project Genomics and effectoromics to understand defence suppression and disease resistance in Arabidopsis-Albugo candida interactions
Researcher (PI) Jonathan Jones
Host Institution (HI) THE SAINSBURY LABORATORY
Call Details Advanced Grant (AdG), LS6, ERC-2008-AdG
Summary This project focuses on two questions about host/parasite interactions: how do biotrophic plant pathogens suppress host defence? and, what is the basis for pathogen specialization on specific host species? A broadly accepted model explains resistance and susceptibility to plant pathogens. First, pathogens make conserved molecules ( PAMPS ) such as flagellin, that plants detect via cell surface receptors, leading to PAMP-Triggered Immunity (PTI). Second, pathogens make effectors that suppress PTI. Third, plants carry 100s of Resistance (R) genes that detect an effector, and activate Effector-Triggered Immunity (ETI). One effector is sufficient to trigger resistance. Albugo candida (Ac) (white rust) strongly suppresses host defence; Ac-infected Arabidopsis are susceptible to pathogen races to which they are otherwise resistant. Ac is an oomycete, not a fungus. Arabidopsis is resistant to races of Ac that infect brassicas. The proposed project involves three programs. First ( genomics, transcriptomics and bioinformatics ), we will use next-generation sequencing (NGS) methods (Solexa and GS-Flex), and novel transcriptomics methods to define the genome sequence and effector set of three Ac strains, as well as carrying out >40- deep resequencing of 7 additional Ac strains. Second, ( effectoromics ), we will carry out functional assays using Effector Detector Vectors (Sohn Plant Cell 19:4077 [2007]), with the set of Ac effectors, screening for enhanced virulence, for suppression of defence, for effectors that are recognized by R genes in disease resistant Arabidopsis and for host effector targets. Third, ( resistance diversity ), we will characterize Arabidopsis germplasm for R genes to Ac, both for recognition of Arabidopsis strains of Ac, and for recognition in Arabidopsis of effectors from Ac strains that infect brassica. This proposal focuses on Ac, but will establish methods that could discover new R genes in non-hosts against many plant diseases.
Summary
This project focuses on two questions about host/parasite interactions: how do biotrophic plant pathogens suppress host defence? and, what is the basis for pathogen specialization on specific host species? A broadly accepted model explains resistance and susceptibility to plant pathogens. First, pathogens make conserved molecules ( PAMPS ) such as flagellin, that plants detect via cell surface receptors, leading to PAMP-Triggered Immunity (PTI). Second, pathogens make effectors that suppress PTI. Third, plants carry 100s of Resistance (R) genes that detect an effector, and activate Effector-Triggered Immunity (ETI). One effector is sufficient to trigger resistance. Albugo candida (Ac) (white rust) strongly suppresses host defence; Ac-infected Arabidopsis are susceptible to pathogen races to which they are otherwise resistant. Ac is an oomycete, not a fungus. Arabidopsis is resistant to races of Ac that infect brassicas. The proposed project involves three programs. First ( genomics, transcriptomics and bioinformatics ), we will use next-generation sequencing (NGS) methods (Solexa and GS-Flex), and novel transcriptomics methods to define the genome sequence and effector set of three Ac strains, as well as carrying out >40- deep resequencing of 7 additional Ac strains. Second, ( effectoromics ), we will carry out functional assays using Effector Detector Vectors (Sohn Plant Cell 19:4077 [2007]), with the set of Ac effectors, screening for enhanced virulence, for suppression of defence, for effectors that are recognized by R genes in disease resistant Arabidopsis and for host effector targets. Third, ( resistance diversity ), we will characterize Arabidopsis germplasm for R genes to Ac, both for recognition of Arabidopsis strains of Ac, and for recognition in Arabidopsis of effectors from Ac strains that infect brassica. This proposal focuses on Ac, but will establish methods that could discover new R genes in non-hosts against many plant diseases.
Max ERC Funding
2 498 923 €
Duration
Start date: 2009-01-01, End date: 2014-06-30
Project acronym ALEM
Project ADDITIONAL LOSSES IN ELECTRICAL MACHINES
Researcher (PI) Matti Antero Arkkio
Host Institution (HI) AALTO KORKEAKOULUSAATIO SR
Call Details Advanced Grant (AdG), PE8, ERC-2013-ADG
Summary "Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Summary
"Electrical motors consume about 40 % of the electrical energy produced in the European Union. About 90 % of this energy is converted to mechanical work. However, 0.5-2.5 % of it goes to so called additional load losses whose exact origins are unknown. Our ambitious aim is to reveal the origins of these losses, build up numerical tools for modeling them and optimize electrical motors to minimize the losses.
As the hypothesis of the research, we assume that the additional losses mainly result from the deterioration of the core materials during the manufacturing process of the machine. By calorimetric measurements, we have found that the core losses of electrical machines may be twice as large as comprehensive loss models predict. The electrical steel sheets are punched, welded together and shrink fit to the frame. This causes residual strains in the core sheets deteriorating their magnetic characteristics. The cutting burrs make galvanic contacts between the sheets and form paths for inter-lamination currents. Another potential source of additional losses are the circulating currents between the parallel strands of random-wound armature windings. The stochastic nature of these potential sources of additional losses puts more challenge on the research.
We shall develop a physical loss model that couples the mechanical strains and electromagnetic losses in electrical steel sheets and apply the new model for comprehensive loss analysis of electrical machines. The stochastic variables related to the core losses and circulating-current losses will be discretized together with the temporal and spatial discretization of the electromechanical field variables. The numerical stochastic loss model will be used to search for such machine constructions that are insensitive to the manufacturing defects. We shall validate the new numerical loss models by electromechanical and calorimetric measurements."
Max ERC Funding
2 489 949 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALEXANDRIA
Project "Foundations for Temporal Retrieval, Exploration and Analytics in Web Archives"
Researcher (PI) Wolfgang Nejdl
Host Institution (HI) GOTTFRIED WILHELM LEIBNIZ UNIVERSITAET HANNOVER
Call Details Advanced Grant (AdG), PE6, ERC-2013-ADG
Summary "Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Summary
"Significant parts of our cultural heritage are produced on the Web, yet only insufficient opportunities exist for accessing and exploring the past of the Web. The ALEXANDRIA project aims to develop models, tools and techniques necessary to archive and index relevant parts of the Web, and to retrieve and explore this information in a meaningful way. While the easy accessibility to the current Web is a good baseline, optimal access to Web archives requires new models and algorithms for retrieval, exploration, and analytics which go far beyond what is needed to access the current state of the Web. This includes taking into account the unique temporal dimension of Web archives, structured semantic information already available on the Web, as well as social media and network information.
Within ALEXANDRIA, we will significantly advance semantic and time-based indexing for Web archives using human-compiled knowledge available on the Web, to efficiently index, retrieve and explore information about entities and events from the past. In doing so, we will focus on the concurrent evolution of this knowledge and the Web content to be indexed, and take into account diversity and incompleteness of this knowledge. We will further investigate mixed crowd- and machine-based Web analytics to support long- running and collaborative retrieval and analysis processes on Web archives. Usage of implicit human feedback will be essential to provide better indexing through insights during the analysis process and to better focus harvesting of content.
The ALEXANDRIA Testbed will provide an important context for research, exploration and evaluation of the concepts, methods and algorithms developed in this project, and will provide both relevant collections and algorithms that enable further research on and practical application of our research results to existing archives like the Internet Archive, the Internet Memory Foundation and Web archives maintained by European national libraries."
Max ERC Funding
2 493 600 €
Duration
Start date: 2014-03-01, End date: 2019-02-28
Project acronym ALEXANDRIA
Project Large-Scale Formal Proof for the Working Mathematician
Researcher (PI) Lawrence PAULSON
Host Institution (HI) THE CHANCELLOR MASTERS AND SCHOLARSOF THE UNIVERSITY OF CAMBRIDGE
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Summary
Mathematical proofs have always been prone to error. Today, proofs can be hundreds of pages long and combine results from many specialisms, making them almost impossible to check. One solution is to deploy modern verification technology. Interactive theorem provers have demonstrated their potential as vehicles for formalising mathematics through achievements such as the verification of the Kepler Conjecture. Proofs done using such tools reach a high standard of correctness.
However, existing theorem provers are unsuitable for mathematics. Their formal proofs are unreadable. They struggle to do simple tasks, such as evaluating limits. They lack much basic mathematics, and the material they do have is difficult to locate and apply.
ALEXANDRIA will create a proof development environment attractive to working mathematicians, utilising the best technology available across computer science. Its focus will be the management and use of large-scale mathematical knowledge, both theorems and algorithms. The project will employ mathematicians to investigate the formalisation of mathematics in practice. Our already substantial formalised libraries will serve as the starting point. They will be extended and annotated to support sophisticated searches. Techniques will be borrowed from machine learning, information retrieval and natural language processing. Algorithms will be treated similarly: ALEXANDRIA will help users find and invoke the proof methods and algorithms appropriate for the task.
ALEXANDRIA will provide (1) comprehensive formal mathematical libraries; (2) search within libraries, and the mining of libraries for proof patterns; (3) automated support for the construction of large formal proofs; (4) sound and practical computer algebra tools.
ALEXANDRIA will be based on legible structured proofs. Formal proofs should be not mere code, but a machine-checkable form of communication between mathematicians.
Max ERC Funding
2 430 140 €
Duration
Start date: 2017-09-01, End date: 2022-08-31
Project acronym ALGAME
Project Algorithms, Games, Mechanisms, and the Price of Anarchy
Researcher (PI) Elias Koutsoupias
Host Institution (HI) THE CHANCELLOR, MASTERS AND SCHOLARS OF THE UNIVERSITY OF OXFORD
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Summary
The objective of this proposal is to bring together a local team of young researchers who will work closely with international collaborators to advance the state of the art of Algorithmic Game Theory and open new venues of research at the interface of Computer Science, Game Theory, and Economics. The proposal consists mainly of three intertwined research strands: algorithmic mechanism design, price of anarchy, and online algorithms.
Specifically, we will attempt to resolve some outstanding open problems in algorithmic mechanism design: characterizing the incentive compatible mechanisms for important domains, such as the domain of combinatorial auctions, and resolving the approximation ratio of mechanisms for scheduling unrelated machines. More generally, we will study centralized and distributed algorithms whose inputs are controlled by selfish agents that are interested in the outcome of the computation. We will investigate new notions of mechanisms with strong truthfulness and limited susceptibility to externalities that can facilitate modular design of mechanisms of complex domains.
We will expand the current research on the price of anarchy to time-dependent games where the players can select not only how to act but also when to act. We also plan to resolve outstanding questions on the price of stability and to build a robust approach to these questions, similar to smooth analysis. For repeated games, we will investigate convergence of simple strategies (e.g., fictitious play), online fairness, and strategic considerations (e.g., metagames). More generally, our aim is to find a productive formulation of playing unknown games by drawing on the fields of online algorithms and machine learning.
Max ERC Funding
2 461 000 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym AlgoRNN
Project Recurrent Neural Networks and Related Machines That Learn Algorithms
Researcher (PI) Juergen Schmidhuber
Host Institution (HI) UNIVERSITA DELLA SVIZZERA ITALIANA
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Summary
Recurrent neural networks (RNNs) are general parallel-sequential computers. Some learn their programs or weights. Our supervised Long Short-Term Memory (LSTM) RNNs were the first to win pattern recognition contests, and recently enabled best known results in speech and handwriting recognition, machine translation, etc. They are now available to billions of users through the world's most valuable public companies including Google and Apple. Nevertheless, in lots of real-world tasks RNNs do not yet live up to their full potential. Although universal in theory, in practice they fail to learn important types of algorithms. This ERC project will go far beyond today's best RNNs through novel RNN-like systems that address some of the biggest open RNN problems and hottest RNN research topics: (1) How can RNNs learn to control (through internal spotlights of attention) separate large short-memory structures such as sub-networks with fast weights, to improve performance on many natural short-term memory-intensive tasks which are currently hard to learn by RNNs, such as answering detailed questions on recently observed videos? (2) How can such RNN-like systems metalearn entire learning algorithms that outperform the original learning algorithms? (3) How to achieve efficient transfer learning from one RNN-learned set of problem-solving programs to new RNN programs solving new tasks? In other words, how can one RNN-like system actively learn to exploit algorithmic information contained in the programs running on another? We will test our systems existing benchmarks, and create new, more challenging multi-task benchmarks. This will be supported by a rather cheap, GPU-based mini-brain for implementing large RNNs.
Max ERC Funding
2 500 000 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALGSTRONGCRYPTO
Project Algebraic Methods for Stronger Crypto
Researcher (PI) Ronald John Fitzgerald CRAMER
Host Institution (HI) STICHTING NEDERLANDSE WETENSCHAPPELIJK ONDERZOEK INSTITUTEN
Call Details Advanced Grant (AdG), PE6, ERC-2016-ADG
Summary Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Summary
Our field is cryptology. Our overarching objective is to advance significantly the frontiers in
design and analysis of high-security cryptography for the future generation.
Particularly, we wish to enhance the efficiency, functionality, and, last-but-not-least, fundamental understanding of cryptographic security against very powerful adversaries.
Our approach here is to develop completely novel methods by
deepening, strengthening and broadening the
algebraic foundations of the field.
Concretely, our lens builds on
the arithmetic codex. This is a general, abstract cryptographic primitive whose basic theory we recently developed and whose asymptotic part, which relies on algebraic geometry, enjoys crucial applications in surprising foundational results on constant communication-rate two-party cryptography. A codex is a linear (error correcting) code that, when endowing its ambient vector space just with coordinate-wise multiplication, can be viewed as simulating, up to some degree, richer arithmetical structures such as finite fields (or products thereof), or generally, finite-dimensional algebras over finite fields. Besides this degree, coordinate-localities for which simulation holds and for which it does not at all are also captured.
Our method is based on novel perspectives on codices which significantly
widen their scope and strengthen their utility. Particularly, we bring
symmetries, computational- and complexity theoretic aspects, and connections with algebraic number theory, -geometry, and -combinatorics into play in novel ways. Our applications range from public-key cryptography to secure multi-party computation.
Our proposal is subdivided into 3 interconnected modules:
(1) Algebraic- and Number Theoretical Cryptanalysis
(2) Construction of Algebraic Crypto Primitives
(3) Advanced Theory of Arithmetic Codices
Max ERC Funding
2 447 439 €
Duration
Start date: 2017-10-01, End date: 2022-09-30
Project acronym ALICE
Project Strange Mirrors, Unsuspected Lessons: Leading Europe to a new way of sharing the world experiences
Researcher (PI) Boaventura De Sousa Santos
Host Institution (HI) CENTRO DE ESTUDOS SOCIAIS
Call Details Advanced Grant (AdG), SH2, ERC-2010-AdG_20100407
Summary Europe sits uncomfortably on the idea that there are no political and cultural alternatives credible enough to respond to the current uneasiness or malaise caused by both a world that is more and more non-European and a Europe that increasingly questions what is European about itself. This project will develop a new grounded theoretical paradigm for contemporary Europe based on two key ideas: the understanding of the world by far exceeds the European understanding of the world; social, political and institutional transformation in Europe may benefit from innovations taking place in regions and countries with which Europe is increasingly interdependent. I will pursue this objective focusing on four main interconnected topics: democratizing democracy, intercultural constitutionalism, the other economy, human rights (right to health in particular).
In a sense that the European challenges are unique but, in one way or another, are being experienced in different corners of the world. The novelty resides in bringing new ideas and experiences into the European conversation, show their relevance to our current uncertainties and aspirations and thereby contribute to face them with new intellectual and political resources. The usefulness and relevance of non-European conceptions and experiences un-thinking the conventional knowledge through two epistemological devices I have developed: the ecology of knowledges and intercultural translation. By resorting to them I will show that there are alternatives but they cannot be made credible and powerful if we go on relying on the modes of theoretical and political thinking that have dominated so far. In other words, the claim put forward by and worked through this project is that in Europe we don’t need alternatives but rather an alternative thinking of alternatives.
Summary
Europe sits uncomfortably on the idea that there are no political and cultural alternatives credible enough to respond to the current uneasiness or malaise caused by both a world that is more and more non-European and a Europe that increasingly questions what is European about itself. This project will develop a new grounded theoretical paradigm for contemporary Europe based on two key ideas: the understanding of the world by far exceeds the European understanding of the world; social, political and institutional transformation in Europe may benefit from innovations taking place in regions and countries with which Europe is increasingly interdependent. I will pursue this objective focusing on four main interconnected topics: democratizing democracy, intercultural constitutionalism, the other economy, human rights (right to health in particular).
In a sense that the European challenges are unique but, in one way or another, are being experienced in different corners of the world. The novelty resides in bringing new ideas and experiences into the European conversation, show their relevance to our current uncertainties and aspirations and thereby contribute to face them with new intellectual and political resources. The usefulness and relevance of non-European conceptions and experiences un-thinking the conventional knowledge through two epistemological devices I have developed: the ecology of knowledges and intercultural translation. By resorting to them I will show that there are alternatives but they cannot be made credible and powerful if we go on relying on the modes of theoretical and political thinking that have dominated so far. In other words, the claim put forward by and worked through this project is that in Europe we don’t need alternatives but rather an alternative thinking of alternatives.
Max ERC Funding
2 423 140 €
Duration
Start date: 2011-07-01, End date: 2016-12-31
Project acronym ALK7
Project Metabolic control by the TGF-² superfamily receptor ALK7: A novel regulator of insulin secretion, fat accumulation and energy balance
Researcher (PI) Carlos Ibanez
Host Institution (HI) KAROLINSKA INSTITUTET
Call Details Advanced Grant (AdG), LS4, ERC-2008-AdG
Summary The aim of this proposal is to understand a novel regulatory signaling network controlling insulin secretion, fat accumulation and energy balance centered around selected components of the TGF-² signaling system, including Activins A and B, GDF-3 and their receptors ALK7 and ALK4. Recent results from my laboratory indicate that these molecules are part of paracrine signaling networks that control important functions in pancreatic islets and adipose tissue through feedback inhibition and feed-forward regulation. These discoveries have open up a new research area with important implications for the understanding of metabolic networks and the treatment of human metabolic syndromes, such as diabetes and obesity.
To drive progress in this new research area beyond the state-of-the-art it is proposed to: i) Elucidate the molecular mechanisms by which Activins regulate Ca2+ influx and insulin secretion in pancreatic ²-cells; ii) Elucidate the molecular mechanisms underlying the effects of GDF-3 on adipocyte metabolism, turnover and fat accumulation; iii) Investigate the interplay between insulin levels and fat deposition in the development of insulin resistance using mutant mice lacking Activin B and GDF-3; iv) Investigate tissue-specific contributions of ALK7 and ALK4 signaling to metabolic control by generating and characterizing conditional mutant mice; v) Investigate the effects of specific and reversible inactivation of ALK7 and ALK4 on metabolic regulation using a novel chemical-genetic approach based on analog-sensitive alleles.
This is research of a high-gain/high-risk nature. It is posed to open unique opportunities for further exploration of complex metabolic networks. The development of drugs capable of enhancing insulin secretion, limiting fat accumulation and ameliorating diet-induced obesity by targeting components of the ALK7 signaling network will find a strong rationale in the results of the proposed work.
Summary
The aim of this proposal is to understand a novel regulatory signaling network controlling insulin secretion, fat accumulation and energy balance centered around selected components of the TGF-² signaling system, including Activins A and B, GDF-3 and their receptors ALK7 and ALK4. Recent results from my laboratory indicate that these molecules are part of paracrine signaling networks that control important functions in pancreatic islets and adipose tissue through feedback inhibition and feed-forward regulation. These discoveries have open up a new research area with important implications for the understanding of metabolic networks and the treatment of human metabolic syndromes, such as diabetes and obesity.
To drive progress in this new research area beyond the state-of-the-art it is proposed to: i) Elucidate the molecular mechanisms by which Activins regulate Ca2+ influx and insulin secretion in pancreatic ²-cells; ii) Elucidate the molecular mechanisms underlying the effects of GDF-3 on adipocyte metabolism, turnover and fat accumulation; iii) Investigate the interplay between insulin levels and fat deposition in the development of insulin resistance using mutant mice lacking Activin B and GDF-3; iv) Investigate tissue-specific contributions of ALK7 and ALK4 signaling to metabolic control by generating and characterizing conditional mutant mice; v) Investigate the effects of specific and reversible inactivation of ALK7 and ALK4 on metabolic regulation using a novel chemical-genetic approach based on analog-sensitive alleles.
This is research of a high-gain/high-risk nature. It is posed to open unique opportunities for further exploration of complex metabolic networks. The development of drugs capable of enhancing insulin secretion, limiting fat accumulation and ameliorating diet-induced obesity by targeting components of the ALK7 signaling network will find a strong rationale in the results of the proposed work.
Max ERC Funding
2 462 154 €
Duration
Start date: 2009-04-01, End date: 2014-03-31
Project acronym ALKAGE
Project Algebraic and Kähler geometry
Researcher (PI) Jean-Pierre, Raymond, Philippe Demailly
Host Institution (HI) UNIVERSITE JOSEPH FOURIER GRENOBLE 1
Call Details Advanced Grant (AdG), PE1, ERC-2014-ADG
Summary The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Summary
The purpose of this project is to study basic questions in algebraic and Kähler geometry. It is well known that the structure of projective or Kähler manifolds is governed by positivity or negativity properties of the curvature tensor. However, many fundamental problems are still wide open. Since the mid 1980's, I have developed a large number of key concepts and results that have led to important progress in transcendental algebraic geometry. Let me mention the discovery of holomorphic Morse inequalities, systematic applications of L² estimates with singular hermitian metrics, and a much improved understanding of Monge-Ampère equations and of singularities of plurisuharmonic functions. My first goal will be to investigate the Green-Griffiths-Lang conjecture asserting that an entire curve drawn in a variety of general type is algebraically degenerate. The subject is intimately related to important questions concerning Diophantine equations, especially higher dimensional generalizations of Faltings' theorem - the so-called Vojta program. One can rely here on a breakthrough I made in 2010, showing that all such entire curves must satisfy algebraic differential equations. A second closely related area of research of this project is the analysis of the structure of projective or compact Kähler manifolds. It can be seen as a generalization of the classification theory of surfaces by Kodaira, and of the more recent results for dimension 3 (Kawamata, Kollár, Mori, Shokurov, ...) to other dimensions. My plan is to combine powerful recent results obtained on the duality of positive cohomology cones with an analysis of the instability of the tangent bundle, i.e. of the Harder-Narasimhan filtration. On these ground-breaking questions, I intend to go much further and to enhance my national and international collaborations. These subjects already attract many young researchers and postdocs throughout the world, and the grant could be used to create even stronger interactions.
Max ERC Funding
1 809 345 €
Duration
Start date: 2015-09-01, End date: 2020-08-31
Project acronym ALLEGRO
Project Active large-scale learning for visual recognition
Researcher (PI) Cordelia Schmid
Host Institution (HI) INSTITUT NATIONAL DE RECHERCHE ENINFORMATIQUE ET AUTOMATIQUE
Call Details Advanced Grant (AdG), PE6, ERC-2012-ADG_20120216
Summary A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Summary
A massive and ever growing amount of digital image and video content
is available today, on sites such as
Flickr and YouTube, in audiovisual archives such as those of BBC and
INA, and in personal collections. In most cases, it comes with
additional information, such as text, audio or other metadata, that forms a
rather sparse and noisy, yet rich and diverse source of annotation,
ideally suited to emerging weakly supervised and active machine
learning technology. The ALLEGRO project will take visual recognition
to the next level by using this largely untapped source of data to
automatically learn visual models. The main research objective of
our project is the development of new algorithms and computer software
capable of autonomously exploring evolving data collections, selecting
the relevant information, and determining the visual models most
appropriate for different object, scene, and activity categories. An
emphasis will be put on learning visual models from video, a
particularly rich source of information, and on the representation of
human activities, one of today's most challenging problems in computer
vision. Although this project addresses fundamental research
issues, it is expected to result in significant advances in
high-impact applications that range from visual mining of the Web and
automated annotation and organization of family photo and video albums
to large-scale information retrieval in television archives.
Max ERC Funding
2 493 322 €
Duration
Start date: 2013-04-01, End date: 2019-03-31
Project acronym ALLEGRO
Project unrAvelLing sLow modE travelinG and tRaffic: with innOvative data to a new transportation and traffic theory for pedestrians and bicycles
Researcher (PI) Serge Hoogendoorn
Host Institution (HI) TECHNISCHE UNIVERSITEIT DELFT
Call Details Advanced Grant (AdG), SH3, ERC-2014-ADG
Summary A major challenge in contemporary traffic and transportation theory is having a comprehensive understanding of pedestrians and cyclists behaviour. This is notoriously hard to observe, since sensors providing abundant and detailed information about key variables characterising this behaviour have not been available until very recently. The behaviour is also far more complex than that of the much better understood fast mode. This is due to the many degrees of freedom in decision-making, the interactions among slow traffic participants that are more involved and far less guided by traffic rules and regulations than those between car-drivers, and the many fascinating but complex phenomena in slow traffic flows (self-organised patterns, turbulence, spontaneous phase transitions, herding, etc.) that are very hard to predict accurately.
With slow traffic modes gaining ground in terms of mode share in many cities, lack of empirical insights, behavioural theories, predictively valid analytical and simulation models, and tools to support planning, design, management and control is posing a major societal problem as well: examples of major accidents due to bad planning, organisation and management of events are manifold, as are locations where safety of slow modes is a serious issue due to interactions with fast modes.
This programme is geared towards establishing a comprehensive theory of slow mode traffic behaviour, considering the different behavioural levels relevant for understanding, reproducing and predicting slow mode traffic flows in cities. The levels deal with walking and cycling operations, activity scheduling and travel behaviour, and knowledge representation and learning. Major scientific breakthroughs are expected at each of these levels, in terms of theory and modelling, by using innovative (big) data collection and experimentation, analysis and fusion techniques, including social media data analytics, using augmented reality, and remote and crowd sensing.
Summary
A major challenge in contemporary traffic and transportation theory is having a comprehensive understanding of pedestrians and cyclists behaviour. This is notoriously hard to observe, since sensors providing abundant and detailed information about key variables characterising this behaviour have not been available until very recently. The behaviour is also far more complex than that of the much better understood fast mode. This is due to the many degrees of freedom in decision-making, the interactions among slow traffic participants that are more involved and far less guided by traffic rules and regulations than those between car-drivers, and the many fascinating but complex phenomena in slow traffic flows (self-organised patterns, turbulence, spontaneous phase transitions, herding, etc.) that are very hard to predict accurately.
With slow traffic modes gaining ground in terms of mode share in many cities, lack of empirical insights, behavioural theories, predictively valid analytical and simulation models, and tools to support planning, design, management and control is posing a major societal problem as well: examples of major accidents due to bad planning, organisation and management of events are manifold, as are locations where safety of slow modes is a serious issue due to interactions with fast modes.
This programme is geared towards establishing a comprehensive theory of slow mode traffic behaviour, considering the different behavioural levels relevant for understanding, reproducing and predicting slow mode traffic flows in cities. The levels deal with walking and cycling operations, activity scheduling and travel behaviour, and knowledge representation and learning. Major scientific breakthroughs are expected at each of these levels, in terms of theory and modelling, by using innovative (big) data collection and experimentation, analysis and fusion techniques, including social media data analytics, using augmented reality, and remote and crowd sensing.
Max ERC Funding
2 458 700 €
Duration
Start date: 2015-11-01, End date: 2020-10-31
Project acronym ALMA
Project Attosecond Control of Light and Matter
Researcher (PI) Anne L'huillier
Host Institution (HI) LUNDS UNIVERSITET
Call Details Advanced Grant (AdG), PE2, ERC-2008-AdG
Summary Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Summary
Attosecond light pulses are generated when an intense laser interacts with a gas target. These pulses are not only short, enabling the study of electronic processes at their natural time scale, but also coherent. The vision of this proposal is to extend temporal coherent control concepts to a completely new regime of time and energy, combining (i) ultrashort pulses (ii) broadband excitation (iii) high photon energy, allowing scientists to reach not only valence but also inner shells in atoms and molecules, and, when needed, (iv) high spatial resolution. We want to explore how elementary electronic processes in atoms, molecules and more complex systems can be controlled by using well designed sequences of attosecond pulses. The research project proposed is organized into four parts: 1. Attosecond control of light leading to controlled sequences of attosecond pulses We will develop techniques to generate sequences of attosecond pulses with a variable number of pulses and controlled carrier-envelope-phase variation between consecutive pulses. 2. Attosecond control of electronic processes in atoms and molecules We will investigate the dynamics and coherence of phenomena induced by attosecond excitation of electron wave packets in various systems and we will explore how they can be controlled by a controlled sequence of ultrashort pulses. 3. Intense attosecond sources to reach the nonlinear regime We will optimize attosecond light sources in a systematic way, including amplification of the radiation by injecting a free electron laser. This will open up the possibility to develop nonlinear measurement and control schemes. 4. Attosecond control in more complex systems, including high spatial resolution We will develop ultrafast microscopy techniques, in order to obtain meaningful temporal information in surface and solid state physics. Two directions will be explored, digital in line microscopic holography and photoemission electron microscopy.
Max ERC Funding
2 250 000 €
Duration
Start date: 2008-12-01, End date: 2013-11-30
Project acronym AlmaCrypt
Project Algorithmic and Mathematical Cryptology
Researcher (PI) Antoine Joux
Host Institution (HI) SORBONNE UNIVERSITE
Call Details Advanced Grant (AdG), PE6, ERC-2014-ADG
Summary Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Summary
Cryptology is a foundation of information security in the digital world. Today's internet is protected by a form of cryptography based on complexity theoretic hardness assumptions. Ideally, they should be strong to ensure security and versatile to offer a wide range of functionalities and allow efficient implementations. However, these assumptions are largely untested and internet security could be built on sand.
The main ambition of Almacrypt is to remedy this issue by challenging the assumptions through an advanced algorithmic analysis.
In particular, this proposal questions the two pillars of public-key encryption: factoring and discrete logarithms. Recently, the PI contributed to show that in some cases, the discrete logarithm problem is considerably weaker than previously assumed. A main objective is to ponder the security of other cases of the discrete logarithm problem, including elliptic curves, and of factoring. We will study the generalization of the recent techniques and search for new algorithmic options with comparable or better efficiency.
We will also study hardness assumptions based on codes and subset-sum, two candidates for post-quantum cryptography. We will consider the applicability of recent algorithmic and mathematical techniques to the resolution of the corresponding putative hard problems, refine the analysis of the algorithms and design new algorithm tools.
Cryptology is not limited to the above assumptions: other hard problems have been proposed to aim at post-quantum security and/or to offer extra functionalities. Should the security of these other assumptions become critical, they would be added to Almacrypt's scope. They could also serve to demonstrate other applications of our algorithmic progress.
In addition to its scientific goal, Almacrypt also aims at seeding a strengthened research community dedicated to algorithmic and mathematical cryptology.
--
Max ERC Funding
2 403 125 €
Duration
Start date: 2016-01-01, End date: 2021-12-31
Project acronym ALPAM
Project Atomic-Level Physics of Advanced Materials
Researcher (PI) Börje Johansson
Host Institution (HI) KUNGLIGA TEKNISKA HOEGSKOLAN
Call Details Advanced Grant (AdG), PE5, ERC-2008-AdG
Summary Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Summary
Most of the technological materials have been developed by very expensive and cumbersome trial and error methods. On the other hand, computer based theoretical design of advanced materials is an area where rapid and extensive developments are taking place. Within my group new theoretical tools have now been established which are extremely well suited to the study of complex materials. In this approach basic quantum mechanical theories are used to describe fundamental properties of alloys and compounds. The utilization of such calculations to investigate possible optimizations of certain key properties represents a major departure from the traditional design philosophy. The purpose of my project is to build up a new competence in the field of computer-aided simulations of advanced materials. The main goal will be to achieve a deep understanding of the behaviour of complex metallic systems under equilibrium and non-equilibrium conditions at the atomic level by studying their electronic, magnetic and atomic structure using the most modern and advanced computational methods. This will enable us to establish a set of materials parameters and composition-structure-property relations that are needed for materials optimization.
The research will be focused on fundamental technological properties related to defects in advanced metallic alloys (high-performance steels, superalloys, and refractory, energy related and geochemical materials) and alloy phases (solid solutions, intermetallic compounds), which will be studied by means of parameter free atomistic simulations combined with continuum modelling. As a first example, we will study the Fe-Cr system, which is of great interest to industry as well as in connection to nuclear waste. The Fe-Cr-Ni system will form another large group of materials under the aegis of this project. Special emphasis will also be placed on those Fe-alloys which exist under extreme conditions and are possible candidates for the Earth core.
Max ERC Funding
2 000 000 €
Duration
Start date: 2009-03-01, End date: 2014-02-28
Project acronym ALPHA
Project Alpha Shape Theory Extended
Researcher (PI) Herbert Edelsbrunner
Host Institution (HI) INSTITUTE OF SCIENCE AND TECHNOLOGY AUSTRIA
Call Details Advanced Grant (AdG), PE6, ERC-2017-ADG
Summary Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Summary
Alpha shapes were invented in the early 80s of last century, and their implementation in three dimensions in the early 90s was at the forefront of the exact arithmetic paradigm that enabled fast and correct geometric software. In the late 90s, alpha shapes motivated the development of the wrap algorithm for surface reconstruction, and of persistent homology, which was the starting point of rapidly expanding interest in topological algorithms aimed at data analysis questions.
We now see alpha shapes, wrap complexes, and persistent homology as three aspects of a larger theory, which we propose to fully develop. This viewpoint was a long time coming and finds its clear expression within a generalized
version of discrete Morse theory. This unified framework offers new opportunities, including
(I) the adaptive reconstruction of shapes driven by the cavity structure;
(II) the stochastic analysis of all aspects of the theory;
(III) the computation of persistence of dense data, both in scale and in depth;
(IV) the study of long-range order in periodic and near-periodic point configurations.
These capabilities will significantly deepen as well as widen the theory and enable new applications in the sciences. To gain focus, we concentrate on low-dimensional applications in structural molecular biology and particle systems.
Max ERC Funding
1 678 432 €
Duration
Start date: 2018-07-01, End date: 2023-06-30
Project acronym ALPROS
Project Artificial Life-like Processive Systems
Researcher (PI) Roeland Johannes Maria Nolte
Host Institution (HI) STICHTING KATHOLIEKE UNIVERSITEIT
Call Details Advanced Grant (AdG), PE5, ERC-2011-ADG_20110209
Summary Toroidal processive enzymes (e.g. enzymes/proteins that are able to thread onto biopolymers and to perform stepwise reactions along the polymer chain) are among the most fascinating tools involved in the clockwork machinery of life. Processive catalysis is ubiquitous in Nature, viz. DNA polymerases, endo- and exo-nucleases and; it plays a crucial role in numerous events of the cell’s life, including most of the replication, transmission, and expression and repair processes of the genetic information. In the case of DNA polymerases the protein catalyst encircles the DNA and whilst moving along it, make copies of high fidelity. Although numerous works have been reported in relation with the synthesis of natural enzymes' analogues, very few efforts have been paid in comparison to mimic these processive properties. It is the goal of this proposal to rectify this oversight and unravel the essential components of Nature’s polymer catalysts. The individual projects are designed to specifically target the essential aspects of processive catalysis, i.e. rate of motion, rate of catalysis, and transfer of information. One project is aimed at extending the research into a processive catalytic system that is more suitable for industrial application. Two projects involve more farsighted studies and are designed to push the research way beyond the current boundaries into the area of Turing machines and bio-rotaxane catalysts which can modify DNA in a non-natural process. The vision of this proposal is to open up the field of ‘processive catalysis’ and invigorate the next generation of chemists to develop information transfer and toroidal processive catalysts. The construction of synthetic analogues of processive enzymes could open a gate toward a large range of applications, ranging from intelligent tailoring of polymers to information storage and processing.
Summary
Toroidal processive enzymes (e.g. enzymes/proteins that are able to thread onto biopolymers and to perform stepwise reactions along the polymer chain) are among the most fascinating tools involved in the clockwork machinery of life. Processive catalysis is ubiquitous in Nature, viz. DNA polymerases, endo- and exo-nucleases and; it plays a crucial role in numerous events of the cell’s life, including most of the replication, transmission, and expression and repair processes of the genetic information. In the case of DNA polymerases the protein catalyst encircles the DNA and whilst moving along it, make copies of high fidelity. Although numerous works have been reported in relation with the synthesis of natural enzymes' analogues, very few efforts have been paid in comparison to mimic these processive properties. It is the goal of this proposal to rectify this oversight and unravel the essential components of Nature’s polymer catalysts. The individual projects are designed to specifically target the essential aspects of processive catalysis, i.e. rate of motion, rate of catalysis, and transfer of information. One project is aimed at extending the research into a processive catalytic system that is more suitable for industrial application. Two projects involve more farsighted studies and are designed to push the research way beyond the current boundaries into the area of Turing machines and bio-rotaxane catalysts which can modify DNA in a non-natural process. The vision of this proposal is to open up the field of ‘processive catalysis’ and invigorate the next generation of chemists to develop information transfer and toroidal processive catalysts. The construction of synthetic analogues of processive enzymes could open a gate toward a large range of applications, ranging from intelligent tailoring of polymers to information storage and processing.
Max ERC Funding
1 603 699 €
Duration
Start date: 2012-02-01, End date: 2017-01-31
Project acronym ALREG
Project Analysing Learning in Regulatory Governance
Researcher (PI) Claudio Radaelli
Host Institution (HI) THE UNIVERSITY OF EXETER
Call Details Advanced Grant (AdG), SH2, ERC-2008-AdG
Summary This four-year interdisciplinary project addresses the question what has been learned through the use of better regulation ? Better regulation is a flagship policy on the Lisbon agenda for growth and jobs. Its aims are to provide new governance architectures for law-making, to increase the competitiveness of the regulatory environment, and to secure wide social legitimacy for multi-level systems of rules. Whilst most of the research has looked at how better regulation is changing, this project will produce findings on what has changed because of better regulation. Theoretically, the project will use (and significantly improve on) theories of policy learning. Empirically, it will cover Denmark, Italy, the Netherlands, Poland, the UK and the EU including multi-level analysis and analysis by sector of regulation. Methodologically, the project will draw on comparative analysis of types of learning, experiments with regulatory policy-makers in six countries and the European Commission, large-n analysis of impact assessments, backward-mapping of legislation (to appraise the role played by better regulation in the formulation or laws in the UK and the EU), meta-analysis of case-studies and co-production of knowledge with better regulation officers. Dissemination will target both stakeholders (i.e., policy officers, civil society organizations, and business federations) and academic conferences in political science, law, and risk analysis, with a major research monograph to be completed in year 4 and a final interdisciplinary conference.
Summary
This four-year interdisciplinary project addresses the question what has been learned through the use of better regulation ? Better regulation is a flagship policy on the Lisbon agenda for growth and jobs. Its aims are to provide new governance architectures for law-making, to increase the competitiveness of the regulatory environment, and to secure wide social legitimacy for multi-level systems of rules. Whilst most of the research has looked at how better regulation is changing, this project will produce findings on what has changed because of better regulation. Theoretically, the project will use (and significantly improve on) theories of policy learning. Empirically, it will cover Denmark, Italy, the Netherlands, Poland, the UK and the EU including multi-level analysis and analysis by sector of regulation. Methodologically, the project will draw on comparative analysis of types of learning, experiments with regulatory policy-makers in six countries and the European Commission, large-n analysis of impact assessments, backward-mapping of legislation (to appraise the role played by better regulation in the formulation or laws in the UK and the EU), meta-analysis of case-studies and co-production of knowledge with better regulation officers. Dissemination will target both stakeholders (i.e., policy officers, civil society organizations, and business federations) and academic conferences in political science, law, and risk analysis, with a major research monograph to be completed in year 4 and a final interdisciplinary conference.
Max ERC Funding
948 448 €
Duration
Start date: 2009-09-01, End date: 2013-09-30
Project acronym AMAIZE
Project Atlas of leaf growth regulatory networks in MAIZE
Researcher (PI) Dirk, Gustaaf Inzé
Host Institution (HI) VIB VZW
Call Details Advanced Grant (AdG), LS9, ERC-2013-ADG
Summary "Understanding how organisms regulate size is one of the most fascinating open questions in biology. The aim of the AMAIZE project is to unravel how growth of maize leaves is controlled. Maize leaf development offers great opportunities to study the dynamics of growth regulatory networks, essentially because leaf development is a linear system with cell division at the leaf basis followed by cell expansion and maturation. Furthermore, the growth zone is relatively large allowing easy access of tissues at different positions. Four different perturbations of maize leaf size will be analyzed with cellular resolution: wild-type and plants having larger leaves (as a consequence of GA20OX1 overexpression), both grown under either well-watered or mild drought conditions. Firstly, a 3D cellular map of the growth zone of the fourth leaf will be made. RNA-SEQ of three different tissues (adaxial- and abaxial epidermis; mesophyll) obtained by laser dissection with an interval of 2.5 mm along the growth zone will allow for the analysis of the transcriptome with high resolution. Additionally, the composition of fifty selected growth regulatory protein complexes and DNA targets of transcription factors will be determined with an interval of 5 mm along the growth zone. Computational methods will be used to construct comprehensive integrative maps of the cellular and molecular processes occurring along the growth zone. Finally, selected regulatory nodes of the growth regulatory networks will be further functionally analyzed using a transactivation system in maize.
AMAIZE opens up new perspectives for the identification of optimal growth regulatory networks that can be selected for by advanced breeding or for which more robust variants (e.g. reduced susceptibility to drought) can be obtained through genetic engineering. The ability to improve the growth of maize and in analogy other cereals could have a high impact in providing food security"
Summary
"Understanding how organisms regulate size is one of the most fascinating open questions in biology. The aim of the AMAIZE project is to unravel how growth of maize leaves is controlled. Maize leaf development offers great opportunities to study the dynamics of growth regulatory networks, essentially because leaf development is a linear system with cell division at the leaf basis followed by cell expansion and maturation. Furthermore, the growth zone is relatively large allowing easy access of tissues at different positions. Four different perturbations of maize leaf size will be analyzed with cellular resolution: wild-type and plants having larger leaves (as a consequence of GA20OX1 overexpression), both grown under either well-watered or mild drought conditions. Firstly, a 3D cellular map of the growth zone of the fourth leaf will be made. RNA-SEQ of three different tissues (adaxial- and abaxial epidermis; mesophyll) obtained by laser dissection with an interval of 2.5 mm along the growth zone will allow for the analysis of the transcriptome with high resolution. Additionally, the composition of fifty selected growth regulatory protein complexes and DNA targets of transcription factors will be determined with an interval of 5 mm along the growth zone. Computational methods will be used to construct comprehensive integrative maps of the cellular and molecular processes occurring along the growth zone. Finally, selected regulatory nodes of the growth regulatory networks will be further functionally analyzed using a transactivation system in maize.
AMAIZE opens up new perspectives for the identification of optimal growth regulatory networks that can be selected for by advanced breeding or for which more robust variants (e.g. reduced susceptibility to drought) can be obtained through genetic engineering. The ability to improve the growth of maize and in analogy other cereals could have a high impact in providing food security"
Max ERC Funding
2 418 429 €
Duration
Start date: 2014-02-01, End date: 2019-01-31